Touch detecting interactive display
First Claim
1. A method of operating an interactive display, comprising the steps of:
- providing a display surface;
displaying imagery coincident with said display surface;
detecting at least one contact location at which one user contacts said surface;
storing a position history for each contact location;
determining velocity information for each contact location based on said position history;
identifying at least one user gesture based on any of;
said position history, andsaid velocity information;
measuring an intensity with which said user contacts said display surface;
associating each gesture and said intensity with a command;
executing said command to alter said imagery; and
performing a visual analysis of a material layer mechanically coupled to said display surface, said material layer changing optical properties in response to an applied pressure;
whereby said user controls said interactive display by direct physical interaction with said imagery.
4 Assignments
0 Petitions
Accused Products
Abstract
The invention provides an interactive display that is controlled by user gestures identified on a touch detecting display surface. In the preferred embodiment of the invention, imagery is projected onto a horizontal projection surface from a projector located above the projection surface. The locations where a user contacts the projection surface are detected using a set of infrared emitters and receivers arrayed around the perimeter of the projection surface. For each contact location, a computer software application stores a history of contact position information and, from the position history, determines a velocity for each contact location. Based upon the position history and the velocity information, gestures are identified. The identified gestures are associated with display commands that are executed to update the displayed imagery accordingly. Thus, the invention enables users to control the display through direct physical interaction with the imagery.
-
Citations
79 Claims
-
1. A method of operating an interactive display, comprising the steps of:
-
providing a display surface; displaying imagery coincident with said display surface; detecting at least one contact location at which one user contacts said surface; storing a position history for each contact location; determining velocity information for each contact location based on said position history; identifying at least one user gesture based on any of; said position history, and said velocity information; measuring an intensity with which said user contacts said display surface; associating each gesture and said intensity with a command; executing said command to alter said imagery; and performing a visual analysis of a material layer mechanically coupled to said display surface, said material layer changing optical properties in response to an applied pressure; whereby said user controls said interactive display by direct physical interaction with said imagery.
-
-
2. The method of claim 1, wherein said imagery is geographic information systems imagery.
-
3. The method of claim 1, wherein said display surface is a substantially horizontal surface.
-
4. The method of claim 3, said interactive display comprising:
-
a railing; wherein said railing substantially surrounds said display surface; and
wherein said railing provides a visual cue discouraging said user from leaning onto said display surface.
-
-
5. The method of claim 1, wherein said display surface is a projection surface, and said imagery is produced by a projector.
-
6. The method of claim 5, wherein said command is a separating motion and said altered imagery is an inward zoom.
-
7. The method of claim 1, wherein said command is a display control command that when executed effects any of:
-
a panning movement of said imagery, an inward zoom of said imagery, an outward zoom of said imagery, and a rotation of said imagery.
-
-
8. The method of claim 1, wherein said detecting step comprises any of the steps of:
-
operating at least one infrared emitter and receiver pair; operating a capacitive touch pad; operating an ultrasound system; operating a resistive touch pad, and the step of measuring an intensity with which said user contacts said display surface.
-
-
9. The method of claim 1, wherein said identifying step, said associating step, and said executing step collectively comprise the steps of:
-
pairing each contact location with a pixel within said imagery; and updating said imagery to maintain said pixel coincident with each corresponding contact location.
-
-
10. The method of claim 1, wherein said identifying step comprises the step of:
classifying said position history and said velocity information as one of a set of distinct gestures.
-
11. The method of claim 10, wherein said classifying is based on any of:
-
a direction of said velocity information for said contact location; a difference of two directions; a comparison of at least two directions; a length of a line segment connecting a first point within said position history of a first contact location to a second point within said position history of a second contact location; and a direction of a line segment connecting a first point within said position history of a first contact location to a second point within said position history of a second contact location.
-
-
12. The method of claim 1, wherein said command, when executed effects any of:
-
a selection of an object represented within said imagery; a selection of a menu item represented within said imagery; a selection of a submenu item represented within said imagery; a change in a transparency of said imagery; a change in a visibility of at least one imagery layer within said imagery; and a change in a transparency of at least one imagery layer within said imagery.
-
-
13. The method of claim 1, wherein said imagery comprises a representation of at least one control interface.
-
14. The method of claim 13, wherein said at least one control interface comprises a menu.
-
15. The method of claim 13, wherein said at least one control interface is positioned proximate to and oriented towards a first location on a periphery of said display surface;
- and wherein said command, when executed, effects any of a movement, and a reorientation of said at least one control interface;
wherein said at least one control interface is positioned proximate to and oriented towards a second location on a periphery of said display surface.
- and wherein said command, when executed, effects any of a movement, and a reorientation of said at least one control interface;
-
16. The method of claim 15, wherein said position history indicates a contact location proximate to said second location.
-
17. The method of claim 1, wherein said command is an approaching motion and said altered imagery is an outward zoom.
-
18. The method of claim 1, wherein said intensity comprises a force determined using measurements obtained by load cells mechanically coupled with said display surface.
-
19. An interactive display, comprising:
-
a display surface; imagery coincident with said display surface; means for detecting at least one contact location at which at least one user contacts said display surface; means for storing a position history for each contact location; means for determining velocity information for each contact location based on said position history; means for identifying at least one user gesture based on any of; said position history; and said velocity information; means for measuring intensity with which said user contacts said display surface; means for associating each gesture and said measured intensity with a command; means for executing said command to alter said imagery; means for performing a visual analysis of a material layer mechanically coupled to said display surface, said material layer changing at least one optical property in response to an applied pressure; whereby said user controls said interactive display by direct physical interaction with said imagery.
-
-
20. The interactive display of claim 19, wherein said display surface is a substantially horizontal surface.
-
21. The interactive display of claim 19, further comprising:
-
a railing, wherein said railing substantially surrounds said display surface; and wherein said railing provides a visual cue discouraging said user from leaning onto said display surface.
-
-
22. The interactive display of claim 19, wherein said display surface is a projection surface, and said imagery is produced by a projector.
-
23. The interactive display of claim 22, wherein said display is a front projection surface.
-
24. The interactive display of claim 19, wherein said command is a display control command that when executed effects any of:
-
a panning movement of said imagery, an inward zoom of said imagery, an outward zoom of said imagery, and a rotation of said imagery.
-
-
25. The interactive display of claim 19, said means for detecting comprising of:
-
at least one infrared emitter and receiver pair; a capacitive touch pad; an ultrasound system; and a resistive touch pad.
-
-
26. The interactive display of claim 19, further comprising:
means for classifying said position history and said velocity information as one of a set of distinct gestures.
-
27. The interactive display of claim 26, said classifying means performing said classifying based on any of:
-
a direction of said velocity information for said contact location; a difference of two directions; a comparison of at least two directions; a length of a line segment connecting a first point within said position history of a first contact location to a second point said position history of a second contact location; and a direction of a line segment connecting a first point within said position history of a first contact location to a second point within said position history of a second contact location.
-
-
28. The interactive display of claim 19, wherein said imagery comprises a representation of at least one control interface.
-
29. The interactive display of claim 28 wherein said at least one control interface comprises a menu.
-
30. The interactive display of claim 19,
wherein said command is an approaching motion and said altered imagery is an outward zoom.
-
31. The interactive display of claim 19, wherein said intensity comprises a force determined using measurement obtained by load cells mechanically coupled with said display surface.
-
32. An interactive display, comprising:
-
a display surface; imagery coincident with said display surface; means for detecting at least one contact location at which at least one user contacts said display surface; means for storing a position history for each contact location; means for determining velocity information for each contact location based on said position history; means for identifying at least one user gesture based on any of; said position history; and said velocity information; means for measuring intensity with which said user contacts said display surface; means for associating each gesture and said measured intensity with a command; and means for executing said command to alter said imagery; wherein said at least one control interface is positioned proximate to and oriented towards a first location on a periphery of said display surface; and wherein said command, when executed, effects any movement, and a reorientation of said at lest one control interface;
wherein such said control interface is positioned proximate to and oriented towards a second location on a periphery of said display surface.
-
-
33. A method of operating an interactive display, comprising the steps of:
-
providing a display surface; displaying imagery coincident with said display surface; detecting at least one contact location at which one user contacts said surface, said detecting further comprising performing a visual analysis of a material layer mechanically coupled to said display surface, said material layer changing optical properties in response to an applied pressure; storing a position history for each contact location; determining velocity information for each contact location based on said position history; and identifying at least one user gesture based on any of; said position history, and said velocity information; associating each gesture with a command; and executing said command to alter said imagery; whereby said user controls said interactive display by direct physical interaction with said imagery.
-
-
34. The method of claim 33, wherein said imagery is geographic information systems imagery.
-
35. The method of claim 33, wherein said display surface is a substantially horizontal surface.
-
36. The method of claim 33, said interactive display comprising:
a railing; wherein said railing substantially surrounds said display surface; and wherein said railing provides a visual cue discouraging said user from leaning onto said display surface.
-
37. The method of claim 33, wherein said display surface is a projection surface, and said imagery is produced by a projector.
-
38. The method of claim 37, wherein said display surface is a front projection surface.
-
39. The method of claim 33, wherein said command is a display control command that when executed effects any of:
-
a panning movement of said imagery, fan inward zoom of said imagery, an outward zoom of said imagery, and a rotation of said imagery.
-
-
40. The method of claim 33, wherein said detecting step comprises any of the steps of:
-
operating at least one infrared emitter and receiver pair; operating a capacitive touch pad; operating an ultrasound system; operating a resistive touch pad, and the step of measuring an intensity with which said user contacts said display surface.
-
-
41. The method of claim 33, wherein said identifying step, said associating step, and said executing step collectively comprise the steps of:
-
pairing each contact location with a pixel within said imagery; and updating said imagery to maintain said pixel coincident with each corresponding contact location.
-
-
42. The method of claim 33, wherein said identifying step comprises the step of:
classifying said position history and said velocity information as one of a set of distinct gestures.
-
43. The method of claim 42, wherein said classifying is based on any of:
-
a direction of said velocity information for said contact location; a difference of two directions; a comparison of at least two directions; a length of a line segment connecting a first point within said position history of a first contact location to a second point within said position history of a second contact location; and a direction of a line segment connecting a first point within said position history of a first contact location to a second point within said position history of a second contact location.
-
-
44. The method of claim 33, wherein said command, when executed, effects any of:
-
a selection of an object represented within said imagery; a selection of a menu item represented within said imagery; a selection of a submenu item represented within said imagery; a change in a transparency of said imagery; a change in a visibility of at least one imagery layer within said imagery; and a change in a transparency of at least one imagery layer within said imagery.
-
-
45. The method of claim 33, wherein said imagery comprises a representation of at least one control interface.
-
46. The method of claim 45, wherein said at least one control interface comprises a menu.
-
47. The method of claim 45, wherein said at least one control interface is positioned proximate to and oriented towards a first location on a periphery of said display surface;
-
wherein said command, when executed, effects any of a movement, and a reorientation of said at least one control interface; and wherein said at least one control interface is positioned proximate to and oriented towards a second location on a periphery of said display surface.
-
-
48. The method of claim 47, wherein said position history indicates a contact location proximate to said second location.
-
49. The method of claim 33, further comprising the step of:
-
measuring an intensity with which said user contacts said display surface; wherein said gesture is additionally identified based on said intensity.
-
-
50. The method of claim 49, wherein said intensity comprises a force determined using measurements obtained by load cells mechanically coupled with said display surface.
-
51. A method of operating an interactive display, comprising the steps of:
-
providing a display surface; displaying imagery coincident with said display surface; detecting at least one contact location at which one user contacts said surface; storing a position history for each contact location; determining velocity information for each contact location based on said position history; measuring an intensity with which said user contacts said display surface; identifying at least one user gesture based on any of; said position history; said velocity information; and said intensity; associating each gesture with a command; executing said command to alter said imagery; and performing a visual analysis of a material layer mechanically coupled to said display surface, said material layer changing optical properties in response to an applied pressure; whereby said user controls said interactive display by direct physical interaction with said imagery.
-
-
52. The method of claim 51, wherein said imagery is geographic information systems imagery.
-
53. The method of claim 51, wherein said display surface is a substantially horizontal surface.
-
54. The method of claim 53, said interactive display comprising:
a railing; wherein said railing substantially surrounds said display surface; and wherein said railing provides a visual cue discouraging said user from leaning onto said display surface.
-
55. The method of claim 51, wherein said display surface is a projection surface, and said imagery is produced by a projector.
-
56. The method of claim 55, wherein said display surface is a front projection surface.
-
57. The method of claim 51, wherein said command is a display control command that when executed effects any of:
-
a panning movement of said imagery, fan inward zoom of said imagery, an outward zoom of said imagery, and a rotation of said imagery.
-
-
58. The method of claim 51, wherein said detecting step comprises any of the steps of:
-
operating at least one infrared emitter and receiver pair; operating a capacitive touch pad; operating an ultrasound system; operating a resistive touch pad, and the step of measuring an intensity with which said user contacts said display surface.
-
-
59. The method of claim 51, wherein said identifying step, said associating step, and said executing step collectively comprise the steps of:
-
pairing each contact location with a pixel within said imagery; and updating said imagery to maintain said pixel coincident with each corresponding contact location.
-
-
60. The method of claim 51, wherein said identifying step comprises the step of:
classifying said position history and said velocity information as one of a set of distinct gestures.
-
61. The method of claim 60, wherein said classifying is based on any of:
-
a direction of said velocity information for said contact location; a difference of two directions; a comparison of at least two directions; a length of a line segment connecting a first point within said position history of a first contact location to a second point within said position history of a second contact location; and a direction of a line segment connecting a first point within said position history of a first contact location to a second point within said position history of a second contact location.
-
-
62. The method of claim 51, wherein said command, when executed, effects any of:
-
a selection of an object represented within said imagery; a selection of a menu item represented within said imagery; a selection of a submenu item represented within said imagery; a change in a transparency of said imagery; a change in a visibility of at least one imagery layer within said imagery; and a change in a transparency of at least one imagery layer within said imagery.
-
-
63. The method of claim 51, wherein said imagery comprises a representation of at least one control interface.
-
64. The method of claim 63, wherein said at least one control interface comprises a menu.
-
65. The method of claim 63, wherein said at least one control interface is positioned proximate to and oriented towards a first location on a periphery of said display surface;
-
wherein said command, when executed, effects any of a movement, and a reorientation of said at least one control interface; and wherein said at least one control interface is positioned proximate to and oriented towards a second location on a periphery of said display surface.
-
-
66. The method of claim 65, wherein said position history indicates a contact location proximate to said second location.
-
67. The method of claim 51, wherein said intensity comprises a force determined using measurements obtained by load cells mechanically coupled with said display surface.
-
68. An interactive display, comprising:
-
a display surface; imagery coincident with said display surface, wherein the imagery comprises a representation of at least one control interface, wherein said at least one control interface is positioned proximate to and oriented towards a first location on a periphery of said display surface; wherein said command, when executed, effects any movement, and a reorientation of said at least one control interface;
wherein such said control interface is positioned proximate to and oriented towards a second location on a periphery of said display surface;means for detecting at least one contact location at which at least one user contacts said display surface; means for storing a position history for each contact location; means for determining velocity information for each contact location based on said position history; means for measuring intensity with which said user contacts said displays surface; means for identifying at least one user gesture based on any of; said intensity; said position history; and said velocity information; means for associating each gesture with a command; and means for executing said command to alter said imagery; whereby said user controls said interactive display by direct physical interaction with said imagery.
-
-
69. The interactive display of claim 68, wherein said display surface is a substantially horizontal surface.
-
70. The interactive display of claim 68, further comprising:
a railing, wherein said railing substantially surrounds said display surface; and wherein said railing provides a visual cue discouraging said user from leaning onto said display surface.
-
71. The interactive display of claim 68, wherein said display surface is a projection surface, and said imagery is produced by a projector.
-
72. The interactive display of claim 71, wherein said display is a front projection surface.
-
73. The interactive display of claim 68, wherein said command is a display control command that when executed effects any of:
-
a panning movement of said imagery, an inward zoom of said imagery, an outward zoom of said imagery, and a rotation of said imagery.
-
-
74. The interactive display of claim 68, said means for detecting comprising of:
-
at least one infrared emitter and receiver pair; a capacitive touch pad; an ultrasound system; and a resistive touch pad.
-
-
75. The interactive display of claim 68, further comprising:
means for performing a visual analysis of a material layer mechanically coupled to said display surface, said material layer changing optical properties in response to an applied pressure.
-
76. The interactive display of claim 68, further comprising:
means for classifying said position history and said velocity information as one of a set of distinct gestures.
-
77. The interactive display of claim 76, said classifying means performing said classifying based on any of:
-
a direction of said velocity information for said contact location; a difference of two directions; a comparison of at least two directions; a length of a line segment connecting a first point within said position history of a first contact location to a second point said position history of a second contact location; and a direction of a line segment connecting a first point within said position history of a first contact location to a second point within said position history of a second contact location.
-
-
78. The interactive display of claim 77 wherein said at least one control interface comprises a menu.
-
79. The interactive display of claim 68, wherein said intensity comprises a force determined using measurement obtained by load cells mechanically coupled with said display surface.
Specification