System and methods for providing a three-dimensional touch screen
First Claim
1. A method comprising:
- managing user interaction in three dimensions with respect to a displayed user interface screen, by;
displaying the user interface screen via a display device, wherein the user interface screen is displayed on a surface;
performing a calibration process by;
detecting position and orientation of the displayed user interface screen based on a pattern displayed onto the surface;
detecting shape, size, and geometry of the surface on which the user interface screen is displayed; and
building a three-dimensional model of the surface on which the user interface screen is displayed;
receiving position information from a visual sensor, the position information reflecting the position of an object in a three-dimensional space located adjacent to the displayed user interface screen, wherein the three-dimensional space is located between the displayed user interface screen and the visual sensor;
generating a user interface event based on the received position information; and
providing the generated user interface event to an application program.
1 Assignment
0 Petitions
Accused Products
Abstract
Systems and techniques for providing a three-dimensional touch screen are described. Example systems include a touch screen manager that is configured to display a screen via a display device, such as by projecting on a wall or other surface a screen generated by a computing device, such as a desktop, laptop, or tablet computer. The manager is further configured to receive position information from a sensor that determines the positions of objects, such as the finger or hand of a user, within a three-dimensional space in front of the display surface. The manager then converts the received position information into user interface events, including gestures (e.g., pinch, swipe), mouse-type events (e.g., click, drag), or the like. The user interface events are provided to an application or other module executing on the computing device.
21 Citations
31 Claims
-
1. A method comprising:
managing user interaction in three dimensions with respect to a displayed user interface screen, by; displaying the user interface screen via a display device, wherein the user interface screen is displayed on a surface; performing a calibration process by; detecting position and orientation of the displayed user interface screen based on a pattern displayed onto the surface; detecting shape, size, and geometry of the surface on which the user interface screen is displayed; and building a three-dimensional model of the surface on which the user interface screen is displayed; receiving position information from a visual sensor, the position information reflecting the position of an object in a three-dimensional space located adjacent to the displayed user interface screen, wherein the three-dimensional space is located between the displayed user interface screen and the visual sensor; generating a user interface event based on the received position information; and providing the generated user interface event to an application program. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27)
-
28. A non-transitory computer-readable medium including contents that are configured, when executed, to cause a computing system to perform a method comprising:
managing user interaction in three dimensions with respect to a displayed user interface screen, by; displaying the user interface screen via a display device, wherein the user interface screen is displayed on a surface; performing a calibration process by; detecting position and orientation of the displayed user interface screen based on a pattern displayed onto the surface; and building a three-dimensional model of the surface on which the user interface screen is displayed; receiving position information from a visual sensor, the position information reflecting the position of an object in a three-dimensional space located adjacent to the displayed user interface screen, wherein the three-dimensional space is located between the displayed user interface screen and the visual sensor; generating a user interface event based on the received position information; and providing the generated user interface event to an application program.
-
29. A system comprising:
-
a first computing device comprising a processor and a memory; and wherein the memory stores a code module that is configured, when executed by the processor, to manage user interaction in three dimensions with respect to a displayed user interface screen, by; displaying the user interface screen via a display device, wherein the user interface screen is displayed on a surface; performing a calibration process by; detecting position and orientation of the displayed user interface screen based on a pattern displayed onto the surface; and building a three-dimensional model of the surface on which the user interface screen is displayed; receiving position information from a visual sensor, the position information representing the position of an object in a three-dimensional space located adjacent to the displayed user interface screen, wherein the three-dimensional space is located between the displayed user interface screen and the visual sensor; generating a user interface event based on the received position information; and providing the generated user interface event to an application program. - View Dependent Claims (30, 31)
-
Specification