GESTURE MAPPING FOR DISPLAY DEVICE
First Claim
1. A method for interacting with a computer system including a display device and a database coupled to a processor, the method comprising:
- storing, in the database, a plurality of two-dimensional gestures for operating the computer system;
detecting, via at least two three-dimensional optical sensors coupled to the processor, the presence of an object within a field of view of the sensors;
associating, via the processor, positional information with movement of the object within the field of view of the sensors;
mapping, via the processor, the positional information of the object with one of the plurality of gestures stored in the database;
determining, via the processor, a control operation based on the mapped gesture and a location of the object with respect to the display.
1 Assignment
0 Petitions
Accused Products
Abstract
Embodiments of the present invention disclose a gesture mapping method for a computer system including a display and a database coupled to a processor. According to one embodiment, the method includes storing a plurality of two-dimensional gestures for operating the computer system, and detecting the presence of an object within a field of view of at least two three-dimensional optical sensors. Positional information is associated with movement of the object, and this information is mapped to one of the plurality of gestures stored in the database. Furthermore, the processor is configured to determine a control operation for the mapped gesture based on the positional information and a location of the object with respect to the display.
-
Citations
15 Claims
-
1. A method for interacting with a computer system including a display device and a database coupled to a processor, the method comprising:
-
storing, in the database, a plurality of two-dimensional gestures for operating the computer system; detecting, via at least two three-dimensional optical sensors coupled to the processor, the presence of an object within a field of view of the sensors; associating, via the processor, positional information with movement of the object within the field of view of the sensors; mapping, via the processor, the positional information of the object with one of the plurality of gestures stored in the database; determining, via the processor, a control operation based on the mapped gesture and a location of the object with respect to the display. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A system comprising:
-
a display coupled to a processor; a database coupled to the processor and configured to store a set of two-dimensional gestures for operating the system; at least two three-dimensional optical sensors configured to detect movement of an object within a field of view of either optical sensor; wherein upon detection of an object within the field of view of at least one sensor, the processor is configured to; map movement of the object with at least one gesture in the set of gestures stored in the database, and determine an executable control operation based on the mapped gesture and a location of the object with respect to the display. - View Dependent Claims (10, 11, 12, 13)
-
-
14. A computer readable storage medium having stored executable instructions, that when executed by a processor, causes the processor to:
-
store a plurality of two-dimensional gestures in a database; detect the presence of a user'"'"'s hand within a field of view of at least two three-dimensional optical sensors; associate positional information with movement of the hand within the field of view of the sensors; map the positional information of the hand with one of the plurality of hand gestures stored in the database; determine a control operation for the hand gesture based on the positional information and a location of the hand with respect to the display. - View Dependent Claims (15)
-
Specification