Gesture-Based User Interface
First Claim
Patent Images
1. A user interface method, comprising:
- capturing, by a computer, a sequence of images over time of at least a part of a body of a human subject;
processing the images in order to detect a gesture, selected from a group of gestures consisting of a grab gesture, a push gesture, a pull gesture, and a circular hand motion; and
controlling a software application responsively to the detected gesture.
3 Assignments
0 Petitions
Accused Products
Abstract
A user interface method, including capturing, by a computer, a sequence of images over time of at least a part of a body of a human subject, and processing the images in order to detect a gesture, selected from a group of gestures consisting of a grab gesture, a push gesture, a pull gesture, and a circular hand motion. A software application is controlled responsively to the detected gesture.
195 Citations
35 Claims
-
1. A user interface method, comprising:
-
capturing, by a computer, a sequence of images over time of at least a part of a body of a human subject; processing the images in order to detect a gesture, selected from a group of gestures consisting of a grab gesture, a push gesture, a pull gesture, and a circular hand motion; and controlling a software application responsively to the detected gesture. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17)
-
-
18. An apparatus, comprising:
-
a display; and a computer coupled to the display and configured to capture a sequence of images over time of at least a part of a body of a human subject, to process the images in order to detect a gesture, selected from a group of gestures consisting of a grab gesture, a push gesture, a pull gesture, and a circular hand motion, and to control a software application responsively to the detected gesture. - View Dependent Claims (19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34)
-
-
35. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer, cause the computer to capture a sequence of depth maps over time of at least a part of a body of a human subject, to process the depth maps in order to detect a gesture, selected from a group of gestures consisting of a grab gesture, a push gesture, a pull gesture, and a circular hand motion, and to control a software application responsively to the detected gesture.
Specification