Gesture-Based Human Machine Interface
First Claim
Patent Images
1. A method of using a computer system through a gesture-based human machine interface, the method comprising:
- pointing to a position on a screen of the computer system using an object to point;
capturing images of a space in front of the screen with at least two cameras, the space including the object;
analyzing the images using a processor to identify the object, to determine the position on the screen to which the object is pointing, and to determine a distance of the object from the screen; and
modifying information displayed on the screen in response to the determination of both the position on the screen to which the object is pointing and the distance of the object from the screen.
1 Assignment
0 Petitions
Accused Products
Abstract
A gesture-based human machine interface, for example, a user interface for controlling a program executing on a computer, and related method are provided. Gestures of the user are monitored and a response is provided that is based upon the detected gestures. An object is used to point to information displayed on a screen. The information displayed on the screen is modified in response to a determination of the position on the screen to which the object is pointing and in response to the distance of the object from the screen.
107 Citations
20 Claims
-
1. A method of using a computer system through a gesture-based human machine interface, the method comprising:
-
pointing to a position on a screen of the computer system using an object to point; capturing images of a space in front of the screen with at least two cameras, the space including the object; analyzing the images using a processor to identify the object, to determine the position on the screen to which the object is pointing, and to determine a distance of the object from the screen; and modifying information displayed on the screen in response to the determination of both the position on the screen to which the object is pointing and the distance of the object from the screen. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. An apparatus comprising:
-
a screen operable to display information; at least two cameras configured to capture images of a space in front of the screen; and a processor configured to receive the images, analyze the images to identify an object pointing at a position on the screen, to determine the position on the screen to which the object is pointing, and to determine a distance of the object from the screen, and modify the information displayed on the screen in response to the determination of both the position on the screen to which the object is pointing and the distance of the object from the screen. - View Dependent Claims (11, 12, 13, 14, 15)
-
-
16. A gesture-based human interface of a computer system, the interface comprising:
-
a screen operable to display information; at least two cameras configured to capture images of a space in front of the screen; and a processor configured to receive the images, analyze the images to identify an object pointing at a position on the screen, to determine the position on the screen to which the object is pointing, to determine a distance of the object from the screen, and to determine a speed of movement of the object, and modify the information displayed on the screen in response to the determination of the position on the screen to which the object is pointing, the distance of the object from the screen, and the speed of movement. - View Dependent Claims (17, 18, 19, 20)
-
Specification