SYSTEM AND METHOD FOR CLOSE-RANGE MOVEMENT TRACKING
First Claim
Patent Images
1. A method for operating a user interface, the method comprising:
- acquiring close range depth images of a part of a user'"'"'s body with a depth sensor;
identifying from the depth images movement within a designated zone of the part of the user'"'"'s body;
tracking the movement within the zone of the part of the user'"'"'s body;
displaying the part of the user'"'"'s body as a first object on a screen, wherein the first object on the screen is shown performing a gesture corresponding to the identified movement of the part of the user'"'"'s body.
3 Assignments
0 Petitions
Accused Products
Abstract
A system and method for close range object tracking are described. Close range depth images of a user'"'"'s hands and fingers or other objects are acquired using a depth sensor. Using depth image data obtained from the depth sensor, movements of the user'"'"'s hands and fingers or other objects are identified and tracked, thus permitting the user to interact with an object displayed on a screen, by using the positions and movements of his hands and fingers or other objects.
-
Citations
21 Claims
-
1. A method for operating a user interface, the method comprising:
-
acquiring close range depth images of a part of a user'"'"'s body with a depth sensor; identifying from the depth images movement within a designated zone of the part of the user'"'"'s body; tracking the movement within the zone of the part of the user'"'"'s body; displaying the part of the user'"'"'s body as a first object on a screen, wherein the first object on the screen is shown performing a gesture corresponding to the identified movement of the part of the user'"'"'s body. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A method comprising:
-
acquiring depth images of a user'"'"'s hand or hands and fingers with a depth sensor; tracking movements of the user'"'"'s hand or hands and fingers from the depth images, wherein the movements correspond to a gesture that performs a function. - View Dependent Claims (11, 12, 13, 14, 15)
-
-
16. A method of translating gestures made as part of a gesture-based language, comprising:
-
acquiring depth images of a user'"'"'s hands and fingers with a depth sensor; identifying a gesture made by the user'"'"'s hands and fingers from the depth images as a pre-defined gesture of the gesture-based language; providing a translation of the pre-identified gesture as an output. - View Dependent Claims (17)
-
-
18. A system comprising:
-
a depth sensor configured to acquire at close range depth images of at least a part of a user'"'"'s body; a tracking module configured to identify the at least the part of the user'"'"'s body from the depth image data and track a movement of the at least the part of the user'"'"'s body from the depth image data; an output module configured to permit the movement of the at least the part of the user'"'"'s body to interact with one or more user interface elements; a display configured to show a representation of at least part of the user'"'"'s body and the one or more user interface elements. - View Dependent Claims (19, 20)
-
-
21. A system, comprising:
-
means for acquiring close range depth images of a part of a user'"'"'s body with a depth sensor; means for identifying from the depth images movement within a designated zone of the part of the user'"'"'s body; means for tracking the movement within the zone of the part of the user'"'"'s body; means for displaying the part of the user'"'"'s body as a first object on a screen, wherein the first object on the screen is shown performing a gesture corresponding to the identified movement of the part of the user'"'"'s body.
-
Specification