Method and system implementing user-centric gesture control
First Claim
Patent Images
1. A machine system implemented method providing user-centric gesture recognition, the method comprising:
- receiving three-dimensional (3D) tracking data representative of respective positionings of respective ones of plural user body parts of an automatically tracked first user of the machine system, the plural user body parts being in a field of sensing of one or more sensors of the machine system;
defining a first movable 3D frame of reference that moves in coordination with movement of a first subset of one or more of the plural user body parts of the tracked first user but not in coordination with movement of one or more other body parts;
defining a major interaction enabled zone positioned within the first movable 3D frame of reference and moving as the first movable 3D frame of reference moves;
determining based at least on the received 3D tracking data if at least one of the other body parts of the tracked first user is within the defined major interaction enabled zone;
based at least on the received 3D tracking data, recognizing a gesture made by the at least one of the other body parts that is determined to be within the defined major interaction enabled zone;
comparing said recognized gesture to contents of a gesture library of one or more pre-defined gestures that are useable to control by way of gesture, a gesture controllable device of the machine system; and
based at least on finding a match between a recognized said user-gesture and contents of said gesture library, causing a carrying out of a corresponding gesture initiated action by the gesture controllable device.
3 Assignments
0 Petitions
Accused Products
Abstract
A user-centric method and system to identify user-made gestures to control a remote device images the user using a three-dimensional image system, and defines at least one user-centric three-dimensional detection zone dynamically sized appropriately for the user, who is free to move about. Images made within the detection zone are compared to a library of stored gestures, and the thus identified gesture is mapped to an appropriate control command signal coupleable to the remote device. The method and system also provides of a first user to hand off control of the remote device to a second user.
195 Citations
15 Claims
-
1. A machine system implemented method providing user-centric gesture recognition, the method comprising:
-
receiving three-dimensional (3D) tracking data representative of respective positionings of respective ones of plural user body parts of an automatically tracked first user of the machine system, the plural user body parts being in a field of sensing of one or more sensors of the machine system; defining a first movable 3D frame of reference that moves in coordination with movement of a first subset of one or more of the plural user body parts of the tracked first user but not in coordination with movement of one or more other body parts; defining a major interaction enabled zone positioned within the first movable 3D frame of reference and moving as the first movable 3D frame of reference moves; determining based at least on the received 3D tracking data if at least one of the other body parts of the tracked first user is within the defined major interaction enabled zone; based at least on the received 3D tracking data, recognizing a gesture made by the at least one of the other body parts that is determined to be within the defined major interaction enabled zone; comparing said recognized gesture to contents of a gesture library of one or more pre-defined gestures that are useable to control by way of gesture, a gesture controllable device of the machine system; and based at least on finding a match between a recognized said user-gesture and contents of said gesture library, causing a carrying out of a corresponding gesture initiated action by the gesture controllable device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A machine system having user-concentric gesture recognition, the system including stored code configured to cause one or more programmable parts of the machine system to carry out a method comprising:
-
receiving three-dimensional (3D) image data representative of respective positionings of respective ones of plural user body parts of an automatically tracked first user of the machine system, the plural user body parts being in a field of view of one or more image capture devices of the machine system; identifying the automatically tracked first user when said user is within said field of view; defining at least a first movable 3D frame of reference that moves in coordination with movement of a first subset of one or more of the plural user body parts of the identified first user but not in coordination with movement of one or more other body parts; defining a major interaction enabled zone positioned within the first movable 3D frame of reference, the defining of the major interaction enabled zone including sizing and positioning the major interaction enabled zone in accordance with parameters of an accessible body profile of the identified first user; determining if at least one of the other user body parts of the identified first user not in the first subset of body parts, is within the defined major interaction enabled zone; recognizing a gesture made by the at least one of the other body parts that is determined to be within the defined major interaction enabled zone; comparing said recognized gesture to contents of a gesture library including a plurality pre-defined gestures defined as occurring within at least one subregion of the major interaction enabled zone, at least one of the pre-defined gestures being defined as useable to control a gesture controllable device of the machine system; and based at least on finding a match between a recognized said user-gesture and contents of said gesture library, outputting a signal useable to control said gesture controllable device. - View Dependent Claims (10, 11, 12, 13, 14, 15)
-
Specification