Method and system implementing user-centric gesture control
First Claim
Patent Images
1. A method implementing user-centric gesture recognition enabling a user to remotely gesture-control a device, the method comprising:
- disposing a three-dimensional imaging system whose field of view encompasses a three-dimensional space in which at least one user may be present, and recognizing a user when said user is within said field of view;
providing a gesture recognition system that defines a world coordinate system within said three-dimensional space based upon location of said user as determined by said imaging system, said world coordinate system moving with movement of said user;
said gesture recognition system defining relative to a current position of said user as determined by said imaging system a dynamically-sized and positioned three-dimensional zone of interaction within which zone said user may make at least one gesture;
said gesture recognition system storing a gesture library of at least one pre-defined gesture defined in three dimensions in said zone of interaction, the at least one pre-defined gesture useable to control said device;
said gesture recognition system recognizing a gesture made by said user within said three-dimensional space and comparing said gesture to contents of said gesture library;
upon finding a match between a recognized said user-gesture and contents of said gesture library, outputting a signal useable to control said device.
3 Assignments
0 Petitions
Accused Products
Abstract
A user-centric method and system to identify user-made gestures to control a remote device images the user using a three-dimensional image system, and defines at least one user-centric three-dimensional detection zone dynamically sized appropriately for the user, who is free to move about. Images made within the detection zone are compared to a library of stored gestures, and the thus identified gesture is mapped to an appropriate control command signal coupleable to the remote device. The method and system also provides of a first user to hand off control of the remote device to a second user.
-
Citations
19 Claims
-
1. A method implementing user-centric gesture recognition enabling a user to remotely gesture-control a device, the method comprising:
-
disposing a three-dimensional imaging system whose field of view encompasses a three-dimensional space in which at least one user may be present, and recognizing a user when said user is within said field of view; providing a gesture recognition system that defines a world coordinate system within said three-dimensional space based upon location of said user as determined by said imaging system, said world coordinate system moving with movement of said user; said gesture recognition system defining relative to a current position of said user as determined by said imaging system a dynamically-sized and positioned three-dimensional zone of interaction within which zone said user may make at least one gesture; said gesture recognition system storing a gesture library of at least one pre-defined gesture defined in three dimensions in said zone of interaction, the at least one pre-defined gesture useable to control said device; said gesture recognition system recognizing a gesture made by said user within said three-dimensional space and comparing said gesture to contents of said gesture library; upon finding a match between a recognized said user-gesture and contents of said gesture library, outputting a signal useable to control said device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A system enabling user to remotely gesture-control a device using user-concentric gesture recognition, the system comprising:
-
a three-dimensional imaging system whose field of view encompasses a three-dimensional space in which at least one user may be present, and recognizing a user when said user is within said field of view; a gesture recognition system, coupled to said three-dimensional imaging system, that defines a world coordinate system within said three-dimensional space based upon location of said user as determined by said imaging system, said world coordinate system moving with movement of said user; said gesture recognition system defining, relative to a current position of said user as determined by said imaging system, a dynamically-sized and positioned three-dimensional zone of interaction within which zone said user may make at least one gesture; and a gesture library including a plurality pre-defined gestures defined in three-dimensions in said zone of interaction, the gestures useable to control said device; said gesture recognition system recognizing a gesture made by said user within said three-dimensional space and comparing said gesture to contents of said gesture library; upon finding a match between a recognized said user-gesture and contents of said gesture library, said gesture recognition system outputting a signal useable to control said device. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
-
-
19. A method implementing user-centric gesture recognition enabling a user to remotely gesture-control a device, the method comprising:
-
disposing a three-dimensional imaging system whose field of view encompasses a three-dimensional space in which at least one user may be present, and recognizing a user when said user is within said field of view; providing a gesture recognition system that defines a world coordinate system within said three-dimensional space based upon location of said user as determined by said imaging system, said world coordinate system moving with movement of said user; said gesture recognition system defining relative to a current position of said user as determined by said imaging system a dynamically-sized and positioned three-dimensional zone of interaction within which zone said user may make at least one gesture; said gesture recognition system storing a gesture library of at least one pre-defined gesture useable to control said device; said gesture recognition system recognizing a gesture made by said user within said three-dimensional space and comparing said gesture to contents of said gesture library; causing said device to alert a user that a user-gesture may not have been correctly recognized by said gesture control system, and inviting said user to confirm improper recognition of a gesture; and upon finding a match between a recognized said user-gesture and contents of said gesture library, outputting a signal useable to control said device.
-
Specification