Customized gesture interpretation
First Claim
1. A method of a gesture system performing user-customized gesture recognition in a three-dimensional (3D) sensory space, the method including:
- in a user-initiated training environment, training the gesture system to recognize a user-defined customized gesture by;
detecting, by an electronic sensor, a training gesture performed by the user in the 3D sensory space;
determining a set of characteristic values for the detected training gesture using data captured by the electronic sensor; and
storing the set of characteristic values for recognizing the training gesture as the user-defined customized gesture; and
in a gesture interaction environment, performing gesture recognition of the user-defined customized gesture by;
detecting an actual gesture performed by the user in the 3D sensory space using an electronic sensor;
determining a set of characteristic values for the detected actual gesture using data from the electronic sensor;
matching the determined set of characteristic values for the detected actual gesture with the stored set of characteristic values for the training gesture from among multiple stored sets of characteristic values;
identifying the training gesture, for which the set of characteristic values matched the set of characteristic values of the detected actual gesture, as a gesture of interest; and
displaying an on-screen indicator to reflect a degree of completion of the gesture of interest as the detected actual gesture is being performed by the user in real time, thereby enabling the user to dynamically control a relationship between actual movement of the user in real time and a corresponding action displayed in real time on the screen,wherein the degree of completion is displayed by the on-screen indicator in real time as the user continues to perform the detected actual gesture to reflect the degree of completion of the actual gesture, in real time, as performed by the user while dynamically interacting with virtual objects displayed on the screen.
11 Assignments
0 Petitions
Accused Products
Abstract
The technology disclosed relates to filtering gestures, according to one implementation. In particular, it relates to distinguishing between interesting gestures from non-interesting gestures in a three-dimensional (3D) sensory space by comparing characteristics of user-defined reference gestures against characteristics of actual gestures performed in the 3D sensory space. Based on the comparison, a set of gestures of interest are filtered from all the gestures performed in the 3D sensory space. The technology disclosed also relates to customizing gesture interpretation for a particular user, according to another implementation. In particular, it relates to setting parameters for recognizing gestures by prompting the user to select values for characteristics of the gestures. In one implementation, the technology disclosed includes performing characteristic focused demonstrations of boundaries of the gesture. It further includes testing the interpretation of gestures by prompting the user to perform complete gesture demonstrations and receiving evaluation from the user regarding the interpretation.
259 Citations
32 Claims
-
1. A method of a gesture system performing user-customized gesture recognition in a three-dimensional (3D) sensory space, the method including:
-
in a user-initiated training environment, training the gesture system to recognize a user-defined customized gesture by; detecting, by an electronic sensor, a training gesture performed by the user in the 3D sensory space; determining a set of characteristic values for the detected training gesture using data captured by the electronic sensor; and storing the set of characteristic values for recognizing the training gesture as the user-defined customized gesture; and in a gesture interaction environment, performing gesture recognition of the user-defined customized gesture by; detecting an actual gesture performed by the user in the 3D sensory space using an electronic sensor; determining a set of characteristic values for the detected actual gesture using data from the electronic sensor; matching the determined set of characteristic values for the detected actual gesture with the stored set of characteristic values for the training gesture from among multiple stored sets of characteristic values; identifying the training gesture, for which the set of characteristic values matched the set of characteristic values of the detected actual gesture, as a gesture of interest; and displaying an on-screen indicator to reflect a degree of completion of the gesture of interest as the detected actual gesture is being performed by the user in real time, thereby enabling the user to dynamically control a relationship between actual movement of the user in real time and a corresponding action displayed in real time on the screen, wherein the degree of completion is displayed by the on-screen indicator in real time as the user continues to perform the detected actual gesture to reflect the degree of completion of the actual gesture, in real time, as performed by the user while dynamically interacting with virtual objects displayed on the screen. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30)
-
-
31. A non-transitory computer readable medium storing computer program instructions to customize gesture interpretation for a particular user, the instructions, when executed on a processor, implement a method including:
-
in a user-initiated training environment, training a gesture system to recognize a user-defined customized gesture by; detecting, by an electronic sensor, a training gesture performed by the user in a three-dimensional (3D) sensory space; determining a set of characteristic values for the detected training gesture using data captured by the electronic sensor; and storing the set of characteristic values for recognizing the training gesture as the user-defined customized gesture; and in a gesture interaction environment, performing gesture recognition of the user-defined customized gesture by; detecting an actual gesture performed by the user in a the 3D sensory space using an electronic sensor; determining a set of characteristic values for the detected actual gesture, using data from the electronic sensor; matching the determined set of characteristic values for the detected actual gesture with the stored set of characteristic values for the training gesture from among multiple stored sets of characteristic values; identifying the training gesture, for which the set of characteristic values matched the set of characteristic values of the detected actual gesture, as a gesture of interest; and displaying an on-screen indicator to reflect a degree of completion of the gesture of interest as the detected actual gesture is being performed by the user in real time, thereby enabling the user to dynamically control a relationship between actual movement of the user in real time and a corresponding action displayed in real time on the screen, wherein the degree of completion is displayed by the on-screen indicator in real time as the user continues to perform the detected actual gesture to reflect the degree of completion of the actual gesture, in real time, as performed by the user while dynamically interacting with virtual objects displayed on the screen. - View Dependent Claims (32)
-
Specification