×

Customized gesture interpretation

  • US 10,620,709 B2
  • Filed: 01/15/2014
  • Issued: 04/14/2020
  • Est. Priority Date: 04/05/2013
  • Status: Active Grant
First Claim
Patent Images

1. A method of a gesture system performing user-customized gesture recognition in a three-dimensional (3D) sensory space, the method including:

  • in a user-initiated training environment, training the gesture system to recognize a user-defined customized gesture by;

    detecting, by an electronic sensor, a training gesture performed by the user in the 3D sensory space;

    determining a set of characteristic values for the detected training gesture using data captured by the electronic sensor; and

    storing the set of characteristic values for recognizing the training gesture as the user-defined customized gesture; and

    in a gesture interaction environment, performing gesture recognition of the user-defined customized gesture by;

    detecting an actual gesture performed by the user in the 3D sensory space using an electronic sensor;

    determining a set of characteristic values for the detected actual gesture using data from the electronic sensor;

    matching the determined set of characteristic values for the detected actual gesture with the stored set of characteristic values for the training gesture from among multiple stored sets of characteristic values;

    identifying the training gesture, for which the set of characteristic values matched the set of characteristic values of the detected actual gesture, as a gesture of interest; and

    displaying an on-screen indicator to reflect a degree of completion of the gesture of interest as the detected actual gesture is being performed by the user in real time, thereby enabling the user to dynamically control a relationship between actual movement of the user in real time and a corresponding action displayed in real time on the screen,wherein the degree of completion is displayed by the on-screen indicator in real time as the user continues to perform the detected actual gesture to reflect the degree of completion of the actual gesture, in real time, as performed by the user while dynamically interacting with virtual objects displayed on the screen.

View all claims
  • 11 Assignments
Timeline View
Assignment View
    ×
    ×