×

Dynamic user interactions for display control and measuring degree of completeness of user gestures

  • US 10,042,510 B2
  • Filed: 01/15/2014
  • Issued: 08/07/2018
  • Est. Priority Date: 01/15/2013
  • Status: Active Grant
First Claim
Patent Images

1. A method of creating interface elements in a three-dimensional (3D) sensory space, the method including:

  • detecting a first motion made by a finger in a three-dimensional (3D) sensory space using an electronic sensor including capturing by the cameras images of the first motion made by the finger and analyzing the images to detect edges of objects including the finger and other fingers;

    determining that the first motion by the finger includes a first swirl gesture by and representing gesture-relevant body parts including the finger and other fingers using a simplified skeletal model and determining therefrom gestures made by individual fingers, wherein;

    the finger making the first motion is a finger that is determined to be moving in a swirling motion; and

    the finger is attached to a hand that is determined to not move along with the finger that is determined to be moving in the swirling motion;

    determining, based on magnitudes of their respective spatial trajectories, whether to recognize a gesture associated with (a) the first motion of the finger and (b) lack of motion of the hand to which the finger is attached;

    recognizing a gesture associated with the first motion of the finger without motion of the hand to which the finger is attached;

    constructing an on-screen puck that is constructed having at least one associated function, wherein the on-screen puck is constructed responsive to the first motion made by the finger without motion of the hand to which the finger is attached;

    detecting in additional images, captured using the cameras of the electronic sensor, a subsequent second motion of both the finger and an extended second finger in the 3D sensory space, wherein;

    the finger making the first motion and the extended second finger are determined to be moving in second motion; and

    the detected subsequent second motion being different from the first motion;

    determining, based on magnitudes of their respective spatial trajectories, whether to recognize a gesture associated with (a) the second motion of the finger and (b) the extended second finger;

    recognizing a gesture associated with the second motion of the finger and the extended second finger that is different from the gesture associated with the first motion of the finger and the extended second finger; and

    rotating the on-screen puck responsive to the subsequent second motion of the finger and the extended second finger moving in the 3D sensory space and performing the at least one associated function.

View all claims
  • 11 Assignments
Timeline View
Assignment View
    ×
    ×