Dynamic user interactions for display control and measuring degree of completeness of user gestures
First Claim
1. A method of creating interface elements in a three-dimensional (3D) sensory space, the method including:
- detecting a first motion made by a finger in a three-dimensional (3D) sensory space using an electronic sensor including capturing by the cameras images of the first motion made by the finger and analyzing the images to detect edges of objects including the finger and other fingers;
determining that the first motion by the finger includes a first swirl gesture by and representing gesture-relevant body parts including the finger and other fingers using a simplified skeletal model and determining therefrom gestures made by individual fingers, wherein;
the finger making the first motion is a finger that is determined to be moving in a swirling motion; and
the finger is attached to a hand that is determined to not move along with the finger that is determined to be moving in the swirling motion;
determining, based on magnitudes of their respective spatial trajectories, whether to recognize a gesture associated with (a) the first motion of the finger and (b) lack of motion of the hand to which the finger is attached;
recognizing a gesture associated with the first motion of the finger without motion of the hand to which the finger is attached;
constructing an on-screen puck that is constructed having at least one associated function, wherein the on-screen puck is constructed responsive to the first motion made by the finger without motion of the hand to which the finger is attached;
detecting in additional images, captured using the cameras of the electronic sensor, a subsequent second motion of both the finger and an extended second finger in the 3D sensory space, wherein;
the finger making the first motion and the extended second finger are determined to be moving in second motion; and
the detected subsequent second motion being different from the first motion;
determining, based on magnitudes of their respective spatial trajectories, whether to recognize a gesture associated with (a) the second motion of the finger and (b) the extended second finger;
recognizing a gesture associated with the second motion of the finger and the extended second finger that is different from the gesture associated with the first motion of the finger and the extended second finger; and
rotating the on-screen puck responsive to the subsequent second motion of the finger and the extended second finger moving in the 3D sensory space and performing the at least one associated function.
11 Assignments
0 Petitions
Accused Products
Abstract
The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface. The technology disclosed further relates to detecting if a user has intended to interact with a virtual object based on measuring a degree of completion of gestures and creating interface elements in the 3D space.
257 Citations
18 Claims
-
1. A method of creating interface elements in a three-dimensional (3D) sensory space, the method including:
-
detecting a first motion made by a finger in a three-dimensional (3D) sensory space using an electronic sensor including capturing by the cameras images of the first motion made by the finger and analyzing the images to detect edges of objects including the finger and other fingers; determining that the first motion by the finger includes a first swirl gesture by and representing gesture-relevant body parts including the finger and other fingers using a simplified skeletal model and determining therefrom gestures made by individual fingers, wherein; the finger making the first motion is a finger that is determined to be moving in a swirling motion; and the finger is attached to a hand that is determined to not move along with the finger that is determined to be moving in the swirling motion; determining, based on magnitudes of their respective spatial trajectories, whether to recognize a gesture associated with (a) the first motion of the finger and (b) lack of motion of the hand to which the finger is attached; recognizing a gesture associated with the first motion of the finger without motion of the hand to which the finger is attached; constructing an on-screen puck that is constructed having at least one associated function, wherein the on-screen puck is constructed responsive to the first motion made by the finger without motion of the hand to which the finger is attached; detecting in additional images, captured using the cameras of the electronic sensor, a subsequent second motion of both the finger and an extended second finger in the 3D sensory space, wherein; the finger making the first motion and the extended second finger are determined to be moving in second motion; and the detected subsequent second motion being different from the first motion; determining, based on magnitudes of their respective spatial trajectories, whether to recognize a gesture associated with (a) the second motion of the finger and (b) the extended second finger; recognizing a gesture associated with the second motion of the finger and the extended second finger that is different from the gesture associated with the first motion of the finger and the extended second finger; and rotating the on-screen puck responsive to the subsequent second motion of the finger and the extended second finger moving in the 3D sensory space and performing the at least one associated function. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A system including one or more processors coupled to memory storing computer instructions to create interface elements in a three-dimensional (3D) sensory space, which computer instructions, when executed on the processors, implement actions comprising:
-
detecting a first motion made by a finger in a three-dimensional (3D) sensory space using an electronic sensor including capturing by the cameras images of the first motion made by the finger and analyzing the images to detect edges of objects including the finger and other fingers; determining that the first motion by the finger includes a first swirl gesture by and representing gesture-relevant body parts including the finger and other fingers using a simplified skeletal model and determining therefrom gestures made by individual fingers, wherein; the finger making the first motion is a finger that is determined to be moving in a swirling motion; and the finger is attached to a hand that is determined to not move along with the finger that is determined to be moving in the swirling motion; determining, based on magnitudes of their respective spatial trajectories, whether to recognize a gesture associated with (a) the first motion of the finger and (b) lack of motion of the hand to which the finger is attached; recognizing a gesture associated with the first motion of the finger without motion of the hand to which the finger is attached; constructing an on-screen puck that is constructed having at least one associated function, wherein the on-screen puck is constructed responsive to the first motion made by the finger without motion of the hand to which the finger is attached; detecting in additional images, captured using the cameras of the electronic sensor, a subsequent second motion of both the finger and an extended second finger in the 3D sensory space, wherein; the finger making the first motion and the extended second finger are determined to be moving in second motion; and the detected subsequent second motion being different from the first motion; determining, based on magnitudes of their respective spatial trajectories, whether to recognize a gesture associated with (a) the second motion of the finger and (b) the extended second finger; recognizing a gesture associated with the second motion of the finger and the extended second finger that is different from the gesture associated with the first motion of the finger and the extended second finger; and rotating the on-screen puck responsive to the subsequent second motion of the finger and the extended second finger moving in the 3D sensory space and performing the at least one associated function. - View Dependent Claims (9, 10, 11, 12)
-
-
13. A non-transitory computer readable storage medium impressed with computer program instructions to create interface elements in a three-dimensional (3D) sensory space, which computer program instructions, when executed on one or more processors, implement actions comprising:
-
detecting a first motion made by a finger in a three-dimensional (3D) sensory space using an electronic sensor including capturing by the cameras images of the first motion made by the finger and analyzing the images to detect edges of objects including the finger and other fingers; determining that the first motion by the finger includes a first swirl gesture by and representing gesture-relevant body parts including the finger and other fingers using a simplified skeletal model and determining therefrom gestures made by individual fingers, wherein; the finger making the first motion is a finger that is determined to be moving in a swirling motion; and the finger is attached to a hand that is determined to not move along with the finger that is determined to be moving in the swirling motion; determining, based on magnitudes of their respective spatial trajectories, whether to recognize a gesture associated with (a) the first motion of the finger and (b) lack of motion of the hand to which the finger is attached; recognizing a gesture associated with the first motion of the finger without motion of the hand to which the finger is attached; constructing an on-screen puck that is constructed having at least one associated function, wherein the on-screen puck is constructed responsive to the first motion made by the finger without motion of the hand to which the finger is attached; detecting in additional images, captured using the cameras of the electronic sensor, a subsequent second motion of both the finger and an extended second finger in the 3D sensory space, wherein; the finger making the first motion and the extended second finger are determined to be moving in second motion; and the detected subsequent second motion being different from the first motion; determining, based on magnitudes of their respective spatial trajectories, whether to recognize a gesture associated with (a) the second motion of the finger and (b) the extended second finger; recognizing a gesture associated with the second motion of the finger and the extended second finger that is different from the gesture associated with the first motion of the finger and the extended second finger; and rotating the on-screen puck responsive to the subsequent second motion of the finger and the extended second finger moving in the 3D sensory space and performing the at least one associated function. - View Dependent Claims (14, 15, 16, 17)
-
-
18. A method of creating interface elements in a three-dimensional (3D) sensory space, the method comprising:
-
detecting a first motion made by a finger in the 3D sensory space using an electronic sensor; detecting, by the electronic sensor, an arm waving, wherein; the finger making the first motion is a finger on a hand attached to the waving arm; the arm moves a greater distance through the 3D sensory space than a distance that the finger moves relative to the hand; and the finger moves a distance of 1 to 5 millimeters relative to the hand; detecting positions of the arm, the hand attached to the arm, and fingers attached to the hand in the 3D sensory space; calculating from the detected positions a spatial trajectory of a waving motion executed by the arm and motion made by the finger; determining, based on magnitudes of the respective spatial trajectories, whether to recognize a gesture associated with (a) the waving motion or (b) the motion of the finger; recognizing a gesture associated with the first motion of the finger without recognizing a gesture associated with the arm waving motion; and manipulating an on-screen item, constructed in response to a detected motion, according to the recognized gesture.
-
Specification