Dynamic user interactions for display control and manipulation of display objects
First Claim
1. A method of creating interface elements in a three-dimensional (3D) sensory space, the method including:
- detecting a first motion in a three-dimensional (3D) sensory space;
constructing, responsive to the first motion, a user interface element for display on a gestural interface, the user interface element being specifically constructed according to the detected first motion and having at least one associated function;
wherein the associated function of user interface element is selected based on a location of the user interface element on the gestural interface, the location of the user interface element on the gestural interface being movable and dependent upon the detected first motion;
detecting, using an electronic sensor, a subsequent second motion in the 3D sensory space, the second motion different from the first motion and proximate to the location of the constructed user interface element on the gestural interface;
changing state of the user interface element responsive to the subsequent second motion; and
responsive to the user interface element changing state, performing the at least one associated function selected.
11 Assignments
0 Petitions
Accused Products
Abstract
The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface. The technology disclosed further relates to detecting if a user has intended to interact with a virtual object based on measuring a degree of completion of gestures and creating interface elements in the 3D space.
257 Citations
22 Claims
-
1. A method of creating interface elements in a three-dimensional (3D) sensory space, the method including:
-
detecting a first motion in a three-dimensional (3D) sensory space; constructing, responsive to the first motion, a user interface element for display on a gestural interface, the user interface element being specifically constructed according to the detected first motion and having at least one associated function; wherein the associated function of user interface element is selected based on a location of the user interface element on the gestural interface, the location of the user interface element on the gestural interface being movable and dependent upon the detected first motion; detecting, using an electronic sensor, a subsequent second motion in the 3D sensory space, the second motion different from the first motion and proximate to the location of the constructed user interface element on the gestural interface; changing state of the user interface element responsive to the subsequent second motion; and responsive to the user interface element changing state, performing the at least one associated function selected. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A system including one or more processors coupled to memory storing computer instructions to create interface elements in a three-dimensional (3D) sensory space, which computer instructions, when executed on the one or more processors, implement actions comprising:
-
detecting a first motion in a three-dimensional (3D) sensory space; constructing, responsive to the first motion, a user interface element for display on a gestural interface, the user interface element being specifically constructed according to the detected first motion and having at least one associated function; wherein the associated function of the user interface element is selected based on a location of the user interface element on the gestural interface, the location of the user interface element on the gestural interface being movable and dependent upon the detected first motion; detecting, using an electronic sensor, a subsequent second motion in the 3D sensory space, the second motion different from the first motion and proximate to the location of the constructed user interface element on the gestural interface; changing state of the user interface element responsive to the subsequent second motion; and responsive to the user interface element changing state, performing the at least one associated function selected. - View Dependent Claims (10, 11, 12, 13, 14, 15)
-
-
16. A non-transitory computer readable storage medium impressed with computer program instructions to create interface elements in a three-dimensional (3D) sensory space, which computer program instructions, when executed on one or more processors, implement actions comprising:
-
detecting a first motion in a three-dimensional (3D) sensory space; constructing, responsive to the first motion, a user interface element for display on a gestural interface, the user interface element being specifically constructed according to the detected first motion and having at least one associated function; wherein the associated function of the user interface element is selected based on a location of the user interface element on the gestural interface, the location of the user interface element on the gestural interface being movable and dependent upon the detected first motion; detecting a subsequent second motion in the 3D sensory space, the second motion different from the first motion and proximate to the location of the constructed user interface element on the gestural interface; changing state of the user interface element responsive to the subsequent second motion; and responsive to the user interface element changing state, performing the at least one associated function selected. - View Dependent Claims (17, 18, 19, 20, 21, 22)
-
Specification