Dynamic user interactions for display control and scaling responsiveness of display objects
First Claim
1. A method of uniformly responding to gestural inputs from multiple users within a three-dimensional (3D) sensory space, the method including:
- automatically adapting a responsiveness scale between gestures in a physical space from multiple users and resulting responses in a shared gestural interface by;
calculating a user spacing within a three-dimensional (3D) sensory space based on a spacing of users detected in the 3D sensory space; and
automatically proportioning on-screen responsiveness of the shared gestural interface responsive to the user spacing when interpreting movement distances of the gestures in the physical space.
11 Assignments
0 Petitions
Accused Products
Abstract
The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface. The technology disclosed further relates to detecting if a user has intended to interact with a virtual object based on measuring a degree of completion of gestures and creating interface elements in the 3D space.
252 Citations
19 Claims
-
1. A method of uniformly responding to gestural inputs from multiple users within a three-dimensional (3D) sensory space, the method including:
automatically adapting a responsiveness scale between gestures in a physical space from multiple users and resulting responses in a shared gestural interface by; calculating a user spacing within a three-dimensional (3D) sensory space based on a spacing of users detected in the 3D sensory space; and automatically proportioning on-screen responsiveness of the shared gestural interface responsive to the user spacing when interpreting movement distances of the gestures in the physical space. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
10. A non-transitory computer readable medium storing instructions for uniformly responding to gestural inputs from multiple users within a three-dimensional (3D) sensory space, which instructions when executed by one or more processors perform:
automatically adapting a responsiveness scale between gestures in a physical space from multiple users and resulting responses in a shared gestural interface by; calculating a user spacing within a three-dimensional (3D) sensory space based on a spacing of users detected in the 3D sensory space; and automatically proportioning on-screen responsiveness of the shared gestural interface responsive to the user spacing when interpreting movement distances of the gestures in the physical space. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
-
19. A system for uniformly responding to gestural inputs from multiple users within a three-dimensional (3D) sensory space, the system including:
-
a camera; and one or more processors coupled to the camera and to a non-transitory computer readable medium storing instructions thereon, which when executed by the processors perform; automatically adapting a responsiveness scale between gestures performed in a physical space within a field of view of the camera from multiple users and resulting responses in a shared gestural interface by; calculating a user spacing within a three-dimensional (3D) sensory space based on a spacing of users detected in the 3D sensory space; and automatically proportioning on-screen responsiveness of the shared gestural interface responsive to the user spacing when interpreting movement distances of the gestures in the physical space.
-
Specification