Velocity field interaction for free space gesture interface and control
First Claim
Patent Images
1. A method of automatically interpreting a gesture of a control object, in a three-dimensional (3D) sensor space using a 3D sensor, as a first gesture or a second gesture, the method including:
- defining a control plane that remains tangent to a surface of the control object throughout a sensed movement of the control object in any direction in the 3D sensor space, the control plane being defined by an orientation of the control object, as sensed by a video camera of the 3D sensor;
interpreting the gesture of the control object as the first gesture or the second gesture by comparing a direction of a sensed trajectory of the movement of the control object to (i) a normal vector that is normal to the defined control plane and (ii) a surface of the defined control plane, wherein;
the gesture is the first gesture when the direction of the trajectory of the movement of the control object is within a pre-determined range of the normal vector of the control plane; and
the gesture is the second gesture when the direction of the trajectory of the movement of the control object is parallel to the surface of the control plane, within a pre-determined range,such that the gesture is the first gesture when the direction of the movement of the trajectory of the control object is more normal to the surface of the control plane than parallel to the surface of the control plane and the gesture is the second gesture when the direction of the movement of the trajectory of the control object is more parallel to the surface of the control plane than normal to the surface of the control plane.
5 Assignments
0 Petitions
Accused Products
Abstract
The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.
45 Citations
16 Claims
-
1. A method of automatically interpreting a gesture of a control object, in a three-dimensional (3D) sensor space using a 3D sensor, as a first gesture or a second gesture, the method including:
-
defining a control plane that remains tangent to a surface of the control object throughout a sensed movement of the control object in any direction in the 3D sensor space, the control plane being defined by an orientation of the control object, as sensed by a video camera of the 3D sensor; interpreting the gesture of the control object as the first gesture or the second gesture by comparing a direction of a sensed trajectory of the movement of the control object to (i) a normal vector that is normal to the defined control plane and (ii) a surface of the defined control plane, wherein; the gesture is the first gesture when the direction of the trajectory of the movement of the control object is within a pre-determined range of the normal vector of the control plane; and the gesture is the second gesture when the direction of the trajectory of the movement of the control object is parallel to the surface of the control plane, within a pre-determined range, such that the gesture is the first gesture when the direction of the movement of the trajectory of the control object is more normal to the surface of the control plane than parallel to the surface of the control plane and the gesture is the second gesture when the direction of the movement of the trajectory of the control object is more parallel to the surface of the control plane than normal to the surface of the control plane. - View Dependent Claims (2, 3, 4, 5)
-
-
6. A method of automatically interpreting a gesture of a control object, in a three-dimensional (3D) sensor space relative to a flow depicted in a display, as a first gesture or a second gesture, the method including:
-
defining a control plane that remains tangent to a surface of the control object throughout a sensed movement of the control object in any direction in the 3D sensor space, the control plan being defined by an orientation of the control object, as sensed by a video camera of the 3D sensor; and interpreting the gesture of the control object as the first gesture or the second gesture by comparing a direction of the flow to (i) a normal vector that is normal to the defined control plane and (ii) a surface of the defined control plane, wherein; the gesture is the first gesture when the direction of the flow is within a pre-determined range of the normal vector of the control plane; and the gesture is the second gesture when the direction of the flow is parallel to the surface of the control plane, within a pre-determined range, such that the gesture is the first gesture when the direction of the flow is more normal to the surface of the control plane than parallel to the surface of the control plane, and the gesture is the second gesture when the direction of the flow is more parallel to the surface of the control plane than normal to the surface of the control plane. - View Dependent Claims (7, 8, 9)
-
-
10. A method of navigating a multi-layer presentation tree using gestures of a control object in a three-dimensional (3D) sensor space, using a 3D sensor, by distinguishing between the control object and a sub-object of the control object, the method including:
-
processing an output of a video camera of the 3D sensor thereby sensing a movement of the control object in any direction in the 3D sensor space using a control plane that remains tangent to a surface of the control object throughout the movement of the control object in any direction in the 3D sensor space, the control plane being defined by processing the output of the video camera of the 3D sensor and a sensed orientation of the control object; and interpreting by a computing device a direction of the movement of the control object as scrolling through a particular level of the multi-layer presentation tree when the direction of the movement of the control object is more normal with respect to the surface of the tangent control plane than parallel with respect to the surface of the control plane. - View Dependent Claims (11, 12, 13)
-
-
14. A method of interpreting a gesture of a control object in a 3D sensor space relative to one or more objects depicted in a display, the method including:
-
sensing a speed of a movement of the control object moving in any direction through the 3D sensor space using a 3D sensor; interpreting by a computing device the movement as a path on the display if the speed of the movement exceeds a pre-determined threshold; and duplicating one or more of the objects in the display that intersect the path as interpreted. - View Dependent Claims (15, 16)
-
Specification