Gesture detection and interactions
First Claim
1. A method of controlling operation of a computing device based on gesture detection, the method comprising:
- determining, via touchscreen functionality of the computing device and without reliance on a three-dimensional object detection system of the computing device, a first input based on a touch-input performed by a first hand of a user selecting an item displayed by the touchscreen functionality;
determining, via the three-dimensional object detection system without reliance on the touchscreen functionality, a second input based on a three-dimensional gesture performed by a second hand of the user that defines an operation for the item; and
controlling performance of the operation on the item based on the first and second inputs.
2 Assignments
0 Petitions
Accused Products
Abstract
Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
-
Citations
42 Claims
-
1. A method of controlling operation of a computing device based on gesture detection, the method comprising:
-
determining, via touchscreen functionality of the computing device and without reliance on a three-dimensional object detection system of the computing device, a first input based on a touch-input performed by a first hand of a user selecting an item displayed by the touchscreen functionality; determining, via the three-dimensional object detection system without reliance on the touchscreen functionality, a second input based on a three-dimensional gesture performed by a second hand of the user that defines an operation for the item; and controlling performance of the operation on the item based on the first and second inputs. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A method of controlling operation of a computing device based on gesture detection, the method comprising:
-
detecting, using a three-dimensional object detection system of the computing device that is configured to detect objects in three-dimensional space, a first collection of inputs involving identification, orientation, or movement of one or more hands of a user within a threshold distance from the computing device; causing performance of a first operation by the computing device responsive to the detection of the first collection of inputs; detecting, using the three-dimensional object detection system of the computing device, a second collection of inputs involving identification, orientation, or movement of the one or more hands of the user outside the threshold distance from the computing device, the identification, orientation, or movement of the one or more hands of the user associated with the second collection of inputs being similar to the identification, orientation, or movement of the one or more hands of the user associated with the first collection of inputs; and causing performance of a second operation by the computing device responsive to the detection of the second collection of inputs the second operation being different than the first operation. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21)
-
-
22. A computing device comprising:
-
a three-dimensional object detection system configured to detect objects in three-dimensional space; a touchscreen; and a gesture module implemented at least partially in hardware, the gesture module configured to; determine, via the touchscreen and without reliance on the three-dimensional object detection system, a first input based on a touch-input performed by a first hand of a user selecting an item displayed by the touchscreen; determine, via the three-dimensional object detection system without reliance on the touchscreen, a second input based on a three-dimensional gesture performed by a second hand of the user that defines an operation for the item; and control performance of the operation on the item based on the first and second inputs. - View Dependent Claims (23, 24, 25, 26, 27, 28, 29, 30)
-
-
31. A computing device comprising:
-
a three-dimensional object detection system configured to detect objects in three-dimensional space; and a gesture module implemented at least partially in hardware, the gesture module configured to; cause performance of a first operation responsive to detection, by the three-dimensional object detection system, of a first collection of inputs involving identification, orientation, or movement of one or more hands of a user within a threshold distance from the computing device; and cause performance of a second operation responsive to detection, by the three-dimensional object detection system, of a second collection of inputs involving identification, orientation, or movement of the one or more hands of the user that is not within the threshold distance from the computing device, the second operation being different than the first operation, the identification, orientation, or movement of the one or more hands of the user associated with the second collection of inputs being similar to the identification, orientation, or movement of the one or more hands of the user associated with the first collection of inputs. - View Dependent Claims (32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42)
-
Specification