Gesture detection and interactions
First Claim
1. A method of controlling operation of a computing device based on gesture detection, the method comprising:
- detecting a first input by the computing device by recognizing a first gesture performed by a first hand of a user;
detecting a second input by the computing device by recognizing a second gesture performed by a second hand of the user, the second input defining one or more operations to be performed; and
controlling, by the computing device, performance of the one or more operations by dynamically adjusting, based on the first input, an amount of scale used in the performance of the one or more operations.
2 Assignments
0 Petitions
Accused Products
Abstract
Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
418 Citations
20 Claims
-
1. A method of controlling operation of a computing device based on gesture detection, the method comprising:
-
detecting a first input by the computing device by recognizing a first gesture performed by a first hand of a user; detecting a second input by the computing device by recognizing a second gesture performed by a second hand of the user, the second input defining one or more operations to be performed; and controlling, by the computing device, performance of the one or more operations by dynamically adjusting, based on the first input, an amount of scale used in the performance of the one or more operations. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A computing device comprising:
-
a housing; a three dimensional object detection system disposed within the housing and configured to identify and detect orientation and movement of first and second hands of a user in three dimensional space using radio waves; and a gesture module implemented at least partially in hardware and disposed within the housing, the gesture module configured to control performance of one or more operations by the computing device responsive to detection of first orientation and movement of the first hand of the user to modify a scaling factor used in the performance of the one or more operations defined by a gesture corresponding to second orientation and movement of the second hand of the user. - View Dependent Claims (10, 11, 12, 13, 20)
-
-
14. A computing device comprising:
-
a three dimensional object detection system configured to detect orientation or movement of objects in three dimensional space; and a gesture module implemented at least partially in hardware, the gesture module configured to; detect a first input, received by the computing device, by recognizing a first gesture defined by a first movement or orientation of a first hand of a user; detect a second input, received by the computing device, by recognizing a second gesture defined by a second movement or orientation of a second hand of the user; and control, by the computing device, performance of one or more operations defined by the second gesture by dynamically adjusting an amount of scale used in performance of the one or more operations based on the first gesture. - View Dependent Claims (15, 16, 17, 18, 19)
-
Specification