Gesture detection and interactions
First Claim
1. A method of controlling operation of a computing device based on three-dimensional gesture detection, the method comprising:
- receiving a touch input, by a touchscreen of the computing device, indicating a request from a user of the computing device to enter a gesture recognition mode;
entering the gesture recognition mode based on the touch input; and
responsive to entering the gesture recognition mode;
transmitting, by a three-dimensional object detection system of the computing device, a signal configured to;
pass through an intervening article worn by or associated with the user; and
reflect off human tissue of the user;
receiving, by the three-dimensional object detection system, a reflection of the signal off a portion of the user;
determining, by the computing device, a gesture performed by the portion of the user in three-dimensional space based on the reflection of the signal; and
causing performance of one or more operations, by the computing device, based on the determined gesture.
2 Assignments
0 Petitions
Accused Products
Abstract
Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
-
Citations
34 Claims
-
1. A method of controlling operation of a computing device based on three-dimensional gesture detection, the method comprising:
-
receiving a touch input, by a touchscreen of the computing device, indicating a request from a user of the computing device to enter a gesture recognition mode; entering the gesture recognition mode based on the touch input; and responsive to entering the gesture recognition mode; transmitting, by a three-dimensional object detection system of the computing device, a signal configured to; pass through an intervening article worn by or associated with the user; and reflect off human tissue of the user; receiving, by the three-dimensional object detection system, a reflection of the signal off a portion of the user; determining, by the computing device, a gesture performed by the portion of the user in three-dimensional space based on the reflection of the signal; and causing performance of one or more operations, by the computing device, based on the determined gesture. - View Dependent Claims (2, 3, 4, 5)
-
-
6. A method of controlling operation of a computing device based on three-dimensional gesture detection, the method comprising:
-
transmitting, by a three-dimensional object detection system of the computing device, one or more signals configured to; pass through an intervening article worn by or associated with a user of the computing device; and reflect off human tissue of the user; receiving, by the three-dimensional object detection system, a first reflection of a first signal of the signals off a first portion of the user; determining, by the computing device, a first instance of a first gesture performed by the first portion of the user in three-dimensional space based on the first reflection of the first signal; receiving, by the three-dimensional object detection system, a second reflection of the first signal or a second signal of the signals off a second portion of the user; determining, by the computing device, a second gesture performed by the second portion of the user in three-dimensional space based on the second reflection of the first signal or the second signal; recognizing, by the computing device, the second gesture as an intended gesture based on the determination of the first instance of the first gesture; and causing performance of one or more operations, by the computing device, based on the second gesture. - View Dependent Claims (7, 8, 9, 10)
-
-
11. A computing device comprising:
-
a three-dimensional object detection system configured to; transmit signals configured to; pass through intervening articles worn by or associated with users of the computing device; and reflect off human tissue of the users; receive reflections of the signals reflected off portions of the users; and determine spatial information about the portions of the users based on the received reflections of the signals; and a gesture module implemented at least partially in hardware, the gesture module configured to; receive, from the three-dimensional object detection system, first spatial information about a first portion of the user; determine a first instance of a first gesture performed by the first portion of the user based on the first spatial information; receive, from the three-dimensional object detection system, second spatial information about a second portion of the user; determine a second gesture performed by the second portion of the user based on the second spatial information; recognize the second gesture as an intended gesture based on the determination of the first instance of the first gesture, and cause performance of one or more operations of the computing device based on the second gesture. - View Dependent Claims (12, 13, 14, 15, 16, 17)
-
-
18. A method of controlling operation of a computing device based on three-dimensional gesture detection, the method comprising:
-
displaying, by the computing device, a user interface; determining, by a three-dimensional object detection system of the computing device, that a portion of a user has entered a predefined threshold distance with the computing device; determining, by the computing device, a location within a user interface corresponding to a location of the portion of the user relative the user interface when the predefined threshold distance was met; displaying, by the computing device, an indication of the location on the user interface of the computing device; receiving, from the three-dimensional object detection system, spatial information about the portion of the user while in the predefined threshold distance; and changing the indication based on the spatial information;
orperforming an operation on an object of the user interface proximal the indication based on the spatial information. - View Dependent Claims (19, 20)
-
-
21. A method of controlling operation of a computing device based on three-dimensional gesture detection, the method comprising:
-
transmitting, by a three-dimensional object detection system of the computing device, a first signal configured to reflect off of human tissue; receiving, by the three-dimensional object detection system, a first reflection of the first signal off of human tissue of a user; determining, by the computing device, a first gesture performed by the user in three-dimensional space based on the first reflection of the first signal, the first gesture corresponding to a request from the user to enter a gesture recognition mode; entering the gesture recognition mode based on the determined first gesture; and responsive to entering the gesture recognition mode; transmitting, by the three-dimensional object detection system, a second signal; receiving, by the three-dimensional object detection system, a second reflection of the second signal off of the user; determining, by the computing device, a second gesture performed by the user in three-dimensional space based on the second reflection of the second signal; and causing performance of one or more operations, by the computing device, based on the determined second gesture. - View Dependent Claims (22, 23)
-
-
24. A computing device comprising:
-
a touchscreen configured to receive a touch input indicative of a request of a user to enter a gesture recognition mode; a three-dimensional object detection system configured to; transmit one or more signals over time, the signals capable of; passing through an intervening article worn by or associated with the user; and reflecting off human tissue of the user; receive one or more reflections of the signals reflected off a portion of the user over time; and determine, based on the received reflections of the signals, spatial information about the portion of the user over time; at least one processor; and at least one computer-readable memory device comprising instructions, that, when executed by the processor, cause the processor to implement a gesture module configured to; enter, based on the touch input, a gesture recognition mode; and responsive to entering the gesture recognition mode; receive, from the three-dimensional object detection system, a first portion of the spatial information; determine, based on the first portion of the spatial information, a gesture performed by the portion of the user in three-dimensional space; and send an indication of the gesture to cause performance of one or more operations. - View Dependent Claims (25, 26, 27, 28)
-
-
29. A computing device comprising:
-
a touchscreen configured to display a user interface; a three-dimensional object detection system configured to; transmit one or more signals over time, the signals capable of; passing through an intervening article worn by or associated with the user; and reflecting off human tissue of the user; receive one or more reflections of the signals reflected off a portion of the user over time; and determine, based on the received reflections of the signals, spatial information about the portion of the user over time; at least one processor; and at least one computer-readable memory device comprising instructions, that, when executed by the processor, cause the processor to implement a gesture module configured to; receive a first portion of the spatial information; determine, based on the first portion of the spatial information, that the portion of the user has entered a predefined threshold distance with the computing device; determine, based on the first portion of the spatial information, a location within the user interface corresponding to a location of the portion of the user relative the user interface when the predefined threshold distance was met; causing the touchscreen to display an indication of the location on the user interface; and cause, based on a second portion of the spatial information corresponding to the portion of the user while in the predefined threshold distance, the indication to change;
orcause, based on the second portion of the spatial information, a performance of an operation on an object of the user interface proximal the indication. - View Dependent Claims (30, 31)
-
-
32. A computing device comprising:
-
a three-dimensional object detection system configured to; transmit one or more signals over time, the signals capable of; passing through an intervening article worn by or associated with the user; and reflecting off human tissue of the user; receive one or more reflections of the signals reflected off the user over time; and determine, based on the received reflections of the signals, spatial information about the user over time; at least one processor; and at least one computer-readable memory device comprising instructions, that, when executed by the processor, cause the processor to implement a gesture module configured to; receive the spatial information; determine, based on a first portion of the spatial information, a first gesture performed by the user in three-dimensional space, the first gesture corresponding to a request from the user to enter a gesture recognition mode; enter, based on the determining the first gesture, the gesture recognition mode; and responsive to entering the gesture recognition mode; determine, based on a second portion of the spatial information, a second gesture performed by the user in three-dimensional space; and causing, based on the determination of the second gesture, performance of one or more operations. - View Dependent Claims (33, 34)
-
Specification