Human computer interaction using wearable device
First Claim
Patent Images
1. A wearable apparatus for human-computer interaction, comprising:
- one or more processors;
a sensor module coupled to the one or more processors to measure a motion of a body part of a user, wherein the user is interacting with an application of a computing device external to the apparatus;
an interpretation module, coupled with the one or more processors and the sensor module, and configured to;
interpret the motion of the body part of the user;
receive an indication of available actions based on a current state of the computing device or the application, wherein the indication of the available actions is received when the wearable apparatus enters a new state based on the current state of the computing device or the application prompting a new user interaction screen;
output the available actions for display on a display of the wearable apparatus;
associate the interpreted motion with an available action of the available actions; and
translate, using a trained gesture estimation model, the motion of the body part into an indication of a user input for the application based on the associated available action;
a communication module, coupled to the interpretation module, configured to;
send the indication of the user input to the computing device for the application, andreceive a feedback signal from the computing device notifying the user that the user input has been successfully accepted by the computing device; and
a body encasing the one or more processors, the sensor module, the communication module, and the interpretation module, wherein the body is configured for the apparatus to be worn by the user.
3 Assignments
0 Petitions
Accused Products
Abstract
Embodiments of apparatus and methods for human-computer interaction are described. An apparatus for human-computer interaction may have one or more processors, multiple sensors to measure motion of a body part of a user, a communication module to communicate with a remote computing device, and an interpretation module to interpret the motion of the body part of the user to be associated with an indication of a user input to the remote computing device. The components may be encased in a body configured to be worn by the user. Other embodiments may be described and/or claimed.
10 Citations
20 Claims
-
1. A wearable apparatus for human-computer interaction, comprising:
-
one or more processors; a sensor module coupled to the one or more processors to measure a motion of a body part of a user, wherein the user is interacting with an application of a computing device external to the apparatus; an interpretation module, coupled with the one or more processors and the sensor module, and configured to; interpret the motion of the body part of the user; receive an indication of available actions based on a current state of the computing device or the application, wherein the indication of the available actions is received when the wearable apparatus enters a new state based on the current state of the computing device or the application prompting a new user interaction screen; output the available actions for display on a display of the wearable apparatus; associate the interpreted motion with an available action of the available actions; and translate, using a trained gesture estimation model, the motion of the body part into an indication of a user input for the application based on the associated available action; a communication module, coupled to the interpretation module, configured to; send the indication of the user input to the computing device for the application, and receive a feedback signal from the computing device notifying the user that the user input has been successfully accepted by the computing device; and a body encasing the one or more processors, the sensor module, the communication module, and the interpretation module, wherein the body is configured for the apparatus to be worn by the user. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method for human-computer interaction, comprising:
-
measuring, by a wearable computing device, motion of a body part of a user, wherein the user is interacting with an application of a computing device external to the wearable computing device; receiving, at the wearable computing device, an indication of available actions based on a current state of the computing device or the application, wherein the indication of the available actions is received when the wearable computing device enters a new state based on the current state of the computing device or the application prompting a new user interaction screen; displaying, by the wearable computing device, the available actions; associating, by the wearable computing device, the measured motion with an available action of the available actions; interpreting, by the wearable computing device, using a trained gesture estimation model, the motion of the body part of the user into an indication of a user input for the application based on the associated available action; sending, by the wearable computing device, the indication of the user input to the computing device; and receiving, by the wearable computing device, a feedback signal from the computing device notifying the user that the user input has been successfully accepted by the computing device. - View Dependent Claims (12, 13, 14, 15)
-
-
16. At least one non-transitory computer-readable storage medium, comprising a plurality of instructions, which when executed by at least one processor of a wearable computing device, cause the at least one processor to:
-
measure motion of a body part of a user, wherein the user is interacting with an application of a computing device external to the wearable computing device; receive an indication of available actions based on a current state of the computing device or the application, wherein the indication of the available actions is received when the wearable computing device enters a new state based on the current state of the computing device or the application prompting a new user interaction screen; output the available actions for display on a display of the wearable computing device; associate the measured motion with an available action of the available actions; interpret, using a trained gesture estimation model, the motion of the body part of the user into an indication of a user input for the application based on the associated available action; send the indication of the user input to the computing device; and receive, from the computing device, a feedback signal notifying the user that the user input has been successfully accepted by the computing device. - View Dependent Claims (17, 18, 19, 20)
-
Specification