Gesture fingerprinting
First Claim
Patent Images
1. A computer-implemented method comprising:
- receiving, using one or more computing devices, input from a user entered via an input device;
determining, using the one or more computing devices, a gesture and one or more attributes associated with the gesture based on the input;
matching, using the one or more computing devices, the gesture to a gesture model for the user using the one or more attributes, the gesture model being initially patterned after how a segment of users input the gesture to perform an action;
receiving, using the one or more computing devices, one or more subsequent inputs from the user entered via the input device to perform the action;
determining, by the one or more computing devices, based on the one or more subsequent inputs, whether the gesture model is appropriate for the user; and
responsive to determining the gesture model is not appropriate for the user, automatically adjusting, using the one or more computing devices, the gesture model by self-adapting the gesture model to improve overall experience for the user in a future iteration.
2 Assignments
0 Petitions
Accused Products
Abstract
Various implementations related to gesture fingerprinting are described. In one such implementation, a computer-implemented method includes receiving input from a user entered via an input device; determining a gesture and one or more attributes associated with the gesture based on the input; matching the gesture to a gesture model for the user using the one or more attributes; and optimizing the gesture model based on subsequent input received from the user.
69 Citations
17 Claims
-
1. A computer-implemented method comprising:
-
receiving, using one or more computing devices, input from a user entered via an input device; determining, using the one or more computing devices, a gesture and one or more attributes associated with the gesture based on the input; matching, using the one or more computing devices, the gesture to a gesture model for the user using the one or more attributes, the gesture model being initially patterned after how a segment of users input the gesture to perform an action; receiving, using the one or more computing devices, one or more subsequent inputs from the user entered via the input device to perform the action; determining, by the one or more computing devices, based on the one or more subsequent inputs, whether the gesture model is appropriate for the user; and responsive to determining the gesture model is not appropriate for the user, automatically adjusting, using the one or more computing devices, the gesture model by self-adapting the gesture model to improve overall experience for the user in a future iteration. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A system comprising:
-
one or more processors; and one or more memories storing instructions that, when executed by the one or more processors, cause the system to perform operations including; receiving input from a user entered via an input device; determining a gesture and one or more attributes associated with the gesture based on the input; matching the gesture to a gesture model for the user using the one or more attributes, the gesture model being initially patterned after how a segment of users input the gesture to perform an action; receiving one or more subsequent inputs from the user entered via the input device to perform the action; determining, based on the one or more subsequent inputs, whether the gesture model is appropriate for the user; and responsive to determining the gesture model is not appropriate for the user, automatically adjusting the gesture model by self-adapting the gesture model to improve overall experience for the user in a future iteration. - View Dependent Claims (8, 9, 10, 11, 12)
-
-
13. A system comprising:
-
one or more processors; an interpretation module executable by the one or more processors to interpret a gesture and one or more attributes associated with the gesture based on input received from a user on a computing device via an input device; an application module coupled to the interpretation module or a data store to receive a gesture model and executable by the one or more processors to match the gesture to a gesture model for the user using the one or more attributes, the gesture model being initially patterned after how a segment of users input the gesture to perform an action; and a learning module executable by the one or more processors to; receive one or more subsequent inputs from the user entered via the input device to perform the action; determine, based on the one or more subsequent inputs, whether the gesture model is appropriate for the user; and responsive to determining the gesture model is not appropriate for the user, automatically adjust the gesture model by self-adapting the gesture model to improve overall experience for the user in a future iteration. - View Dependent Claims (14, 15, 16, 17)
-
Specification