SYSTEM, DEVICE, AND METHOD OF DETECTING USER IDENTITY BASED ON MOTOR-CONTROL LOOP MODEL
First Claim
Patent Images
1. A method comprising:
- (a) during a usage session of a user who utilizes a pointing device for interacting with a computerized service, monitoring on-screen movements;
(b) analyzing the on-screen movements, (I) to derive from them estimated dynamics of the pointing device as utilized by said user, and (II) to determine motor capabilities and cognitive capabilities that characterize the utilization of said pointing device by said user;
(c) differentiating between (i) said user and (ii) one or more other users, based on a subsequent analysis of subsequent on-screen movements which correspond to device dynamics that do not match at least one of;
(I) the motor capabilities that were identified for said user based on his prior on-screen movements, (II) the cognitive capabilities that were identified for said user based on his prior on-screen movements.
4 Assignments
0 Petitions
Accused Products
Abstract
Device, system, and method of detecting identity of a user based on motor-control loop model. A method includes: during a first session of a user who utilizes a pointing device for interacting with a computerized service, monitoring the pointing device dynamics and gestures of the user; based on the monitored dynamics and gestures, estimating parameters that characterize a sensorimotor control loop model of the user; storing in a database a record indicating that the user is associated with the parameters that characterize the sensorimotor control loop model of the user.
75 Citations
29 Claims
-
1. A method comprising:
-
(a) during a usage session of a user who utilizes a pointing device for interacting with a computerized service, monitoring on-screen movements; (b) analyzing the on-screen movements, (I) to derive from them estimated dynamics of the pointing device as utilized by said user, and (II) to determine motor capabilities and cognitive capabilities that characterize the utilization of said pointing device by said user; (c) differentiating between (i) said user and (ii) one or more other users, based on a subsequent analysis of subsequent on-screen movements which correspond to device dynamics that do not match at least one of;
(I) the motor capabilities that were identified for said user based on his prior on-screen movements, (II) the cognitive capabilities that were identified for said user based on his prior on-screen movements. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A process comprising:
-
(a) monitoring spatial acceleration of a mobile electronic device, that is utilized by a user for interacting with a computerized service, by an accelerometer of the mobile electronic device; (b) based on analysis of the spatial acceleration of the mobile electronic device, generating a function that corresponds to a sensorimotor control loop of said user who utilizes spatial gestures to interact with said mobile electronic device; (c) differentiating between said user, and one or more other users, based on whether or not fresh gestures of user interactions correspond to said sensorimotor control loop of said user. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17)
-
-
18. A process comprising:
-
(a) monitoring spatial orientation of a mobile electronic device, that is utilized by a user for interacting with a computerized service, by a gyroscope of the mobile electronic device; (b) based on analysis of the spatial orientation of the mobile electronic device, generating a function that corresponds to a sensorimotor control loop of said user who utilizes spatial gestures to interact with said mobile electronic device; (c) differentiating between said user, and one or more other users, based on whether or not fresh gestures of user interactions correspond to said sensorimotor control loop of said user. - View Dependent Claims (19, 20, 21, 22, 23, 24, 25)
-
-
26. A method comprising:
-
(a) during a usage session of a user who utilizes a pointing device for interacting with a computerized service, (a1) monitoring on-screen movements of an on-screen pointer controlled by said user, and (a2) monitoring gestures performed by said user via said pointing device; (b) based on analysis of;
(b1) the on-screen movements of the on-screen pointer controlled by said user, and (b2) the gestures performed by said user via said pointing device, determining a user-specific characteristic that quantifies a level of eye-hand coordination of said user;(c) differentiating between said user, and one or more other users, based on said user-specific characteristic that quantifies the level of eye-hand coordination of said user. - View Dependent Claims (27, 28, 29)
-
Specification