Recognition of free-form gestures from orientation tracking of a handheld or wearable device
First Claim
1. An apparatus, comprising:
- hardware logic configurable to cause a computing device to perform actions, the actions comprising;
capturing a sequence of rotation vectors from a device being moved to perform a gesture;
converting the sequence of rotation vectors to a sequence of corresponding device positions using a robotic chain model with at most 4 degrees of freedom; and
wherein the robotic chain model simulates a shoulder and an elbow of a user holding the device while performing the gesture; and
connecting the corresponding device positions to form a trace,wherein the trace approximates a shape of the gesture; and
extracting features of the trace and using the features to classify the gesture;
wherein the gesture is classified by an algorithm comprising a support vector machine comparing the trace to contents of a training database and performing statistical analysis.
2 Assignments
0 Petitions
Accused Products
Abstract
A user performs a gesture with a hand-held or wearable device capable of sensing its own orientation. Orientation data, in the form of a sequence of rotation vectors, is collected throughout the duration of the gesture. To construct a trace representing the shape of the gesture and the direction of device motion, the orientation data is processed by a robotic chain model with four or fewer degrees of freedom, simulating a set of joints moved by the user to perform the gesture (e.g., a shoulder and an elbow). To classify the gesture, a trace is compared to contents of a training database including many different users'"'"' versions of the gesture and analyzed by a learning module such as support vector machine.
-
Citations
25 Claims
-
1. An apparatus, comprising:
hardware logic configurable to cause a computing device to perform actions, the actions comprising; capturing a sequence of rotation vectors from a device being moved to perform a gesture; converting the sequence of rotation vectors to a sequence of corresponding device positions using a robotic chain model with at most 4 degrees of freedom; and wherein the robotic chain model simulates a shoulder and an elbow of a user holding the device while performing the gesture; and connecting the corresponding device positions to form a trace, wherein the trace approximates a shape of the gesture; and extracting features of the trace and using the features to classify the gesture; wherein the gesture is classified by an algorithm comprising a support vector machine comparing the trace to contents of a training database and performing statistical analysis. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
10. A non-transitory machine-readable storage medium programmed with instructions for components of a machine to perform actions, the actions comprising:
-
capturing a sequence of rotation vectors from a device being moved to perform a gesture; converting the sequence of rotation vectors to a sequence of corresponding device positions using a robotic chain model with at most 4 degrees of freedom; and connecting the corresponding device positions to form a trace, wherein the robotic chain model simulates a shoulder and an elbow of a user holding the device while performing the gesture; wherein the trace approximates a shape of the gesture; and extracting features of the trace and using the features to classify the gesture; wherein the gesture is classified by an algorithm comprising a support vector machine comparing the trace to contents of a training database and performing statistical analysis. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25)
-
Specification