Efficient gesture processing
First Claim
1. At least one tangible computer readable storage medium having instructions stored thereon that, when executed on a machine, cause the machine to:
- receive data from a motion sensor;
select a subset of one or more gesture recognition algorithms from a plurality of gesture recognition algorithms, wherein selecting the subset comprises;
comparing characteristics of a partial data segment against a gesture-matching database, the partial data segment being a subset of a gesture'"'"'s complete motion data segment received from the motion sensor,enabling one or more gesture-recognition algorithms with training gesture motion characteristics in the gesture-matching database that match the data characteristics of the partial data, andidentifying a candidate gesture by using the one or more enabled gesture-recognition algorithms to analyze the gesture'"'"'s complete data segment and comparing the analysis of each gesture-recognition algorithm to a template-matching database;
determine a gesture based on the best match obtained from comparison of the candidate gestures with the template-matching database; and
trigger an event on the machine, wherein the event corresponds to the determined gesture.
1 Assignment
0 Petitions
Accused Products
Abstract
Embodiments of the invention describe a system to efficiently execute gesture recognition algorithms. Embodiments of the invention describe a power efficient staged gesture recognition pipeline including multimodal interaction detection, context based optimized recognition, and context based optimized training and continuous learning. Embodiments of the invention further describe a system to accommodate many types of algorithms depending on the type of gesture that is needed in any particular situation. Examples of recognition algorithms include but are not limited to, HMM for complex dynamic gestures (e.g. write a number in the air), Decision Trees (DT) for static poses, peak detection for coarse shake/whack gestures or inertial methods (INS) for pitch/roll detection.
31 Citations
23 Claims
-
1. At least one tangible computer readable storage medium having instructions stored thereon that, when executed on a machine, cause the machine to:
-
receive data from a motion sensor; select a subset of one or more gesture recognition algorithms from a plurality of gesture recognition algorithms, wherein selecting the subset comprises; comparing characteristics of a partial data segment against a gesture-matching database, the partial data segment being a subset of a gesture'"'"'s complete motion data segment received from the motion sensor, enabling one or more gesture-recognition algorithms with training gesture motion characteristics in the gesture-matching database that match the data characteristics of the partial data, and identifying a candidate gesture by using the one or more enabled gesture-recognition algorithms to analyze the gesture'"'"'s complete data segment and comparing the analysis of each gesture-recognition algorithm to a template-matching database; determine a gesture based on the best match obtained from comparison of the candidate gestures with the template-matching database; and trigger an event on the machine, wherein the event corresponds to the determined gesture. - View Dependent Claims (2, 4, 5, 6, 7, 21)
-
-
3. The at least one tangible computer readable storage medium 2, wherein the machine is to select the subset of gesture recognition algorithm(s) based, at least in part, on a comparison of a total energy magnitude of the data with a total energy magnitude value associated with each of the plurality of gesture algorithms.
-
8. A mobile computing device comprising:
-
a motion sensor; a memory; at least one processor; an algorithm selection module, stored in the memory and executed via the at least one processor, to select a subset of one or more gesture recognition algorithms from a plurality of gesture recognition algorithms wherein selecting the subset comprises; comparing characteristics of a partial data segment against a gesture-matching database, the partial data segment being a subset of a gesture'"'"'s complete motion data segment received from the motion sensor, enabling one or more gesture-recognition algorithms with training gesture motion characteristics in the gesture-matching database that match the data characteristics of the partial data, and identifying a candidate gesture by using the one or more enabled gesture-recognition algorithms to analyze the gesture'"'"'s complete data segment and comparing the analysis of each gesture-recognition algorithm to a template-matching database; and a gesture recognition module, stored in the memory and executed via the at least one processor, to determine a gesture based on the best match obtained from comparison of the candidate gestures with the template-matching database; wherein the determined gesture triggers an event on the processor, the event corresponding to the determined gesture. - View Dependent Claims (9, 10, 11, 12, 13, 14, 15, 16, 17, 22)
-
-
18. A machine-implemented method comprising:
-
receiving data from a motion sensor; selecting a subset of one or more gesture recognition algorithms from a plurality of gesture recognition algorithms, wherein selecting the subset comprises; comparing characteristics of a partial data segment against a gesture-matching database, the partial data segment being a subset of a gesture'"'"'s complete motion data segment received from the motion sensor, enabling one or more gesture-recognition algorithms with training gesture motion characteristics in the gesture-matching database that match the data characteristics of the partial data, and identifying a candidate gesture by using the one or more enabled gesture-recognition algorithms to analyze the gesture'"'"'s complete data segment and comparing the analysis of each gesture-recognition algorithm to a template-matching database; determining a gesture based on the best match obtained from comparison of the candidate gestures with the template-matching database; and triggering an event that corresponds to the determined gesture. - View Dependent Claims (19, 20, 23)
-
Specification