TOUCH CLASSIFICATION
First Claim
1. A computer-implemented method comprising:
- obtaining frame data representative of a plurality of frames captured by a touch-sensitive device;
analyzing the frame data to define a respective blob in each frame of the plurality of frames, the blobs being indicative of a touch event;
computing a plurality of feature sets for the touch event, each feature set specifying properties of the respective blob in each frame of the plurality of frames; and
determining a type of the touch event via machine learning classification configured to provide multiple non-bimodal classification scores based on the plurality of feature sets for the plurality of frames, each non-bimodal classification score being indicative of an ambiguity level in the machine learning classification.
3 Assignments
0 Petitions
Accused Products
Abstract
A method for touch classification includes obtaining frame data representative of a plurality of frames captured by a touch-sensitive device, analyzing the frame data to define a respective blob in each frame of the plurality of frames, the blobs being indicative of a touch event, computing a plurality of feature sets for the touch event, each feature set specifying properties of the respective blob in each frame of the plurality of frames, and determining a type of the touch event via machine learning classification configured to provide multiple non-bimodal classification scores based on the plurality of feature sets for the plurality of frames, each non-bimodal classification score being indicative of an ambiguity level in the machine learning classification.
43 Citations
29 Claims
-
1. A computer-implemented method comprising:
-
obtaining frame data representative of a plurality of frames captured by a touch-sensitive device; analyzing the frame data to define a respective blob in each frame of the plurality of frames, the blobs being indicative of a touch event; computing a plurality of feature sets for the touch event, each feature set specifying properties of the respective blob in each frame of the plurality of frames; and determining a type of the touch event via machine learning classification configured to provide multiple non-bimodal classification scores based on the plurality of feature sets for the plurality of frames, each non-bimodal classification score being indicative of an ambiguity level in the machine learning classification. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19)
-
-
20. A touch-sensitive device comprising:
-
a touch-sensitive surface; a memory in which blob definition instructions, feature computation instructions, and machine learning classification instructions are stored; and a processor coupled to the memory, configured to obtain frame data representative of a plurality of frames captured via the touch-sensitive surface and configured to execute the blob definition instructions to analyze the frame data to define a respective blob in each frame of the plurality of frames, the blobs being indicative of a touch event; wherein the processor is further configured to execute the feature computation instructions to compute a plurality of feature sets for the touch event, each feature set specifying properties of the respective blob in each frame of the plurality of frames; and wherein the processor is further configured to execute the machine learning classification instructions to determine a type of the touch event via machine learning classification configured to provide multiple non-bimodal classification scores based on the plurality of feature sets for the plurality of frames, each non-bimodal classification score being indicative of an ambiguity level in the machine learning classification. - View Dependent Claims (21, 22, 23, 24, 25, 26)
-
-
27. A touch-sensitive device comprising:
-
a touch-sensitive surface; a memory in which a plurality of instruction sets are stored; and a processor coupled to the memory and configured to execute the plurality of instruction sets, wherein the plurality of instructions sets comprise; first instructions to cause the processor to obtain frame data representative of a plurality of sensor images captured by the touch-sensitive device; second instructions to cause the processor to analyze the frame data to define a respective connected component in each sensor image of the plurality of sensor images, the connected components being indicative of a touch event; third instructions to cause the processor to compute a plurality of feature sets for the touch event, each feature set specifying properties of the respective connected component in each sensor image of the plurality of sensor images; fourth instructions to cause the processor to determine a type of the touch event via machine learning classification configured to provide multiple non-bimodal classification scores based on the plurality of feature sets for the plurality of frames, each non-bimodal classification score being indicative of an ambiguity level in the machine learning classification; and fifth instructions to cause the processor to provide an output to the computing system, the output being indicative of the type of the touch event; wherein the fourth instructions comprise aggregation instructions to cause the processor to aggregate information representative of the touch event over the plurality of sensor images. - View Dependent Claims (28, 29)
-
Specification