Depth-based feature systems for classification applications
First Claim
1. A computer system configured to recognize a user gesture from depth data, the computer system comprising:
- at least one processor;
at least one memory comprising instructions configured to cause the computer system to perform a method comprising;
receiving depth data captured while a user performed a gesture;
applying a template to the depth data to generate a plurality of features; and
identifying the gesture based upon the plurality of features, whereinthe template was generated, at least in part, by;
determining a Gaussian function;
sampling the Gaussian function to generate a first distribution; and
generating a second distribution by iteratively sampling the first distribution and selecting points of the first distribution using a metric, wherein selecting points of the first distribution using a metric comprises determining a minimum of a plurality of Sub_Metric values, each Sub_Metric value determined using the formula
4 Assignments
0 Petitions
Accused Products
Abstract
Human Computer Interfaces (HCI) may allow a user to interact with a computer via a variety of mechanisms, such as hand, head, and body gestures. Various of the disclosed embodiments allow information captured from a depth camera on an HCI system to be used to recognize such gestures. Particularly, the HCI system'"'"'s depth sensor may capture depth frames of the user'"'"'s movements over time. To discern gestures from these movements, the system may group portions of the user'"'"'s anatomy represented by the depth data into classes. “Features” which reflect distinguishing features of the user'"'"'s anatomy may be used to accomplish this classification. Some embodiments provide improved systems and methods for generating and/or selecting these features. Features prepared by various of the disclosed embodiments may be less susceptible to overfitting training data and may more quickly distinguish portions of the user'"'"'s anatomy.
14 Citations
20 Claims
-
1. A computer system configured to recognize a user gesture from depth data, the computer system comprising:
-
at least one processor; at least one memory comprising instructions configured to cause the computer system to perform a method comprising; receiving depth data captured while a user performed a gesture; applying a template to the depth data to generate a plurality of features; and identifying the gesture based upon the plurality of features, wherein the template was generated, at least in part, by; determining a Gaussian function; sampling the Gaussian function to generate a first distribution; and generating a second distribution by iteratively sampling the first distribution and selecting points of the first distribution using a metric, wherein selecting points of the first distribution using a metric comprises determining a minimum of a plurality of Sub_Metric values, each Sub_Metric value determined using the formula - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A computer-implemented method to recognize a user gesture from depth data, the method comprising:
-
receiving depth data captured while a user performed a gesture; applying a template to the depth data to generate a plurality of features; and identifying the gesture based upon the plurality of features, wherein the template was generated, at least in part, by; determining a Gaussian function; sampling the Gaussian function to generate a first distribution; and generating a second distribution by iteratively sampling the first distribution and selecting points of the first distribution using a metric, wherein selecting points of the first distribution using a metric comprises determining a minimum of a plurality of Sub_Metric values, each Sub_Metric value determined using the formula - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A non-transitory computer-readable medium comprising instructions configured to cause a computer system to perform a method to recognize a user gesture from depth data, the method comprising:
-
receiving depth data captured while a user performed a gesture; applying a template to the depth data to generate a plurality of features; and identifying the gesture based upon the plurality of features, wherein the template was generated, at least in part, by; determining a Gaussian function; sampling the Gaussian function to generate a first distribution; and generating a second distribution by iteratively sampling the first distribution and selecting points of the first distribution using a metric, wherein selecting points of the first distribution using a metric comprises determining a minimum of a plurality of Sub_Metric values, each Sub_Metric value determined using the formula - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification