System for recognizing an open or closed hand
First Claim
Patent Images
1. A method for generating a model of a user'"'"'s hand including one or more fingers, comprising:
- (a) receiving position data in a computing device representing a position of a user interacting with a sensor associated with the computing device, the position data including at least one of depth and image data representing the user'"'"'s hand; and
(b) analyzing the position data to identify whether the hand is in an open or closed state, said step (b) including the steps of;
(b)(1) analyzing depth data from the position data captured in said step (a) to segment the position data into data of the hand, and(b)(2) extracting a set of feature descriptors by applying one or more filters to the image data of the hand identified in said step (b)(1), the one or more filters analyzing image data of the hand as compared to image data outside of a boundary of the hand to discern features of the hand including a shape of the hand.
2 Assignments
0 Petitions
Accused Products
Abstract
A system and method are disclosed relating to a pipeline for generating a computer model of a target user, including a hand model of the user'"'"'s hands, captured by an image sensor in a NUI system. The computer model represents a best estimate of the position of a user'"'"'s hand or hands and whether the hand or hand is in an open or closed state. The generated hand model may be used by a gaming or other application to determine such things as user gestures and control actions.
-
Citations
17 Claims
-
1. A method for generating a model of a user'"'"'s hand including one or more fingers, comprising:
-
(a) receiving position data in a computing device representing a position of a user interacting with a sensor associated with the computing device, the position data including at least one of depth and image data representing the user'"'"'s hand; and (b) analyzing the position data to identify whether the hand is in an open or closed state, said step (b) including the steps of; (b)(1) analyzing depth data from the position data captured in said step (a) to segment the position data into data of the hand, and (b)(2) extracting a set of feature descriptors by applying one or more filters to the image data of the hand identified in said step (b)(1), the one or more filters analyzing image data of the hand as compared to image data outside of a boundary of the hand to discern features of the hand including a shape of the hand. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A system for determining whether a detected hand is open or closed, the system including a sensing mechanism operatively coupled to a computing device, the system comprising:
-
a skeletal recognition engine for recognizing at least a portion of a skeleton of a user from received data including at least one of image and depth data; an image segmentation engine for analyzing depth data received from the skeletal recognition engine to segment one or more regions of the body into a region representing a hand of the user; and a descriptor extraction engine for extracting data representative of a hand including one or more fingers and whether the hand is open or closed, the descriptor extraction engine applying a plurality of filters for analyzing pixels in the region representing the hand, each filter in the plurality of filters determining a position and open or closed state of the hand, the descriptor extraction engine combining the results of each filter to arrive at a best estimate of whether the hand is opened or closed. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16, 17)
-
Specification