Real Time Hand Tracking, Pose Classification and Interface Control
First Claim
Patent Images
1. A method of controlling an electronics device via hand gestures, comprising:
- detecting, via an image processing module of the electronics device, a bare-hand position via a camera input based upon a detected sequence of bare-hand positions;
determining whether the bare-hand position has been detected for a threshold duration of time;
identifying the detected bare-hand position from a vocabulary of hand gestures, where the identified bare-hand position comprises a hand gesture associated with powering on the electronics device; and
controlling, in response to determining that the bare-hand position has been detected for the threshold duration of time, the electronics device in response to the identified bare-hand position by powering on the electronics device.
1 Assignment
0 Petitions
Accused Products
Abstract
A hand gesture from a camera input is detected using an image processing module of a consumer electronics device. The detected hand gesture is identified from a vocabulary of hand gestures. The electronics device is controlled in response to the identified hand gesture. This abstract is not to be considered limiting, since other embodiments may deviate from the features described in this abstract.
77 Citations
56 Claims
-
1. A method of controlling an electronics device via hand gestures, comprising:
-
detecting, via an image processing module of the electronics device, a bare-hand position via a camera input based upon a detected sequence of bare-hand positions; determining whether the bare-hand position has been detected for a threshold duration of time; identifying the detected bare-hand position from a vocabulary of hand gestures, where the identified bare-hand position comprises a hand gesture associated with powering on the electronics device; and controlling, in response to determining that the bare-hand position has been detected for the threshold duration of time, the electronics device in response to the identified bare-hand position by powering on the electronics device. - View Dependent Claims (2)
-
-
3. A method of controlling an electronics device via hand gestures, comprising:
-
detecting, via an image processing module of the electronics device, a hand gesture via a camera input; identifying the detected hand gesture from a vocabulary of hand gestures; and controlling the electronics device in response to the identified hand gesture. - View Dependent Claims (4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A method of hand position detection, comprising:
-
extracting, via an image processing module of an electronics device, a feature set associated with hand gesture detection and hand pose inference from a plurality of input images by; tracking a region of interest (ROI) between subsequent video frames of the plurality of input images as a flock of features; triggering scale invariant feature transforms (SIFT) feature extraction; calculating an optical flow path of the flock of features; measuring brightness gradients in multiple directions across the plurality of input images; generating image pyramids from the measured brightness gradients; extracting pixel intensity/displacement features and the SIFT features using the generated image pyramids; and applying a cascade filter in association with extracting the pixel intensity/displacement features and the SIFT features from the generated image pyramids; inferring a hand pose type using a trained multiclass support vector machine (SVM) by; detecting at least one feature within a training image and the plurality of input images; and performing a one-to-one mapping of instances of the at least one feature within the plurality of input images with at least one label drawn from a finite set of elements, where the at least one label comprises at least one label generated during a training phase based upon a motion capture three dimensional (3D) data set; and approximating the hand pose using inverse kinematics (IK) optimization by; partitioning the plurality of input images into a plurality of processing regions; determining a centroid of features within each of the plurality of processing regions; mapping a location of each feature centroid onto three dimensional (3D) pose data associated with a motion capture data set; comparing variances from each feature centroid to a closest match within the 3D pose data; determining which of a plurality of joint constraints affect the IK optimization; mapping each feature centroid to a closest joint stored within the 3D pose data; minimizing a distance of each mapped closest joint within the training image based upon the 3D pose data; and determining a final hand position based upon the minimized distance of each mapped closest joint within the training image. - View Dependent Claims (14)
-
-
15. A method of hand position detection, comprising:
-
extracting, via an image processing module of an electronics device, a feature set associated with hand gesture detection and hand pose inference from at least one input image; inferring a hand pose type using a trained multiclass support vector machine (SVM); and approximating the hand pose using inverse kinematics (IK) optimization. - View Dependent Claims (16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30)
-
-
31. An apparatus for controlling an electronics device via hand gestures, comprising:
-
a camera; and a processor programmed to; detect a bare-hand position via the camera based upon a detected sequence of bare-hand positions; determine whether the bare-hand position has been detected for a threshold duration of time; identify the detected bare-hand position from a vocabulary of hand gestures, where the identified bare-hand position comprises a hand gesture associated with powering on the electronics device; and control, in response to determining that the bare-hand position has been detected for the threshold duration of time, the electronics device in response to the identified bare-hand position by powering on the electronics device.
-
-
32. An apparatus for controlling an electronics device via hand gestures, comprising:
-
a camera; and a processor programmed to; detect a hand gesture via the camera; identify the detected hand gesture from a vocabulary of hand gestures; and control the electronics device in response to the identified hand gesture. - View Dependent Claims (33, 34, 35, 36, 37, 38, 39, 40)
-
-
41. An apparatus for hand position detection, comprising:
-
a camera; and a processor programmed to; extract a feature set associated with hand gesture detection and hand pose inference from a plurality of input images received via the camera, where the processor is further programmed to; track a region of interest (ROI) between subsequent video frames of the plurality of input images as a flock of features; trigger scale invariant feature transforms (SIFT) feature extraction; calculate an optical flow path of the flock of features; measure brightness gradients in multiple directions across the plurality of input images; generate image pyramids from the measured brightness gradients; extract pixel intensity/displacement features and the SIFT features using the generated image pyramids; and apply a cascade filter in association with extracting the pixel intensity/displacement features and the SIFT features from the generated image pyramids; infer a hand pose type using a trained multiclass support vector machine (SVM), where the processor is further programmed to; detect at least one feature within a training image and the plurality of input images; and perform a one-to-one mapping of instances of the at least one feature within the plurality of input images with at least one label drawn from a finite set of elements, where the at least one label comprises at least one label generated during a training phase based upon a motion capture three dimensional (3D) data set; and approximate the hand pose using inverse kinematics (IK) optimization, where the processor is further programmed to; partition the plurality of input images into a plurality of processing regions; determine a centroid of features within each of the plurality of processing regions; map a location of each feature centroid onto three dimensional (3D) pose data associated with a motion capture data set; compare variances from each feature centroid to a closest match within the 3D pose data; determine which of a plurality of joint constraints affect the IK optimization; map each feature centroid to a closest joint stored within the 3D pose data; minimize a distance of each mapped closest joint within the training image based upon the 3D pose data; and determine a final hand position based upon the minimized distance of each mapped closest joint within the training image.
-
-
42. An apparatus for hand position detection, comprising:
-
a camera; and a processor programmed to; extract a feature set associated with hand gesture detection and hand pose inference from at least one input image received via the camera; infer a hand pose type using a trained multiclass support vector machine (SVM); and approximate the hand pose using inverse kinematics (IK) optimization. - View Dependent Claims (43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56)
-
Specification