Gesture recognition system for TV control
First Claim
Patent Images
1. A gesture recognition system for remote control of a device, comprising:
- (a) a sensor for capturing video data of a user'"'"'s hand at a location near the device;
(b) a processor for processing the captured video data of the hand; and
(c) programming executable on said processor for carrying out steps comprising;
segmenting each hand image in the captured video based on skin color;
extracting one or more parameters of the segmented hand image;
wherein the one or more parameters comprises a palm center location of the user'"'"'s hand based on an extracted contour and skeleton of the hand;
wherein said palm center location comprises a center of the user'"'"'s palm during an “
open”
configuration of the user'"'"'s hand;
said “
open”
configuration corresponding to a condition wherein all of the user'"'"'s fingers are extended to expose the user'"'"'s palm;
wherein the one or more parameters further comprises a palm bottom location;
measuring the distance between the palm center location and a furthest contour point on the extracted image contour;
the furthest contour point being within a predetermined angular range of a vector passing through the palm bottom location and the palm center location;
tracking the one or more parameters in the hand image;
classifying a hand gesture as “
open”
or “
closed”
based on the one or more parameters; and
operating the device based on recognition of said hand gesture and tracking the position of said one or more parameters.
1 Assignment
0 Petitions
Accused Products
Abstract
A gesture recognition system using a skin-color based method combined with motion information to achieve real-time segmentation. A Kalman filter is used to track the centroid of the hand. The palm center, palm bottom, as well as the largest distance from the palm center to the contour from extracted hand mask are computed. The computed distance to a threshold is then compared to decide if the current posture is “open” or “closed.” In a preferred embodiment, the transition between the “open” and “closed” posture to decide if the current gesture is in “select” or “grab” state.
-
Citations
16 Claims
-
1. A gesture recognition system for remote control of a device, comprising:
-
(a) a sensor for capturing video data of a user'"'"'s hand at a location near the device; (b) a processor for processing the captured video data of the hand; and (c) programming executable on said processor for carrying out steps comprising; segmenting each hand image in the captured video based on skin color; extracting one or more parameters of the segmented hand image; wherein the one or more parameters comprises a palm center location of the user'"'"'s hand based on an extracted contour and skeleton of the hand; wherein said palm center location comprises a center of the user'"'"'s palm during an “
open”
configuration of the user'"'"'s hand;said “
open”
configuration corresponding to a condition wherein all of the user'"'"'s fingers are extended to expose the user'"'"'s palm;wherein the one or more parameters further comprises a palm bottom location; measuring the distance between the palm center location and a furthest contour point on the extracted image contour; the furthest contour point being within a predetermined angular range of a vector passing through the palm bottom location and the palm center location; tracking the one or more parameters in the hand image; classifying a hand gesture as “
open”
or “
closed”
based on the one or more parameters; andoperating the device based on recognition of said hand gesture and tracking the position of said one or more parameters. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A gesture recognition system for remote control of a device having a user interface for visualization on a display, comprising:
-
(a) a sensor for capturing video data of a user'"'"'s hand at a location near the device; (b) a processor for processing the captured video data of the hand; (c) programming executable on said processor for carrying out steps comprising; segmenting each hand image in the captured video based on skin color; extracting one or more parameters of the segmented hand image; wherein the one or more parameters comprises a palm center location of the user'"'"'s hand based on an extracted contour and skeleton of the hand; wherein said palm center location comprises a center of the user'"'"'s palm during an “
open”
configuration of the user'"'"'s hand;said “
open”
configuration corresponding to a condition wherein all of the user'"'"'s fingers are extended to expose the user'"'"'s palm;wherein the one or more parameters further comprises a palm bottom location; measuring the distance between the palm center location and a furthest contour point on the extracted image contour; the furthest contour point being within a predetermined angular range of a vector passing through the palm bottom location and the palm center location; tracking the one or more parameters in the hand image; classifying a hand gesture as “
open”
or “
closed”
based on said one or more parameters; andoperating the device based on recognition of said hand gesture and tracking of said one or more parameters; wherein operating the device comprises sending a command to the user interface based on recognition of said hand gesture and tracking the position of said one or more parameters. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. A method for remotely controlling a device using hand gestures, the device having a user interface for visualization on a display, the method comprising:
-
capturing video of a user'"'"'s hand with a sensor at a location near said device; segmenting each hand image in the captured video based on skin color; extracting one or more parameters of the segmented hand image; wherein the one or more parameters comprises a palm center location of the user'"'"'s hand based on an extracted contour and skeleton of the hand; wherein said palm center location comprises a center of the user'"'"'s palm during an “
open”
configuration of the user'"'"'s hand;said “
open”
configuration corresponding to a condition wherein all of the user'"'"'s fingers are extended to expose the user'"'"'s palm;wherein the one or more parameters further comprises a palm bottom location; measuring the distance between the palm center location and a furthest contour point on the extracted image contour; the furthest contour point being within a predetermined angular range of a vector passing through the palm bottom location and the palm center location; tracking the at least one of the one or more parameters in the hand image; classifying a hand gesture as “
open”
or “
closed”
based on said one or more parameters;wherein classifying a hand gesture as “
open”
or “
closed”
comprises;comparing the distance to the furthest contour point to a threshold value; and classifying the image as “
open”
if the distance to the furthest contour point is above the threshold value; andclassifying the image as “
closed”
if the distance to the furthest contour point is below the threshold value;operating the device based on recognition of said hand gesture and tracking of said one or more parameters; sending a command to the user interface based on recognition of said hand gesture and tracking of said one or more parameters; and outputting the sent command on the display to operate the device. - View Dependent Claims (16)
-
Specification