GESTURE-CONTROLLED INTERFACES FOR SELF-SERVICE MACHINES AND OTHER APPLICATIONS
First Claim
1. A method of gesture recognition, comprising the steps of:
- imaging a gesture-making target;
deriving the start position of the target, the end position of the target, and the velocity between the start and end positions;
comparing the velocity of the target to a threshold value; and
identifying the gesture as a static gesture if the velocity is below the threshold value, otherwise, identifying the gesture as a dynamic gesture.
5 Assignments
0 Petitions
Accused Products
Abstract
A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description. The disclosure details methods for gesture recognition, as well as the overall architecture for using gesture recognition to control of devices, including self-service machines.
102 Citations
18 Claims
-
1. A method of gesture recognition, comprising the steps of:
-
imaging a gesture-making target;
deriving the start position of the target, the end position of the target, and the velocity between the start and end positions;
comparing the velocity of the target to a threshold value; and
identifying the gesture as a static gesture if the velocity is below the threshold value, otherwise, identifying the gesture as a dynamic gesture. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A gesture-controlled interface for self-service machines and other applications, comprising:
-
a sensor module for visually analyzing a gesture made by a human or machine, and outputting image data including position and velocity information associated with the gesture;
an identification module operative to identify the gesture based upon the image data output by the sensor module; and
a transformation module operative to generate a command based upon the gesture identified by the identification module. - View Dependent Claims (14, 15, 16, 17, 18)
-
Specification