Method and apparatus for translating hand gestures
First Claim
1. A gesture recognition apparatus comprising:
- an input assembly configured to detect a user gesture, a computer connected to said input assembly configured to generate an output signal based on said user gesture, said input assembly comprising;
a glove to be worn by a user, said glove having sensors configured to detect dynamic hand movements of each finger and thumb;
an elbow sensor configured to detect and measure flexing and positioning of the forearm about the elbow;
a shoulder sensor configured to detect movement and position of the user'"'"'s arm with respect to the user'"'"'s shoulder; and
a frame having a first section configured to couple to the upper arm of the user and a second section configured to couple to the forearm of the user, said first sections being coupled together by a hinge, said elbow sensor being positioned on said frame configured to measure flexing and positioning of the forearm and second section.
1 Assignment
0 Petitions
Accused Products
Abstract
A sign language recognition apparatus and method is provided for translating hand gestures into speech or written text. The apparatus includes a number of sensors on the hand, arm and shoulder to measure dynamic and static gestures. The sensors are connected to a microprocessor to search a library of gestures and generate output signals that can then be used to produce a synthesized voice or written text. The apparatus includes sensors such as accelerometers on the fingers and thumb and two accelerometers on the back of the hand to detect motion and orientation of the hand. Sensors are also provided on the back of the hand or wrist to detect forearm rotation, an angle sensor to detect flexing of the elbow, two sensors on the upper arm to detect arm elevation and rotation, and a sensor on the upper arm to detect arm twist. The sensors transmit the data to the microprocessor to determine the shape, position and orientation of the hand relative to the body of the user.
-
Citations
34 Claims
-
1. A gesture recognition apparatus comprising:
an input assembly configured to detect a user gesture, a computer connected to said input assembly configured to generate an output signal based on said user gesture, said input assembly comprising; a glove to be worn by a user, said glove having sensors configured to detect dynamic hand movements of each finger and thumb; an elbow sensor configured to detect and measure flexing and positioning of the forearm about the elbow; a shoulder sensor configured to detect movement and position of the user'"'"'s arm with respect to the user'"'"'s shoulder; and a frame having a first section configured to couple to the upper arm of the user and a second section configured to couple to the forearm of the user, said first sections being coupled together by a hinge, said elbow sensor being positioned on said frame configured to measure flexing and positioning of the forearm and second section. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17)
-
18. A gesture recognition apparatus comprising:
-
an input assembly comprising a glove having hand sensors configured to detect hand movement, an elbow sensor configured to detect forearm movement, an arm sensor configured to detect arm orientation, and a shoulder sensor configured to detect arm rotation, said input assembly further comprising a frame having an upper arm section and a forearm section, said upper arm section and said forearm section coupled together by a hinge; and a computer connected to said input assembly and configured to generate an output corresponding to the detected hand movement, forearm movement, or arm rotation. - View Dependent Claims (19, 20, 21, 22, 23, 24, 25)
-
-
26. A method for translating a user'"'"'s gesture composed of an initial pose, movement, hand location, and a final pose, the method comprising:
-
determining the initial pose, the hand location and the final pose of the gesture, and a movement of the gesture, the movement occurring between the initial pose and the final pose; matching the determined initial pose with one or more initial poses of all known gestures, and defining a first list of candidate gestures as those whose pose matches the determined initial pose or, if there is only one match, returning a first most likely gesture corresponding to the match; matching the determined hand location with one or more hand locations of the first list of candidate gestures, and defining a second list of candidate gestures as those whose hand locations match the determined hand location, or, if there is only one match, returning a second most likely gesture corresponding to the match; matching the determined movement with one or more movements of the second list of candidate gestures, and defining a third list of candidate gestures as those more than one gestures whose movements match the determined movement, or, if there is only one match, returning a third most likely gesture corresponding to the match; matching the determined final pose with one or more poses of the third list of candidate gestures, and defining a fourth list of candidate gestures as those more than one gestures whose final pose matches the determined final pose, or, if there is only one match, returning a fourth most likely gesture corresponding to the match; matching the determined final hand location of the gesture with a hand location of the fourth list of candidate gestures, and returning a fifth most likely gesture corresponding to the match; and converting the first, second, third, fourth or fifth gesture into an output based on the gesture. - View Dependent Claims (27, 28, 29)
-
-
30. A hand gesture recognition apparatus comprising:
-
an input assembly configured to detect a hand gesture, said input assembly comprising a glove configured to be worn on a hand of a user, said glove having a first sensor on the back of said glove for detecting position, orientation and movement of a user'"'"'s palm; a processor connected to said input assembly and configured to generate an output signal corresponding to the detected hand gesture based on the detected hand movement, glove position, glove orientation and glove movement, wherein said processor is further configured to translate the detected hand movement, glove position, glove orientation and glove movement into a meaning, and said processor is further configured to generate an output corresponding to the meaning. - View Dependent Claims (31, 32, 33, 34)
-
Specification