MOTOR VEHICLE CONTROL INTERFACE WITH GESTURE RECOGNITION
1 Assignment
0 Petitions
Accused Products
Abstract
A method for operating a motor vehicle control panel having a camera system and a gesture recognition device. includes filming a person gesticulating using at least one arm in the inside of the vehicle. Description data relating to a position and/or a sequence of movements of the at least one gesticulating arm is determined by the gesture recognition device and is associated with a control gesture. At least one situation parameter, which describes a gesticulation context of the person, is determined by a plausibility device which decides whether the person has performed a possible control gesture or only a gesture which is to be ignored. If it is determined that it is a possible control gesture, a control command is generated for the control gesture.
21 Citations
35 Claims
-
1-15. -15. (canceled)
-
16. A method for operating a user interface in a motor vehicle, wherein the user interface comprises a camera, the method comprising:
-
filming a user who is gesticulating with a gesticulating arm, within an interior of the motor vehicle by the user being filmed by the camera to thereby produce image data; generating descriptive data of a position or movement of the gesticulating arm by a gesture recognition device, based on the image data produced by the camera; recognizing an operating gesture by associating the descriptive data with one of a plurality of predefined operating gestures based on a first classification; determining, using a plausibility verification device of the user interface, at least one situation parameter that describes a gesticulation context of the user; determining, using the at least one situation parameter and a second classification, whether the user is performing a possible operating gesture or only a gesture that is not defined for operation and that is to be ignored; outputting a control command to a receiver for a motor vehicle system depending on the operating gesture recognized based on the first classification if it is determined that the user is performing a possible operating gesture; and inhibiting the output of the control command if it is determined that the gesture is to be ignored. - View Dependent Claims (17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 31, 32, 33)
-
-
29. The method as claimed in 16, wherein
in addition to the gesticulating arm, image data is captured relating to a body model and a current body position of the user, the current position of the user is used to identify control commands that are currently appropriate, and all control commands other than the control commands that are currently appropriate are inhibited.
-
30. The method as claimed in 16, wherein
in addition to the gesticulating arm, image data is captured relating to a body model and a body position of the user, the at least one situation parameter relates to at least one of a position of the gesticulating arm, a speed of movement of the gesticulating arm, an acceleration of the gesticulating arm and an angle present in the gesticulating arm, and the body model is used to define a range for each operating parameter such that which positions or movements of the gesticulating arm will produce the control command is varied based on the current body position of the user.
-
34. A user interface operating at least one device in a motor vehicle, comprising:
-
a camera to film a user who is gesticulating with a gesticulating arm in the interior of the motor vehicle, and to produce image data; a gesture recognition device coupled to the camera, to determine a position or a series of motions of the at least one gesticulating arm based on the image data of the camera, to recognize an operating gesture by associating the image data with one of a plurality of predefined operating gestures, and to output a control command to the at least one device depending on the operating gesture recognized; and a plausibility verification device to determine at least one situation parameter that describes a gesticulation context of the user, to determine, using the at least one situation parameter and a second classification, whether the user is performing a possible operating gesture or only a gesture that is not defined for operation and that is to be ignored, and to inhibit the output of the control command if it is determined that the gesture is to be ignored. - View Dependent Claims (35)
-
Specification