APPARATUS AND METHOD OF USER INTERFACE FOR MANIPULATING MULTIMEDIA CONTENTS IN VEHICLE
First Claim
1. An apparatus of a user interface for manipulating multimedia contents for a vehicle, comprising:
- a transparent display module displaying an image including one or more multimedia objects;
an ultrasonic detection module detecting a user indicating means by using an ultrasonic sensor in a 3D space close to the transparent display module;
an image detection module tracking and photographing the user indicating means; and
a head unit judging whether or not any one of the multimedia objects is selected by the user indicating means by using information received from at least one of the image detection module and the ultrasonic detection module and performing a control corresponding to the selected multimedia object.
1 Assignment
0 Petitions
Accused Products
Abstract
Disclosed are provided an apparatus and a method of a user interface for manipulating multimedia contents for a vehicle. An apparatus of a user interface for manipulating multimedia contents for a vehicle according to an embodiment of the present invention includes: a transparent display module displaying an image including one or more multimedia objects; an ultrasonic detection module detecting a user indicating means by using an ultrasonic sensor in a 3D space close to the transparent display module; an image detection module tracking and photographing the user indicating means; and a head unit judging whether or not any one of the multimedia objects is selected by the user indicating means by using information received from at least one of the image detection module and the ultrasonic detection module and performing a control corresponding to the selected multimedia object.
157 Citations
20 Claims
-
1. An apparatus of a user interface for manipulating multimedia contents for a vehicle, comprising:
-
a transparent display module displaying an image including one or more multimedia objects; an ultrasonic detection module detecting a user indicating means by using an ultrasonic sensor in a 3D space close to the transparent display module; an image detection module tracking and photographing the user indicating means; and a head unit judging whether or not any one of the multimedia objects is selected by the user indicating means by using information received from at least one of the image detection module and the ultrasonic detection module and performing a control corresponding to the selected multimedia object. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. An apparatus of a user interface for manipulating multimedia contents for a vehicle, comprising:
-
a transparent display displaying an image including one or more multimedia objects; an ultrasonic sensor detecting an object in a 3D space close to the transparent display; a stereo camera stereo-photographing the 3D space; a motion tracker judging whether or not the detected object is a hand and when the object is the hand in accordance with the judgment result, tracking a motion of the hand; a first coordinate detector detecting a first coordinate corresponding to a 3D position of an end point of the hand; a second coordinate detector acquiring 3D coordinates of both user'"'"'s pupils from the image photographed by the stereo camera and detecting a second coordinate corresponding to a point where an indication vector linking the first coordinate with a center position of the both pupils meets the transparent display; a motion analyzer acquiring a user'"'"'s gesture from a motion of the hand; an integrator acquiring a final intention of the user by integrating the gesture, the first coordinate, and the second coordinate; and a controller performing predetermined control depending on the acquired final intention. - View Dependent Claims (10, 11, 12, 13, 14, 15)
-
-
16. A method of a user interface for manipulating multimedia contents for a vehicle, comprising:
-
when an object is detected in a 3D space, verifying whether or not the detected object is a user'"'"'s hand; detecting a first coordinate which is a 3D coordinate corresponding to an end point of the hand when the object is the hand; detecting 3D coordinates corresponding to both pupils of the user'"'"'s and detecting a second coordinate corresponding to a point where an indication vector linking the first coordinate with a center position of the both pupils meets a transparent display; acquiring a user'"'"'s gesture by tracking the hand; acquiring a final intention of the user by integrating the gesture, the first coordinate, and the second coordinate; and performing predetermined control depending on the acquired final intention. - View Dependent Claims (17, 18, 19, 20)
-
Specification