APPARATUS AND METHOD FOR CONTROLLING INTERFACE
First Claim
1. An apparatus for controlling an interface, comprising:
- a receiver to receive image information comprising a depth image related to a user from a sensor;
a processor to generate, based on the image information, at least one of motion information regarding a hand motion of the user and gaze information regarding a gaze of the user; and
a controller to control a 2-dimensional or 3-dimensional graphical user interface (2D/3D GUI) based on at least one of the motion information and the gaze information.
1 Assignment
0 Petitions
Accused Products
Abstract
In an apparatus and method for controlling an interface, a user interface (UI) may be controlled using information on a hand motion and a gaze of a user without separate tools such as a mouse and a keyboard. That is, the UI control method provides more intuitive, immersive, and united control of the UI. Since a region of interest (ROI) sensing the hand motion of the user is calculated using a UI object that is controlled based on the hand motion within the ROI, the user may control the UI object in the same method and feel regardless of a distance from the user to a sensor. In addition, since positions and directions of view points are adjusted based on a position and direction of the gaze, a binocular 2D/3D image based on motion parallax may be provided.
44 Citations
22 Claims
-
1. An apparatus for controlling an interface, comprising:
-
a receiver to receive image information comprising a depth image related to a user from a sensor; a processor to generate, based on the image information, at least one of motion information regarding a hand motion of the user and gaze information regarding a gaze of the user; and a controller to control a 2-dimensional or 3-dimensional graphical user interface (2D/3D GUI) based on at least one of the motion information and the gaze information. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 21)
-
-
20. A method of controlling an interface, comprising:
-
receiving, by a processor, image information comprising a depth image related to a user from a sensor; generating by the processor, based on the image information, at least one of motion information regarding a hand motion of the user and gaze information regarding a gaze of the user; and controlling, by the processor, a 2D/3D GUI based on at least one of the motion information and the gaze information. - View Dependent Claims (22)
-
Specification