Three-dimensional space touch apparatus using multiple infrared cameras
First Claim
1. A three-dimensional (3D) space touch apparatus, comprising:
- an infrared Light-Emitting Diode (LED) array;
left and right infrared cameras;
a support that supports the infrared LED array and the left and right infrared cameras,wherein the infrared LED array emits infrared rays to form an infrared screen having a predetermined shape in a space above the support, andwherein the left and right infrared cameras are disposed on left and right sides of the support so that lenses thereof can be oriented to the infrared screen; and
a space touch sensor module that calculates X-axis, Y-axis and Z-axis coordinates of a selected location in the infrared screen, touched by user pointing means, using scattering or diffusion of the infrared rays generated in images including the selected location captured by the left and right infrared cameras when the selected location is touched by the user pointing means,wherein the space touch sensor module;
calculates the Z-axis coordinate of the selected location in the infrared screen by dividing a vertical-axis coordinate of locations, corresponding to the selected location of the camera images touched by the user pointing means, by a resolution of the left and right infrared cameras, and multiplying a result of the division by a predetermined distance of the Z-axis; and
calculates the X-axis and Y-axis coordinates of the selected location in the infrared screen using lateral-axis coordinates of the locations in the camera images touched by the user pointing means and information about angles of view of the left and right infrared cameras.
1 Assignment
0 Petitions
Accused Products
Abstract
Disclosed herein is a 3D space touch apparatus. The 3D space touch apparatus includes a support, an infrared LED array, left and right infrared cameras, and a space touch sensor module. The support supports the infrared LED array and the left and right infrared cameras. The infrared LED array emits infrared rays, which form an infrared screen in a space above the support. The left and right infrared cameras are disposed on the left and right sides of the support so that the lenses thereof can be oriented to the infrared screen. The space touch sensor module calculates the X-, Y- and Z-axis coordinates of a location of the infrared screen, touched by user pointing means, using images captured by the left and right infrared cameras and information about the resolutions and angles of view of the left and right infrared cameras.
-
Citations
19 Claims
-
1. A three-dimensional (3D) space touch apparatus, comprising:
-
an infrared Light-Emitting Diode (LED) array; left and right infrared cameras; a support that supports the infrared LED array and the left and right infrared cameras, wherein the infrared LED array emits infrared rays to form an infrared screen having a predetermined shape in a space above the support, and wherein the left and right infrared cameras are disposed on left and right sides of the support so that lenses thereof can be oriented to the infrared screen; and a space touch sensor module that calculates X-axis, Y-axis and Z-axis coordinates of a selected location in the infrared screen, touched by user pointing means, using scattering or diffusion of the infrared rays generated in images including the selected location captured by the left and right infrared cameras when the selected location is touched by the user pointing means, wherein the space touch sensor module; calculates the Z-axis coordinate of the selected location in the infrared screen by dividing a vertical-axis coordinate of locations, corresponding to the selected location of the camera images touched by the user pointing means, by a resolution of the left and right infrared cameras, and multiplying a result of the division by a predetermined distance of the Z-axis; and calculates the X-axis and Y-axis coordinates of the selected location in the infrared screen using lateral-axis coordinates of the locations in the camera images touched by the user pointing means and information about angles of view of the left and right infrared cameras. - View Dependent Claims (2, 3, 4, 5)
-
-
6. A space touch apparatus, comprising:
-
at least one infrared light source which emits at least one infrared ray to form an infrared screen having a predetermined coordinate frame in a space; at least two cameras oriented to the infrared screen to respectively capture at least two images of an area including a selected location, touched by user pointing means, in the infrared screen; and a space touch sensor module which calculates coordinates of the selected location in the infrared screen using scattering or diffusion of the infrared ray generated in the at least two images of the area including the selected location respectively captured by the at least two cameras when the selected location is touched by the user pointing means, wherein the space touch sensor module; calculates a Z-axis coordinate of the selected location in the infrared screen by dividing a vertical-axis coordinate of locations, corresponding to the selected location of the camera images touched by the user pointing means, by a resolution of the at least two cameras, and multiplying a result of the division by a predetermined distance of the Z-axis; and calculates X-axis and Y-axis coordinates of the selected location in the infrared screen using lateral-axis coordinates of the locations of the camera images touched by the user pointing means and information about angles of view of the at least two cameras. - View Dependent Claims (7, 8, 9, 10, 11)
-
-
12. A space touch apparatus, comprising:
-
at least one light source which emits at least one selected wavelength light ray to form a screen having a predetermined shape in a space; at least two cameras including a left and a right infrared camera which respectively capture at least two images of an area including a selected location using scattering or diffusion of the selected wavelength light rays when the selected location is touched by user pointing means in the screen, and filtering out different wavelength lights; a space touch sensor module which calculates coordinates of the selected location in the infrared screen, touched by user pointing means, using the at least two images of the area including the selected location respectively captured by the at least two cameras, wherein the space touch sensor module; calculates a Z-axis coordinate of the selected location in the screen by dividing a vertical-axis coordinate of locations, corresponding to the selected location of the camera images touched by the user pointing means, by a resolution of the left and right infrared cameras, and multiplying a result of the division by a predetermined distance of the Z-axis; and calculates X-axis and Y-axis coordinates of the selected location in the infrared screen using lateral-axis coordinates of the locations in the camera images touched by the user pointing means and the information about angles of view of the left and right infrared cameras. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19)
-
Specification