System and method for predicting a touch position of a pointer on a touch-enabled unit or determining a pointing direction in 3D space
First Claim
1. A system for predicting a touch position of a pointer on a touch-enabled unit, said system comprising:
- a 3D imaging unit;
a processing unit coupled to the 3D imaging unit;
a touch-enabled unit coupled to the processing unit, wherein the 3D imaging unit is configured to monitor an interaction zone in front of the touch-enabled unit, and the processing unit comprises a prediction module that is configured to predict where a pointer that approaches the touch-enabled unit and is monitored by the 3D imaging unit will touch the touch-enabled unit; and
a calibration module that is configured to;
generate at least one calibration parameter by comparing the predicted touch position to the actual touch position of the pointer detected by the touch-enabled unit,use a plurality of pairs for the generation of the at least one calibration parameter, wherein each of the pairs contains a predicted touch position and a corresponding actual touch position, andtransfer the at least one calibration parameter to the prediction module to calibrate the prediction module.
5 Assignments
0 Petitions
Accused Products
Abstract
A system for predicting a touch position of a pointer on a touch-enabled unit includes a 3D imaging unit, a processing unit coupled to the 3D imaging unit, and a touch-enabled unit coupled to the processing unit. The 3D imaging unit is configured to monitor an interaction zone in front of the touch-enabled unit. The processing unit includes a prediction module that is configured to predict where a pointer that approaches the touch-enabled unit and is monitored by the 3D imaging unit will touch the touch-enabled unit, and a calibration module that is configured to generate at least one calibration parameter by comparing the predicted touch position to the actual touch position of the pointer detected by the touch-enabled unit and to transfer the at least one calibration parameter to the prediction module in order to calibrate the prediction module.
5 Citations
20 Claims
-
1. A system for predicting a touch position of a pointer on a touch-enabled unit, said system comprising:
-
a 3D imaging unit; a processing unit coupled to the 3D imaging unit; a touch-enabled unit coupled to the processing unit, wherein the 3D imaging unit is configured to monitor an interaction zone in front of the touch-enabled unit, and the processing unit comprises a prediction module that is configured to predict where a pointer that approaches the touch-enabled unit and is monitored by the 3D imaging unit will touch the touch-enabled unit; and a calibration module that is configured to; generate at least one calibration parameter by comparing the predicted touch position to the actual touch position of the pointer detected by the touch-enabled unit, use a plurality of pairs for the generation of the at least one calibration parameter, wherein each of the pairs contains a predicted touch position and a corresponding actual touch position, and transfer the at least one calibration parameter to the prediction module to calibrate the prediction module. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A system for determining a pointing direction of a pointer in 3D space, said system comprising:
-
a 3D imaging unit, and a processing unit coupled to the 3D imaging unit, wherein the processing unit comprises a determination module that is configured to determine a pointing direction of a pointer of a user that is monitored by the 3D imaging unit when the user has the intention to point to an object, and a calibration module that is configured to generate at least one calibration parameter by comparing the determined pointing direction to the position of the object and to transfer the at least one calibration parameter to the determination module in order to calibrate the determination module, wherein the determination module is configured to determine the pointing direction of the pointer pointing to the same object for a plurality of times and to determine a peak in a distribution of the determined pointing directions, and the calibration module is configured to generate the at least one calibration parameter by comparing the pointing direction corresponding to the peak in the distribution of the determined pointing directions to the position of the object. - View Dependent Claims (9, 10, 11)
-
-
12. A system for determining a pointing direction of a pointer in 3D space, said system comprising:
-
a 3D imaging unit, and a processing unit coupled to the 3D imaging unit, wherein the processing unit comprises a determination module that is configured to determine a pointing direction of a pointer of a user that is monitored by the 3D imaging unit when the user has the intention to point to an object, and a calibration module that is configured to generate at least one calibration parameter by comparing the determined pointing direction to the position of the object and to transfer the at least one calibration parameter to the determination module in order to calibrate the determination module; wherein the 3D imaging unit is configured to capture consecutive images of a scene, wherein at least a first one of the images comprises a 2D intensity image and a depth map, and at least a second one of the images comprise only a 2D intensity image, but not a depth map, wherein the scene comprises a display showing an element and the determination module is configured to determine the coordinates of the element in a coordinate system associated with the 3D imaging unit and the calibration module is configured to generate at least one further calibration parameter for calibrating the coordinate system associated with the 3D imaging unit by comparing the coordinates of the element in the coordinate system associated with the 3D imaging unit (to predetermined coordinates of the element in a fixed coordinate system. - View Dependent Claims (13)
-
-
14. A system for predicting a touch position of a pointer on a touch-enabled unit, said system comprising:
-
a 3D imaging unit, wherein a 3D image of a scene captured by the 3D image unit comprises a 2D intensity image of the scene and a depth map of the scene; a processing unit coupled to the 3D imaging unit; a touch-enabled unit coupled to the processing unit, wherein the 3D imaging unit is configured to monitor an interaction zone in front of the touch-enabled unit, and the processing unit comprises a prediction module that is configured to predict where a pointer that approaches the touch-enabled unit and is monitored by the 3D imaging unit will touch the touch-enabled unit; and a calibration module that is configured to generate at least one calibration parameter by comparing the predicted touch position to the actual touch position of the pointer detected by the touch-enabled unit and to transfer the at least one calibration parameter to the prediction module to calibrate the prediction module, and wherein the at least one calibration parameter comprises an offset for the values of the depth map. - View Dependent Claims (15, 16)
-
-
17. A system for predicting a touch position of a pointer on a touch-enabled unit, said system comprising:
-
a 3D imaging unit; a processing unit coupled to the 3D imaging unit; a touch-enabled unit coupled to the processing unit, wherein the 3D imaging unit is configured to monitor an interaction zone in front of the touch-enabled unit, and the processing unit comprises a prediction module that is configured to predict where a pointer that approaches the touch-enabled unit and is monitored by the 3D imaging unit will touch the touch-enabled unit; and a calibration module that is configured to generate at least one calibration parameter by comparing the predicted touch position to the actual touch position of the pointer detected by the touch-enabled unit and to transfer the at least one calibration parameter to the prediction module to calibrate the prediction module; wherein the prediction module is configured to provide coordinates of the predicted touch position in a coordinate system of the 3D imaging unit and the touch-enabled unit is configured to provide coordinates of the actual touch position in a coordinate system of the touch-enabled unit, and wherein the at least one calibration parameter comprises translating and/or rotating and/or scaling parameters for at least one of the coordinate systems such that a deviation between the coordinates of the predicted touch position in the coordinate system of the 3D imaging unit and the coordinates of the actual touch position in the coordinate system of the touch-enabled unit is minimized. - View Dependent Claims (18, 19, 20)
-
Specification