SYSTEM AND METHOD FOR PREDICTING A TOUCH POSITION OF A POINTER ON A TOUCH-ENABLED UNIT OR DETERMINING A POINTING DIRECTION IN 3D SPACE
First Claim
1. A system (10) for predicting a touch position of a pointer (21) on a touch-enabled unit (13), said system comprising:
- a 3D imaging unit (11);
a processing unit (12) coupled to the 3D imaging unit (11);
a touch-enabled unit (13) coupled to the processing unit (12), wherein the 3D imaging unit (11) is configured to monitor an interaction zone in front of the touch-enabled unit (13), and the processing unit (12) comprises a prediction module (14) that is configured to predict where a pointer (21) that approaches the touch-enabled unit (13) and is monitored by the 3D imaging unit (11) will touch the touch-enabled unit (11); and
a calibration module (15) that is configured to generate at least one calibration parameter (31) by comparing the predicted touch position (28) to the actual touch position (29) of the pointer (21) detected by the touch-enabled unit (13) and to transfer the at least one calibration parameter (31) to the prediction module (14) to calibrate the prediction module (14).
5 Assignments
0 Petitions
Accused Products
Abstract
A system for predicting a touch position of a pointer on a touch-enabled unit includes a 3D imaging unit, a processing unit coupled to the 3D imaging unit, and a touch-enabled unit coupled to the processing unit. The 3D imaging unit is configured to monitor an interaction zone in front of the touch-enabled unit. The processing unit includes a prediction module that is configured to predict where a pointer that approaches the touch-enabled unit and is monitored by the 3D imaging unit will touch the touch-enabled unit, and a calibration module that is configured to generate at least one calibration parameter by comparing the predicted touch position to the actual touch position of the pointer detected by the touch-enabled unit and to transfer the at least one calibration parameter to the prediction module in order to calibrate the prediction module.
-
Citations
15 Claims
-
1. A system (10) for predicting a touch position of a pointer (21) on a touch-enabled unit (13), said system comprising:
-
a 3D imaging unit (11); a processing unit (12) coupled to the 3D imaging unit (11); a touch-enabled unit (13) coupled to the processing unit (12), wherein the 3D imaging unit (11) is configured to monitor an interaction zone in front of the touch-enabled unit (13), and the processing unit (12) comprises a prediction module (14) that is configured to predict where a pointer (21) that approaches the touch-enabled unit (13) and is monitored by the 3D imaging unit (11) will touch the touch-enabled unit (11); and a calibration module (15) that is configured to generate at least one calibration parameter (31) by comparing the predicted touch position (28) to the actual touch position (29) of the pointer (21) detected by the touch-enabled unit (13) and to transfer the at least one calibration parameter (31) to the prediction module (14) to calibrate the prediction module (14). - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method for predicting a touch position of a pointer (21) on a touch-enabled unit (13), said method comprising:
-
monitoring an interaction zone in front of the touch-enabled unit (13) by using a 3D imaging unit (11); predicting by using a prediction method where a pointer (21) that approaches the touch-enabled unit (13) and is monitored by the 3D imaging unit (11) will touch the touch-enabled unit (13); and generating at least one calibration parameter (31) by comparing the predicted touch position (28) to the actual touch position (29) of the pointer (21) detected by the touch-enabled unit (13), and using the at least one calibration parameter (31) to calibrate the prediction method.
-
-
10. A system (50) for determining a pointing direction of a pointer (21) in 3D space, said system comprising:
-
a 3D imaging unit (11), and a processing unit (51) coupled to the 3D imaging unit (11), wherein the processing unit (51) comprises a determination module (52) that is configured to determine a pointing direction (55) of a pointer (21) of a user that is monitored by the 3D imaging unit (11) when the user has the intention to point to an object (54), and a calibration module (53) that is configured to generate at least one calibration parameter (56) by comparing the determined pointing direction (55) to the position of the object (54) and to transfer the at least one calibration parameter (56) to the determination module (52) in order to calibrate the determination module (52). - View Dependent Claims (11, 12, 13, 14)
-
-
15. A method for determining a pointing direction of a pointer (21) in 3D space, said method comprising:
-
monitoring a pointer (21) of a user by a 3D imaging unit (11) when the user has the intention to point to an object (54); determining a pointing direction (55) of the pointer (21) by using a determination method; and generating at least one calibration parameter (56) by comparing the determined pointing direction (55) to the position of the object (54) and using the at least one calibration parameter (56) to calibrate the determination method.
-
Specification