Command of a device by gesture emulation of touch gestures
First Claim
Patent Images
1. A user to machine interface comprising:
- at least a remote control with at least one motion sensor,a communication link between the remote control and the machine, andat least one processor for processing the motion signals from said at least one motion sensor,wherein said machine runs an operating system configured to receive touch input points when a user touches a screen connected to the machine, said at least one processor being further configured for initiating a touch emulation mode based on a triggering event, using the at least one motion sensor to perform a gesture recognition of at least a 3D gesture of a user in the air with the remote control, generating at least one emulated touch point corresponding to a touch point on the screen, based on said at least a 3D gesture of a user in the air, and sending the at least one emulated touch point to the operating system of the machine to simulate the user touching the screen at the position of the at least one emulated touch point.
1 Assignment
0 Petitions
Accused Products
Abstract
A user to machine interface emulates a touch interface on a screen. The interface is configured for operating in a touch emulation mode based on a triggering event. A triggering event may be a rotation around a first axis of an angle higher than a first threshold. Analysis of the amount of rotation around a second axis may be used to determine the number of fingers defining a specific touch gesture. An infinite variety of touch gestures may therefore be emulated by a remote control based on application context thus allowing for multiple uses of the touch screen machine from a distance.
15 Citations
19 Claims
-
1. A user to machine interface comprising:
-
at least a remote control with at least one motion sensor, a communication link between the remote control and the machine, and at least one processor for processing the motion signals from said at least one motion sensor, wherein said machine runs an operating system configured to receive touch input points when a user touches a screen connected to the machine, said at least one processor being further configured for initiating a touch emulation mode based on a triggering event, using the at least one motion sensor to perform a gesture recognition of at least a 3D gesture of a user in the air with the remote control, generating at least one emulated touch point corresponding to a touch point on the screen, based on said at least a 3D gesture of a user in the air, and sending the at least one emulated touch point to the operating system of the machine to simulate the user touching the screen at the position of the at least one emulated touch point. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A method of operating a user to machine interface comprising:
-
moving in space by at least a user a remote control with at least one motion sensor, using a communication link between the remote control and the machine, and processing signals from the at least one motion sensor, wherein said machine runs an operating system configured to receive touch input points when a user touches a screen connected to the machine, said method further comprising initiating a touch emulation mode based on a triggering event, using the at least one motion sensor to perform a gesture recognition of at least a 3D gesture imparted by a user in the air to the remote control, generating at least one emulated touch point corresponding to a touch point on the screen based on said at least a 3D gesture imparted by a user in the air, and sending the at least one emulated touch point to the operating system of the machine to simulate the user touching the screen at the position of the at least one emulated touch point. - View Dependent Claims (16, 17, 18, 19)
-
Specification