Touchless tablet method and system thereof
First Claim
1. A touchless tablet comprising:
- a camera that takes a picture of a form,a touchless sensing unit operatively coupled to the camera for identifying a touchless finger action above the form within a three dimensional (3D) ultrasonic sensory space; and
a controller communicatively coupled to said touchless sensing unit, wherein said controllerdetermines a location of form components in the picture captured by said camera,creates a virtual layout of virtual form components in a three-dimensional ultrasonic touchless sensing space based on the location of the form components in the picture;
presents on a display a visual layout of visual form components also based on the location of the form components in the picture,applies a combinational weighting of a Time of Flight (TOF) ultrasonic distance measurement corresponding to a coarse estimated location of the finger and a differential Time of Flight (dTOF) ultrasonic measurement corresponding to a relative displacement as the finger accelerates or decelerates between a far distance and a close distance;
creates a vector from a first location at the far distance to a second location at the close distance for a forward finger movement, or, a vector from the first location at the close distance to the second location at the far distance for a retracting finger movement in the ultrasonic sensory space; and
predicts a destination of the finger on said form from said vector to provide zoom in and zoom out and associates the touchless finger action in the ultrasonic sensory space on at least one virtual form component in the virtual layout of the three-dimensional ultrasonic touchless sensing space with at least one visual form component of the visual layout on said form and presented on the display based on the predicted destination,wherein said touchless tablet identifies a selection of said virtual form component based on said location and action of said finger in the three-dimensional ultrasonic touchless sensing space.
1 Assignment
0 Petitions
Accused Products
Abstract
A system (100) and method for a touchless tablet that produces a touchless sensory field over a form (111). The touchless tablet includes a touchless sensing unit (110) for identifying a finger action above the form, and a controller (130) communicatively coupled to the sensing unit for associating the finger action with at least one form component on the form. The touchless tablet identifies a selection of a form component (122) based on a location and action of the finger above the form. A display (140) connected to the touchless tablet can expose a graphical application, wherein a touchless selection of a form component corresponds to a selection of a graphical component on the graphical application.
34 Citations
18 Claims
-
1. A touchless tablet comprising:
-
a camera that takes a picture of a form, a touchless sensing unit operatively coupled to the camera for identifying a touchless finger action above the form within a three dimensional (3D) ultrasonic sensory space; and a controller communicatively coupled to said touchless sensing unit, wherein said controller determines a location of form components in the picture captured by said camera, creates a virtual layout of virtual form components in a three-dimensional ultrasonic touchless sensing space based on the location of the form components in the picture; presents on a display a visual layout of visual form components also based on the location of the form components in the picture, applies a combinational weighting of a Time of Flight (TOF) ultrasonic distance measurement corresponding to a coarse estimated location of the finger and a differential Time of Flight (dTOF) ultrasonic measurement corresponding to a relative displacement as the finger accelerates or decelerates between a far distance and a close distance; creates a vector from a first location at the far distance to a second location at the close distance for a forward finger movement, or, a vector from the first location at the close distance to the second location at the far distance for a retracting finger movement in the ultrasonic sensory space; and predicts a destination of the finger on said form from said vector to provide zoom in and zoom out and associates the touchless finger action in the ultrasonic sensory space on at least one virtual form component in the virtual layout of the three-dimensional ultrasonic touchless sensing space with at least one visual form component of the visual layout on said form and presented on the display based on the predicted destination, wherein said touchless tablet identifies a selection of said virtual form component based on said location and action of said finger in the three-dimensional ultrasonic touchless sensing space. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A virtual screen comprising:
-
a sensing unit containing an arrangement of sensing elements for generating an ultrasonic touchless sensory field representing the virtual screen; a camera that takes a picture of a form, where the form is thereafter not needed and removed, a processor communicatively coupled to said touchless sensing unit and the camera for determining a location of form components in the picture captured by said camera, creating a virtual layout of virtual form components in said ultrasonic touchless sensory field based on the location of the form components in the picture; presenting on a display a visual layout of visual form components also based on the location of the form components in the picture determining a touchless finger location and a touchless finger action within said touchless sensory field; and associating the touchless finger location and touchless finger action on a virtual form component in the virtual layout with a visual form component in the visual layout, and controlling a graphical application of the visual form component according to said touchless finger location and said touchless finger action la applying a combinational weighting of a Time of Flight (TOF) ultrasonic distance measurement corresponding to a coarse estimated location of the finger and a differential Time of Flight (dTOF) ultrasonic measurement corresponding to a relative displacement as the finger accelerates or decelerates between a far distance and a close distance to the sensing unit. - View Dependent Claims (11, 12, 13, 14, 15)
-
-
16. A method of navigation in a virtual screen comprising:
-
capturing by way of a camera a picture of a user interface and determining a location of user interface components in the picture; creating by way of an ultrasonic sensing unit a three dimensional (3D) sensory space with a virtual layout of virtual user interface components corresponding to the location of user interface components in the picture; presenting on a display a visual layout of visual user interface components also based on the location of user interface components in the picture; estimating a time of flight (TOF) between when an ultrasonic pulse is transmitted from a first ultrasonic transducer and when a reflection of said ultrasonic pulse off the finger in said 3D sensory space is received from a plurality of ultrasonic transducers; estimating a differential time of flight (dTOF) between a first reflected ultrasonic signal and a second ultrasonic reflected received from the ultrasonic transducer for the plurality of ultrasonic transducers, estimating a finger position within said 3D sensory space corresponding to one of the virtual user interface components in the virtual layout by applying a combinational weighting of the TOF and the dTOF as the finger accelerates and decelerates between a far distance and a close distance, where the TOF corresponds to an estimated location of the finger and the dTOF corresponds to a relative displacement of the finger; determining one of a forward or retracting finger movement associated with said finger position; and controlling the visual user interface component in the visual layout corresponding to the virtual user interface component according to said finger movement by adjusting a graphical portion of said user interface to where the finger is pointed with respect to said forward or retracting finger movement. - View Dependent Claims (17, 18)
-
Specification