Virtual user interface method and system thereof
First Claim
1. A virtual user interface (VUI) that maps touchless and non-visible virtual components in a three-dimensional ultrasonic touchless sensing field to touchable and visible user components in a Graphical user interface (GUI) of a differing size managed by a computing device, and translates touchless finger actions applied to the virtual non-visible components in the VUI to actions on the touchable user component in the GUI, where the VUI comprises:
- an ultrasonic sensing unit that generates the 3D ultrasonic touchless sensory field providing separate 2D <
x,y>
navigation control and 1D <
z>
user event control;
a processor communicatively coupled to the sensing unit that identifies and tracks a finger movement within the ultrasonic touchless sensory field;
a timer that determines a length of time finger is at a position nearest to a VUI component during 2D <
x,y>
navigation control and prior to 1D <
z>
user event control;
a coordinator to adjust a sensitivity of the ultrasonic touchless sensing field at a VUI component associated with the finger position and length of time;
a driver communicatively coupled to the processor for converting the finger movement to a coordinate object that includes the navigation control, user event control and length of time and providing the coordinate object to the computing device; and
an applications program interface (API) running on the computing device and communicatively coupled to the driver that;
exposes programmable methods and variables to provide call control of the coordinate object to the UI for handling sensory events in the ultrasonic touchless sensory field; and
provides a visual indicator that expands or shrinks a GUI component along its boundary mapped to the VUI component according to the adjusted sensitivity and mapping;
wherein the touchless sensing unit;
emits a plurality of ultrasonic pulses from a first ultrasonic transmitter configured to transmit the ultrasonic pulses;
estimates for a plurality of ultrasonic receivers a time of flight between transmitting one of the ultrasonic pulses and receiving a reflected ultrasonic signal corresponding to a reflection off the finger;
calculates for the plurality of ultrasonic receivers a phase differential between the reflected ultrasonic signal and a previously received reflected ultrasonic signal, anddetermines a location and relative displacement of the finger from a mathematical weighting of said time of flight by said phase differential for mapping the virtual components of the VUI to the user components of the UI.
1 Assignment
0 Petitions
Accused Products
Abstract
A virtual user interface (VUI) is provided. The VUI (120) can include a touchless sensing unit (110) for identifying and tracking at least one object in a touchless sensory field, a processor (130) communicatively coupled to the sensing unit for capturing a movement of the object within the touchless sensory field, and a driver (132) for converting the movement to a coordinate object (133). In one aspect, the VUI can implement an applications program interface (134) for receiving the coordinate object and providing the coordinate object to the virtual user interface (VUI). An object movement within the sensory field of the VUI can activate user components in a User Interface (150).
42 Citations
17 Claims
-
1. A virtual user interface (VUI) that maps touchless and non-visible virtual components in a three-dimensional ultrasonic touchless sensing field to touchable and visible user components in a Graphical user interface (GUI) of a differing size managed by a computing device, and translates touchless finger actions applied to the virtual non-visible components in the VUI to actions on the touchable user component in the GUI, where the VUI comprises:
-
an ultrasonic sensing unit that generates the 3D ultrasonic touchless sensory field providing separate 2D <
x,y>
navigation control and 1D <
z>
user event control;a processor communicatively coupled to the sensing unit that identifies and tracks a finger movement within the ultrasonic touchless sensory field; a timer that determines a length of time finger is at a position nearest to a VUI component during 2D <
x,y>
navigation control and prior to 1D <
z>
user event control;a coordinator to adjust a sensitivity of the ultrasonic touchless sensing field at a VUI component associated with the finger position and length of time; a driver communicatively coupled to the processor for converting the finger movement to a coordinate object that includes the navigation control, user event control and length of time and providing the coordinate object to the computing device; and an applications program interface (API) running on the computing device and communicatively coupled to the driver that; exposes programmable methods and variables to provide call control of the coordinate object to the UI for handling sensory events in the ultrasonic touchless sensory field; and provides a visual indicator that expands or shrinks a GUI component along its boundary mapped to the VUI component according to the adjusted sensitivity and mapping; wherein the touchless sensing unit; emits a plurality of ultrasonic pulses from a first ultrasonic transmitter configured to transmit the ultrasonic pulses; estimates for a plurality of ultrasonic receivers a time of flight between transmitting one of the ultrasonic pulses and receiving a reflected ultrasonic signal corresponding to a reflection off the finger; calculates for the plurality of ultrasonic receivers a phase differential between the reflected ultrasonic signal and a previously received reflected ultrasonic signal, and determines a location and relative displacement of the finger from a mathematical weighting of said time of flight by said phase differential for mapping the virtual components of the VUI to the user components of the UI. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A method for supplying a coordinate object to a computing device and controlling at least a portion of a three-dimensional (3D) ultrasonic user interface (UI) managed by the computing device, the method comprising the steps of:
-
detecting a generally stationary finger and a length of time the finger is at a location nearest to a Virtual User Interface (VUI) component in the 3D ultrasonic UI; adjusting a sensitivity of the VUI component in the 3D ultrasonic UI according to the length of time at the finger position to magnify a corresponding Graphical User Interface (GUI) component on the computing device; detecting thereafter a forward and retracting touchless finger action applied to-the VUI component at the location in the 3D ultrasonic UI generated by an ultrasonic sensing unit providing separate 2D <
x,y>
navigation control and 1D <
z>
user event control;converting the finger action to a coordinate object that includes the navigation control, user event control and length of time and communicating the coordinate object to the computing device by way of a driver implemented on the computing device that communicates with the ultrasonic sensing unit; providing a visual indicator that expands or shrinks the GUI component along its boundary according to the coordinate object and mapping of the adjusted VUI component size; and controlling at least a portion of the GUI using the coordinate object by way of an Applications Programming Interface (API) on the computing device that communicates with the driver through native functions and methods of the ultrasonic sensing unit to implement sensitivity adjustment; wherein the converting further includes translating a coordinate space of the touchless sensory field to a coordinate space of the UI, and wherein the coordinate object identifies at least one among an absolute location, a relative difference, a velocity, and an acceleration of the finger in the touchless sensory field where the touchless sensing unit; emits a sequence of ultrasonic pulses from a first ultrasonic transmitter configured to transmit the ultrasonic pulses; estimates for a plurality of ultrasonic receivers a time of flight between transmitting one of the ultrasonic pulses and receiving a reflected ultrasonic signal corresponding to a reflection off the finger; calculates for the plurality of ultrasonic receivers a phase differential between a first reflected ultrasonic signal at a first time and a second reflected ultrasonic signal at a second time, and determines a location and relative displacement of the finger from a mathematical weighting of said time of flight by said phase differential for mapping the virtual components of the VUI to the user components of the UI. - View Dependent Claims (8, 9, 10, 11)
-
-
12. A communication device for presenting a touchless virtual user interface (VUI), the communication device having a controlling element that
receives a coordinate object from a touchless ultrasonic sensing unit and controls at least a portion of a user interface (UI) using the coordinate object, and exposes an open language programmable Applications Programming Interface with methods and variables for coding portability to provide call control to the UI for handling touchless sensory events in the VUI and accessing the coordinate object through a driver that communicates with native functions and methods of the touchless ultrasonic sensing unit. wherein the controlling element controls at least one user component in the UI in accordance with touchless finger movements applied to at least one virtual component in the touchless virtual user interface (VUI); -
wherein the controller element is a Digital Signal Processor (DSP) that; emits a plurality of ultrasonic pulses from a first ultrasonic transmitter; estimates a time of flight between when said ultrasonic pulse was transmitted from said first ultrasonic transducer and when a reflection of said ultrasonic pulse off the finger in said three-dimensional space was received from a plurality of ultrasonic receivers; estimates a phase differential between a first reflected ultrasonic signal and a second ultrasonic reflected signal both received from a same ultrasonic receiver for the plurality of ultrasonic receivers, and determines a location and relative displacement of the finger from a mathematical weighting of said time of flight by said phase differential, and generates the coordinate object for mapping the virtual components of the VUI to the user components of the UI, and that identifies at least one among an absolute location, a relative difference, a velocity, a length of time, and an acceleration of a finger producing the touchless finger movements for controlling at least a portion of the UI. - View Dependent Claims (13, 14, 15, 16, 17)
-
Specification