Application programming interface (API) for sensory events
First Claim
Patent Images
1. An Applications Programming Interface (API), executing from a computer system suitable for use in providing sensory information comprising:
- a set of programmable touchless API methods wherein the set of programmable touchless API methods expose sensory information related to an object within a three-dimensional sensory space for rendering onto a Graphical User Interface (GUI), anda low-level driver on the computer system configured tocommunicate over a wireless communication link to an ultrasonic sensing unit providing the sensory information,wherein the ultrasonic sensing unit;
stores digital ultrasonic reflection wave in a local memory of the ultrasonic sensing unit, and,generates a history of sensory information consisting of Time of Flights (TOFs) and phase differentials calculated from the stored digital ultrasonic reflection wave including an absolute location and relative movement of the object with respect to an origin of a three-dimensional coordinate system defining the touchless sensory space,wherein the ultrasonic sensing unit contains an on-board digital signal processor (DSP), the local memory and battery to perform pulse-echo location of the object by way of a transmitter and three or more receivers wherein a transmitter receiver pair provides one-dimensional range measurement, establish the three-dimensional coordinate system with X, Y and Z principal axes, andwherein the low-level driver receives the history of sensory information for each transmitter and receiver pair from the local memory by the DSP performing precise tracking and angular resolution of the object along range measurement projections of the X, Y and Z principal axes in the three-dimensional coordinate system.
1 Assignment
0 Petitions
Accused Products
Abstract
An Applications Programming Interface (API) provides coordinate and movement information of an object within a sensory field. The API can provide touchless APT methods for identifying a position, a displacement, a velocity, an acceleration, and a length of time an object is within a sensory field. The API can include an event listener for receiving at least one sensory event, and an event handler for processing sensory events. A GUI can implement the API to provide touchless navigation and control.
29 Citations
18 Claims
-
1. An Applications Programming Interface (API), executing from a computer system suitable for use in providing sensory information comprising:
-
a set of programmable touchless API methods wherein the set of programmable touchless API methods expose sensory information related to an object within a three-dimensional sensory space for rendering onto a Graphical User Interface (GUI), and a low-level driver on the computer system configured to communicate over a wireless communication link to an ultrasonic sensing unit providing the sensory information, wherein the ultrasonic sensing unit; stores digital ultrasonic reflection wave in a local memory of the ultrasonic sensing unit, and, generates a history of sensory information consisting of Time of Flights (TOFs) and phase differentials calculated from the stored digital ultrasonic reflection wave including an absolute location and relative movement of the object with respect to an origin of a three-dimensional coordinate system defining the touchless sensory space, wherein the ultrasonic sensing unit contains an on-board digital signal processor (DSP), the local memory and battery to perform pulse-echo location of the object by way of a transmitter and three or more receivers wherein a transmitter receiver pair provides one-dimensional range measurement, establish the three-dimensional coordinate system with X, Y and Z principal axes, and wherein the low-level driver receives the history of sensory information for each transmitter and receiver pair from the local memory by the DSP performing precise tracking and angular resolution of the object along range measurement projections of the X, Y and Z principal axes in the three-dimensional coordinate system. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A computer-readable storage medium, comprising computer instructions that implement a touchless applications programming interface (API) comprising a set of programmable touchless API methods and touchless API fields
wherein the set of programmable touchless API methods programmatically manipulate sensory information related to an object within a three-dimensional sensory space for building a Graphical User Interface (GUI), and a low-level driver on the computer system configured to: -
communicate over a wireless communication link to an ultrasonic sensing unit providing the sensory information, wherein the ultrasonic sensing unit; calculates phase differentials between previously received reflected stored ultrasonic wave in a local memory of the ultrasonic sensing unit, tracks a history of Time of Flights weighted by phase differentials for predicting an error estimate to produce a fine location of the object; and
,generates sensory information from the tracking of the Time of Flights (TOFs) and phase differentials in the history for each transmitter-receiver pair on the ultrasonic sensing unit, wherein the low-level driver generates an absolute location and relative movement of the object with respect to an origin of a three-dimensional coordinate system defining the touchless sensory space from the sensory information, wherein the ultrasonic sensing unit contains an on-board digital signal processor (DSP), the local memory, ultrasonic sensors and power supply to perform pulse-echo location of the object by way of ultrasonic sensors comprising a transmitter and three or more receivers, establish the three-dimensional coordinate system with respect to a approximately symmetric arrangement of the sensors located thereon to create X, Y and Z principal axes of the three-dimensional coordinate system, and store and reference received ultrasonic wave in the local memory for performing millimeter accuracy tracking and angular resolution of the object along range measurement projections of the X, Y and Z principal axes in the three-dimensional coordinate system. - View Dependent Claims (14, 15, 16, 17, 18)
-
Specification