Touchless user interface for handheld and wearable computers
First Claim
Patent Images
1. A wearable computer comprising:
- a display to present an image;
a touchscreen to sense tactile input from a human finger;
an optical grating having adjacent features that diffract incident light reflected off the human finger into curtains of minimum intensity separated by foci;
an image sensor in a path of the curtains and the foci, the curtains and foci extending to the image sensor, the image sensor to capture patterns of the curtains and the foci; and
at least one processor to detect a remote gesture from the patterns of the curtains and the foci captured by the image sensor, to alter the image responsive to the detected remote gesture, to detect a tactile gesture from the sensed tactile input, and to alter the image responsive to the detected tactile gesture.
1 Assignment
0 Petitions
Accused Products
Abstract
A user interface includes both a touchscreen for tactile input and one or more lensless optical sensors for sensing additional, remote gestures. Users can interact with the user interface in a volume of space near the display, and are thus not constrained to the relatively small area of the touchscreen. Remote hand or face gestures can be used to turn on or otherwise alter the tactile user interface. Shared user interfaces can operate without touch, and thus avoid cross-contamination of e.g. viruses and bacteria.
-
Citations
11 Claims
-
1. A wearable computer comprising:
-
a display to present an image; a touchscreen to sense tactile input from a human finger; an optical grating having adjacent features that diffract incident light reflected off the human finger into curtains of minimum intensity separated by foci; an image sensor in a path of the curtains and the foci, the curtains and foci extending to the image sensor, the image sensor to capture patterns of the curtains and the foci; and at least one processor to detect a remote gesture from the patterns of the curtains and the foci captured by the image sensor, to alter the image responsive to the detected remote gesture, to detect a tactile gesture from the sensed tactile input, and to alter the image responsive to the detected tactile gesture. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A method comprising:
-
diffracting incident light from a human finger using an optical phase grating that diffracts the incident light into curtains of minimum intensity separated by foci to produce an interference pattern; capturing the interference pattern with a photodetector array, the curtains of minimum intensity and the foci extending to the photodetector array; detecting a remote gesture from the captured interference pattern; displaying an image responsive to the detected remote gesture; detecting a tactile gesture on the image; and altering the image responsive to the tactile gesture. - View Dependent Claims (8, 9, 10, 11)
-
Specification