Light-based finger gesture user interface
First Claim
Patent Images
1. A light-based finger gesture user interface for an electronic device comprising:
- a housing for an electronic device comprising inner sidewalls extending from a bottom opening to a top opening to form a cavity;
a touch screen display mounted in said housing such that the whole cavity is disposed along a side of the display, separate from the display;
a detector mounted in said inner sidewalls, comprising a plurality of light emitters that project light beams into the cavity and a plurality of light receivers that measure intensities of blocked and unblocked ones of the projected light beams; and
a processor connected to said detector and to said display for generating touch position data in response to a first object touching said display based on output received from said display, for generating cavity insertion data in response to a second object, different than the first object, being inside the cavity based on output received from said detector, and for combining the touch position data and the cavity insertion data into input for the electronic device, wherein the cavity insertion data includes location and orientation of the second object inside the cavity.
1 Assignment
0 Petitions
Accused Products
Abstract
A light-based finger gesture user interface for an electronic device including a housing for an electronic device, a display mounted in the housing, a cavity, separated from the display, penetrating two opposite sides of the housing, a detector mounted in the housing operative to detect an object inserted in the cavity, and a processor connected to the detector and to the display for causing the display to render a visual representation in response to output from the detector.
339 Citations
30 Claims
-
1. A light-based finger gesture user interface for an electronic device comprising:
-
a housing for an electronic device comprising inner sidewalls extending from a bottom opening to a top opening to form a cavity; a touch screen display mounted in said housing such that the whole cavity is disposed along a side of the display, separate from the display; a detector mounted in said inner sidewalls, comprising a plurality of light emitters that project light beams into the cavity and a plurality of light receivers that measure intensities of blocked and unblocked ones of the projected light beams; and a processor connected to said detector and to said display for generating touch position data in response to a first object touching said display based on output received from said display, for generating cavity insertion data in response to a second object, different than the first object, being inside the cavity based on output received from said detector, and for combining the touch position data and the cavity insertion data into input for the electronic device, wherein the cavity insertion data includes location and orientation of the second object inside the cavity. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A light-based finger gesture user interface for an electronic device comprising:
-
a housing for an electronic device comprising inner sidewalls extending from a bottom opening to a top opening to form a cavity; a touch screen display mounted in said housing for rendering output generated by an application program running on the electronic device, wherein the whole cavity is disposed along a side of the display, separate from the display; a sensor mounted in said inner sidewalls, comprising a plurality of light emitters that project light beams into the cavity and a plurality of light receivers that measure intensities of blocked and unblocked ones of the projected light beams; and a processor in said housing connected to said display and to said sensor, for executing the application program, for generating touch position data in response to a first object touching said display based on output received from said display, for generating cavity insertion data in response to a second object, different than the first object, being inside the cavity based on output received from said sensor, and for combining the touch position data and the cavity insertion data into input for the application program wherein the cavity insertion data includes location and orientation of the second object inside the cavity. - View Dependent Claims (9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30)
-
Specification