Data input device
First Claim
Patent Images
1. A data input device comprising:
- an illuminator operative to illuminate at least one engagement plane by directing light along said at least one engagement plane;
a two-dimensional imaging sensor, including multiple rows of detector arrays, viewing said at least one engagement plane at a non-zero angle from a location outside said at least one engagement plane for sensing light from said illuminator scattered by engagement of a data entry object with said at least one engagement plane; and
a data entry processor receiving an output from said two-dimensional imaging sensor and providing a data entry input to utilization circuitry.
1 Assignment
0 Petitions
Accused Products
Abstract
A data input device and method including an illuminator (106) operative to illuminate at least one engagement plane (102) by directing light along the at least one engagement plane, a two-dimensional imaging sensor (112) viewing the at least one engagement plane from a location outside the at least one engagement plane for sensing light from the illuminator (106) scattered by engagement of a data entry object (110) with the at least one engagement plane and a data entry processor receiving an output from the two-dimensional imaging sensor (112) and providing a data input to utilization circuitry.
-
Citations
133 Claims
-
1. A data input device comprising:
-
an illuminator operative to illuminate at least one engagement plane by directing light along said at least one engagement plane; a two-dimensional imaging sensor, including multiple rows of detector arrays, viewing said at least one engagement plane at a non-zero angle from a location outside said at least one engagement plane for sensing light from said illuminator scattered by engagement of a data entry object with said at least one engagement plane; and a data entry processor receiving an output from said two-dimensional imaging sensor and providing a data entry input to utilization circuitry. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 74)
-
2. A data input device according to claim 1 and also comprising a data entry matrix projector operative to project at least one visually sensible data entry matrix onto a projection surface underlying said at least one engagement plane.
-
3. A data input device according to claim 2 and wherein said at least one visually sensible data entry matrix defines a keyboard.
-
4. A data input device according to claim 2 and wherein said projector comprises:
-
a projector light source; and a spatial light modulation element operative to receive light from said projector light source and to project at least one visually sensible data entry matrix onto a surface underlying said at least one engagement plane.
-
-
5. A data input device according to claim 4 and wherein said spatial light modulation element comprises a diffractive optical element.
-
6. A data input device according to claim 4 and wherein said spatial light modulation element comprises a transparency.
-
7. A data input device according to claim 1 and wherein said two-dimensional imaging sensor comprises a solid state imaging sensor.
-
8. A data input device according to claim 2 and wherein said data entry processor correlates said output from said two-dimensional imaging sensor with said at least one visually sensible data entry matrix.
-
9. A data input device according to claim 2 and wherein said data entry matrix projector comprises a diffractive optical element which receives light from a diode laser via a collimating lens.
-
10. A data input device according to claim 9 and wherein light passing through said diffractive optical element is reflected by a curved mirror having optical power via a lens onto said projection surface.
-
11. A data input device according to claim 10 and wherein said diffractive optical element, said mirror and said lens are all integrally formed in a prism.
-
12. A data input device according to claim 2 and wherein said data entry matrix projector comprises an integrally formed beam splitter and diffractive optical elements.
-
13. A data input device according to claim 12 and wherein in said data entry matrix projector, a beam of light from a diode laser passes through a collimating lens and impinges on two mutually angled surfaces of said beam splitter, which breaks the beam of light into two beams, each of which passes through a separate diffractive optical element and impinges on said projection surface.
-
14. A data input device according to claim 12 and wherein said diffractive optical elements are integrally formed with said beam splitter in a prism.
-
15. A data input device according to claim 13 and wherein said diffractive optical elements are integrally formed with said beam splitter in a prism.
-
16. A data input device according to claim 2 and wherein said data entry matrix projector comprises a plurality of different diffractive optical elements, each of which typically corresponds to a different matrix configuration, which are selectably positionable along a projection light path.
-
17. A data input device according to claim 2 and wherein said data entry matrix projector comprises a diffractive optical element having a multiplicity of diffraction orders selected to provide a matrix configuration which has a relatively low maximum diffraction angle.
-
18. A data input device according to claim 2 and wherein said data entry matrix projector comprises a diffractive optical element having a multiplicity of diffraction orders selected to provide a keyboard configuration which has a generally trapezoidal configuration.
-
19. A data input device according to claim 2 and wherein said data entry matrix projector comprises a diffractive optical element having a multiplicity of diffraction orders selected to compensate for geometrical distortions inherent in the operation of said diffractive optical element, particularly at high diffraction angles.
-
20. A data input device according to claim 2 and wherein said data entry matrix projector comprises a diffractive optical element having a multiplicity of diffraction orders selected to compensate for geometrical distortions occasioned by a highly oblique angle of projection.
-
21. A data input device according to claim 2 and wherein in said data entry matrix projector light from a pair of point light sources is combined by beam combiner, such that two light beams emerge from said beam combiner and appear to originate in a single virtual light source positioned behind said beam combiner.
-
22. A data input device according to claim 21 and wherein said light beams pass through a shadow mask onto said projection surface.
-
23. A data input device according to claim 2 and wherein said data entry matrix projector comprises an array of light emitting elements and microlenses.
-
24. A data input device according to claim 23 and wherein said light emitting elements are individually controllable.
-
25. A data input device according to claim 2 and wherein said data entry matrix projector comprises a monolithic pattern of LEDs formed on a unitary substrate.
-
26. A data input device according to claim 1 and wherein said two-dimensional imaging sensor is located on the opposite side of a transparent engagement surface from said at least one engagement plane, whereby the presence of said data entry object at said at least one engagement plane causes light from said illuminator to be scattered and to pass through said transparent engagement surface so as to be detected by said two-dimensional imaging sensor.
-
27. A data input device according to claim 1 and wherein a transparent engagement surface is coextensive with said at least one engagement plane, whereby touching engagement of said data entry object with said transparent engagement surface causes light from said illuminator to be scattered and to pass through said transparent engagement surface so as to be detected by said two-dimensional imaging sensor.
-
28. A data input device according to claim 27 and wherein said transparent engagement surface exhibits total internal reflection of a planar beam of light emitted by an illuminator and coupled to an edge of said transparent engagement surface, whereby touching engagement of said data entry object with said transparent engagement surface causes light from said illuminator to be scattered due to frustrated total internal reflection.
-
29. A data input device according to claim 1 and wherein said data entry processor is operative to map locations on said two-dimensional image sensor to data entry functions.
-
30. A data input device according to claim 29 and wherein said data entry processor is operative to map received light intensity at said locations on said two-dimensional image sensor to said data entry functions.
-
31. A data input device according to claim 1 and wherein said data entry processor comprises the following functionality:
-
as each pixel value is acquired, determining, using the pixel coordinates, whether that pixel lies within a predefined keystroke region; acquiring pixel values for various pixel coordinates; adding or subtracting each pixel value to or from a pixel total maintained for each said keystroke region based on determining a pixel function of each pixel; comparing said pixel total for each said keystroke region with a current key actuation threshold; if the pixel total exceeds the key actuation threshold for a given keystroke region in a given frame and in the previous frame the pixel total did not exceed the key actuation threshold for that keystroke region, providing a key actuation output; and if the pixel total does not exceed the key actuation threshold for a given keystroke region in a given frame and in the previous frame the pixel total did exceed the key actuation threshold for that keystroke region, providing a key deactuation output.
-
-
32. A data input device according to claim 31 and wherein said determining whether that pixel lies within a predefined keystroke region is made by employing a pixel index table which indicates for each pixel, whether that pixel lies within a predetermined keystroke region and, if so, within which keystroke region it lies.
-
33. A data input device according to claim 31 and wherein both of said determining steps employ said pixel index table.
-
34. A data input device according to claim 31 and wherein said pixel total is maintained for each keystroke region in a keystroke region accumulator table.
-
35. A data input device according to claim 31 and wherein said comparing employs a keystroke region threshold table.
-
36. A data input device according to claim 35 and also comprising the following functionality:
-
once all of the pixels in a frame have been processed, determining an updated background level for a frame; and determining a key actuation threshold for said keystroke region threshold table by subtracting the updated background level from a predetermined threshold level which is established for each keystroke region.
-
-
37. A data input device according to claim 31 and wherein said pixel function comprises adding the pixel values of a plurality of pixels in said keystroke region.
-
38. A data input device according to claim 31 and wherein said pixel function comprises adding the pixel values of said plurality of pixels in said keystroke region and subtracting therefrom pixel values of a plurality of pixels in a keystroke region border outside said keystroke region.
-
39. A data input device according to claim 31 and wherein said pixel function comprises adding the pixel values of said plurality of pixels in said keystroke region, ignoring the pixel values of a plurality of pixels in a first keystroke region border outside said keystroke region and subtracting pixel values of a plurality of pixels in a second keystroke region border, outside said first keystroke region border.
-
40. A data input device according to claim 1 and wherein said data entry processor is operative to determine the “
- center of gravity”
of pixel values of pixels in said two-dimensional image sensor.
- center of gravity”
-
41. A data input device according to claim 1 and wherein said data entry processor comprises the following functionality:
-
as each pixel value is acquired, determining, using the pixel coordinates, whether that pixel lies within a predefined active region; acquiring pixel values for various pixel coordinates;
determining the “
center of gravity”
of the pixel values.
-
-
42. A data input device according to claim 41 and wherein determining the “
- center of gravity”
is achieved by;multiplying said pixel values by X and Y values representing the geographic position of each pixel; summing the results along mutually perpendicular axes X and Y; summing the total of the pixel values for all relevant pixels for said active region; and dividing said summed results by said total of said pixel values to determine the X and Y coordinates of the “
center of gravity”
, which represents a desired engagement location.
- center of gravity”
-
43. A data input device according to claim 41 and wherein said pixel values are thresholded prior to summing thereof.
-
44. A data input device according to claim 42 and wherein said pixel values are thresholded prior to summing thereof.
-
45. A data input device according to claim 1 and wherein at least said engagement plane is associated with a pull-down tray in a vehicle.
-
46. A data input device according to claim 45 and wherein said pull-down tray defines an engagement surface which is configured by projection.
-
47. A data input device according to claim 1 and wherein at least said two-dimensional detector and illuminator are associated with a camera.
-
48. A data input device according to claim 1 and wherein at least said two-dimensional detector and illuminator are associated with a home entertainment system.
-
49. A data input device according to claim 48 and wherein said engagement plane overlies a television screen forming part of said home entertainment system.
-
50. A data input device according to claim 1 and wherein at least said engagement plane is associated with a table.
-
51. A data input device according to claim 1 and wherein at least said engagement plane is associated with a remote control device.
-
52. A data input device according to claim 1 and wherein at least said engagement plane is located within a restricted particulate matter environment.
-
53. A data input device according to claim 1 and wherein at least said engagement plane is located within an industrial environment unsuitable for a conventional keyboard.
-
54. A data input device according to claim 1 and wherein at least said two-dimensional detector and illuminator are associated with a video projector.
-
55. A data input device according to claim 1 and wherein at least said two-dimensional detector and illuminator are associated with a restaurant patron interface system.
-
56. A data input device according to claim 1 and wherein at least said two-dimensional detector and illuminator are associated with a mobile audio player.
-
57. A data input device according to claim 1 and wherein at least said two-dimensional detector and illuminator provide touch screen functionality.
-
58. A data input device according to claim 57 and wherein said touch screen functionality employs a video display screen.
-
59. A data input device according to claim 1 and wherein at least said two-dimensional detector and illuminator provide access control functionality.
-
60. A data input device according to claim 1 and wherein at least said engagement plane is associated with a game board.
-
61. A data input device according to claim 60 and wherein said game board defines an engagement surface which is configured by projection.
-
62. A data input device according to claim 1 and wherein at least said engagement plane is associated with a musical instrument.
-
63. A data input device according to claim 62 and wherein said musical instrument defines an engagement surface which is configured by projection.
-
64. A data input device according to claim 1 and wherein at least said two-dimensional detector and illuminator provide vehicle telematics functionality.
-
65. A data input device according to claim 64 and wherein said vehicle defines an engagement surface which is configured by projection.
-
66. A data input device according to claim 1 and wherein at least said two-dimensional detector and illuminator provide automatic vending user interface functionality.
-
74. A data input method according to claim 21 and wherein said two-dimensional imaging sensor comprises a solid state imaging sensor.
-
2. A data input device according to claim 1 and also comprising a data entry matrix projector operative to project at least one visually sensible data entry matrix onto a projection surface underlying said at least one engagement plane.
-
-
67. A data input method comprising:
-
illuminating at least one engagement plane by directing light along said at least one engagement plane; employing a two-dimensional imaging sensor including multiple rows of detector arrays for viewing said at least one engagement plane at a non-zero angle from a location outside said at least one engagement plane for sensing light from said illumination scattered by engagement of a data entry object with said at least one engagement plane; and receiving and processing an output from said two-dimensional imaging sensor and providing a data entry input to utilization circuitry. - View Dependent Claims (68, 69, 70, 71, 72, 73, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133)
-
68. A data input method according to claim 67 and also comprising projecting at least one visually sensible data entry matrix onto a projection surface underlying said at least one engagement plane.
-
69. A data input method according to claim 68 and wherein said at least one visually sensible data entry matrix defines a keyboard.
-
70. A data input method according to claim 68 and wherein said projecting employs a projector light source and a spatial light modulation element operative to receive light from said projector light source and to project at least one visually sensible data entry matrix onto a surface underlying said at least one engagement plane.
-
71. A data input method according to claim 70 and wherein said spatial light modulation element comprises a diffractive optical element.
-
72. A data input method according to claim 70 and wherein said spatial light modulation element comprises an aspheric optical element.
-
73. A data input method according to claim 70 and wherein said spatial light modulation element comprises a transparency.
-
75. A data input method according to claim 68 and wherein said processing correlates said output from said two-dimensional imaging sensor with said at least one visually sensible data entry matrix.
-
76. A data input method according to claim 68 and wherein said projecting employs a diffractive optical element which receives light from a diode laser via a collimating lens.
-
77. A data input method according to claim 76 and wherein light passing through said diffractive optical element is reflected by a curved mirror having optical power via a lens onto said projection surface.
-
78. A data input method according to claim 77 and wherein said diffractive optical element, said mirror and said lens are all integrally formed in a prism.
-
79. A data input method according to claim 68 and wherein said projecting employs an integrally formed beam splitter and diffractive optical elements.
-
80. A data input method according to claim 79 and wherein said projecting a beam of light from a diode laser passes through a collimating lens and impinges on two mutually angled surfaces of said beam splitter, which breaks the beam of light into two beams, each of which passes through a separate diffractive optical element and impinges on said projection surface.
-
81. A data input method according to claim 79 and wherein said diffractive optical elements are integrally formed with said beam splitter in a prism.
-
82. A data input method according to claim 80 and wherein said diffractive optical elements are integrally formed with said beam splitter in a prism.
-
83. A data input method according to claim 68 and wherein said projecting employs a plurality of different diffractive optical elements, each of which typically corresponds to a different matrix configuration, which are selectably positionable along a projection light path.
-
84. A data input method according to claim 68 and wherein said projecting employs a diffractive optical element having a multiplicity of diffraction orders selected to provide a matrix configuration which has a relatively low maximum diffraction angle.
-
85. A data input method according to claim 68 and wherein said projecting employs a diffractive optical element having a multiplicity of diffraction orders selected to provide a keyboard configuration which has a generally trapezoidal configuration.
-
86. A data input method according to claim 68 and wherein said projecting employs a diffractive optical element having a multiplicity of diffraction orders selected to compensate for geometrical distortions inherent in the operation of said diffractive optical element, particularly at high diffraction angles.
-
87. A data input method according to claim 68 and wherein said projecting employs a diffractive optical element having a multiplicity of diffraction orders selected to compensate for geometrical distortions occasioned by a highly oblique angle of projection.
-
88. A data input method according to claim 68 and wherein in said projecting light from a pair of point light sources is combined by beam combiner, such that two light beams emerge from said beam combiner and appear to originate in a single virtual light source positioned behind said beam combiner.
-
89. A data input method according to claim 88 and wherein said light beams pass through a shadow mask onto said projection surface.
-
90. A data input method according to claim 68 and wherein said projecting employs an array of light emitting elements and microlenses.
-
91. A data input method according to claim 90 and wherein said light emitting elements are individually controllable.
-
92. A data input method according to claim 68 and wherein said projecting employs a monolithic pattern of LEDs formed on a unitary substrate.
-
93. A data input method according to claim 67 and wherein said two-dimensional imaging sensor is located on the opposite side of a transparent engagement surface from said at least one engagement plane, whereby the presence of said data entry object at said at least one engagement plane causes light from said illuminator to be scattered and to pass through said transparent engagement surface so as to be detected by said two-dimensional imaging sensor.
-
94. A data input method according to claim 67 and wherein a transparent engagement surface is coextensive with said at least one engagement plane, whereby touching engagement of said data entry object with said transparent engagement surface causes light from said illuminator to be scattered and to pass through said transparent engagement surface so as to be detected by said two-dimensional imaging sensor.
-
95. A data input method according to claim 94 and wherein said transparent engagement surface exhibits total internal reflection of a planar beam of light emitted by an illuminator and coupled to an edge of said transparent engagement surface, whereby touching engagement of said data entry object with said transparent engagement surface causes light from said illuminator to be scattered due to frustrated total internal reflection.
-
96. A data input method according to claim 67 and wherein said processing is operative to map locations on said two-dimensional image sensor to data entry functions.
-
97. A data input method according to claim 96 and wherein said processing is operative to map received light intensity at said locations on said two-dimensional image sensor to said data entry functions.
-
98. A data input method according to claim 67 and wherein said processing comprises the following:
-
as each pixel value is acquired, determining, using the pixel coordinates, whether that pixel lies within a predefined keystroke region; acquiring pixel values for various pixel coordinates; adding or subtracting each pixel value to or from a pixel total maintained for each said keystroke region based on determining a pixel function of each pixel; comparing said pixel total for each said keystroke region with a current key actuation threshold; if the pixel total exceeds the key actuation threshold for a given keystroke region in a given frame and in the previous frame the pixel total did not exceed the key actuation threshold for that keystroke region, providing a key actuation output; and if the pixel total does not exceed the key actuation threshold for a given keystroke region in a given frame and in the previous frame the pixel total did exceed the key actuation threshold for that keystroke region, providing a key deactuation output.
-
-
99. A data input method according to claim 98 and wherein said determining whether that pixel lies within a predefined keystroke region is made by employing a pixel index table which indicates for each pixel, whether that pixel lies within a predetermined keystroke region and, if so, within which keystroke region it lies.
-
100. A data input method according to claim 98 and wherein both of said determining steps employ said pixel index table.
-
101. A data input method according to claim 98 and wherein said pixel total is maintained for each keystroke region in a keystroke region accumulator table.
-
102. A data input method according to claim 98 and wherein said comparing employs a keystroke region threshold table.
-
103. A data input method according to claim 102 and also comprising the following:
-
once all of the pixels in a frame have been processed, determining an updated background level for a frame; and determining a key actuation threshold for said keystroke region threshold table by subtracting the updated background level from a predetermined threshold level which is established for each keystroke region.
-
-
104. A data input method according to claim 98 and wherein said pixel function comprises adding the pixel values of a plurality of pixels in said keystroke region.
-
105. A data input method according to claim 98 and wherein said pixel function comprises adding the pixel values of said plurality of pixels in said keystroke region and subtracting therefrom pixel values of a plurality of pixels in a keystroke region border outside said keystroke region.
-
106. A data input method according to claim 98 and wherein said pixel function comprises adding the pixel values of said plurality of pixels in said keystroke region, ignoring the pixel values of a plurality of pixels in a first keystroke region border outside said keystroke region and subtracting pixel values of a plurality of pixels in a second keystroke region border, outside said first keystroke region border.
-
107. A data input method according to claim 67 and wherein said processing is operative to determine the “
- center of gravity”
of pixel values of pixels in said two-dimensional image sensor.
- center of gravity”
-
108. A data input method according to claim 67 and wherein said processing comprises the following:
-
as each pixel value is acquired, determining, using the pixel coordinates, whether that pixel lies within a predefined active region; acquiring pixel values for various pixel coordinates;
determining the “
center of gravity”
of the pixel values.
-
-
109. A data input method according to claim 108 and wherein determining the “
- center of gravity”
is achieved by;multiplying said pixel values by X and Y values representing the geographic position of each pixel; summing the results along mutually perpendicular axes X and Y; summing the total of the pixel values for all relevant pixels for said active region; and dividing said summed results by said total of said pixel values to determine the X and Y coordinates of the “
center of gravity”
, which represents a desired engagement location.
- center of gravity”
-
110. A data input method according to claim 108 and wherein said pixel values are thresholded prior to summing thereof.
-
111. A data input method according to claim 109 and wherein said pixel values are thresholded prior to summing thereof.
-
112. A data input method according to claim 98 and wherein at least said engagement plane is associated with a pull-down tray in a vehicle.
-
113. A data input method according to claim 112 and wherein said pull-down tray defines an engagement surface which is configured by projection.
-
114. A data input method according to claim 67 and wherein said receiving and processing are associated with a camera.
-
115. A data input method according to claim 67 and wherein said receiving and processing are associated with a home entertainment system.
-
116. A data input method according to claim 115 and wherein said engagement plane overlies a television screen forming part of said home entertainment system.
-
117. A data input method according to claim 67 and wherein at least said engagement plane is associated with a table.
-
118. A data input method according to claim 67 and wherein at least said engagement plane is associated with a remote control method.
-
119. A data input method according to claim 67 and wherein at least said engagement plane is located within a restricted particulate matter environment.
-
120. A data input method according to claim 67 and wherein at least said engagement plane is located within an industrial environment unsuitable for a conventional keyboard.
-
121. A data input method according to claim 67 and wherein said receiving and processing are associated with a video projector.
-
122. A data input method according to claim 67 and wherein said receiving and processing are associated with a restaurant patron interface system.
-
123. A data input method according to claim 67 and wherein said receiving and processing are associated with a mobile audio player.
-
124. A data input method according to claim 67 and wherein said receiving and processing provide touch screen functionality.
-
125. A data input method according to claim 124 and wherein said touch screen functionality employs a video display screen.
-
126. A data input method according to claim 67 and wherein said receiving and processing provide access control functionality.
-
127. A data input method according to claim 67 and wherein at least said engagement plane is associated with a game board.
-
128. A data input method according to claim 127 and wherein said game board defines an engagement surface which is configured by projection.
-
129. A data input method according to claim 67 and wherein at least said engagement plane is associated with a musical instrument.
-
130. A data input method according to claim 129 and wherein said musical instrument defines an engagement surface which is configured by projection.
-
131. A data input method according to claim 67 wherein said receiving and processing provide vehicle telematics functionality.
-
132. A data input method according to claim 131 and wherein said vehicle defines an engagement surface which is configured by projection.
-
133. A data input method according to claim 67 and wherein said receiving and processing provide automatic vending user interface functionality.
-
68. A data input method according to claim 67 and also comprising projecting at least one visually sensible data entry matrix onto a projection surface underlying said at least one engagement plane.
-
Specification
- Resources
Thank you for your request. You will receive a custom alert email when the Litigation Campaign Assessment is available.
×
-
Current AssigneeVKB, Inc.
-
Original AssigneeVKB, Inc.
-
InventorsLieberman, Klony, Sharon, Yuval, Maor, Yaniv, Naimi, Eyal, Tsachi, Mattan, Turm, Amichai, Levy, Amiram, Arnon, Boas
-
Primary Examiner(s)Hjerpe; Richard
-
Assistant Examiner(s)Shapiro; Leonid
-
Application NumberUS10/250,350Publication NumberTime in Patent Office2,052 DaysField of Search345/158, 345/156, 345168-169, 345/173, 345/174, 345/175US Class Current345/158CPC Class CodesG05B 2219/36156 Keyboard as a drawerG06F 2203/04109 FTIR in optical digitiser, ...G06F 3/0426 tracking fingers with respe...G07F 7/10 together with a coded signa...G07F 7/1041 PIN input keyboard gets new...G10H 1/34 Switch arrangements, e.g. k...G10H 2220/101 for graphical creation, edi...G10H 2220/161 with 2D or x/y surface coor...G10H 2220/221 Keyboards, i.e. configurati...G10H 2220/305 using a light beam to detec...G10H 2220/455 Camera input, e.g. analyzin...H03K 17/941 using an optical detector H...H03K 17/9631 using a light source as par...H03K 17/9638 using a light guideH03K 2217/96046 Key-pad combined with displ...