Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
First Claim
1. A method to obtain information from an interaction of at least one user-object with a virtual input device defined on a device plane, the method comprising the following steps:
- (a) generating with a fan beam a plane of light substantially parallel to and spaced-above said device plane a distance sufficiently small such that penetration of said plane of light by said at least one user-object is equivalent to interactingly touching a virtual input device position on said device plane;
(b) using a single sensor that acquires data representing a single image at a given time to determine from light reflected from said plane of light if and when at least a portion of said at least one user-object penetrates said plane of light and thus interactingly touches a position on said device plane; and
(c) for each contact with said device plane determined to occur at step (b) determining contact position relative to said single sensor;
wherein a function of said virtual input device associated with each contact position determined at step (c) is ascertainable.
2 Assignments
0 Petitions
Accused Products
Abstract
A system used with a virtual device inputs or transfers information to a companion device, and includes two optical systems OS1, OS2. In a structured-light embodiment, OS1 emits a fan beam plane of optical energy parallel to and above the virtual device. When a user-object penetrates the beam plane of interest, OS2 registers the event. Triangulation methods can locate the virtual contact, and transfer user-intended information to the companion system. In a non-structured active light embodiment, OS1 is preferably a digital camera whose field of view defines the plane of interest, which is illuminated by an active source of optical energy. Preferably the active source, OS1, and OS2 operate synchronously to reduce effects of ambient light. A non-structured passive light embodiment is similar except the source of optical energy is ambient light. A subtraction technique preferably enhances the signal/noise ratio. The companion device may in fact house the present invention.
-
Citations
20 Claims
-
1. A method to obtain information from an interaction of at least one user-object with a virtual input device defined on a device plane, the method comprising the following steps:
-
(a) generating with a fan beam a plane of light substantially parallel to and spaced-above said device plane a distance sufficiently small such that penetration of said plane of light by said at least one user-object is equivalent to interactingly touching a virtual input device position on said device plane;
(b) using a single sensor that acquires data representing a single image at a given time to determine from light reflected from said plane of light if and when at least a portion of said at least one user-object penetrates said plane of light and thus interactingly touches a position on said device plane; and
(c) for each contact with said device plane determined to occur at step (b) determining contact position relative to said single sensor;
wherein a function of said virtual input device associated with each contact position determined at step (c) is ascertainable. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20)
(d) transferring to a companion device information commensurate with contact position determined at step (c) relative to said virtual input device;
wherein Interaction of said at least one user-object with said virtual input device affects operation of said companion device.
-
-
3. The method of claim 2, wherein said companion device includes at least one of (i) a PDA, (ii) a portable communication device, (iii) an electronic device, (iv) an electronic game device, and (v) a musical instrument, and said virtual input device is at least one of (I) a virtual keyboard, (II) a virtual mouse, (III) a virtual trackball, (IV) a virtual pen, (V) a virtual trackpad, and (VI) a user-interface selector.
-
4. The method of claim 2, wherein:
-
said user-object includes at least a portion of a user'"'"'s hand;
said virtual input device includes a virtual keyboard; and
said companion device includes at least one of (i) a PDA, (ii) a mobile telephone, and (iii) a computer.
-
-
5. The method of claim 2, wherein:
-
said user-object includes at least a portion of a user'"'"'s hand;
said virtual input device includes a virtual mouse, and said companion device includes at least one of (i) a PDA, (ii) a mobile telephone, and (iii) a computer.
-
-
6. The method of claim 2, wherein:
-
said user-object includes at least a portion of a user'"'"'s hand;
said virtual input device includes a virtual trackball, and said companion device includes at least one of (i) a PDA, (ii) a mobile telephone, and (iii) a computer.
-
-
7. The method of claim 2, wherein:
-
said user-object includes at least a portion of a user'"'"'s hand;
said virtual input device includes a virtual pen, and said companion device includes at least one of (i) a PDA, (ii) a mobile telephone, and (iii) a computer.
-
-
8. The method of claim 1, wherein step (a) includes generating said plane of light using optical energy, and wherein step (b) includes detecting a reflected portion of said optical energy when at least a portion of said user-object penetrates said plane of light.
-
9. The method of claim 1, wherein at least one of step (b) and step (c) is carried out using triangulation analysis.
-
10. The method of claim claim 1, wherein said virtual input device is mapped to a work surface selected from at least one of (i) a table top, (ii) a desk top, (iii) a wall, (iv) a point-of-sale appliance, (v) a point-of-service appliance, (vi) a kiosk, (vii) a surface in a vehicle, (viii) a projected display, (ix) a physical display, (x) a CRT, and (xi) an LCD.
-
11. The method of claim 1, wherein step (b) includes providing a camera having a lens with a lens optical axis, and providing an image plane, and further including improving at least one of resolution and depth of field of said camera by tilting said image plane relative to said lens optical axis.
-
12. The method of claim 1, wherein:
-
step (a) includes defining said plane of light using an optical source; and
step (b) includes providing a camera to sense penetration of said plane of light.
-
-
13. The method of claim 12, further including:
-
synchronizing operation of said optical source and said camera;
wherein effects of ambient light upon accuracy of information obtained at at least one of step (b) and step (c) are reduced.
-
-
14. The method of claim 12, wherein said optical source emits optical energy bearing a signature used to reject ambient light.
-
15. The method of claim 1, wherein:
-
step (b) includes acquiring ambient light information generated by ambient light by determining when said user-object is distant from said plane of light; and
at least one of step (b) and step (c) includes subtracting said ambient light information from information acquired when said user-object interacts with said virtual input device;
wherein effects of ambient light are reduced.
-
-
16. The method of claim 1, wherein said user-object includes at least a portion of a user'"'"'s hand.
-
17. The method of claim 1, wherein said virtual input device includes a virtual keyboard.
-
18. The method of claim 1, wherein said virtual input device includes a virtual mouse.
-
19. The method of claim 1, wherein said virtual input device includes a virtual trackball.
-
20. The method of claim 1, wherein said virtual input device a virtual pen.
Specification