Visual feedback by identifying anatomical features of a hand
First Claim
1. A system usable by an intended user having a hand to operate a graphical user interface, said hand defining predetermined anatomical features, said graphical user interface including a window, said system comprising:
- a display;
an input device, said input device including;
an input element defining an input surface and an opposed undersurface, said input element being at least partially transparent; and
an image sensor configured located and operative for capturing a hand image of said hand through said input element when said hand is located substantially adjacent to said input surface and either entirely spaced apart therefrom or contacting said input surface;
an image processor coupled to said image sensor and operative for receiving said hand image from said image sensor and for processing said hand image to identify said predetermined anatomical features of said hand in said hand image and associate with each of said predetermined anatomical features a corresponding feature location, said feature locations being indicative of positions of said predetermined anatomical features relative to said input surface; and
a display interface coupled to said image processor and to said display and operative for displaying on said display said window and displaying cursors in said window, said cursors being each associated with a corresponding ones of said predetermined anatomical features, said cursors being positioned so that the position of said cursors relative to said window corresponds to said feature locations relative to said input surface.
0 Assignments
0 Petitions
Accused Products
Abstract
A method (62) for providing visual feedback to a user having a hand (14) and operating a graphical user interface (28) on a display (24) using an input device (10). The input device (10) includes an image sensor (46) and an input element (40) defining an input surface (42), the input element (40) being at least partially transparent. The method (62) includes: with the hand (14) substantially adjacent to the input element (40), acquiring a hand image of the hand (14) through the input element (40) using the image sensor (46); identifying predetermined anatomical features (13) in the hand image and associate with each a corresponding feature location indicative of a position of the predetermined anatomical features (13) relative to the input surface (40); and displaying cursors (34, 61) in a window (30) of the graphical user interface (28) at positions relative to the window (30) that correspond to the feature locations relative to the input surface (40).
-
Citations
36 Claims
-
1. A system usable by an intended user having a hand to operate a graphical user interface, said hand defining predetermined anatomical features, said graphical user interface including a window, said system comprising:
-
a display; an input device, said input device including; an input element defining an input surface and an opposed undersurface, said input element being at least partially transparent; and an image sensor configured located and operative for capturing a hand image of said hand through said input element when said hand is located substantially adjacent to said input surface and either entirely spaced apart therefrom or contacting said input surface; an image processor coupled to said image sensor and operative for receiving said hand image from said image sensor and for processing said hand image to identify said predetermined anatomical features of said hand in said hand image and associate with each of said predetermined anatomical features a corresponding feature location, said feature locations being indicative of positions of said predetermined anatomical features relative to said input surface; and a display interface coupled to said image processor and to said display and operative for displaying on said display said window and displaying cursors in said window, said cursors being each associated with a corresponding ones of said predetermined anatomical features, said cursors being positioned so that the position of said cursors relative to said window corresponds to said feature locations relative to said input surface. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19)
-
-
20. An input device usable with a computing device by an intended user having a hand, said input device comprising:
- an input element defining an input surface and an opposed undersurface, said input element being at least partially transparent;
an image sensor for capturing through said input element a hand image of said hand when said hand is located substantially adjacent to said input surface and either entirely spaced apart therefrom or contacting said input surface;
an input device interface couplable to said computing device and operative for transmitting to said computing device image information obtained from said hand image;
wherein said hand defines at least two predetermined anatomical features, said image processor being operative for receiving said hand image from said image sensor and for processing said hand image to identify said at least two predetermined anatomical features of said hand in said hand image and associate with each of said at least two predetermined anatomical features a corresponding feature location, said feature locations being indicative of a position of said at least two predetermined anatomical features relative to said input surface; and
said image information includes said feature locations. - View Dependent Claims (21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36)
- an input element defining an input surface and an opposed undersurface, said input element being at least partially transparent;
Specification