Method for user input from alternative touchpads of a handheld computerized device
First Claim
1. A method of creating a virtual image of at least portions of the user'"'"'s hand or hands while operating a handheld computerized device, said handheld computerized device comprising at least one touchpad, at least one graphics display screen, at least one image sensor configured to photograph at least part of the region of space proximate said at least one touchpad, but not configured to photograph the surface of said touchpad, at least one processor, memory, and software, said method comprising:
- obtaining data on the location and movement of the user'"'"'s fingers and/or hand using said touchpad, said user'"'"'s fingers and/or hand being positioned in an arbitrary manner with respect to said touchpad when contacting said touchpad, said data not being associated with an image of the user'"'"'s fingers from said image sensor, said at least one image sensor comprising at least one video camera;
said touchpad being located in a location of said handheld computerized device that is different from the location of said at least one display screen;
analyzing said data from said touchpad on the location and movement of said user'"'"'s fingers and/or hand according to a model of a human hand, and assigning said data on the location and movement of said user'"'"'s fingers and/or hand to specific fingers on said model of said human hand, thereby making predictions as to the location of the user'"'"'s hand and fingers;
using said model of said human hand, and said predictions of the location of the user'"'"'s hand and fingers, to compute a graphical representation of at least said user'"'"'s fingers;
further using images of portions of the user'"'"'s hand or hands that do not contain images pertaining to the position of the user'"'"'s fingers to further supplement or refine said model of said human hand; and
displaying said graphical representation of at least said user'"'"'s fingers on said at least one graphics display screen of said handheld computerized device.
1 Assignment
0 Petitions
Accused Products
Abstract
The present invention relates to a handheld computerized device with a bit mapped display screen on the front panel, and a touchpad installed in an alternative location. More particularly, the invention relates to a method and graphical user interface that can, for example, enable the user to see the user'"'"'s finger position and motion from behind the device superimposed upon a virtual keyboard layout on the front panel. This can allow the user to use a touchpad keypad on the back of the device to input keystrokes and mouse actions, and these will be reflected on the display screen on the front of the handheld computerized device as “virtual fingers” or equivalent. The system operates by inputting touchpad data into a software biomechanical and anatomical model of the human hand, optionally supplemented by video or image information to refine this model.
50 Citations
20 Claims
-
1. A method of creating a virtual image of at least portions of the user'"'"'s hand or hands while operating a handheld computerized device, said handheld computerized device comprising at least one touchpad, at least one graphics display screen, at least one image sensor configured to photograph at least part of the region of space proximate said at least one touchpad, but not configured to photograph the surface of said touchpad, at least one processor, memory, and software, said method comprising:
-
obtaining data on the location and movement of the user'"'"'s fingers and/or hand using said touchpad, said user'"'"'s fingers and/or hand being positioned in an arbitrary manner with respect to said touchpad when contacting said touchpad, said data not being associated with an image of the user'"'"'s fingers from said image sensor, said at least one image sensor comprising at least one video camera; said touchpad being located in a location of said handheld computerized device that is different from the location of said at least one display screen; analyzing said data from said touchpad on the location and movement of said user'"'"'s fingers and/or hand according to a model of a human hand, and assigning said data on the location and movement of said user'"'"'s fingers and/or hand to specific fingers on said model of said human hand, thereby making predictions as to the location of the user'"'"'s hand and fingers; using said model of said human hand, and said predictions of the location of the user'"'"'s hand and fingers, to compute a graphical representation of at least said user'"'"'s fingers; further using images of portions of the user'"'"'s hand or hands that do not contain images pertaining to the position of the user'"'"'s fingers to further supplement or refine said model of said human hand; and displaying said graphical representation of at least said user'"'"'s fingers on said at least one graphics display screen of said handheld computerized device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A method of creating a virtual image of at least portions of the user'"'"'s hand or hands while operating a handheld computerized device, said handheld computerized device comprising at least one touchpad, at least one graphics display screen, at least one image sensor configured to photograph at least part of the region of space proximate said at least one touchpad, but not configured to photograph the surface of said touchpad, at least one processor, memory, and software, said method comprising:
-
further displaying at least one data entry location on said at least one graphics display screen of said handheld computerized device; obtaining data on the location and movement of the user'"'"'s fingers and/or hand using said touchpad, said user'"'"'s fingers and/or hand being positioned in an arbitrary manner with respect to said touchpad when contacting said touchpad, said data not being associated with an image of the user'"'"'s fingers from said image sensor, said at least one image sensor comprising at least one video camera; said touchpad being located in a location of said handheld computerized device that is different from the location of said at least one display screen; analyzing said data from said touchpad on the location and movement of said user'"'"'s fingers and/or hand according to a model of a human hand, assigning said data on the location and movement of said user'"'"'s fingers and/or hand to specific fingers on said model of said human hand, thereby making predictions as to the location of the user'"'"'s hand and fingers; wherein the model of a human hand is supplemented or refined by obtaining at least one full or partial image of the user hand or hands; using said model of said human hand, and said predictions of the location of the user'"'"'s hand and fingers, to compute a graphical representation of at least said user'"'"'s fingers; further using images of portions of the user'"'"'s hand or hands that do not contain images pertaining to the position of the user'"'"'s fingers to further supplement or refine said model of said human hand; displaying said graphical representation of at least said user'"'"'s fingers on said at least one graphics display screen of said handheld computerized device; wherein distances between said graphical representation of at least said user'"'"'s fingers on said at least one graphics display screen, and said at least one data entry location, give information to said user to facilitate said user to position said user'"'"'s fingers and/or hand on said at least one touchpad to enter user data into said at least one data entry location. - View Dependent Claims (14, 15, 16, 17, 18, 19, 20)
-
Specification