Virtual touchpads for wearable and portable devices
First Claim
1. An apparatus optically interfacing with an operator responsive to a predefined set of tactile gestures performed by one or more fingers on one or more virtual touchpads assigned on portions of the surface adjacent to said apparatus, comprising:
- one or more cameras placed along each side of the apparatus facing each one of said virtual touchpads, said cameras configured to obtain sequences of captured images of said virtual touchpads and their surroundings;
a display; and
a computer coupled to said cameras and said display, and configured to capture the positions of one or more finger tips and one or more visible surface artifacts that include blemishes, blood vessels, wrinkles, hair, stains, or textures on said virtual touchpads, correlate movements of said surface artifacts with said finger tip positions to distinguish between hovering and touching activities, analyze said hovering and touching activities to detect an operator tactile gesture, and to simulate, based on the detected tactile gesture, a control input to modify said display responsive to said gesture.
1 Assignment
0 Petitions
Accused Products
Abstract
Systems and methods for using portions of the surface around a wearable smart device as virtual touchpads for data and command entry are disclosed. For wearable smart devices worn on the wrist like smartwatches, the skin area over the back hand as well as the surface area over the arm adjacent to the device may serve as virtual touchpads. One or more side looking cameras facing each virtual touchpad may be used to detect touches and movements by one or more fingers on the virtual touchpads to simulate equivalent tactile gestures that control the device. Various surface artifact movements captured by the cameras may be used to confirm that the finger is actually touching the virtual touchpad and not just hovering above it. Sound sensors that independently detect and locate finger touches, as well as motion sensors that detect vibration corresponding to actual touch tap events, may be combined with the cameras to improve the detection accuracy of the tactile gestures. Touch position may also be detected by triangulation of encoded RF signals that propagate from one side of the device via the operator body to the back hand virtual touchpad. Two or more virtual touchpads can operate in interchangeable or complementary modes. During interchangeable operation mode, all virtual touchpads may respond to all the tactile gestures supported by the device. In complementary mode, some virtual touchpads may be restricted to recognize only a specific subset of the supported gestures.
13 Citations
24 Claims
-
1. An apparatus optically interfacing with an operator responsive to a predefined set of tactile gestures performed by one or more fingers on one or more virtual touchpads assigned on portions of the surface adjacent to said apparatus, comprising:
-
one or more cameras placed along each side of the apparatus facing each one of said virtual touchpads, said cameras configured to obtain sequences of captured images of said virtual touchpads and their surroundings; a display; and a computer coupled to said cameras and said display, and configured to capture the positions of one or more finger tips and one or more visible surface artifacts that include blemishes, blood vessels, wrinkles, hair, stains, or textures on said virtual touchpads, correlate movements of said surface artifacts with said finger tip positions to distinguish between hovering and touching activities, analyze said hovering and touching activities to detect an operator tactile gesture, and to simulate, based on the detected tactile gesture, a control input to modify said display responsive to said gesture. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 24)
-
-
9. A computer-implemented method comprising:
-
assigning one or more adjacent portions of the body surface area of a human operator wearing a wearable device as virtual touchpads, said virtual touchpads are placed along one or more sides of said wearable device; capturing a sequence of images of said one or more virtual touchpads; extracting from said captured images a sequence of positions of one or more finger tips and the adjacent surface artifacts on said virtual touchpads, said surface artifacts include blood vessels, blemishes, wrinkles, hair, stains, or textures; correlating movements of said surface artifacts with said finger tip positions to distinguish between hovering and touching activities; analyzing said hovering and touching activities to detect a predefined gesture from a set of predefined gestures; simulating a generic tactile touchpad input command that corresponds to said detected gesture; and controlling said wearable device in response to said simulated tactile input command. - View Dependent Claims (10, 11, 20, 21, 22, 23)
-
-
12. A system comprising a portable device placed on a surface, said portable device further comprising:
-
one or more processors; a display; one or more cameras coupled to said one or more processors and configured to detect one or more finger touches on one or more virtual touchpads assigned on portions of said surface, said virtual touchpads are adjacent to said portable device; memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for; capturing from said cameras a sequence of operator touch positions on said virtual touchpads; extracting the positions of surface artifacts from images of said virtual touchpads when fingers are detected, said surface artifacts include blood vessels, stains wrinkles, hair, blemishes or textures; correlating movements of said surface artifacts with said sequence of operator touch positions to distinguish between hovering and touching activities; analyzing said sequence of operator touch positions to detect a predefined tactile gesture from a set of predefined tactile gestures; and generating a tactile input command corresponding to said detected gesture to modify said display in response to the detected tactile gesture. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19)
-
Specification