SYSTEM AND METHOD FOR DIFFERENTIATING BETWEEN INTENDED AND UNINTENDED USER INPUT ON A TOUCHPAD
First Claim
1. A method for distinguishing intended user input from unintended user interaction with a touchpad comprising:
- receiving a touch sensor signal from the touchpad indicating points of contact between sensors of the touchpad and a user, wherein a point of contact is a location of a sensor on the touchpad that is in contact with the user;
identifying points of contact corresponding to at least one of the user'"'"'s finger and points of contact corresponding to at least one of a user'"'"'s thumb and a user'"'"'s palm based on the touch sensor signal and a model of a hand; and
identifying intended user input from the touch sensor signal wherein the intended user input corresponds to a portion of the touch sensor signal originating from the points of contact corresponding to a user'"'"'s finger.
2 Assignments
0 Petitions
Accused Products
Abstract
A method and system for differentiating between intended user input and inadvertent or incidental contact with a touchpad is herein disclosed. When a user engages the touchpad, sensors on the touchpad are activated and generate touch sensor signals. Based on the pattern of engaged sensors, a hand pattern can be determined. From the hand pattern, a hand model may be retrieved. The hand model may indicate passive zones and active zones. Contact in the active zones may be considered intentional, while contact in the passive zones may be considered unintended or incidental. Moreover, a global shift may be calculated, and input from the active zones may be compensated for the global shift. The input from the active zones can then be used to control a graphical user interface.
129 Citations
34 Claims
-
1. A method for distinguishing intended user input from unintended user interaction with a touchpad comprising:
-
receiving a touch sensor signal from the touchpad indicating points of contact between sensors of the touchpad and a user, wherein a point of contact is a location of a sensor on the touchpad that is in contact with the user; identifying points of contact corresponding to at least one of the user'"'"'s finger and points of contact corresponding to at least one of a user'"'"'s thumb and a user'"'"'s palm based on the touch sensor signal and a model of a hand; and identifying intended user input from the touch sensor signal wherein the intended user input corresponds to a portion of the touch sensor signal originating from the points of contact corresponding to a user'"'"'s finger. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method for controlling a user interface using a touchpad comprising:
-
receiving a touch sensor signal from the touchpad indicating points of contact between sensors of the touchpad and a user, wherein a point of contact is a location of a sensor on the touchpad that is in contact with the user estimating a user hand pattern indicating an orientation of the user'"'"'s hand with respect to the touchpad based on the touch sensor signals; retrieving a hand model from a hand model database based on the user hand pattern, wherein the hand model indicates active spatial locations relative to the touchpad where the user'"'"'s hand motions are classified as intended motions and passive spatial locations relative to the touchpad where the user'"'"'s hand motions are classified as inadvertent contact; calculating a global shift indicating movement of a majority of the user'"'"'s hand relative to the touchpad; and adjusting touch sensor signals corresponding to the active spatial locations based on the calculated global shift; and controlling a user interface based on adjusted touch sensor signals. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23)
-
-
24. A method for controlling a user interface using a touchpad comprising:
-
receiving touch sensor signals from a touchpad, wherein a touch sensor signal includes a plurality of contact points between a user'"'"'s hand and the touchpad; estimating a user hand pattern that indicates an orientation of the user'"'"'s hand with respect to the touchpad based on the touch sensor signals; retrieving a hand model from a hand model database based on the user hand pattern, wherein the hand model indicates active spatial locations relative to the touchpad where the user'"'"'s hand motions are classified as intended motions and passive spatial locations relative to the touchpad where the user'"'"'s hand motions are classified as inadvertent; and controlling a graphical user interface based on touch sensor signals received from sensors in the active spatial locations of the touchpad. - View Dependent Claims (25, 26, 27, 28, 29, 30, 31, 32, 33)
-
-
34. A device having a graphical user interface comprising:
-
a front surface and a rear surface; a display unit on the front surface of the device; a touchpad on the rear surface of the device having sensors dispersed along an outer surface of the touchpad, wherein the sensors are sensitive to a user'"'"'s touch, and wherein the touchpad generates a touch sensor signal indicating locations of points of contact between the user'"'"'s hand and the touchpad; a signal processing module that receives the touch sensor signal and determines a user hand pattern based on the touch sensor signal and a model of a hand, wherein a hand pattern includes points of contact of the user'"'"'s fingers and points of contact of at least one of the user'"'"'s palm and the user'"'"'s thumb; and the signal processing module identifies intended user input from the touch sensor signal, wherein the intended user input corresponds to a portion of the touch sensor signal originating from the points of contact corresponding to a user'"'"'s finger.
-
Specification