Using pressure differences with a touch-sensitive display screen
First Claim
1. On a personal electronic device with a touch-sensitive screen, a method for responding to user input, the method comprising:
- receiving a plurality of datapoints from the touch-sensitive screen, each datapoint comprising position information;
for each of a plurality of the received datapoints, associating a pressure with the datapoint;
associating at least one rate of change of pressure with at least a subset of the datapoints; and
based, at least in part, on the associated rate-of-change-of-pressure information, performing a user-interface action.
3 Assignments
0 Petitions
Accused Products
Abstract
Disclosed is a user interface that responds to differences in pressure detected by a touch-sensitive screen. The user selects one type of user-interface action by “lightly” touching the screen and selects another type of action by exerting more pressure. Embodiments can respond to single touches, to gestural touches that extend across the face of the touch-sensitive screen, and to touches in which the user-exerted pressure varies during the course of the touch. Some embodiments respond to how quickly the user changes the amount of pressure applied. In some embodiments, the location and pressure of the user'"'"'s input are compared against a stored gesture profile. Action is taken only if the input matches “closely enough” to the stored gesture profile. In some embodiments, a notification is sent to the user when the pressure exceeds a threshold between a light and a heavy press.
-
Citations
20 Claims
-
1. On a personal electronic device with a touch-sensitive screen, a method for responding to user input, the method comprising:
-
receiving a plurality of datapoints from the touch-sensitive screen, each datapoint comprising position information; for each of a plurality of the received datapoints, associating a pressure with the datapoint; associating at least one rate of change of pressure with at least a subset of the datapoints; and based, at least in part, on the associated rate-of-change-of-pressure information, performing a user-interface action. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A personal electronic device comprising:
-
a touch-sensitive screen; and a processor operatively connected to the touch-sensitive screen and configured for; receiving a plurality of datapoints from the touch-sensitive screen, each datapoint comprising position information; for each of a plurality of the received datapoints, associating a pressure with the datapoint; associating at least one rate of change of pressure with at least a subset of the datapoints; and based, at least in part, on the associated rate-of-change-of-pressure information, performing a user-interface action. - View Dependent Claims (9, 10)
-
-
11. On a personal electronic device with a touch-sensitive screen, a method for responding to user input, the method comprising:
-
receiving a plurality of datapoints from the touch-sensitive screen, each datapoint comprising position information; for each of a plurality of the received datapoints, associating a pressure with the datapoint; for a plurality of the datapoints with associated pressure, comparing the position information and the associated pressure against a stored gesture profile; and if the compared datapoints are within a threshold of the stored gesture profile, then performing a user-interface action based, at least in part, on the compared datapoints. - View Dependent Claims (12, 13, 14, 15)
-
-
16. A personal electronic device comprising:
-
a touch-sensitive screen; and a processor operatively connected to the touch-sensitive screen and configured for; receiving a plurality of datapoints from the touch-sensitive screen, each datapoint comprising position information; for each of a plurality of the received datapoints, associating a pressure with the datapoint; for a plurality of the datapoints with associated pressure, comparing the position information and the associated pressure against a stored gesture profile; and if the compared datapoints are within a threshold of the stored gesture profile, then performing a user-interface action based, at least in part, on the compared datapoints. - View Dependent Claims (17, 18, 19, 20)
-
Specification