Determining user handedness and orientation using a touchscreen device
First Claim
1. A computer-implemented method comprising:
- receiving a touch input through a proximity-sensitive display on a user device;
classifying the touch input as likely a left-handed touch input or a right-handed touch input;
identifying a particular portion of the user interface that is likely to not be visually obscured based on classifying the touch input as likely a left-handed touch input or a right-handed touch input;
determining to position a user interface element on the particular portion of the user interface that, based on classifying the touch input as likely a left-handed touch input or a right-handed touch input, is identified as likely to not be visually obscured; and
providing, for display on the proximity-sensitive display on the user device, the user interface including the user interface element positioned on the particular portion of the user interface that, based on classifying the touch input as likely a left-handed touch input or a right-handed touch input, is identified as likely to not be visually obscured.
2 Assignments
0 Petitions
Accused Products
Abstract
The present disclosure provides techniques for determining the position and/or orientation of a pointing device relative to the screen on a touchscreen device. A method may include receiving first orientation data from a first device that may include a capacitive touch surface. A touch point may be received indicating a location of a touch by a user on the capacitive touch surface. Second orientation data may be received from a second device. The first and second orientation data may be correlated to determine a relative orientation of the first device to the second device. A position of a pointing device may be determined based on the touch point and the relative orientation of the first and second devices. Additionally, multiple distances relative to a capacitive touch surface may be received, and based on the multiple distances, a position of a user'"'"'s finger, hand, and/or arm may be determined.
-
Citations
20 Claims
-
1. A computer-implemented method comprising:
-
receiving a touch input through a proximity-sensitive display on a user device; classifying the touch input as likely a left-handed touch input or a right-handed touch input; identifying a particular portion of the user interface that is likely to not be visually obscured based on classifying the touch input as likely a left-handed touch input or a right-handed touch input; determining to position a user interface element on the particular portion of the user interface that, based on classifying the touch input as likely a left-handed touch input or a right-handed touch input, is identified as likely to not be visually obscured; and providing, for display on the proximity-sensitive display on the user device, the user interface including the user interface element positioned on the particular portion of the user interface that, based on classifying the touch input as likely a left-handed touch input or a right-handed touch input, is identified as likely to not be visually obscured. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A non-transitory computer-readable storage medium encoded with a computer program, the computer program comprising instructions that, upon execution by a computer, cause the computer to perform operations comprising:
-
receiving a touch input through a proximity-sensitive display on a user device; classifying the touch input as likely a left-handed touch input or a right-handed touch input; identifying a particular portion of the user interface that is likely to not be visually obscured based on classifying the touch input as likely a left-handed touch input or a right-handed touch input; determining to position a user interface element on the particular portion of the user interface that, based on classifying the touch input as likely a left-handed touch input or a right-handed touch input, is identified as likely to not be visually obscured; and providing, for display on the proximity-sensitive display on the user device, the user interface including the user interface element positioned on the particular portion of the user interface that, based on classifying the touch input as likely a left-handed touch input or a right-handed touch input, is identified as likely to not be visually obscured. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A system comprising:
one or more processors and one or more computer storage media storing instructions that are operable and when executed by the one or more processors, cause the one or more processors to perform operations comprising; receiving a touch input through a proximity-sensitive display on a user device; classifying the touch input as likely a left-handed touch input or a right-handed touch input; identifying a particular portion of the user interface that is likely to not be visually obscured based on classifying the touch input as likely a left-handed touch input or a right-handed touch input; determining to position a user interface element on the particular portion of the user interface that, based on classifying the touch input as likely a left-handed touch input or a right-handed touch input, is identified as likely to not be visually obscured; and providing, for display on the proximity-sensitive display on the user device, the user interface including the user interface element positioned on the particular portion of the user interface that, based on classifying the touch input as likely a left-handed touch input or a right-handed touch input, is identified as likely to not be visually obscured. - View Dependent Claims (16, 17, 18, 19, 20)
Specification