Identifying a target touch region of a touch-sensitive surface based on an image
First Claim
Patent Images
1. A computing system comprising:
- a camera to capture an image representing an object disposed between the camera and a touch-sensitive surface, the touch-sensitive surface to detect a touch input at a location of the touch-sensitive surface;
a processor; and
a non-transitory storage medium storing instructions executable on the processor to;
identify, based on a characteristic of the object represented in the image, a target touch region of the touch-sensitive surface, the target touch region for receiving touch input, wherein the identifying of the target touch region comprises determining that a first region of the touch-sensitive surface is within a threshold distance from a target portion of the object, determining that a second region of the touch-sensitive surface is outside the threshold distance from the target portion of the object, and including the first region in the target touch region and excluding the second region from the target touch region,determine that the location of the detected touch input is not within the identified target touch region of the touch-sensitive surface, andreject the detected touch input in response to the determining.
2 Assignments
0 Petitions
Accused Products
Abstract
Examples disclosed herein relate to identifying a target touch region of a touch-sensitive surface based on an image. Examples include a touch input detected at a location of a touch-sensitive surface, an image representing an object disposed between a camera that captures the image and the touch-sensitive surface, identifying a target touch region of a touch-sensitive surface based on an image, and rejecting the detected touch input when the location of the detected touch input is not within any of the at least one identified target touch region of the touch-sensitive surface.
-
Citations
21 Claims
-
1. A computing system comprising:
-
a camera to capture an image representing an object disposed between the camera and a touch-sensitive surface, the touch-sensitive surface to detect a touch input at a location of the touch-sensitive surface; a processor; and a non-transitory storage medium storing instructions executable on the processor to; identify, based on a characteristic of the object represented in the image, a target touch region of the touch-sensitive surface, the target touch region for receiving touch input, wherein the identifying of the target touch region comprises determining that a first region of the touch-sensitive surface is within a threshold distance from a target portion of the object, determining that a second region of the touch-sensitive surface is outside the threshold distance from the target portion of the object, and including the first region in the target touch region and excluding the second region from the target touch region, determine that the location of the detected touch input is not within the identified target touch region of the touch-sensitive surface, and reject the detected touch input in response to the determining. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A non-transitory machine-readable storage medium comprising instructions executable by a processing resource of a computing system comprising a touch-sensitive surface and a camera disposed above and pointed at the touch-sensitive surface, the instructions executable to:
-
acquire, from the touch-sensitive surface, a location of a touch input detected by the touch-sensitive surface; acquire, from the camera, an image representing an object disposed between the camera and the touch-sensitive surface; identify a current touch input scenario for the computing system of a plurality of touch input scenarios for the computing system, including a hand input scenario and a stylus input scenario; identify a target touch region of the touch-sensitive surface based on a location of a portion of the object as represented in the image, wherein the target touch region is identified differently when the hand input scenario is identified than when the stylus input scenario is identified, wherein the identifying of the target touch region comprises determining that a first region of the touch-sensitive surface is within a threshold distance from the portion of the object, determining that a second region of the touch-sensitive surface is outside the threshold distance from the portion of the object, and including the first region in the target touch region and excluding the second region from the target touch region; determine whether the location of the touch input is within the identified target touch region of the touch-sensitive surface; and reject the touch input in response to determining that the location of the touch input is not within the identified target touch region of the touch-sensitive surface. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16)
-
-
17. A method comprising:
-
detecting, with a touch-sensitive surface, a touch input at a location of the touch-sensitive surface; capturing, with a camera of a computer system disposed above and pointed at the touch-sensitive surface, an image representing an object disposed between the camera and the touch-sensitive surface; identifying a target touch region of the touch-sensitive surface based on a characteristic of the object represented in the image, wherein the identifying of the target touch region comprises determining that a first region of the touch-sensitive surface is within a threshold distance from a target portion of the object, determining that a second region of the touch-sensitive surface is outside the threshold distance from the target portion of the object, and including the first region in the target touch region and excluding the second region from the target region; comparing the location of the touch input to the identified target touch region of the touch-sensitive surface; and in response to the comparing indicating that the location of the touch input is not within the identified target touch region, rejecting the detected touch input. - View Dependent Claims (18, 19, 20, 21)
-
Specification