Apparatus and methods for dynamically correlating virtual keyboard dimensions to user finger size
First Claim
1. A method for providing a user interface on a computing device equipped with a touchscreen display and a digital camera configured to be able to image the face of a user when the user is viewing the touchscreen display, comprising:
- displaying a moving icon on the touchscreen display for a predetermined eye-tracking training period;
obtaining a first plurality of images of the eyes of the user with the digital camera during the predetermined eye-tracking training period and when the face of the user is at a first distance from the digital camera;
obtaining a second plurality of images of the eyes of the user with the digital camera when the face of the user is at a second distance from the digital camera;
comparing the obtained first plurality of images to known locations of the moving icon during the predetermined eye-tracking training period to determine an image analysis rule;
comparing the obtained second plurality of images to known locations of the moving icon to adjust the image analysis rule;
storing the adjusted image analysis rule in memory;
obtaining a digital image of the eyes of the user of the computing device with the digital camera after the predetermined eye-tracking training period;
determining a location of a gaze of the user based on the obtained digital image and the adjusted image analysis rule;
determining whether the gaze of the user is directed to a portion of the touchscreen display containing an image element without requiring additional user interaction with the user interface;
determining a center of the portion of the touchscreen display to which the gaze of the user is directed; and
enlarging the image element on the touchscreen display based upon a finger size calibration factor related to a size of a finger of the user and a distance from the center of the portion of the touchscreen display to which the gaze of the user is directed in response to determining that the gaze of the user is directed to the portion of the touchscreen display containing the image element.
1 Assignment
0 Petitions
Accused Products
Abstract
Embodiments provide a user interface for computing devices equipped with a touchscreen user interface/display and a digital camera that enhances a portion of a displayed image within a user'"'"'s gaze. A user may calibrate their mobile device by touching a portion of the touchscreen with one or more fingers and following a moving image on the display with their eyes. The mobile device may track where a user is looking, and if the user is looking at the mobile device display, a portion of the display in the vicinity of the user'"'"'s gaze may be enhanced in size. In an embodiment, if the user is looking at a virtual keyboard, key icons near the user'"'"'s gaze may be increased in size commensurate with the user'"'"'s finger tip size. In this manner, a user can accurately select individual keys in a virtual keyboard that fits within a mobile device display.
-
Citations
32 Claims
-
1. A method for providing a user interface on a computing device equipped with a touchscreen display and a digital camera configured to be able to image the face of a user when the user is viewing the touchscreen display, comprising:
-
displaying a moving icon on the touchscreen display for a predetermined eye-tracking training period; obtaining a first plurality of images of the eyes of the user with the digital camera during the predetermined eye-tracking training period and when the face of the user is at a first distance from the digital camera; obtaining a second plurality of images of the eyes of the user with the digital camera when the face of the user is at a second distance from the digital camera; comparing the obtained first plurality of images to known locations of the moving icon during the predetermined eye-tracking training period to determine an image analysis rule; comparing the obtained second plurality of images to known locations of the moving icon to adjust the image analysis rule; storing the adjusted image analysis rule in memory; obtaining a digital image of the eyes of the user of the computing device with the digital camera after the predetermined eye-tracking training period; determining a location of a gaze of the user based on the obtained digital image and the adjusted image analysis rule; determining whether the gaze of the user is directed to a portion of the touchscreen display containing an image element without requiring additional user interaction with the user interface; determining a center of the portion of the touchscreen display to which the gaze of the user is directed; and enlarging the image element on the touchscreen display based upon a finger size calibration factor related to a size of a finger of the user and a distance from the center of the portion of the touchscreen display to which the gaze of the user is directed in response to determining that the gaze of the user is directed to the portion of the touchscreen display containing the image element. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A computing device, comprising:
-
a processor; a memory coupled to the processor; a digital camera coupled to the processor and configured to be able to image the eyes of a user of the computing device; and a touchscreen display coupled to the processor, wherein the processor is configured with processor-executable instructions to perform operations comprising; displaying a moving icon on the touchscreen display for a predetermined eye-tracking training period; obtaining a first plurality of images of the eyes of the user with the digital camera during the predetermined eye-tracking training period and when the face of the user is at a first distance from the digital camera; obtaining a second plurality of images of the eyes of the user with the digital camera when the face of the user is at a second distance from the digital camera; comparing the obtained first plurality of images to known locations of the moving icon during the predetermined eye-tracking training period to determine an image analysis rule; comparing the obtained second plurality of images to known locations of the moving icon to adjust the image analysis rule; storing the adjusted image analysis rule in the memory; obtaining a digital image of the eyes of the user of the computing device with the digital camera after the predetermined eye-tracking training period; determining a location of a gaze of the user based on the obtained digital image and the adjusted image analysis rule; determining whether the gaze of the user is directed to a portion of the touchscreen display containing an image element without requiring additional user interaction with a user interface; determining a center of the portion of the touchscreen display to which the gaze of the user is directed; and enlarging the image element on the touchscreen display based upon a finger size calibration factor related to a size of a finger of the user and a distance from the center of the portion of the touchscreen display to which the gaze of the user is directed in response to determining that the gaze of the user is directed to the portion of the touchscreen display containing the image element. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16)
-
-
17. A computing device, comprising:
-
means for displaying a moving icon on a touchscreen display for a predetermined eye- tracking training period; means for obtaining a first plurality of images of eyes of a user with a digital camera during the predetermined eye-tracking training period and when the face of the user is at a first distance from the digital camera; means for obtaining a second plurality of images of the eyes of the user when the face of the user is at a second distance from the digital camera; means for comparing the obtained first plurality of images to known locations of the moving icon during the predetermined eye-tracking training period to determine an image analysis rule; means for comparing the obtained second plurality of images to known locations of the moving icon to adjust the image analysis rule; means for storing the adjusted image analysis rule in memory; means for obtaining a digital image of the eyes of the user of the computing device after the predetermined eye-tracking training period; means for determining a location of a gaze of the user based on the obtained digital image and the adjusted image analysis rule; means for determining whether the gaze of the user is directed to a portion of the touchscreen display containing an image element without requiring additional user interaction with a user interface; means for determining a center of the portion of the touchscreen display to which the gaze of the user is directed; and means for enlarging the image element on the touchscreen display based upon a finger size calibration factor related to a size of a finger of the user and a distance from the center of the portion of the touchscreen display to which the gaze of the user is directed in response to determining that the gaze of the user is directed to the portion of the touchscreen display containing the image element. - View Dependent Claims (18, 19, 20, 21, 22, 23, 24)
-
-
25. A non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of a computing device to perform operations comprising:
-
displaying a moving icon on a touchscreen display for a predetermined eye-tracking training period; obtaining a first plurality of images of eyes of a user of the computing device with a digital camera during the predetermined eye-tracking training period and when the face of the user is at a first distance from the digital camera; obtaining a second plurality of images of the eyes of the user with the digital camera when the face of the user is at a second distance from the digital camera; comparing the obtained first plurality of images to known locations of the moving icon during the predetermined eye-tracking training period to determine an image analysis rule; comparing the obtained second plurality of images to known locations of the moving icon to adjust the image analysis rule; storing the adjusted image analysis rule in memory; obtaining a digital image of the eyes of the user of the computing device with the digital camera after the predetermined eye-tracking training period; determining a location of a gaze of the user based on the obtained digital image and the adjusted image analysis rule; determining whether the gaze of the user is directed to a portion of the touchscreen display containing an image element without requiring additional user interaction with a user interface; determining a center of the portion of the touchscreen display to which the gaze of the user is directed; and enlarging the image element on the touchscreen display based upon a finger size calibration factor related to a size of a finger of the user and a distance from the center of the portion of the touchscreen display to which the gaze of the user is directed in response to determining that the gaze of the user is directed to the portion of the touchscreen display containing the image element. - View Dependent Claims (26, 27, 28, 29, 30, 31, 32)
-
Specification