User interface control based on head orientation
First Claim
1. A system for distinguishing among a plurality of user interface elements based on head orientation, said system comprising:
- a memory area associated with a computing device, said memory area storing a set of facial reference points including coordinates for at least two eyes of a subject, said memory area further storing a plurality of user interface elements for display on a user interface; and
a processor programmed to;
receive a height of the subject and a distance of the subject from a camera associated with the computing device;
calculate a reference point for a nose of the subject based on the height and the distance of the subject from the camera;
access the set of facial reference points stored in the memory area;
determine a first distance between a first one of the eyes and the nose;
determine a second distance between a second one of the eyes and the nose;
compare the determined first distance to the determined second distance to calculate a head orientation value, the first distance and the second distance being weighted to calculate the head orientation value; and
apply the calculated head orientation value to the user interface to identify at least one of the plurality of user interface elements displayed by the user interface.
2 Assignments
0 Petitions
Accused Products
Abstract
Embodiments distinguish among user interface elements based on head orientation. Coordinates representing a set of at least three reference points in an image of a subject gazing on the user interface elements are received by a computing device. The set includes a first reference point and a second reference point located on opposite sides of a third reference point. A first distance between the first reference point and the third reference point is determined. A second distance between the second reference point and the third reference point is determined. The computing device compares the first distance to the second distance to calculate a head orientation value. The computing device selects at least one of the user interface elements based on the head orientation value. In some embodiments, the head orientation value enables the user to navigate a user interface menu or control a character in a game.
-
Citations
20 Claims
-
1. A system for distinguishing among a plurality of user interface elements based on head orientation, said system comprising:
-
a memory area associated with a computing device, said memory area storing a set of facial reference points including coordinates for at least two eyes of a subject, said memory area further storing a plurality of user interface elements for display on a user interface; and a processor programmed to; receive a height of the subject and a distance of the subject from a camera associated with the computing device; calculate a reference point for a nose of the subject based on the height and the distance of the subject from the camera; access the set of facial reference points stored in the memory area; determine a first distance between a first one of the eyes and the nose; determine a second distance between a second one of the eyes and the nose; compare the determined first distance to the determined second distance to calculate a head orientation value, the first distance and the second distance being weighted to calculate the head orientation value; and apply the calculated head orientation value to the user interface to identify at least one of the plurality of user interface elements displayed by the user interface. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method comprising:
-
receiving, by a computing device, coordinates representing a set of at least three reference points in an image of a subject gazing on a plurality of user interface elements, the set including a first reference point and a second reference point located on opposite sides of a third reference point, wherein the subject is viewed by a camera, and wherein receiving the coordinates comprises receiving a height of the subject and a distance of the subject from the camera;
more of the following;
a height of the subject and a distance from the computing device of the subject;calculating the third reference point based on the height and the distance; determining a first distance between the first reference point and the third reference point; determining a second distance between the second reference point and the third reference point; comparing, by the computing device, the determined first distance to the determined second distance to calculate a head orientation value for the subject, the first distance and the second distance being weighted to calculate the head orientation value; and selecting, by the computing device, at least one of the plurality of user interface elements based on the calculated head orientation value. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16)
-
-
17. One or more computer storage media embodying computer-executable components, said components comprising:
-
a communications interface component that when executed causes at least one processor to receive coordinates representing a first set of at least three reference points from a first video frame of a subject gazing on a plurality of user interface elements, the set including a first reference point and a second reference point located on opposite sides of a third reference point, wherein the subject is viewed by a capture device, and wherein receiving the coordinates comprises receiving a height of the subject and a distance of the subject from the capture device, wherein the third reference point is calculated based on the height and the distance of the subject; a detection component that when executed causes at least one processor to determine a first distance between the first reference point and the third reference point and to determine a second distance between the second reference point and the third reference point; and a score component that when executed causes at least one processor to compare the first distance determined by the detection component to the second distance determined by the detection component to calculate a first head orientation value for the subject, the first distance and the second distance being weighted to calculate the head orientation value, wherein the detection component and the score component operate on a second set of at least three reference points from a second video frame to calculate a second head orientation value for the subject; and an interaction component that when executed causes at least one processor to select at least one of the plurality of user interface elements based on a comparison between the first head orientation value calculated by the score component to the second head orientation value calculated by the score component. - View Dependent Claims (18, 19, 20)
-
Specification