Enhanced gesture-based image manipulation
First Claim
Patent Images
1. A computer-implemented method comprising:
- causing a display to render an image object in a user interface;
determining a position of a user'"'"'s torso;
defining, with a processing unit, a plane relative to the user'"'"'s torso, based on the determined position;
ignoring, by the processing unit, movements of the user that are made on a far side of the plane from a camera capturing images of the user;
determining, with the processing unit, an extent to which the user is able to reach;
recognizing, from first and second images captured by the camera, a user'"'"'s gesture performed between the camera and the plane;
determining, with the processing unit, an interaction command corresponding to the recognized user'"'"'s gesture; and
manipulating, with a processing unit and based on the determined interaction command, the image object displayed in the user interface, wherein a magnitude of the manipulation is related to a proximity of the user'"'"'s gesture to the determined extent to which the user is able to reach.
2 Assignments
0 Petitions
Accused Products
Abstract
Enhanced image viewing, in which a user'"'"'s gesture is recognized from first and second images, an interaction command corresponding to the recognized user'"'"'s gesture is determined, and, based on the determined interaction command, an image object displayed in a user interface is manipulated.
43 Citations
22 Claims
-
1. A computer-implemented method comprising:
-
causing a display to render an image object in a user interface; determining a position of a user'"'"'s torso; defining, with a processing unit, a plane relative to the user'"'"'s torso, based on the determined position; ignoring, by the processing unit, movements of the user that are made on a far side of the plane from a camera capturing images of the user; determining, with the processing unit, an extent to which the user is able to reach; recognizing, from first and second images captured by the camera, a user'"'"'s gesture performed between the camera and the plane; determining, with the processing unit, an interaction command corresponding to the recognized user'"'"'s gesture; and manipulating, with a processing unit and based on the determined interaction command, the image object displayed in the user interface, wherein a magnitude of the manipulation is related to a proximity of the user'"'"'s gesture to the determined extent to which the user is able to reach. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 22)
-
-
20. An apparatus comprising:
-
a processor configured to; cause a display to render an image object in a user interface; determine a position of a user'"'"'s torso; define a plane relative to the user'"'"'s torso, based on the determined position; ignore movements of the user that are made on a far side of the plane from a camera capturing images of the user; determine an extent to which the user is able to reach; recognize, from first and second images captured by the camera, a user'"'"'s gesture performed between the camera and the plane; determine an interaction command corresponding to the recognized user'"'"'s gesture; and manipulate, based on the determined interaction command, the image object displayed in the user interface, wherein a magnitude of the manipulation is related to a proximity of the user'"'"'s gesture to the determined extent to which the user is able to reach.
-
-
21. A non-transitory processor-readable medium comprising processor-readable instructions configured to cause a processor to:
-
causing a display to render an image object in a user interface; determine a position of a user'"'"'s torso; define a plane relative to the user'"'"'s torso, based on the determined position; ignore movements of the user that are made on a far side of the plane from a camera capturing images of the user; determine an extent to which the user is able to reach; recognize, from first and second images captured by the camera, a user'"'"'s gesture performed between the camera and the plane; determine an interaction command corresponding to the recognized user'"'"'s gesture; and manipulate, based on the determined interaction command, the image object displayed in the user interface, wherein a magnitude of the manipulation is related to a proximity of the user'"'"'s gesture to the determined extent to which the user is able to reach.
-
Specification