SYSTEMS AND METHODS FOR GAZE-BASED MEDIA SELECTION AND EDITING
First Claim
1. A system for editing media images, comprising:
- a wearable device;
a scene camera mounted on the device such that the scene camera captures media images of a user'"'"'s surroundings;
an eye tracking subsystem that projects a reference frame onto the eye and associates the projected reference frame with a second reference frame of a display for capturing eye tracking data of at least one eye of a user; and
one or more processors communicating with the scene camera and eye tracking subsystem for tagging media images captured by the scene camera based at least in part on the eye tracking data.
3 Assignments
0 Petitions
Accused Products
Abstract
Systems are presented herein, which may be implemented in a wearable device. The system is designed to allow a user to edit media images captured with the wearable device. The system employs eye tracking data to control various editing functions, whether prior to the time of capture, during the time of capture, or after the time of capture. Also presented are methods for determining which sections or regions of media images may be of greater interest to a user or viewer. The method employs eye tracking data to assign saliency to captured media. In both the system and the method, eye tracking data may be combined with data from additional sensors in order to enhance operation.
74 Citations
24 Claims
-
1. A system for editing media images, comprising:
-
a wearable device; a scene camera mounted on the device such that the scene camera captures media images of a user'"'"'s surroundings; an eye tracking subsystem that projects a reference frame onto the eye and associates the projected reference frame with a second reference frame of a display for capturing eye tracking data of at least one eye of a user; and one or more processors communicating with the scene camera and eye tracking subsystem for tagging media images captured by the scene camera based at least in part on the eye tracking data. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15-18. -18. (canceled)
-
19. A system to quantitatively assess comparative saliency in video images as determined by proximally-located wearable devices with the purpose of recording relevant events from different viewpoints, comprising:
-
a plurality of wearable devices configured to be worn by individual users, each wearable device including a scene camera mounted thereon such that the scene camera captures media images of the individual user'"'"'s surroundings, one or more sensors, a communication interface, and an eye tracking subsystem that projects a reference frame onto the user'"'"'s eye and associates the projected reference frame with a second reference frame of a display for capturing eye tracking data of the user'"'"'s eye; and a server for communicating with the wearable devices via each wearable device'"'"'s communication interface. - View Dependent Claims (20, 21, 22, 23)
-
-
24. A method for selecting or editing media images from a wearable device worn by a user, comprising:
-
capturing media images, using a scene camera on the wearable device, of the user'"'"'s surroundings; capturing eye tracking data, using an eye tracking subsystem an eye tracking subsystem that projects a reference frame onto the eye and associates the projected reference frame with a second reference frame of a display for capturing eye tracking data of at least one eye of a user; and at least one of selecting and editing the media images based at least in part on actions of the at least one eye events identified from the eye tracking data.
-
-
24-39. -39. (canceled)
Specification