System, method and apparatus of eye tracking or gaze detection applications including facilitating action on or interaction with a simulated object
First Claim
1. A method to facilitate action by a user via a device on a simulated object in an augmented reality environment, the method comprising:
- detecting a gesture of the user in a real environment via a sensor of the device;
wherein, the gesture includes, one or more of;
movement of one or more eyes of the user, andmovement of eye focal point of the one or more eyes of the user;
capturing the gesture to implement the action on the simulated object in the augmented reality environment;
wherein, the gesture is detected based on a given speed or velocity of the movement of the one or more eyes while the one or more eyes is in motion relative to the user;
wherein, the action to be performed on the simulated object is based on the gesture that is detected;
wherein, the gesture includes locating the eye focal point of the one or more eyes of the user to target the simulated object that is in the eye focal point;
wherein, the eye focal point is determined from the given speed or velocity.
2 Assignments
0 Petitions
Accused Products
Abstract
Techniques are disclosed for facilitating action by a user on a simulated object in an augmented reality environment. In some embodiments, a method includes, detecting a gesture of the user in a real environment via a sensor of the device; wherein, the gesture includes, movement of eye ball or eye focal point of one or more eyes of the user. The gesture can be detected by tracking: a movement of one or more eyes of the user, a non-movement of one or more eyes of the user, a location of a focal point of one or more eyes of the user, and/or a movement of an eye lid of one or more eyes of the user. The gesture can be captured to implement the action on the simulated object in the augmented reality environment.
258 Citations
30 Claims
-
1. A method to facilitate action by a user via a device on a simulated object in an augmented reality environment, the method comprising:
-
detecting a gesture of the user in a real environment via a sensor of the device; wherein, the gesture includes, one or more of; movement of one or more eyes of the user, and movement of eye focal point of the one or more eyes of the user; capturing the gesture to implement the action on the simulated object in the augmented reality environment; wherein, the gesture is detected based on a given speed or velocity of the movement of the one or more eyes while the one or more eyes is in motion relative to the user; wherein, the action to be performed on the simulated object is based on the gesture that is detected; wherein, the gesture includes locating the eye focal point of the one or more eyes of the user to target the simulated object that is in the eye focal point; wherein, the eye focal point is determined from the given speed or velocity. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A system to enable interaction with a simulated object by a user in a simulated environment, the system comprising:
-
a processor; a memory having stored thereon instructions which, when executed by the processor, cause the system to; detect a gesture of the user in a real environment; wherein, the gesture includes, one or more of; movement of eye ball motion of one or more eyes of the user, and movement of eye focal point of the one or more eyes of the user; capture the gesture to facilitate the interaction with the simulated object in the simulated environment; wherein, the gesture is captured using a given speed or velocity of the movement of the one or more eyes while the one or more eyes is in motion relative to the user; wherein, the interaction with the simulated objected that is enabled in the simulated environment is based on the gesture of the user; wherein, the gesture includes locating the eye focal point of the one or more eyes of the user to target the simulated object that is in the eye focal point; wherein, the eye focal point is determined from the given speed or velocity. - View Dependent Claims (18, 19, 20, 21, 22, 23, 24, 25)
-
-
26. An apparatus which facilitates action by a user on a virtual object in a digital environment, the apparatus comprising:
-
a sensor which detects a gesture of the user in a real environment; wherein, the gesture includes, one or more of; movement of eye ball of one or more eyes of the user, and movement of eye focal point of the one or more eyes of the user; the gesture being captured to implement the action to be performed on the virtual object in the digital environment; wherein the gesture is detected based on a given speed or velocity of the movement of the one or more eyes while the one or more eyes is in motion relative to the user; wherein, the action to be performed on the virtual object is based on the gesture that is detected; wherein, the gesture includes locating the eye focal point of the one or more eyes of the user to target the virtual object that is in the eye focal point; wherein, the eye focal point is determined from the given speed or velocity of the movement of the one or more eyes. - View Dependent Claims (27, 28, 29)
-
-
30. A non-transitory machine-readable storage medium, having stored thereon instructions which when executed by a processor, cause the processor to perform, a method to facilitate a transaction via an online environment, of a physical product in a real environment that is represented by a simulated object, the method comprising:
-
capturing a gesture performed with respect to the simulated object; wherein, the gesture is performed by a user of the real environment to initiate the transaction of the physical product associated with the simulated object in the online environment; wherein, the gesture performed by the user with respect to the simulated object, includes movement of one or more eyes of the user; wherein, the gesture is captured using a given speed or velocity of the movement of the one or more eyes while the one or more eyes is in motion relative to the user; wherein, the gesture includes locating an eye focal point of the one or more eyes of the user to target the simulated object that is in the eye focal point; wherein, the eye focal point is determined from the given speed or velocity; conducting the transaction of the physical product in the online environment, according to the gesture performed by the user with respect to the simulated object associated with the physical product.
-
Specification