×

Eye tracking systems and method for augmented or virtual reality

  • US 10,825,248 B2
  • Filed: 05/08/2015
  • Issued: 11/03/2020
  • Est. Priority Date: 04/18/2014
  • Status: Active Grant
First Claim
Patent Images

1. A method, comprising:

  • rendering one or more virtual objects in a virtual or augmented reality environment with a virtual image generation subsystem of a virtual content display system altering, with a variable focus element, one or more rays or cones of light among multiple planes that correspond to respective virtual depths towards at least one eye of a user, whereinthe virtual image generation subsystem comprising a graphics processing unit, at least one projector, the variable focus element, a first assembly, and a second assembly that is operatively coupled to the first assembly,the first assembly and second assembly receive power from a portable power source separate from the virtual image generation subsystem,the first assembly includes a first board and a projector driver providing image information and signals to the at least one projector,the second assembly includes a second board, a microprocessor, the graphics processing unit, and one or more motion sensors or transducers, andthe at least one projector projects image information of the one or more virtual objects to the at least one eye of the user;

    representing the at least one eye with an eye model that comprises a first circular shape or circle and a second circular shape or circle, wherein the second circular shape or circle represents a cornea of the at least one eye and is layered on top of the first circular shape or circle;

    detecting, with at least an eye tracking device, one or more characteristics pertaining to an interaction between the at least one eye of the user and reflected light from the at least one eye at least by;

    capturing, by a first set of sensors or transducers in the first assembly, a first pattern emitted or reflected from one or more ambient light sources in a real-world environment;

    projecting, with at least the projector driver in the first assembly and the at least one projector, a second light pattern generated by a set of light sources in the first assembly to the at least one eye of the user;

    in response to the first light pattern, detecting the reflected light from the at least one eye using one or more second sensors or transducers in the first assembly; and

    determining, by the microprocessor in the second assembly, the interaction at least by correlating the reflected light with the first light pattern and the second light pattern;

    determining, by the microprocessor in the second assembly, an eye pointing vector and a center of rotation of the at least one eye in the eye model using at least the first circular shape or circle and the second circular shape or circle based at least in part upon a characteristic of a cross-section of the at least one eye and a range of movement of the at least one eye;

    determining, by the microprocessor in the second assembly, a vectored distance for the eye pointing vector based at least in part upon the one or more characteristics and the center of rotation for the at least one eye in the eye model; and

    determining, by the microprocessor in the second assembly, at least one movement or pose for both the at least one eye and another eye of the user at least by using the vectored distance of the at least one eye and further by extrapolating one or more eye movement or pose characteristics with at least one or more parameters pertaining to the interaction and captured by one or more sensors for the at least one eye of the user.

View all claims
  • 3 Assignments
Timeline View
Assignment View
    ×
    ×