FOCUS ADJUSTING HEADSET
First Claim
1. A method comprising:
- receiving, by a virtual reality headset, virtual scene data to display a virtual scene on an electronic display element of the virtual reality headset;
determining an eye position and gaze lines for each eye of a user wearing the virtual reality headset via images of each eye captured by an eye tracking system included in the virtual reality headset;
determining a vergence point for the user in the virtual scene based on an estimated intersection of the gaze lines for each eye of the user;
displaying the virtual scene on the electronic display element, the virtual reality headset including an optics block and a spatial light modulator (SLM) in optical series that receive a wavefront of light of the virtual scene displayed on the electronic display element; and
adjusting, via the SLM, the wavefront of the light from the electronic display element based at least in part on the determined vergence point for the user, the SLM directing the adjusted wavefront of the light of the virtual scene to the user via an exit pupil of the virtual reality headset.
3 Assignments
0 Petitions
Accused Products
Abstract
A virtual reality (VR) headset adjusts the phase of light of a virtual scene received from a display element using a spatial light modulator (SLM) to accommodate changes in vergence for a user viewing objects in the virtual scene. The VR headset receives virtual scene data that includes depth information for components of the virtual scene and the SLM adjusts a wavefront of the light of the virtual scene by generating a phase function that adjusts the light of the virtual scene with phase delays based the depth values. Individual phase delays shift components of the virtual scene based on the depth values to a target focal plane to accommodate a user at a vergence depth for a frame of the virtual scene. Further, the SLM can provide optical defocus by shifting components of the virtual scene with the phase delays for depth of field blur.
36 Citations
20 Claims
-
1. A method comprising:
-
receiving, by a virtual reality headset, virtual scene data to display a virtual scene on an electronic display element of the virtual reality headset; determining an eye position and gaze lines for each eye of a user wearing the virtual reality headset via images of each eye captured by an eye tracking system included in the virtual reality headset; determining a vergence point for the user in the virtual scene based on an estimated intersection of the gaze lines for each eye of the user; displaying the virtual scene on the electronic display element, the virtual reality headset including an optics block and a spatial light modulator (SLM) in optical series that receive a wavefront of light of the virtual scene displayed on the electronic display element; and adjusting, via the SLM, the wavefront of the light from the electronic display element based at least in part on the determined vergence point for the user, the SLM directing the adjusted wavefront of the light of the virtual scene to the user via an exit pupil of the virtual reality headset. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method comprising:
-
receiving, by a virtual reality headset, virtual scene data for displaying a virtual scene on an electronic display element of the virtual reality headset; determining a position and an orientation of the virtual reality headset; determining a portion of the virtual scene to display on the electronic display element based on the position and the orientation of the virtual reality headset; displaying the portion of the virtual scene on the electronic display element, the virtual reality headset including a spatial light modulator (SLM) that receives a wavefront of light for the portion of the virtual scene displayed on the electronic display element; and adjusting, via the SLM, the wavefront of the light received for the portion of the virtual scene displayed on the electronic display element based at least in part on a plurality of depth values associated with different locations within the portion of the virtual scene, the SLM directing the adjusted wavefront received from the electronic display element to an exit pupil of the virtual reality headset. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. A virtual reality headset comprising:
-
at least one processor; an eye tracking system including an image capturing element, the eye tracking system configured to determine an eye position of each eye of a user wearing the virtual reality headset and gaze lines for each eye of the user; memory including instructions that, when executed by the at least one processor, cause the at least one processor to; determine a viewing location of the user within the virtual scene based on an estimated intersection of the gaze lines for each eye; an electronic display element configured to display a virtual scene; and an optics block including a spatial light modulator (SLM) configured to; receive light of the virtual scene displayed by the electronic display element, the light including a wavefront; adjust the wavefront of the light for the virtual scene based at least in part on the viewing location of the user; and direct the adjusted wavefront of the light to an exit pupil of the virtual reality headset, the adjusted wavefront providing optical defocus to the virtual scene based on the viewing location of the user. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification