Waveguide display with multiple focal depths
First Claim
1. A near-eye optical display system configured to enable a user to view a mixed-reality environment comprising real-world images and virtual images, comprising:
- an imager configured to generate virtual images;
a waveguide configured to enable the user to see through the waveguide to view virtual-world images and including an in-coupling element for in-coupling virtual images into the waveguide, and an out-coupling element for out-coupling virtual images from the see-through waveguide;
an array of lenses, the array configured to impart variable focal depth to virtual images out-coupled from the waveguide to the user'"'"'s eye and further configured to pass real-world images to the user'"'"'s eye without imparting change in focal depth;
a first diffractive optical element (DOE) having an input surface and configured as an in-coupling grating to receive imaging light incorporating the virtual images from the imager as an input;
a second DOE configured for pupil expansion of the imaging light along a first direction; and
a third DOE having an output surface and configured for pupil expansion of the imaging light along a second direction, and further configured as an out-coupling element to out-couple, as an output to the array of lenses from the output surface, imaging light with expanded pupil relative to the input.
1 Assignment
0 Petitions
Accused Products
Abstract
A near-eye optical display system utilized in augmented reality devices includes a see-through waveguide display having optical elements configured for in-coupling virtual images from an imager, exit pupil expansion, and out-coupling virtual images with expanded pupil to the user'"'"'s eye. The near-eye optical display system further includes a curved two-sided array of electrically-activated tunable liquid crystal (LC) microlenses that is located between the waveguide and the user'"'"'s eye. The LC microlenses are distributed in layers on each side of the two-sided array. Each pixel in the waveguide display is mapped to an LC microlens in the array, and multiple nearby pixels may be mapped to the same LC microlens. A region of the waveguide display that the user is gazing upon is detected and the LC microlens that is mapped to that region may be electrically activated to thereby individually shape the wavefront of each pixel in a virtual image.
-
Citations
19 Claims
-
1. A near-eye optical display system configured to enable a user to view a mixed-reality environment comprising real-world images and virtual images, comprising:
-
an imager configured to generate virtual images; a waveguide configured to enable the user to see through the waveguide to view virtual-world images and including an in-coupling element for in-coupling virtual images into the waveguide, and an out-coupling element for out-coupling virtual images from the see-through waveguide; an array of lenses, the array configured to impart variable focal depth to virtual images out-coupled from the waveguide to the user'"'"'s eye and further configured to pass real-world images to the user'"'"'s eye without imparting change in focal depth; a first diffractive optical element (DOE) having an input surface and configured as an in-coupling grating to receive imaging light incorporating the virtual images from the imager as an input; a second DOE configured for pupil expansion of the imaging light along a first direction; and a third DOE having an output surface and configured for pupil expansion of the imaging light along a second direction, and further configured as an out-coupling element to out-couple, as an output to the array of lenses from the output surface, imaging light with expanded pupil relative to the input. - View Dependent Claims (2, 3, 4)
-
-
5. An electronic device supporting an augmented reality experience including virtual images and real-world images for a user, comprising:
-
a virtual image processor configured to provide virtual image data; an optical engine configured to produce virtual images from the virtual image data; an exit pupil expander, responsive to one or more input optical beams incorporating the virtual images, comprising a structure on which multiple diffractive optical elements (DOEs) are disposed including an out-coupling DOE; and a curved array of electrically-modulated tunable lenses, each lens configured to assume a particular wavefront shape to thereby impart multiple focal depths to the virtual images, wherein the array is located on the electronic device between an eye of the user and the out-coupling DOE when the user operates the electronic device, and wherein the exit pupil expander is configured to provide one or more out-coupled optical beams at the out-coupling DOE to the array with an expanded exit pupil relative to the one or more input optical beams. - View Dependent Claims (6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A method for selectively providing variable focus to virtual images in an augmented reality display system that supports virtual images and real-world images, comprising:
-
receiving, from an imager, imaging light incorporating a virtual image at an in-coupling diffractive optical element (DOE) disposed in an exit pupil expander; expanding an exit pupil of the received imaging light along a first coordinate axis in an intermediate DOE disposed in the exit pupil expander; expanding the exit pupil along a second coordinate axis in an out-coupling DOE disposed in the exit pupil expander; outputting the virtual images using the out-coupling DOE to an array of tunable liquid crystal (LC) microlenses, the output virtual images having an expanded exit pupil relative to the received light at the in-coupling DOE along the first and second coordinate axes; and electrically controlling one or more LC microlenses in the array to focus the virtual image on a virtual image plane, a location of the virtual image plane being at a selectively variable distance from the system based on the electrical control. - View Dependent Claims (16, 17, 18, 19)
-
Specification