Auto-stereoscopic augmented reality display
First Claim
1. An imaging structure implemented in a display device, the imaging structure comprising:
- a waveguide configured for see-through viewing of an environment, the waveguide further configured to transmit light of a virtual image that is generated as a near-display object to appear at a distance in the environment when the environment is viewed through the waveguide;
one or more sensors configured to provide reference data related to at least a position and an orientation of the imaging structure in the environment with respect to a real object in the environment; and
switchable diffractive elements integrated in the waveguide and configured in display zones of the display device, the display zones including vector adjustments, based in part on the reference data, to account for the position and the orientation of the imaging structure and enable the virtual image that appears at the distance in the environment to be generated with an accurate viewing angle relative to a viewing angle of the real object in the environment, the switchable diffractive elements switchable to independently activate the display zones to correct for an accurate stereopsis view of the virtual image that appears at the distance in the environment,wherein;
one or more first display zones can be activated to provide a representation of the virtual image for a right eye of a user based on tracked pupil positions of the user, one or more second display zones can be activated to provide a different representation of the virtual image for a left eye of the user based on the tracked pupil positions of the user, andthe one or more first display zones and the one or more second display zones are determined by calculating a ray-trace bisector for each of one or more tiles of the display device relative to a current bisector eye position.
2 Assignments
0 Petitions
Accused Products
Abstract
In embodiments of an auto-stereoscopic augmented reality display, the display device is implemented with an imaging structure that includes a waveguide for see-through viewing of an environment. The waveguide also transmits light of a virtual image that is generated as a near-display object to appear at a distance in the environment. The imaging structure includes switchable diffractive elements that are integrated in the waveguide and configured in display zones. The switchable diffractive elements are switchable to independently activate the display zones effective to correct for an accurate stereopsis view of the virtual image that appears at the distance in the environment.
472 Citations
19 Claims
-
1. An imaging structure implemented in a display device, the imaging structure comprising:
-
a waveguide configured for see-through viewing of an environment, the waveguide further configured to transmit light of a virtual image that is generated as a near-display object to appear at a distance in the environment when the environment is viewed through the waveguide; one or more sensors configured to provide reference data related to at least a position and an orientation of the imaging structure in the environment with respect to a real object in the environment; and switchable diffractive elements integrated in the waveguide and configured in display zones of the display device, the display zones including vector adjustments, based in part on the reference data, to account for the position and the orientation of the imaging structure and enable the virtual image that appears at the distance in the environment to be generated with an accurate viewing angle relative to a viewing angle of the real object in the environment, the switchable diffractive elements switchable to independently activate the display zones to correct for an accurate stereopsis view of the virtual image that appears at the distance in the environment, wherein; one or more first display zones can be activated to provide a representation of the virtual image for a right eye of a user based on tracked pupil positions of the user, one or more second display zones can be activated to provide a different representation of the virtual image for a left eye of the user based on the tracked pupil positions of the user, and the one or more first display zones and the one or more second display zones are determined by calculating a ray-trace bisector for each of one or more tiles of the display device relative to a current bisector eye position. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A computing device, comprising:
-
a see-through display device configured as an auto-stereoscopic augmented reality display to display a virtual image as a near-display object that appears at a distance in an environment that is viewable through the see-through display device; one or more sensors configured to provide reference data related to at least a position and an orientation of the see-through display device in the environment with respect to a real object in the environment; and a processing system to implement an imaging controller that is configured to control activation of switchable diffractive elements configured in display zones of the see-through display device, the display zones of the see-through display device including vector adjustments, based in part on the reference data, to account for the position and the orientation of the see-through display device and enable the virtual image that appears at the distance in the environment to be generated with an accurate viewing angle relative to a viewing angle of the real object in the environment, and the display zones independently controllable to correct for an accurate stereopsis view of the virtual image that appears at the distance in the environment, the see-through display device configured to activate one or more first display zones to display a representation of the virtual image for a right eye of a user based on tracked pupil positions of the user, and activate one or more second display zones to display a different representation of the virtual image for a left eye of a user based on the tracked pupil positions of the user, wherein the one or more first display zones and the one or more second display zones are determined by calculating a ray-trace bisector for each of one or more tiles of the see-through display device relative to a current bisector eye position. - View Dependent Claims (8, 9, 10, 11, 12, 13, 14)
-
-
15. A method, comprising:
-
generating a virtual image for display on a see-through display device; displaying the virtual image as a near-display object that appears at a distance in an environment that is viewable through the see-through display device; controlling activation of switchable diffractive elements configured in display zones of the see-through display device, the display zones independently controllable to correct for an accurate stereopsis view of the virtual image that appears at the distance in the environment, the controlling activation further comprising; tracking pupil positions of left and right eyes of a user; and controlling at least one of the display zones to be switched on to provide a representation of the virtual image for the right eye of the user based on the pupil positions and controlling at least one other of the display zones to be switched on to provide a different representation of the virtual image for a left eye of the user based on the pupil positions, wherein the at least one of the display zones and the at least one other of the display zones are determined by calculating a ray-trace bisector for each of one or more tiles of the see-through display device relative to a current bisector eye position. - View Dependent Claims (16, 17, 18, 19)
-
Specification