Augmented reality visualization system
First Claim
1. An augmented reality (AR) system comprising:
- a head mounted display (HMD) comprising a visual display system and a head mounting structure adapted to fix the HMD with respect to a human operator'"'"'s field of view (FOV), and a camera system oriented towards said FOV configured to capture a plurality of video images of entities, elements or objects within the FOV;
a plurality of sensors comprising a head mounted position sensor and a first position sensor receiver positioned on a first device, said plurality of sensors comprising one or more inertial navigation units (INU) or an inertial measuring unit (IMU), said INU comprising a plurality of accelerometers, said IMU comprising an INU and a global position system (GPS) configured to determine a location of the GPS system, wherein said INU and IMU are configured to determine an orientation of at least a first device and each said sensor is coupled with an output of said location as an orientation data;
a laser rangefinder mounted to said first device used to measure distance between projectile launcher and impact point, or moving object and intended flight path;
a control system comprising a storage medium comprising a plurality of machine readable instructions operable to control elements of said AR system, a processor adapted to execute said plurality of machine readable instructions, said control system is configured to control said AR system, said sensors, and said HMD;
a communication system configured to couple said HMD, said sensors, and said control system, said communication system is further configured to transmit electronic data between said HMD and a remote source using a radio frequency system or a laser data transfer system;
wherein said plurality of machine readable instructions comprises;
a first plurality of instructions configured to operate laser range-finder and ballistic trajectory calculation system as well as an impact point determination system based on at least a first, second, and third data associated with at least one said first device, wherein said first data comprises an aim point of said first device, said second data comprises a computed impact point of a projectile fired from said first device, said third data comprises a path of travel of said projectile from a point of origin from said first device to said impact point, wherein said first, second and third data each comprise one or more geo-reference system position coordinates;
a second plurality of instructions configured to control said processor to operate said plurality of sensors to generate data used by said control system to determine three dimensional orientations of each of said sensors based determinations of location of the devices relative to at least the head mounted position system (HMPS) and said device mounted position sensor;
a third plurality of instructions configured to operate a video processing system to determine and create a wire frame model data of objects within the FOV based on the plurality of video images captured by the camera system, using a photogrammetry processing systema fourth plurality of instructions configured to control said processor to operate said HMD to generate and display on said HMD a plurality of AR visualizations comprising a first visualization within said FOV of said operator comprising at least a first, second, third, and fourth visualization element, wherein said first and second visualization elements are generated based on a first, second, and third plurality of instructions outputs, said first visualization element comprises a visual designator showing said aim point on a surface defined by said wire frame model data, said impact point on said wire frame model data, or said path of travel with respect to at least a portion of said wire frame model data, said second visualization element comprises a visual damage overlay displaying a predicted blast radius associated with a projectile fired from first device, said third visualization element comprises a visual overlay highlighting or designating of a plurality of entities, elements or objects, said fourth visualization element comprises a plurality of metadata associated with said plurality of entities, elements or objects highlighted or designated by said third plurality element; and
a fifth plurality of instructions configured to provide a graphical user interface (GUI) control interfaces enables said operator to select and control one or more of said plurality of visualization elements, wherein a cursor control software module generates and displays a cursor on said HMD, wherein said operator is able to manipulate said cursor to interact with said plurality of visualization elements.
1 Assignment
0 Petitions
Accused Products
Abstract
An augmented reality (AR) system comprising a head mounted display (HMD) configured to display one or more AR visualizations within an operator'"'"'s field of view (FOV), a control system including a processor and a storage system configured to store machine readable instructions, sensors configured to determine at least location and/or orientation of said sensors including a head mounted and device mounted sensor, and a communication system configured to communicate data between elements of the AR system. The software including various subroutines or machine readable instructions including an orientation/location instructions for determining orientation and/or position of the sensors, a visualizations generation instructions section configured to generate a visualization showing an aim point of a device coupled to said device mounted sensor, a path of travel of a projectile launched from said device, or an impact point of said projectile. Embodiments can include one or more photogrammetry processing sections.
-
Citations
14 Claims
-
1. An augmented reality (AR) system comprising:
-
a head mounted display (HMD) comprising a visual display system and a head mounting structure adapted to fix the HMD with respect to a human operator'"'"'s field of view (FOV), and a camera system oriented towards said FOV configured to capture a plurality of video images of entities, elements or objects within the FOV; a plurality of sensors comprising a head mounted position sensor and a first position sensor receiver positioned on a first device, said plurality of sensors comprising one or more inertial navigation units (INU) or an inertial measuring unit (IMU), said INU comprising a plurality of accelerometers, said IMU comprising an INU and a global position system (GPS) configured to determine a location of the GPS system, wherein said INU and IMU are configured to determine an orientation of at least a first device and each said sensor is coupled with an output of said location as an orientation data; a laser rangefinder mounted to said first device used to measure distance between projectile launcher and impact point, or moving object and intended flight path; a control system comprising a storage medium comprising a plurality of machine readable instructions operable to control elements of said AR system, a processor adapted to execute said plurality of machine readable instructions, said control system is configured to control said AR system, said sensors, and said HMD; a communication system configured to couple said HMD, said sensors, and said control system, said communication system is further configured to transmit electronic data between said HMD and a remote source using a radio frequency system or a laser data transfer system; wherein said plurality of machine readable instructions comprises; a first plurality of instructions configured to operate laser range-finder and ballistic trajectory calculation system as well as an impact point determination system based on at least a first, second, and third data associated with at least one said first device, wherein said first data comprises an aim point of said first device, said second data comprises a computed impact point of a projectile fired from said first device, said third data comprises a path of travel of said projectile from a point of origin from said first device to said impact point, wherein said first, second and third data each comprise one or more geo-reference system position coordinates; a second plurality of instructions configured to control said processor to operate said plurality of sensors to generate data used by said control system to determine three dimensional orientations of each of said sensors based determinations of location of the devices relative to at least the head mounted position system (HMPS) and said device mounted position sensor; a third plurality of instructions configured to operate a video processing system to determine and create a wire frame model data of objects within the FOV based on the plurality of video images captured by the camera system, using a photogrammetry processing system a fourth plurality of instructions configured to control said processor to operate said HMD to generate and display on said HMD a plurality of AR visualizations comprising a first visualization within said FOV of said operator comprising at least a first, second, third, and fourth visualization element, wherein said first and second visualization elements are generated based on a first, second, and third plurality of instructions outputs, said first visualization element comprises a visual designator showing said aim point on a surface defined by said wire frame model data, said impact point on said wire frame model data, or said path of travel with respect to at least a portion of said wire frame model data, said second visualization element comprises a visual damage overlay displaying a predicted blast radius associated with a projectile fired from first device, said third visualization element comprises a visual overlay highlighting or designating of a plurality of entities, elements or objects, said fourth visualization element comprises a plurality of metadata associated with said plurality of entities, elements or objects highlighted or designated by said third plurality element; and a fifth plurality of instructions configured to provide a graphical user interface (GUI) control interfaces enables said operator to select and control one or more of said plurality of visualization elements, wherein a cursor control software module generates and displays a cursor on said HMD, wherein said operator is able to manipulate said cursor to interact with said plurality of visualization elements. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A process for operating an augmented reality control system comprising:
-
operating a laser rangefinder and ballistic trajectory calculation system comprising computing ballistic trajectory and impact point of a projectile or moving object based on a first, second, third, and fourth ballistic solution data, wherein said first ballistic solution data comprises a laser range finder output, said second ballistic solution data comprises a weather data associated with an environment a projectile launcher or moving object is operating within, said third ballistic solution data comprises a weapon angle data obtained from a position sensor mounted on the projectile launcher or moving object, said fourth ballistic solution data comprises a ballistic or flight performance characteristics of said projectiles or moving objects; operating a position tracker system comprising of measuring a relative position between a projectile launcher'"'"'s launch axis or a moving object'"'"'s flight orientation with respect to a three axis frame of reference and a HMD camera'"'"'s or a set of HMD camera'"'"'s field of view (FOV) orientation with respect to the three axis frame of reference based on a first and second plurality of outputs respectively from a plurality of position tracking nodes comprising a first and second position sensor respectively mounted on said HMD and said projectile launcher or moving object; executing a photogrammetry process comprising using the HMD camera or set of HMD cameras mounted on the HMD to capture a plurality of images to create a wire frame model data structure to compute offset measurements of a plurality of objects within said plurality of images by using geometric triangulation computations to determine distances between said plurality of objects and calculating range by determining angle offset between said plurality of objects from at least two said plurality of images, wherein the range and distance data are saved into a wire frame model object entities data structure; generating visual overlays for display over a HMD FOV comprising receiving inputs from said laser rangefinder and ballistic trajectory calculation process, said position tracker process, and said photogrammetry processes and computing parallax for a line of sight within said HMD FOV versus the point of impact, calculating the impact within the wire frame model data structure, generating shaped overlays including occlusions based on the wire frame model data structure, and outputting generated said video overlays to said HMD that are positioned and aligned with said HMD FOV; and projecting the video overlays on said HMD. - View Dependent Claims (11, 12, 13, 14)
-
Specification