Visualization of 3-D GPR data in augmented reality
First Claim
1. A method, comprising:
- displaying an augmented reality view on a display screen of an electronic device;
creating a virtual excavation in a surface shown in the augmented reality view, the virtual excavation having a plurality of boundary surfaces that are defined by positions in three-dimensional (3-D) space of a 3-D model;
calculating, by the electronic device, an intersection of positions of the plurality of boundary surfaces of the virtual excavation in the 3-D space of the 3-D model and 3-D ground penetrating radar (GPR) data that has been indexed in the 3-D space of the 3-D model;
extracting, by the electronic device, data items of the 3-D GPR data that intersect the plurality of boundary surfaces of the virtual excavation to create a plurality of data sets, each data set corresponding to a respective one of the boundary surfaces of the virtual excavation; and
for each data set, projecting a two-dimensional (2-D) image based on the extracted 3-D GPR data items of the data set onto the corresponding boundary surface of the virtual excavation in the augmented reality view, the 2-D image including colors or shading to show subsurface features determined from the extracted 3-D GPR data items.
2 Assignments
0 Petitions
Accused Products
Abstract
In one embodiment, an augmented reality application generates an augmented reality view that displays three-dimensional (3-D) ground penetrating radar (GPR) data on boundary surfaces of a virtual excavation. The augmented reality application calculates an intersection of the one or more boundary surfaces of the virtual excavation and the 3-D GPR data, and extracts data items of the 3-D GPR data that intersect the one or more boundary surfaces of the virtual excavation. The augmented reality application then projects two-dimensional (2-D) images based on the extracted data items onto the one or more boundary surfaces of the virtual excavation to show subsurface features in the augmented reality view that can be manipulated (e.g., moved, rotated, scaled, have its depth changed, etc) by a user.
-
Citations
20 Claims
-
1. A method, comprising:
-
displaying an augmented reality view on a display screen of an electronic device; creating a virtual excavation in a surface shown in the augmented reality view, the virtual excavation having a plurality of boundary surfaces that are defined by positions in three-dimensional (3-D) space of a 3-D model; calculating, by the electronic device, an intersection of positions of the plurality of boundary surfaces of the virtual excavation in the 3-D space of the 3-D model and 3-D ground penetrating radar (GPR) data that has been indexed in the 3-D space of the 3-D model; extracting, by the electronic device, data items of the 3-D GPR data that intersect the plurality of boundary surfaces of the virtual excavation to create a plurality of data sets, each data set corresponding to a respective one of the boundary surfaces of the virtual excavation; and for each data set, projecting a two-dimensional (2-D) image based on the extracted 3-D GPR data items of the data set onto the corresponding boundary surface of the virtual excavation in the augmented reality view, the 2-D image including colors or shading to show subsurface features determined from the extracted 3-D GPR data items. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 19, 20)
-
-
11. An apparatus, comprising:
-
a display screen; a processor; and a memory coupled to the processor and configured to store instructions for an augmented reality application that are executable on the processor, the instructions for the augmented reality application, when executed, operable to; display an augmented reality view on the display screen, create a virtual excavation in a surface shown in the augmented reality view, the virtual excavation having a plurality of boundary surfaces that are defined by positions in three-dimensional (3-D) space of a 3-D model, calculate an intersection of positions of the plurality of boundary surfaces of the virtual excavation in the 3-D space of the 3-D model and three-dimensional (3-D) ground penetrating radar (GPR) data that has been indexed in the 3-D space of the 3-D model, extract data items of the 3-D GPR data that intersect the plurality of boundary surfaces of the virtual excavation to create a plurality of data sets, each data set corresponding to a respective one of the boundary surfaces of the virtual excavation, and for each data set, project a two-dimensional (2-D) image based on the extracted 3-D GPR data items of the data set onto the corresponding boundary surface of the virtual excavation in the augmented reality view, the 2-D image including colors or shading to show subsurface features from the extracted 3-D GPR data items. - View Dependent Claims (12, 13)
-
-
14. A non-transitory computer-readable medium that includes instructions executable on a processor, the instructions, when executed, operable to:
-
display an augmented reality view; create a virtual excavation in a surface shown in the augmented reality view, the virtual excavation having a plurality of boundary surfaces that are defined by positions in three-dimensional (3-D) space of a 3-D model; calculate an intersection of positions of the plurality of boundary surfaces of the virtual excavation in the 3-D space of the 3-D model and penetrating radar data descriptive of subsurface features that has been indexed in the 3-D space of the 3-D model; extract data items of the penetrating radar data descriptive of subsurface features that intersect the plurality of boundary surfaces of the virtual excavation to create data sets, each data set corresponding to a respective one of the boundary surfaces of the virtual excavation; and for each data set, project an image based on the extracted penetrating radar data items of the data set onto the corresponding boundary surface of the virtual excavation in the augmented reality view, the image including colors or shading to show one or more subsurface features determined from the extracted penetrating radar data items. - View Dependent Claims (15, 16, 17, 18)
-
Specification