Method and system for determining a relation between a first scene and a second scene
First Claim
1. Method for determining a relation between a first scene and a second scene, said method comprising the steps ofgenerating at least one sensor image of a first scene with at least one sensor;
- accessing information related to at least one second scene by a processing and control module, said second scene encompassing said first scene; and
matching, by the processing and control module, the sensor image with the second scene to map the sensor image onto the second scene,wherein;
the step of accessing information related to the at least one second scene comprises;
accessing a textured 3D map comprising geocoded 3D coordinate data by the processing and control module, wherein the textured 3D map comprises a 3D model having texture information associated to one or more surfaces of the 3D model; and
extracting, from the textured 3D map and by the processing and control module, the texture information comprising the corresponding geocoded 3D coordinate data related to the at least one second scene, said extracting generating 2D extracted texture information; and
the step of matching the sensor image with the second scene to map the sensor image onto the second scene comprises;
matching, by the processing and control module, the 2D extracted texture information corresponding to at least one of the one or more surfaces of the 3D model and comprising the corresponding geocoded 3D coordinate data with 2D image texture information of the sensor image, the 2D image texture information of the sensor image corresponding to one or more surfaces within the sensor image, andassociating, by the processing and control module, the extracted corresponding geocoded 3D coordinate data to a plurality of positions in the sensor image based on the matching of the 2D extracted texture information with the 2D image texture information of the sensor image.
1 Assignment
0 Petitions
Accused Products
Abstract
The present invention relates to a system (200) and method for determining a relation between a first scene and a second scene. The method comprises the steps of generating at least one sensor image of a first scene with at least one sensor; accessing information related to at least one second scene, said second scene encompassing said first scene, and matching the sensor image with the second scene to map the sensor image onto the second scene. The step of accessing information related to the at least one second scene comprises accessing a 3D map comprising geocoded 3D coordinate data. The mapping involves associating geocoding information to a plurality of positions in the sensor image based on the coordinate data of the second scene.
-
Citations
33 Claims
-
1. Method for determining a relation between a first scene and a second scene, said method comprising the steps of
generating at least one sensor image of a first scene with at least one sensor; -
accessing information related to at least one second scene by a processing and control module, said second scene encompassing said first scene; and matching, by the processing and control module, the sensor image with the second scene to map the sensor image onto the second scene, wherein; the step of accessing information related to the at least one second scene comprises; accessing a textured 3D map comprising geocoded 3D coordinate data by the processing and control module, wherein the textured 3D map comprises a 3D model having texture information associated to one or more surfaces of the 3D model; and extracting, from the textured 3D map and by the processing and control module, the texture information comprising the corresponding geocoded 3D coordinate data related to the at least one second scene, said extracting generating 2D extracted texture information; and the step of matching the sensor image with the second scene to map the sensor image onto the second scene comprises; matching, by the processing and control module, the 2D extracted texture information corresponding to at least one of the one or more surfaces of the 3D model and comprising the corresponding geocoded 3D coordinate data with 2D image texture information of the sensor image, the 2D image texture information of the sensor image corresponding to one or more surfaces within the sensor image, and associating, by the processing and control module, the extracted corresponding geocoded 3D coordinate data to a plurality of positions in the sensor image based on the matching of the 2D extracted texture information with the 2D image texture information of the sensor image. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 27, 28, 29, 30, 32)
-
-
12. System for determining a relation between a first scene and a second scene, said system comprising:
-
at least one sensor configured to capture at least one image of the first scene; a map comprising geographic information; and a processing and control module comprising a processor and coupled to a storage medium, wherein the processing and control module accesses from the map information related to at least one second scene, said second scene encompassing said sensor image, and matches the sensor image with the second scene to map the sensor image onto the second scene, wherein; the map is a textured 3D map comprising geocoded 3D coordinate data, wherein the textured 3D map comprises a 3D model having texture information associated to one or more surfaces of the 3D model; the accessing of the map by the processing and control module comprises; accessing the textured 3D map, and extracting, from the textured 3D map, the texture information comprising the corresponding geocoded 3D coordinate data related to the at least one second scene, said extracting generating 2D extracted texture information; and the processing and control module matches the 2D extracted texture information corresponding to at least one of the one or more surfaces of the 3D model and comprising the corresponding geocoded 3D coordinate data with 2D image texture information of the sensor image, the 2D image texture information of the sensor image corresponding to one or more surfaces within the sensor image, and associates the extracted corresponding geocoded 3D coordinate data to a plurality of positions in the sensor image based on the coordinate data of the second scene based on the matching of the 2D extracted texture information with the 2D image texture information of the sensor image. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 31, 33)
-
-
26. A computer program product for determining a relation between a first scene and a second scene, the computer program product comprising at least one non-transitory computer-readable storage medium having computer-readable program code portions stored therein, the computer-readable program code portions comprising:
-
an executable portion configured for receiving first data related to at least one sensor image of a first scene from at least one sensor; an executable portion configured for receiving information related to at least one second scene, said second scene encompassing said first scene; and an executable portion configured for matching the sensor image with the second scene to map the sensor image onto the second scene, wherein; the receiving of information related to the at least one second scene comprises; accessing a textured 3D map comprising geocoded 3D coordinate data, wherein the textured 3D map comprises a 3D model having texture information associated to one or more surfaces of the 3D model; and extracting, from the textured 3D map, the texture information comprising the corresponding geocoded 3D coordinate data related to the at least one second scene, said extracting generating 2D extracted texture information; and the step of matching the sensor image with the second scene to map the sensor image onto the second scene comprises; matching the 2D extracted texture information corresponding to at least one of the one or more surfaces of the 3D model and comprising the corresponding geocoded 3D coordinate data with 2D image texture information of the sensor image, the 2D image texture information of the sensor image corresponding to one or more surfaces within the sensor image, and associating the extracted corresponding geocoded 3D coordinate data to a plurality of positions in the sensor image based on the matching of the 2D extracted texture information with the 2D image texture information of the sensor image.
-
Specification