Inertial measurement with an imaging sensor and a digitized map
First Claim
Patent Images
1. A method of electro-optical absolute attitude determination of an object comprising:
- capturing a frame of image data of a first scene with a detector or sensor that measures angles of detected features within the captured frame of image data with respect to the detector'"'"'s boresight reference relative to the object;
identifying at least three features from the captured frame of image data of said first scene byselecting features within the pixel space of the captured frame of image data; and
computationally correlating said identified features to features in a map to determine corresponding map locations of said identified features and a location of the object within the map;
calculating, with a processor, the absolute attitude based on the difference between the locations of the at least three features in the captured frame of image data and the locations of the correlated features in the map bytransforming the correlated features from the map into a first set of object space angular coordinates based on the location of the object within the map and the measured angles of the correlated features to generate horizontal, vertical, and arc coordinate values.
0 Assignments
0 Petitions
Accused Products
Abstract
The present invention relates to a system and method for determining vehicle attitude and position from image data detected by sensors in a vehicle. The invention uses calculated differences between the locations of selected features in an image plane and the location of corresponding features in a terrain map to determine the attitude of the vehicle carrying the sensors with respect to a ground frame of reference.
-
Citations
20 Claims
-
1. A method of electro-optical absolute attitude determination of an object comprising:
-
capturing a frame of image data of a first scene with a detector or sensor that measures angles of detected features within the captured frame of image data with respect to the detector'"'"'s boresight reference relative to the object; identifying at least three features from the captured frame of image data of said first scene by selecting features within the pixel space of the captured frame of image data; and computationally correlating said identified features to features in a map to determine corresponding map locations of said identified features and a location of the object within the map;
calculating, with a processor, the absolute attitude based on the difference between the locations of the at least three features in the captured frame of image data and the locations of the correlated features in the map bytransforming the correlated features from the map into a first set of object space angular coordinates based on the location of the object within the map and the measured angles of the correlated features to generate horizontal, vertical, and arc coordinate values. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
6. The method of claim 1, the method further comprising:
-
capturing a second frame of image data of the first scene with a second detector that measures angles of detected features within the second frame of image data with respect to the second detector'"'"'s boresight reference relative to the object; and identifying said selected features in the captured second frame of image data of said first scene; where transforming the selected features into a set of object space angular coordinates further includes transforming the selected features into said set of object space angular coordinates based on the measured angles from the second detector, and a known line-of-sight (LOS) boresight angle between the first and second detectors.
-
-
7. The method of claim 6, where the first and second detectors have different detection wavelength ranges.
-
8. A device for electro-optically determining absolute attitude of an object, the device comprising:
-
an imaging sensor that captures a frame of image data of a first scene and that measures the angles of detected features within the captured frame of image data with respect to the sensor'"'"'s boresight reference relative to the object; a feature identification unit that identifies at least three features in captured frame of image data, the feature identification unit comprising a feature selection unit that selects features within the pixel space of the captured frame of image data; and a feature correlation unit that correlates said identified features to a map; a feature location unit that determines corresponding map locations of the correlated features and a location of the object within the map; and an attitude calculator that calculates the absolute attitude based on the difference in the locations of at least three features in the frame of image data and the locations of the correlated features in the map, the attitude calculator comprising a space transformation module that transforms the correlated features from the map into a first set of object space angular coordinates based on the location of the object within the map and the measured angles of the correlated features to generate horizontal, vertical, and arc coordinate values. - View Dependent Claims (9, 10, 11, 12, 13)
-
-
14. A non-transitory computer-readable medium having embodied thereon instructions which, when executed by a computing device, cause the device to perform a method of electro-optical absolute attitude determination of an object comprising:
-
capturing a frame of image data of a first scene with a detector or sensor that measures angles of detected features within the captured frame of image data with respect to the detector'"'"'s boresight reference relative to the object; identifying at least three features from the captured frame of image data of said first scene by selecting features within the pixel space of the captured frame of image data; and computationally correlating said identified features to features in a map to determine corresponding map locations of said identified features and a location of the object within the map;
calculating, with a processor, the absolute attitude based on the difference between the locations of the at least three features in the captured frame of image data and the locations of the correlated features in the map bytransforming the correlated features from the map into a first set of object space angular coordinates based on the location of the object within the map and the measured angles of the correlated features to generate horizontal, vertical, and arc coordinate values. - View Dependent Claims (15, 16, 17, 18, 19, 20)
-
-
19. The medium of claim 14, the method further comprising:
-
capturing a second frame of image data of the first scene with a second detector that measures angles of detected features within the second frame of image data with respect to the second detector'"'"'s boresight reference relative to the object; and identifying said selected features in the captured second frame of image data of said first scene; where transforming the selected features into a set of object space angular coordinates further includes transforming the selected features into said set of object space angular coordinates based on the measured angles from the second detector, and a known line-of-sight (LOS) boresight angle between the first and second detectors.
-
-
20. The medium of claim 19, where the first and second detectors have different detection wavelength ranges.
Specification