Method for registration of range images from multiple LiDARS
First Claim
1. A method for fusing sensor signals from at least two LiDAR sensors having over-lapping field-of-views so as to track objects detected by the sensors, said method comprising:
- defining a transformation value for at least one of the LiDAR sensors that identifies an orientation angle and position of the sensor;
providing target scan points from the objects detected by the sensors where the target scan points for each sensor provide a separate target point map;
projecting the target point map from the at least one sensor to another one of the LiDAR sensors using a current transformation value to overlap the target scan points from the sensors;
determining a plurality of weighting values using the current transformation value where each weighting value identifies a positional change of one of the scan points for the at least one sensor to a location of a scan point for the another one of the sensors, including using the equation;
3 Assignments
0 Petitions
Accused Products
Abstract
A system and method for registering range images from objects detected by multiple LiDAR sensors on a vehicle. The method includes aligning frames of data from at least two LiDAR sensors having over-lapping field-of-views in a sensor signal fusion operation so as to track objects detected by the sensors. The method defines a transformation value for at least one of the LiDAR sensors that identifies an orientation angle and position of the sensor and provides target scan points from the objects detected by the sensors where the target scan points for each sensor provide a separate target point map. The method projects the target point map from the at least one sensor to another one of the LiDAR sensors using a current transformation value to overlap the target scan points from the sensors.
64 Citations
18 Claims
-
1. A method for fusing sensor signals from at least two LiDAR sensors having over-lapping field-of-views so as to track objects detected by the sensors, said method comprising:
-
defining a transformation value for at least one of the LiDAR sensors that identifies an orientation angle and position of the sensor; providing target scan points from the objects detected by the sensors where the target scan points for each sensor provide a separate target point map; projecting the target point map from the at least one sensor to another one of the LiDAR sensors using a current transformation value to overlap the target scan points from the sensors; determining a plurality of weighting values using the current transformation value where each weighting value identifies a positional change of one of the scan points for the at least one sensor to a location of a scan point for the another one of the sensors, including using the equation; - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A method for fusing sensor signals from at least two LiDAR sensors having over-lapping field-of-views so as to track objects detected by the sensors, said LiDAR sensors being on a vehicle, said method comprising:
-
defining a current transformation value for at least one of the LiDAR sensors that identifies an orientation angle and position of the sensor at one sample time; providing target scan points from the objects detected by the sensors where the target scan points for each sensor provide a separate target point map; projecting the target point map from the at least one sensor to another one of the LiDAR sensors using the current transformation value to overlap the target scan points from the sensors; determining a plurality of weighting values using the current transformation value where each weighting value identifies a positional change of one of the scan points for the at least one sensor to a location of a scan point for the another one of the sensors, including using the equation; - View Dependent Claims (14, 15, 16, 17, 18)
-
Specification