Lidar and camera synchronization
First Claim
1. A method for synchronizing a light detection and ranging sensor (lidar) and a camera on an autonomous vehicle, comprising:
- selecting a plurality of track samples for a route, each track sample comprising an image captured at a camera timestamp by the camera and a lidar scan captured by the lidar;
for each track sample, calculating a time shift comprising;
for each time delta of a plurality of time deltas;
adjusting the camera timestamp for the image by the time delta,projecting the lidar scan into image coordinates of the image as a lidar projection according to the adjusted camera timestamp, andcalculating an alignment score of the lidar projection indicative of alignment of the lidar projection to the image;
defining time shift of the track sample as the time delta with an optimal alignment score;
modeling time drift of the camera compared to the lidar over the route based on the calculated time shifts for the track samples; and
synchronizing the lidar and the camera according to the modeled time drift.
2 Assignments
0 Petitions
Accused Products
Abstract
A method and system for synchronizing a lidar and a camera on an autonomous vehicle. The system selects a plurality of track samples for a route including a lidar scan and an image. For each track sample, the system calculates a time shift by iterating many time deltas. For each time delta, the system adjusts a camera timestamp by that time delta, projects a lidar scan onto the image as a lidar projection according to the adjusted camera timestamp, and calculates an alignment score of the lidar projection for that time delta. The system defines the time shift for each track sample as the time delta with the highest alignment score. The system then models time drift of the camera compared to the lidar based on the calculated time shifts for the track samples and synchronizes the lidar and the camera according to the modeled time drift.
16 Citations
31 Claims
-
1. A method for synchronizing a light detection and ranging sensor (lidar) and a camera on an autonomous vehicle, comprising:
-
selecting a plurality of track samples for a route, each track sample comprising an image captured at a camera timestamp by the camera and a lidar scan captured by the lidar; for each track sample, calculating a time shift comprising; for each time delta of a plurality of time deltas; adjusting the camera timestamp for the image by the time delta, projecting the lidar scan into image coordinates of the image as a lidar projection according to the adjusted camera timestamp, and calculating an alignment score of the lidar projection indicative of alignment of the lidar projection to the image; defining time shift of the track sample as the time delta with an optimal alignment score; modeling time drift of the camera compared to the lidar over the route based on the calculated time shifts for the track samples; and synchronizing the lidar and the camera according to the modeled time drift. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. A non-transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to perform steps for synchronizing a lidar and a camera on an autonomous vehicle, the steps comprising:
-
selecting a plurality of track samples for a route, each track sample comprising an image captured at a camera timestamp by the camera and a lidar scan captured by the lidar; for each track sample, calculating a time shift comprising; for each time delta of a plurality of time deltas; adjusting the camera timestamp for the image by the time delta, projecting the lidar scan into image coordinates of the image as a lidar projection according to the adjusted camera timestamp, and calculating an alignment score of the lidar projection indicative of alignment of the lidar projection to the image; defining time shift of the track sample as the time delta with an optimal alignment score; modeling time drift of the camera compared to the lidar over the route based on the calculated time shifts for the track samples; and synchronizing the lidar and the camera according to the modeled time drift. - View Dependent Claims (15, 16, 17, 18, 19, 20, 21, 22)
-
-
23. A system comprising:
-
a lidar on an autonomous vehicle; a camera on the autonomous vehicle; a processor; and a non-transitory computer-readable storage medium storing instructions that, when executed by the processor, cause the processor to perform steps for synchronizing the lidar and the camera, the steps comprising; selecting a plurality of track samples for a route, each track sample comprising an image captured at a camera timestamp by the camera and a lidar scan captured by the lidar; for each track sample, calculating a time shift comprising; for each time delta of a plurality of time deltas; adjusting the camera timestamp for the image by the time delta, projecting the lidar scan into image coordinates of the image as a lidar projection according to the adjusted camera timestamp, and calculating an alignment score of the lidar projection indicative of alignment of the lidar projection to the image; defining time shift of the track sample as the time delta with an optimal alignment score; modeling time drift of the camera compared to the lidar over the route based on the calculated time shifts for the track samples; and synchronizing the lidar and the camera according to the modeled time drift. - View Dependent Claims (24, 25, 26, 27, 28, 29, 30, 31)
-
Specification