Multi-Range Three-Dimensional Imaging Systems
First Claim
1. A system for performing three-dimensional imaging of a scene, the system comprising:
- a first lidar sensor having a first optical axis oriented at a first angle toward the scene, the first lidar sensor including;
a first laser source configured to emit a first plurality of laser pulses;
a first emission optical element configured to collimate and direct the first plurality of laser pulses at a first plurality of corresponding incidence angles with respect to the first optical axis toward one or more first objects in the scene, wherein a portion of each of the first plurality of laser pulses is reflected off of the one or more first objects;
a first receiving optical element configured to receive and focus the portion of each of the first plurality of laser pulses reflected off of the one or more first objects; and
a first photodetector configured to receive and detect the portion of each of the first plurality of laser pulses focused by the first receiving optical element;
a second lidar sensor having a second optical axis oriented at a second angle toward the scene, the second lidar sensor comprising;
a second laser source configured to emit a second plurality of laser pulses;
a second emission optical element configured to collimate and direct the second plurality of laser pulses at a second plurality of corresponding incidence angles with respect to the second optical axis toward one or more second objects in the scene, wherein a portion of each of the second plurality of laser pulses is reflected off of the one or more second objects;
a second receiving optical element configured to receive and focus the portion of each of the second plurality of laser pulses reflected off of the one or more second objects; and
a second photodetector configured to receive and detect the portion of each of the second plurality of laser pulses focused by the second receiving optical element; and
a processor including one or more processing units coupled to the first lidar sensor and the second lidar sensor, the processor configured to;
determine a time of flight for each of the first plurality of laser pulses and each of the second plurality of laser pulses from emission to detection; and
construct a three dimensional image of the scene based on the determined time of flight for each of the first plurality of laser pulses and each of the second plurality of laser pulses, the first angle of the first optical axis, the first plurality of incidence angles, the second angle of the second optical axis, and the second plurality of incidence angles.
3 Assignments
0 Petitions
Accused Products
Abstract
A three-dimensional imaging system includes a lidar sensor having a first optical axis oriented at a first angle toward a scene and configured to determine a three-dimensional image of one or more first objects in the scene, and an optical three-dimensional sensor having a second optical axis oriented at a second angle toward the scene and configured to construct a three-dimensional image of one or more second objects in the scene. The first three-dimensional sensor is characterized by a first angular field of view. The second three-dimensional sensor is characterized by a second angular field of view different from the first angular field of view.
-
Citations
24 Claims
-
1. A system for performing three-dimensional imaging of a scene, the system comprising:
-
a first lidar sensor having a first optical axis oriented at a first angle toward the scene, the first lidar sensor including; a first laser source configured to emit a first plurality of laser pulses; a first emission optical element configured to collimate and direct the first plurality of laser pulses at a first plurality of corresponding incidence angles with respect to the first optical axis toward one or more first objects in the scene, wherein a portion of each of the first plurality of laser pulses is reflected off of the one or more first objects; a first receiving optical element configured to receive and focus the portion of each of the first plurality of laser pulses reflected off of the one or more first objects; and a first photodetector configured to receive and detect the portion of each of the first plurality of laser pulses focused by the first receiving optical element; a second lidar sensor having a second optical axis oriented at a second angle toward the scene, the second lidar sensor comprising; a second laser source configured to emit a second plurality of laser pulses; a second emission optical element configured to collimate and direct the second plurality of laser pulses at a second plurality of corresponding incidence angles with respect to the second optical axis toward one or more second objects in the scene, wherein a portion of each of the second plurality of laser pulses is reflected off of the one or more second objects; a second receiving optical element configured to receive and focus the portion of each of the second plurality of laser pulses reflected off of the one or more second objects; and a second photodetector configured to receive and detect the portion of each of the second plurality of laser pulses focused by the second receiving optical element; and a processor including one or more processing units coupled to the first lidar sensor and the second lidar sensor, the processor configured to; determine a time of flight for each of the first plurality of laser pulses and each of the second plurality of laser pulses from emission to detection; and construct a three dimensional image of the scene based on the determined time of flight for each of the first plurality of laser pulses and each of the second plurality of laser pulses, the first angle of the first optical axis, the first plurality of incidence angles, the second angle of the second optical axis, and the second plurality of incidence angles. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A three-dimensional imaging system comprising:
-
a first three-dimensional sensor having a first optical axis oriented at a first angle toward a scene, the first three-dimensional sensor including; a laser source configured to emit a plurality of laser pulses; an emission optical element configured to collimate and direct the plurality of laser pulses at a plurality of corresponding incidence angles with respect to the first optical axis toward one or more first objects in the scene, wherein a portion of each of the plurality of laser pulses is reflected off of the one or more first objects; a receiving optical element configured to receive and focus the portion of each of the plurality of laser pulses reflected off of the one or more first objects; a photodetector configured to receive and detect the portion of each of the plurality of laser pulses focused by the receiving optical element; and a processor including one or more processing units coupled to the laser source and the photodetector and configured to; determine a time of flight for each of the plurality of laser pulses; and construct a three dimensional image of the one or more first objects based on the determined time of flight for each of the plurality of laser pulses, the first angle of the first optical axis, and the plurality of incidence angles; wherein the first three-dimensional sensor is characterized by a first angular field of view; and a second three-dimensional sensor configured to construct a three-dimensional image of one or more second objects in the scene, the second three-dimensional sensor characterized by a second angular field of view; wherein the processor is coupled to the second three-dimensional sensor and configured to construct a three-dimensional image of the scene based on the three-dimensional image of the one or more first objects and the three-dimensional image of the one or more second objects. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16, 17)
-
-
18. A method of three-dimensional sensing for an autonomous vehicle, the method comprising:
-
sensing one or more first objects in a forward direction using a lidar sensor including a laser source, a photodetector, an emission optical element, a receiving optical element, and a processor including one or more processing units coupled to the laser source and the photodetector, by; emitting, using the laser source, a plurality of laser pulses; collimating and directing, using the emission optical element, the plurality of laser pulses at a plurality of corresponding incidence angles with respect to the forward direction toward the one or more first objects, wherein a portion of each of the plurality of laser pulses is reflected off of the one or more first objects; receiving and focusing, using the receiving optical element, the portion of each of the plurality of laser pulses reflected off of the one or more first objects; detecting, using the photodetector, the portion of each of the plurality of laser pulses focused by the receiving optical element; determining, using the processor, a time of flight for each of the plurality of laser pulses from emission to detection; and constructing, using the processor, a three dimensional image of the one or more first objects based on the determined time of flight for each of the plurality of laser pulses and the plurality of incidence angles; sensing one or more second objects in a direction to the left or right using an optical three-dimensional sensor to obtain a three-dimensional image of the one or more second objects; and combining, using the processor, the three dimensional image of the one or more first objects and the three-dimensional image of the one or more second objects. - View Dependent Claims (19, 20, 21, 22, 23, 24)
-
Specification