Method and apparatus for picking up a three-dimensional range image
First Claim
1. A method for picking up a three-dimensional range image of spatial objects using an optoelectronic sensor with pixel resolution having electronic short-time integrators for each pixel element within the sensor, wherein an integration time can be adjusted, comprising the steps of:
- illuminating an object having a plurality of object points with one or more light pulses each having a predetermined period Δ
L;
sensing light pulses with the sensor that have been backscattered by object points of the object at corresponding pixels of the sensor within a predetermined short integration time Δ
A, where Δ
A≦
Δ
L, and wherein a time instant for a beginning of the predetermined short integration time Δ
A precedes incidence of the first backscattered light pulse at the sensor, which corresponds to a nearest object point;
registering intensities of each of the sensed light pulses that have been backscattered by the object points; and
computing distance values from different registered intensities of the backscattered light pulses resulting from their different transit times.
1 Assignment
0 Petitions
Accused Products
Abstract
A method and apparatus for generating three-dimensional range images of spatial objects, wherein a short-time illumination of the object is performed, for instance using laser diodes. An image sensor is used that has a high light sensitivity, pixel resolution and can be read out randomly, this sensor also having an integration time that can be adjusted for each pixel. By evaluating the backscattered laser pulses in two integration windows with different integration times and by averaging over several laser pulses, three-dimensional range images can be picked up with a high reliability in, for example, 5 ms at the most.
126 Citations
45 Claims
-
1. A method for picking up a three-dimensional range image of spatial objects using an optoelectronic sensor with pixel resolution having electronic short-time integrators for each pixel element within the sensor, wherein an integration time can be adjusted, comprising the steps of:
-
illuminating an object having a plurality of object points with one or more light pulses each having a predetermined period Δ
L;
sensing light pulses with the sensor that have been backscattered by object points of the object at corresponding pixels of the sensor within a predetermined short integration time Δ
A, where Δ
A≦
Δ
L, and wherein a time instant for a beginning of the predetermined short integration time Δ
A precedes incidence of the first backscattered light pulse at the sensor, which corresponds to a nearest object point;
registering intensities of each of the sensed light pulses that have been backscattered by the object points; and
computing distance values from different registered intensities of the backscattered light pulses resulting from their different transit times. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20)
wherein the two or more light sources are activated in series; and
evaluation is performed for each respective partial illumination.
-
-
7. The method as claimed in claim 1, wherein the integration time is configured to be adjusted pixel by pixel.
-
8. The method as claimed in claim 1, wherein a beginning of integration times Δ
-
A and Δ
B is delayed by a trigger impulse delay relative to the emitted pulse.
-
A and Δ
-
9. The method as claimed in claim 1, wherein the integration time Δ
- A is less than 100 ns.
-
10. The method as claimed in claim 1, wherein the integration time Δ
-
B is approximately 1 μ
s.
-
B is approximately 1 μ
-
11. The method as claimed in claim 1, wherein different integration times Δ
-
A and Δ
B are respectively adjusted row by row in order to simultaneously acquire a three-dimensional image and a gray value image on the sensor.
-
A and Δ
-
12. The method as claimed in claim 1, wherein different integration times Δ
-
A and Δ
B are respectively adjusted pixel by pixel in order to simultaneously acquire a three-dimensional image and a gray value image on the sensor.
-
A and Δ
-
13. The method as claimed in claim 1, wherein the sensor is read out randomly.
-
14. The method as claimed in claim 1, wherein the sensor is a CMOS sensor.
-
15. The method as claimed in claim 1, wherein the distance from the sensor to at least one reference point is set as a reference distance.
-
16. The method as claimed in claim 15, wherein the reference point is located at a door frame of a vehicle.
-
17. The method as claimed in claim 1, wherein at least one of static objects and motion sequences is detected.
-
18. The method as claimed in claim 17, wherein the object is one of physical objects and persons located in at least one of defined spaces and vehicles being monitored.
-
19. The method as claimed in claim 18, wherein at least one of a seat occupancy and a seat position of a person is detected.
-
20. The method as claimed in claim 17, wherein at least one of vehicles and crane systems is monitored and wherein a general position determination is executed in a navigation system.
-
21. A method for picking up a three-dimensional range image of spatial objects using an optoelectronic sensor with pixel resolution having electronic short-time integrators for each pixel element, wherein an integration time can be adjusted, comprising the steps of:
-
picking up and integrating a sensor signal at the sensor from a beginning of the picking up and integration to a defined integration time T2, the integration representing a dark current and environmental light;
illuminating an object by an illumination device simultaneous to the beginning of the picking up and integration of the sensor signal at the sensor, wherein integration occurs within a light intensity rise of the light received at the sensor up to an integration time T1, and T1 is less than T2;
repeatedly illuminating the object by the illumination device with simultaneous starting of the picking up and integration of the sensor signal at the sensor, wherein integration occurs within the light intensity rise of the light received at the sensor up to the integration time T2;
reading out and storing for all pixels the respectively integrated value of the sensor signal at the times T1 and T2; and
calculating for each pixel a transit time T0 of the light from the illumination device to the sensor via the object and a corresponding distance value based on the stored integrated values. - View Dependent Claims (22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41)
wherein the two or more light sources are activated in series and the evaluation is performed for each respective partial illumination.
-
-
28. The method as claimed in claim 21, wherein the integration time is configured to be adjusted pixel by pixel.
-
29. The method as claimed in claim 21, wherein a beginning of an integration time is delayed by a trigger impulse delay relative to the emitted pulse.
-
30. The method as claimed in claim 21, wherein the integration time is less than 100 ns.
-
31. The method as claimed in claim 21, wherein the integration time is approximately 1 μ
- s.
-
32. The method as claimed in claim 21, wherein different integration times T1 and T2 are respectively adjusted row by row in order to simultaneously acquire a three-dimensional image and a gray value image on the sensor.
-
33. The method as claimed in claim 21, wherein different integration times T1 and T2 are respectively adjusted pixel by pixel in order to simultaneously acquire a three-dimensional image and a gray value image on the sensor.
-
34. The method as claimed in claim 21, wherein the sensor is read out randomly.
-
35. The method as claimed in claim 21, wherein the sensor is a CMOS sensor.
-
36. The method as claimed in claim 21, wherein the distance from the sensor to at least one reference point is set as a reference distance.
-
37. The method as claimed in claim 36, wherein the reference point is located at a door frame of a vehicle.
-
38. The method as claimed in claim 21, wherein at least one of static objects and motion sequences is detected.
-
39. The method as claimed in claim 38, wherein the object is one of physical objects and persons located in at least one of defined spaces and vehicles being monitored.
-
40. The method as claimed in claim 39, wherein at least one of a seat occupancy and a seat position of a person is detected.
-
41. The method as claimed in claim 38, wherein at least one of vehicles and crane systems is monitored and wherein a general position determination is executed in a navigation system.
-
42. An apparatus for picking up a three-dimensional range image, comprising:
-
an illuminating device that emits light pulses onto an object via a first optical system;
an optoelectronic sensor with a second optical system configured to sense received light pulses backscattered by the object within an adjustable integration time and is comprised of a plurality of pixel elements to provide a pixel resolution, the pixel elements being randomly readable and configured to adjust the integration time pixel by pixel;
a triggering mechanism configured to provide time synchronization between the illumination device and the sensor; and
a computing unit for calculating a three-dimensional image from corresponding charges of pixel elements of the sensor that have been charged by the received light pulses. - View Dependent Claims (43, 44, 45)
-
Specification