THREE-DIMENSIONAL OBJECT DETECTION DEVICE, AND THREE-DIMENSIONAL OBJECT DETECTION METHOD
First Claim
1. A three-dimensional object detection device comprising:
- an image capturing device mounted on a host vehicle, and comprising a lens for capturing images behind the host vehicle;
a three-dimensional object detection unit programmed to detect a presence of a three-dimensional object rearward of the host vehicle based on captured images captured by the image capturing device;
a first edge intensity calculation unit programmed to extract edges of a subject in a first edge extraction area that includes at least a horizon reference area that references a horizon, and to calculate an intensity of the edges in the first edge extraction area as a first edge intensity based on a distribution of the edges extracted from the first edge extraction area;
a second edge intensity calculation unit programmed to extract edges of a subject in a second edge extraction area that includes at least a road edge reference area that references a road edge, and to calculate an intensity of the edges in the second edge extraction area as a second edge intensity based on a distribution of the edges that are extracted from the second edge extraction area;
a day/night assessment unit programmed to assess whether currently daytime or nighttime exists; and
a controller programmed to control detection of the three-dimensional object by the three-dimensional object detection unit based on the first edge intensity in the first edge extraction area when an assessment has been made that daytime currently exists by the day/night assessment unit, and programmed to control the detection of the three-dimensional object by the three-dimensional object detection unit based on the second edge intensity in the second edge extraction area when an assessment has been made that nighttime currently exists.
3 Assignments
0 Petitions
Accused Products
Abstract
A three-dimensional object detection device has an image capturing unit, an object detection unit, first and second edge intensity calculation units, a day/night assessment unit and a controller. The day/night assessment unit assess whether it is currently daytime or nighttime when detecting a three-dimensional object based on the captured images. Upon assesses it is daytime, edges of a subject are extracted from a first edge extraction area, including a horizon reference area, and a threshold value for detecting the three-dimensional object is set based on the intensity of the edges in the first edge extraction area. Upon assesses it is nighttime, the edges of a subject are extracted from a second edge extraction area, including a road edge reference area, and a threshold value for detecting the three-dimensional object is set based on the intensity of the edges that are extracted from the second edge extraction area.
-
Citations
22 Claims
-
1. A three-dimensional object detection device comprising:
-
an image capturing device mounted on a host vehicle, and comprising a lens for capturing images behind the host vehicle; a three-dimensional object detection unit programmed to detect a presence of a three-dimensional object rearward of the host vehicle based on captured images captured by the image capturing device; a first edge intensity calculation unit programmed to extract edges of a subject in a first edge extraction area that includes at least a horizon reference area that references a horizon, and to calculate an intensity of the edges in the first edge extraction area as a first edge intensity based on a distribution of the edges extracted from the first edge extraction area; a second edge intensity calculation unit programmed to extract edges of a subject in a second edge extraction area that includes at least a road edge reference area that references a road edge, and to calculate an intensity of the edges in the second edge extraction area as a second edge intensity based on a distribution of the edges that are extracted from the second edge extraction area; a day/night assessment unit programmed to assess whether currently daytime or nighttime exists; and a controller programmed to control detection of the three-dimensional object by the three-dimensional object detection unit based on the first edge intensity in the first edge extraction area when an assessment has been made that daytime currently exists by the day/night assessment unit, and programmed to control the detection of the three-dimensional object by the three-dimensional object detection unit based on the second edge intensity in the second edge extraction area when an assessment has been made that nighttime currently exists. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18)
-
-
19. A three-dimensional object detection method comprising:
-
converting viewpoints of captured images into bird'"'"'s-eye view images; aligning positions of the bird'"'"'s-eye view images that are obtained at different times in a bird'"'"'s-eye view; generating differential waveform information by counting a number of pixels that indicate a predetermined difference that is equal to or greater than a predetermined first threshold value in a differential image of aligned bird'"'"'s-eye view images to form a frequency distribution, and detecting a presence of a three-dimensional object based on the differential waveform information when a peak value of the differential waveform information is equal to or greater than a predetermined second threshold value; assessing whether currently daytime or nighttime exists; extracting edges of a subject in a first edge extraction area, including at least a horizon reference area that references the horizon, when the assessment has been made that currently daytime exists; calculating an intensity of the edges in the first edge extraction area as a first edge intensity based on a distribution of the edges extracted from the first edge extraction area, when the assessment has been made that currently daytime exists; setting one of the first threshold value and the second threshold value based on the first edge intensity, when the assessment has been made that currently daytime exists; extracting edges of a subject in a second edge extraction area, including a road edge reference area that references the road edge, when an assessment has been made that currently nighttime exists; calculating an intensity of the edges in the second edge extraction area as a second edge intensity based on a distribution of the edges extracted from the second edge extraction area, when the assessment has been made that currently nighttime exists; and setting one of the first threshold value and the second threshold value based on the second edge intensity, when the assessment has been made that currently nighttime exists.
-
-
20. A three-dimensional object detection method comprising:
-
converting viewpoints of captured images into bird'"'"'s-eye view images; detecting a edge components having a luminance difference between adjacent pixel areas of equal to or greater than a predetermined first threshold value are detected from the bird'"'"'s-eye view images; detecting a presence of a three-dimensional object based on the edge information when a quantity of the edge information based on edge components is equal to or greater than a predetermined second threshold value; assessing whether currently the present daytime or nighttime exists; extracting edges of a subject in a first edge extraction area, including at least a horizon reference area that references the horizon, when an assessment has been made that currently daytime exists; calculating an intensity of the edges in the first edge extraction area as a first edge intensity based on a distribution of the edges extracted from the first edge extraction area, when the assessment has been made that currently daytime exists; and setting one of the first threshold value and the second threshold value based on the first edge intensity, when the assessment has been made that currently daytime exists; extracting edges of a subject in a second edge extraction area, including a road edge reference area that references the road edge, when an assessment has been made that currently nighttime exists; calculating an intensity of the edges in the second edge extraction area as a second edge intensity based on a distribution of the edges extracted from the second edge extraction area, when the assessment has been made that currently nighttime exists; and setting one of the first threshold value and the second threshold value based on the second edge intensity, when the assessment has been made that currently nighttime exists.
-
-
21. A three-dimensional object detection method comprising:
-
converting viewpoints of captured images from an image capturing device into bird'"'"'s-eye view images; generating differential waveform information based on a difference between the bird'"'"'s-eye view images obtained at different times, and detecting a presence of a three-dimensional object based on the differential waveform information; assessing whether currently daytime or nighttime exists; extracting edges of a subject in a first edge extraction area, including at least a horizon reference area that references the horizon, when the assessment has been made that currently daytime exists; calculating an intensity of the edges in the first edge extraction area as a first edge intensity based on a distribution of the edges extracted from the first edge extraction area, when the assessment has been made that currently daytime exists; conducting exposure control of the image capturing device based on the first edge intensity, when the assessment has been made that currently daytime exists; extracting edges of a subject in a second edge extraction area, including a road edge reference area that references the road edge, when an assessment has been made that currently nighttime exists; calculating an intensity of the edges in the second edge extraction area as a second edge intensity based on a distribution of the edges extracted from the second edge extraction area, when an assessment has been made that currently nighttime exists; and conducting exposure control of the image capturing device based on the second edge intensity, when an assessment has been made that currently nighttime exists.
-
-
22. A three-dimensional object detection method in which
converting viewpoints of captured images from an image capturing device into bird'"'"'s-eye view images; -
detecting edge information based on the bird'"'"'s-eye view images; detecting a presence of a three-dimensional object based on the edge information; assessing whether currently daytime or nighttime exists; extracting edges of a subject in a first edge extraction area, including at least a horizon reference area that references the horizon, when the assessment has been made that currently daytime exists; calculating an intensity of the edges in the first edge extraction area as a first edge intensity based on a distribution of the edges extracted from the first edge extraction area, when the assessment has been made that currently daytime exists; conducting exposure control of the image capturing device based on the first edge intensity, when the assessment has been made that currently daytime exists; extracting edges of a subject in a second edge extraction area, including a road edge reference area that references the road edge, when an assessment has been made that currently nighttime exists; calculating an intensity of the edges in the second edge extraction area as a second edge intensity based on a distribution of the edges extracted from the second edge extraction area, when an assessment has been made that currently nighttime exists; and conducting exposure control of the image capturing device based on the second edge intensity in the second edge extraction area, when an assessment has been made that currently nighttime exists.
-
Specification