Headlight, taillight and streetlight detection
First Claim
1. In a computerized vehicle control system including an image sensor mounted on a moving vehicle, wherein the image sensor captures consecutively in real time a plurality of image frames, a method comprising the steps of:
- (a) in at least one of the image frames, detecting a spot of measurable brightness;
(b) matching in at least one subsequent image frame of the image frames, a corresponding spot, wherein said spot and said corresponding spot are images of a light source;
(c) acquiring data respectively from said spot and from said corresponding spot,wherein said data includes at least one of position characteristics, intensity characteristics, size characteristics, and shape characteristics, but not color characteristics, of said light source; and
(d) processing said data, thereby classifying the light source based on said data, and producing a light source classification, wherein substantially all the image frames are available to the computerized vehicle control system and at least one other vehicle control system;
wherein the vehicle control system controls headlights of the moving vehicle based on said light source classification and said at least one other vehicle control system is selected from the group consisting of a lane departure warning system, a collision warning system, and an ego-motion estimation system.
3 Assignments
0 Petitions

Accused Products

Abstract
A method in a computerized system including an image sensor mounted in a moving vehicle. The image sensor captures image frames consecutively in real time. In one of the image flames, a spot is detected of measurable brightness; the spot is matched in subsequent image frames. The image frames are available for sharing between the computerized system and another vehicle control system. The spot and the corresponding spot are images of the same object. The object is typically one or more of headlights from an oncoming vehicle, taillights of a leading vehicle, streetlights, street signs and/or traffic signs. Data is acquired from the spot and from the corresponding spot. By processing the data, the object (or spot) is classified. producing an object classification. The vehicle control system controls preferably headlights of the moving vehicle based on the object classification. The other vehicle control system using the image frames is one or more of: lane departure warning system, collision warning system and/or ego-motion estimation system.
158 Citations
METHOD AND CAMERA SYSTEM FOR THE GENERATION OF IMAGES FOR THE TRANSMISSION TO AN EXTERNAL CONTROL UNIT | ||
Patent #
US 20110080495A1
Filed 10/01/2010
|
Current Assignee
First Sensor Mobility GmbH
|
Original Assignee
First Sensor Mobility GmbH
|
Method and a system for detecting a road at night | ||
Patent #
US 7,936,903 B2
Filed 05/16/2006
|
Current Assignee
Koninklijke Philips N.V.
|
Original Assignee
Koninklijke Philips N.V.
|
Multi-Function Summing Machine | ||
Patent #
US 20110280495A1
Filed 05/14/2010
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
Mobileye Technologies Limited
|
Detecting and recognizing traffic signs | ||
Patent #
US 8,064,643 B2
Filed 12/06/2007
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
Mobileye Technologies Limited
|
Vehicle detecting apparatus | ||
Patent #
US 7,839,303 B2
Filed 10/04/2007
|
Current Assignee
DENSO Corporation
|
Original Assignee
DENSO Corporation
|
ENHANCED CLEAR PATH DETECTION IN THE PRESENCE OF TRAFFIC INFRASTRUCTURE INDICATOR | ||
Patent #
US 20100100268A1
Filed 10/19/2009
|
Current Assignee
GM Global Technology Operations LLC
|
Original Assignee
GM Global Technology Operations Incorporated
|
MULTIOBJECT FUSION MODULE FOR COLLISION PREPARATION SYSTEM | ||
Patent #
US 20100191391A1
Filed 01/19/2010
|
Current Assignee
GM Global Technology Operations LLC
|
Original Assignee
GM Global Technology Operations Incorporated
|
BUNDLING OF DRIVER ASSISTANCE SYSTEMS | ||
Patent #
US 20100172542A1
Filed 10/05/2009
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
Mobileye Technologies Limited
|
Vehicle detecting apparatus | ||
Patent #
US 20080088481A1
Filed 10/04/2007
|
Current Assignee
DENSO Corporation
|
Original Assignee
DENSO Corporation
|
DETECTING AND RECOGNIZING TRAFFIC SIGNS | ||
Patent #
US 20080137908A1
Filed 12/06/2007
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
Mobileye Technologies Limited
|
Method and a System For Detecting a Road at Night | ||
Patent #
US 20080199045A1
Filed 05/16/2006
|
Current Assignee
Koninklijke Philips N.V.
|
Original Assignee
Koninklijke Philips N.V.
|
VEHICLE DETECTION APPARATUS | ||
Patent #
US 20120027255A1
Filed 07/22/2011
|
Current Assignee
Koito Manufacturing Company Limited
|
Original Assignee
Koito Manufacturing Company Limited
|
Bundling of driver assistance systems | ||
Patent #
US 8,254,635 B2
Filed 10/05/2009
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
Mobileye Technologies Limited
|
WIDTH CALIBRATION OF LANE DEPARTURE WARNING SYSTEM | ||
Patent #
US 20130027195A1
Filed 07/25/2011
|
Current Assignee
Ford Global Technologies LLC
|
Original Assignee
Ford Global Technologies LLC
|
Method and device for classifying a light object located ahead of a vehicle | ||
Patent #
US 20130058592A1
Filed 08/22/2012
|
Current Assignee
Robert Bosch GmbH
|
Original Assignee
Gregor Schwarzenberg
|
Multi-function summing machine | ||
Patent #
US 8,538,205 B2
Filed 05/14/2010
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
Mobileye Technologies Limited
|
Width calibration of lane departure warning system | ||
Patent #
US 8,665,078 B2
Filed 07/25/2011
|
Current Assignee
Ford Global Technologies LLC
|
Original Assignee
Ford Global Technologies LLC
|
Enhanced clear path detection in the presence of traffic infrastructure indicator | ||
Patent #
US 8,751,154 B2
Filed 10/19/2009
|
Current Assignee
GM Global Technology Operations LLC
|
Original Assignee
GM Global Technology Operations LLC
|
Multiobject fusion module for collision preparation system | ||
Patent #
US 8,812,226 B2
Filed 01/19/2010
|
Current Assignee
GM Global Technology Operations LLC
|
Original Assignee
GM Global Technology Operations LLC
|
APPARATUS TO RECOGNIZE ILLUMINATION ENVIRONMENT OF VEHICLE AND CONTROL METHOD THEREOF | ||
Patent #
US 20140232854A1
Filed 02/11/2014
|
Current Assignee
Mando Corporation
|
Original Assignee
Mando Corporation
|
Driver assistance system for vehicle | ||
Patent #
US 8,818,042 B2
Filed 11/18/2013
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Automatic vehicle exterior light control | ||
Patent #
US 8,842,176 B2
Filed 01/15/2010
|
Current Assignee
Donnelly Corporation
|
Original Assignee
Donnelly Corporation
|
Vehicular vision system | ||
Patent #
US 8,917,169 B2
Filed 12/02/2013
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Method and device for classifying a light object located ahead of a vehicle | ||
Patent #
US 8,965,142 B2
Filed 08/22/2012
|
Current Assignee
Robert Bosch GmbH
|
Original Assignee
Robert Bosch GmbH
|
Driver assistance system for vehicle | ||
Patent #
US 8,977,008 B2
Filed 07/08/2013
|
Current Assignee
Donnelly Corporation
|
Original Assignee
Donnelly Corporation
|
Detecting and recognizing traffic signs | ||
Patent #
US 8,995,723 B2
Filed 09/19/2011
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
MobilEye Vision Technologies Ltd.
|
Driver assistance system for a vehicle | ||
Patent #
US 8,993,951 B2
Filed 07/16/2013
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 9,008,369 B2
Filed 08/25/2014
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Driver assistance system for vehicle | ||
Patent #
US 9,014,904 B2
Filed 09/23/2013
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle detection apparatus | ||
Patent #
US 9,042,600 B2
Filed 07/22/2011
|
Current Assignee
Koito Manufacturing Company Limited
|
Original Assignee
Koito Manufacturing Company Limited
|
Multi-camera vision system for a vehicle | ||
Patent #
US 9,131,120 B2
Filed 05/15/2013
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 9,171,217 B2
Filed 03/03/2014
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular vision system | ||
Patent #
US 9,191,574 B2
Filed 03/13/2013
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 9,191,634 B2
Filed 04/03/2015
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Driver assistance system for vehicle | ||
Patent #
US 9,193,303 B2
Filed 04/20/2015
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system using kinematic model of vehicle motion | ||
Patent #
US 9,205,776 B2
Filed 05/20/2014
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Automatic vehicle equipment monitoring, warning, and control system | ||
Patent #
US 9,230,183 B2
Filed 05/20/2013
|
Current Assignee
Gentex Corporation
|
Original Assignee
Gentex Corporation
|
Vehicle vision system with lens pollution detection | ||
Patent #
US 9,319,637 B2
Filed 03/27/2013
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Method and system for dynamically calibrating vehicular cameras | ||
Patent #
US 9,357,208 B2
Filed 01/20/2012
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Trailer lane departure warning system | ||
Patent #
US 9,373,044 B2
Filed 03/03/2014
|
Current Assignee
Ford Global Technologies LLC
|
Original Assignee
Ford Global Technologies LLC
|
Driver assist system for vehicle | ||
Patent #
US 9,376,060 B2
Filed 11/16/2015
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 9,428,192 B2
Filed 11/16/2015
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Method for controlling a light emission of a headlight of a vehicle | ||
Patent #
US 9,428,103 B2
Filed 08/22/2012
|
Current Assignee
Robert Bosch GmbH
|
Original Assignee
Robert Bosch GmbH
|
Vehicle vision system | ||
Patent #
US 9,436,880 B2
Filed 01/13/2014
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with dirt detection | ||
Patent #
US 9,445,057 B2
Filed 02/19/2014
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 9,440,535 B2
Filed 01/27/2014
|
Current Assignee
Magna Mirrors of America Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Adjustable camera mount for a vehicle windshield | ||
Patent #
US 9,459,515 B2
Filed 05/31/2012
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
MobilEye Vision Technologies Ltd.
|
Vehicle control system with adaptive wheel angle correction | ||
Patent #
US 9,487,235 B2
Filed 04/01/2015
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle camera alignment system | ||
Patent #
US 9,491,450 B2
Filed 07/30/2012
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Calibration system and method for vehicular surround vision system | ||
Patent #
US 9,491,451 B2
Filed 11/14/2012
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular multi-camera vision system | ||
Patent #
US 9,508,014 B2
Filed 05/05/2014
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Image processing method for detecting objects using relative motion | ||
Patent #
US 9,547,795 B2
Filed 01/20/2012
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Driver assistance system for vehicle | ||
Patent #
US 9,555,803 B2
Filed 05/16/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with targetless camera calibration | ||
Patent #
US 9,563,951 B2
Filed 05/20/2014
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 9,609,289 B2
Filed 08/29/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 9,643,605 B2
Filed 10/26/2015
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Enhanced clear path detection in the presence of traffic infrastructure indicator | ||
Patent #
US 9,652,980 B2
Filed 04/23/2014
|
Current Assignee
GM Global Technology Operations LLC
|
Original Assignee
GM Global Technology Operations LLC
|
Driver assist system for vehicle | ||
Patent #
US 9,656,608 B2
Filed 06/13/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Calibration system and method for multi-camera vision system | ||
Patent #
US 9,688,200 B2
Filed 03/03/2014
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system using kinematic model of vehicle motion | ||
Patent #
US 9,701,246 B2
Filed 12/07/2015
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Multi-camera image stitching calibration system | ||
Patent #
US 9,723,272 B2
Filed 10/04/2013
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 9,736,435 B2
Filed 03/20/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with customized display | ||
Patent #
US 9,762,880 B2
Filed 12/07/2012
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision-based object detection using a polar grid | ||
Patent #
US 9,766,628 B1
Filed 04/04/2014
|
Current Assignee
Waymo LLC
|
Original Assignee
Waymo LLC
|
Systems and methods for identifying traffic control devices and testing the retroreflectivity of the same | ||
Patent #
US 9,767,371 B2
Filed 03/27/2015
|
Current Assignee
Georgia Tech Research Corporation
|
Original Assignee
Georgia Tech Research Corporation
|
Vehicular multi-camera vision system | ||
Patent #
US 9,769,381 B2
Filed 11/28/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Method and apparatus for tunnel decision | ||
Patent #
US 9,785,843 B2
Filed 07/16/2015
|
Current Assignee
Hyundai Motor Company
|
Original Assignee
Hyundai Motor Company
|
Apparatus to recognize illumination environment of vehicle and control method thereof | ||
Patent #
US 9,787,949 B2
Filed 02/11/2014
|
Current Assignee
Mando Corporation
|
Original Assignee
Mando Corporation
|
Driving assist system for vehicle | ||
Patent #
US 9,834,142 B2
Filed 05/19/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular control system using cameras and radar sensor | ||
Patent #
US 9,834,216 B2
Filed 01/24/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Method and system for dynamically calibrating vehicular cameras | ||
Patent #
US 9,834,153 B2
Filed 04/25/2012
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
System and method of establishing a multi-camera image using pixel remapping | ||
Patent #
US 9,900,522 B2
Filed 12/01/2011
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with calibration algorithm | ||
Patent #
US 9,916,660 B2
Filed 01/15/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Driver assistance system for vehicle | ||
Patent #
US 9,940,528 B2
Filed 11/20/2015
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 9,948,904 B2
Filed 08/14/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Dense structure from motion | ||
Patent #
US 9,959,595 B2
Filed 02/18/2014
|
Current Assignee
Mobileye Technologies Limited
|
Original Assignee
MobilEye Vision Technologies Ltd.
|
Trailer sway warning system and method | ||
Patent #
US 9,963,004 B2
Filed 03/25/2015
|
Current Assignee
Ford Global Technologies LLC
|
Original Assignee
Ford Global Technologies LLC
|
Vehicular imaging system comprising an imaging device with a single image sensor and image processor for determining a totally blocked state or partially blocked state of the single image sensor as well as an automatic correction for misalignment of the imaging device | ||
Patent #
US 9,972,100 B2
Filed 04/23/2015
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with targetless camera calibration | ||
Patent #
US 9,979,957 B2
Filed 01/26/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular control system | ||
Patent #
US 10,015,452 B1
Filed 04/16/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with lens pollution detection | ||
Patent #
US 10,021,278 B2
Filed 04/18/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Spectral filtering for vehicular driver assistance systems | ||
Patent #
US 10,027,930 B2
Filed 03/28/2014
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Image processing method for detecting objects using relative motion | ||
Patent #
US 10,043,082 B2
Filed 01/16/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Control system for vehicle | ||
Patent #
US 10,046,702 B2
Filed 12/04/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular multi-camera vision system | ||
Patent #
US 10,057,489 B2
Filed 09/18/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 10,071,676 B2
Filed 09/12/2016
|
Current Assignee
Magna Mirrors of America Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 10,071,687 B2
Filed 11/27/2012
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Barrier and guardrail detection using a single camera | ||
Patent #
US 10,078,788 B2
Filed 02/01/2016
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
MobilEye Vision Technologies Ltd.
|
Vehicle vision system with dirt detection | ||
Patent #
US 10,089,540 B2
Filed 09/12/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 10,099,614 B2
Filed 11/27/2012
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular control system | ||
Patent #
US 10,110,860 B1
Filed 07/02/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Barrier and guardrail detection using a single camera | ||
Patent #
US 10,115,027 B2
Filed 03/06/2017
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
Mibileye Vision Technologies Ltd.
|
Vehicular control system using cameras and radar sensor | ||
Patent #
US 10,118,618 B2
Filed 12/04/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with customized display | ||
Patent #
US 10,129,518 B2
Filed 09/11/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Adjustable camera mount for a vehicle windshield | ||
Patent #
US 10,139,708 B2
Filed 08/24/2016
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
MobilEye Vision Technologies Ltd.
|
Vison-based object detection using a polar grid | ||
Patent #
US 10,168,712 B1
Filed 08/08/2017
|
Current Assignee
Waymo LLC
|
Original Assignee
Waymo LLC
|
Multi-camera dynamic top view vision system | ||
Patent #
US 10,179,543 B2
Filed 02/27/2014
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Multi-camera vehicle vision system with image gap fill | ||
Patent #
US 10,187,590 B2
Filed 10/26/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular control system | ||
Patent #
US 10,187,615 B1
Filed 10/22/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle control system with adaptive wheel angle correction | ||
Patent #
US 10,202,147 B2
Filed 11/07/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Method for dynamically calibrating vehicular cameras | ||
Patent #
US 10,202,077 B2
Filed 05/23/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with calibration algorithm | ||
Patent #
US 10,235,775 B2
Filed 03/07/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Real time traffic sign recognition | ||
Patent #
US 10,255,511 B2
Filed 12/30/2016
|
Current Assignee
Texas Instruments Inc.
|
Original Assignee
Texas Instruments Inc.
|
Calibration system and method for vehicular surround vision system | ||
Patent #
US 10,264,249 B2
Filed 11/07/2016
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system using kinematic model of vehicle motion | ||
Patent #
US 10,266,115 B2
Filed 07/10/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Multi-camera image stitching calibration system | ||
Patent #
US 10,284,818 B2
Filed 07/31/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Multi-sensor interior mirror device with image adjustment | ||
Patent #
US 10,300,859 B2
Filed 06/08/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular control system | ||
Patent #
US 10,306,190 B1
Filed 01/21/2019
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
METHOD AND APPARATUS FOR TUNNEL DECISION | ||
Patent #
US 20160162741A1
Filed 07/16/2015
|
Current Assignee
Hyundai Motor Company
|
Original Assignee
Hyundai Motor Company
|
Device and method of controlling the device | ||
Patent #
US 10,341,442 B2
Filed 01/08/2016
|
Current Assignee
Samsung Electronics Co. Ltd.
|
Original Assignee
Samsung Electronics Co. Ltd.
|
Vehicular control system using cameras and radar sensor | ||
Patent #
US 10,351,135 B2
Filed 11/01/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with lens pollution detection | ||
Patent #
US 10,397,451 B2
Filed 07/09/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Barrier and guardrail detection using a single camera | ||
Patent #
US 10,445,595 B2
Filed 08/23/2018
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
MobilEye Vision Technologies Ltd.
|
Vehicle vision system with adjustable computation and data compression | ||
Patent #
US 10,452,076 B2
Filed 12/19/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Processing method for distinguishing a three dimensional object from a two dimensional object using a vehicular system | ||
Patent #
US 10,452,931 B2
Filed 08/06/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with multi-paned view | ||
Patent #
US 10,457,209 B2
Filed 03/28/2013
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular control system | ||
Patent #
US 10,462,426 B2
Filed 05/16/2019
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Multi-camera dynamic top view vision system | ||
Patent #
US 10,486,596 B2
Filed 01/14/2019
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle camera system with image manipulation | ||
Patent #
US 10,493,916 B2
Filed 02/22/2013
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular vision system | ||
Patent #
US 10,509,972 B2
Filed 04/09/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle vision system with customized display | ||
Patent #
US 10,542,244 B2
Filed 11/12/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Targetless vehicular camera calibration method | ||
Patent #
US 10,567,748 B2
Filed 05/21/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Method for displaying video images for a vehicular vision system | ||
Patent #
US 10,574,885 B2
Filed 08/20/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Video processor module for vehicle | ||
Patent #
US 10,611,306 B2
Filed 08/09/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Driver assistance system for vehicle | ||
Patent #
US 10,623,704 B2
Filed 03/09/2015
|
Current Assignee
Donnelly Corporation
|
Original Assignee
Donnelly Corporation
|
Method for dynamically calibrating vehicular cameras | ||
Patent #
US 10,640,041 B2
Filed 02/04/2019
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vision system for vehicle | ||
Patent #
US 10,640,040 B2
Filed 09/10/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Method and system for dynamically ascertaining alignment of vehicular cameras | ||
Patent #
US 10,654,423 B2
Filed 12/04/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Real time traffic sign recognition | ||
Patent #
US 10,657,395 B2
Filed 04/05/2019
|
Current Assignee
Texas Instruments Inc.
|
Original Assignee
Texas Instruments Inc.
|
Vison-based object detection using a polar grid | ||
Patent #
US 10,678,258 B2
Filed 11/15/2018
|
Current Assignee
Waymo LLC
|
Original Assignee
Waymo LLC
|
Vehicular driving assist system using forward-viewing camera | ||
Patent #
US 10,683,008 B2
Filed 07/15/2019
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Dense structure from motion | ||
Patent #
US 10,685,424 B2
Filed 04/19/2018
|
Current Assignee
Mobileye Technologies Limited
|
Original Assignee
MobilEye Vision Technologies Ltd.
|
Vehicle vision system with collision mitigation | ||
Patent #
US 10,692,380 B2
Filed 11/20/2017
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular imaging system with blockage determination and misalignment correction | ||
Patent #
US 10,726,578 B2
Filed 05/14/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicular control system with traffic lane detection | ||
Patent #
US 10,735,695 B2
Filed 10/28/2019
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Stereo assist with rolling shutters | ||
Patent #
US 10,764,517 B2
Filed 01/08/2019
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
MobilEye Vision Technologies Ltd.
|
Method for stitching images captured by multiple vehicular cameras | ||
Patent #
US 10,780,827 B2
Filed 11/25/2019
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Method for determining misalignment of a vehicular camera | ||
Patent #
US 10,780,826 B2
Filed 04/22/2019
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Adaptive forward lighting system for vehicle comprising a control that adjusts the headlamp beam in response to processing of image data captured by a camera | ||
Patent #
US 10,787,116 B2
Filed 09/10/2018
|
Current Assignee
Magna Mirrors of America Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Imaging system for vehicle | ||
Patent #
US 10,793,067 B2
Filed 07/25/2012
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Method for determining alignment of vehicular cameras | ||
Patent #
US 10,868,974 B2
Filed 02/19/2018
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Magna Electronics Incorporated
|
Vehicle detecting method, nighttime vehicle detecting method based on dynamic light intensity and system thereof | ||
Patent #
US 10,878,259 B2
Filed 11/23/2018
|
Current Assignee
Automotive Research Testing Center
|
Original Assignee
Automotive Research Testing Center
|
Automatic headlamp control | ||
Patent #
US 7,004,606 B2
Filed 04/23/2003
|
Current Assignee
Donnelly Corporation
|
Original Assignee
Donnelly Corporation
|
System and method for detecting obstacles to vehicle motion and determining time to contact therewith using sequences of images | ||
Patent #
US 7,113,867 B1
Filed 11/26/2000
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
Mobileye Technologies Limited
|
Image processing system to control vehicle headlamps or other vehicle equipment | ||
Patent #
US 7,149,613 B2
Filed 03/15/2005
|
Current Assignee
Gentex Corporation
|
Original Assignee
Gentex Corporation
|
Automotive lane change aid | ||
Patent #
US 6,882,287 B2
Filed 07/31/2002
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Donnelly Corporation
|
System and method for estimating ego-motion of a moving vehicle using successive images recorded along the vehicle's path of motion | ||
Patent #
US 6,704,621 B1
Filed 11/26/2000
|
Current Assignee
MobilEye Vision Technologies Ltd.
|
Original Assignee
Mobileye Technologies Limited
|
Vehicle lamp control | ||
Patent #
US 6,728,393 B2
Filed 12/19/2002
|
Current Assignee
Gentex Corporation
|
Original Assignee
Gentex Corporation
|
Image acquisition and processing methods for automatic vehicular exterior lighting control | ||
Patent #
US 20040143380A1
Filed 08/20/2003
|
Current Assignee
Gentex Corporation
|
Original Assignee
Gentex Corporation
|
Vehicle headlight control using imaging sensor | ||
Patent #
US 6,831,261 B2
Filed 04/30/2003
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Donnelly Corporation
|
System for controlling exterior vehicle lights | ||
Patent #
US 6,587,573 B1
Filed 03/05/2001
|
Current Assignee
Gentex Corporation
|
Original Assignee
Gentex Corporation
|
Continuously variable headlamp control | ||
Patent #
US 6,593,698 B2
Filed 07/18/2002
|
Current Assignee
Gentex Corporation
|
Original Assignee
Gentex Corporation
|
Occupant sensing system | ||
Patent #
US 20030209893A1
Filed 04/14/2003
|
Current Assignee
American Vehicular Sciences LLC
|
Original Assignee
Automotive Technologies International Incorporated
|
Continuously variable headlamp control | ||
Patent #
US 6,429,594 B1
Filed 08/24/2001
|
Current Assignee
Gentex Corporation
|
Original Assignee
Gentex Corporation
|
Continuously variable headlamp control | ||
Patent #
US 20020195949A1
Filed 07/18/2002
|
Current Assignee
Gentex Corporation
|
Original Assignee
Frederick T. Bauer, Joseph S. Stam, Jon H. Bechtel
|
Continuously variable headlamp control | ||
Patent #
US 6,281,632 B1
Filed 04/10/2000
|
Current Assignee
Gentex Corporation
|
Original Assignee
Gentex Corporation
|
Continuously variable headlamp control | ||
Patent #
US 6,049,171 A
Filed 09/18/1998
|
Current Assignee
Gentex Corporation
|
Original Assignee
Gentex Corporation
|
Vehicle headlight control using imaging sensor | ||
Patent #
US 5,796,094 A
Filed 03/25/1996
|
Current Assignee
Magna Electronics Incorporated
|
Original Assignee
Donnelly Corporation
|
27 Claims
-
1. In a computerized vehicle control system including an image sensor mounted on a moving vehicle, wherein the image sensor captures consecutively in real time a plurality of image frames, a method comprising the steps of:
-
(a) in at least one of the image frames, detecting a spot of measurable brightness; (b) matching in at least one subsequent image frame of the image frames, a corresponding spot, wherein said spot and said corresponding spot are images of a light source; (c) acquiring data respectively from said spot and from said corresponding spot, wherein said data includes at least one of position characteristics, intensity characteristics, size characteristics, and shape characteristics, but not color characteristics, of said light source; and (d) processing said data, thereby classifying the light source based on said data, and producing a light source classification, wherein substantially all the image frames are available to the computerized vehicle control system and at least one other vehicle control system; wherein the vehicle control system controls headlights of the moving vehicle based on said light source classification and said at least one other vehicle control system is selected from the group consisting of a lane departure warning system, a collision warning system, and an ego-motion estimation system. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. In a computerized system including an image sensor mounted on a moving vehicle, wherein the image sensor captures in real time an image frame, a method comprising the steps of:
-
(a) detecting in the image frame a plurality of spots of measurable brightness, wherein said spots are respective images of a plurality of light sources; (b) acquiring data from said spots; and (c) processing said data, thereby classifying said light sources based on said data; wherein said classifying is performed by previously training with a plurality of known images, and wherein at least one edge of at least one of said spots is detected and an intensity profile across said at least one edge is used for said classifying. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23)
-
-
24. In a computerized system including an image sensor mounted on a moving vehicle, wherein the image sensor captures in real time an image frame, a method comprising the steps of:
-
(a) detecting in the image frame a spot of measurable brightness, wherein said spot is an image of at least one light source, said spot including a cluster of patches of intensity; (b) acquiring data from said spot; and (c) processing said data, thereby classifying said at least one light source based on said patches, wherein said classifying is based on a symmetry of said cluster, wherein said at least one light source is classified as a traffic sign based on a lack of said symmetry, and wherein the image sensor has a high dynamic range with a linear response corresponding to more than eight bits or by having a nonlinear response to light intensity and wherein said acquiring data is performed using said high dynamic range. - View Dependent Claims (25, 26, 27)
-
1 Specification
The present application claims the benefit under 35 USC 119(e) of U.S. provisional application 60/785,351 filed on Mar. 24, 2006, and 60/836,670 filed Aug. 10, 2006, the disclosures of which are incorporated herein by reference.
The present invention relates to vehicle control systems in vehicles, Specifically, the present invention detects and classifies objects in real time, e.g. oncoming vehicle headlights, leading vehicle taillights and streetlights, in a series of images obtained from a camera mounted on a vehicle, The images are used in parallel by a number of vehicle control systems including lane departure detection, forward collision control and headlight control systems. The classification of objects is preferably used by more than one of the vehicle control systems. In a headlight control system, the classification of objects is used to provide a signal for switching the headlights between high beams and low beams.
Headlight high beams are a distraction and create a safety hazard by blinding the driver of an oncoming vehicle or leading vehicle. It is not uncommon for a driver to forget to lower the high beams, thus creating a safety hazard for another driver. It is thus desirable to automatically control the state of a vehicle'"'"'s headlights. Automatic vehicle headlight control also increases the use of high beams in conditions which allow their use, increasing the safety as well as reducing the hazard caused by the occasional failure of the driver to deactivate the high beams which distract the other driver.
Prior art control systems which automatically control the vehicle headlights have included a single light sensor which integrates light in the scene forward of the vehicle. When the integrated light exceeds a threshold, the vehicle headlights are dimmed.
Vehicle headlight control using cameras has also been described. In a system, as disclosed by Schofield et al. (U.S. Pat. No. 6,831,261), a headlight control device is capable of identifying characteristics of light sources is based upon an evaluation of light source characteristics in the scene forward of the vehicle. In the vicinity of each light source, each light source is separated from the remainder of the scene and analyzed to determine characteristics of the light source. One characteristic used to identify a light source is the spectral characteristic which is compared with spectral signatures of known light sources, such as those of automobile headlights and taillights. Another characteristic used in identifying a light source is the spatial layout of the light source. By providing the ability to identify the headlights of oncoming vehicles and the tail lights of leading vehicles, the state of the headlights of the controlled vehicle may be adjusted in response to the presence or absence of either of these light sources or the intensity of these light sources. In order to respond to the different characteristics of headlights and tail lights, a different exposure period is provided for the array in order to detect and identify each light source. In particular, the exposure period may be longer for detecting leading taillights and significantly shorter for detecting oncoming headlights. A solid-state light imaging array is provided that is made up of sensors arranged in a matrix on at least one semiconductor substrate. The light-imaging array includes a_spectral separation device, and each of the sensors responds to light in a particular spectral region. The control circuit responds to the sensors in order to determine if spatially adjacent regions of the field of view forward of the vehicle include light of a particular spectral signature above a particular intensity level. In this manner, the control identifies light sources that are either oncoming headlights or leading taillights by identifying such light sources according to their spectral makeup. Spatial evaluation may be implemented by selecting characteristics of an optical device provided with the imaging sensor, such as providing increased magnification central of the forward scene, or providing a wide horizontal view and narrow vertical view.
In the vehicle headlight control system, as disclosed in U.S. Pat. No. 6,831,261 special controls are required for camera settings including exposure time and magnification, for instance multiple exposures each with different exposure times.
Reference is now made to
Exemplary Prior Art Vehicle Controls are:
Step 17-Collision Warning is disclosed in U.S. Pat. No. 7,113,867 by Stein, and included herein by reference for all purposes as if entirely set forth herein. Time to collision is determined based on information from multiple images 15 captured in real time using camera 12 mounted in vehicle 18.
Step 19-Lane Departure Warning (LDW), as disclosed in U.S. Pat. No. 6,882,287 by Scofield. If a moving vehicle has inadvertently moved out of its lane of travel based on image information from images 15 from forward looking camera 12, then system 16 signals the driver accordingly.
Step 21-Ego-motion estimation is disclosed in U.S. Pat. No. 6,704,621 by Stein and included herein by reference for all purposes as if entirely set forth herein, Image information is received from images 15 recorded as the vehicle moves along a roadway. The image information is processed to generate an ego-motion estimate of the vehicle, including the translation of the vehicle in the forward direction and the rotation. Vehicle control systems, such as disclosed in U.S. Pat. No. 6,831,261 which rely on changing exposure parameters (ie, aperture, exposure, magnification, etc) in order to detect headlights have a difficult time maintaining other control systems which rely on the same camera, e.g. Lane Departure Warning, Forward Collision Warning, etc. As a result of changing exposure parameters half or more of the (possibly critical) frames may not be available for the other control systems. This greatly affects performance of the other control systems.
Hence, since in the vehicle headlight control system as disclosed in U.S. Pat. No. 6,831,261 (or in any other disclosure where special control is required of camera settings including, aperture, exposure time and magnification), the same camera cannot be conveniently used for other simultaneously operable vehicle control systems such as LDW 19 or collision warning 17.
Additionally, the use of color cameras with infrared filters required to achieve good spectral separation reduces imaging sensitivity by a factor of six or more. A reduction in sensitivity by such a factor has an adverse impact on other vehicle control application such as LDW performance in dark scenes. The presence of an infrared filter also negates the use of the camera as a near infrared sensor for applications, such as pedestrian detection. Thus, headlight control systems which make strong use of color or spectral analysis in the captured images (such as in U.S. Pat. No. 6,831,261) will tend not be compatible with other applications without sacrificing performance.
There is thus a need for, and it would be highly advantageous to have a method of detecting and classifying objects in real time, e.g. oncoming vehicle headlights, leading vehicle taillights and streetlights, in a series of image frames 15 obtained from a camera mounted in a vehicle to provide a signal, specifically with image frames 15 available for use by a number of vehicle control applications.
The term “classification” as used herein refers to classifying the object of which a spot in an image frame is the image of a real object, e.g. headlight, tail lights. The term “classification” is used to refer interchangeably to the spot or to the object.
Radial basis functions (RBF) are used for interpolation in a stream of data. Radial basis functions (RBF) differ from statistical approaches in that approximations must be performed on streams of data rather than on complete data sets. RBFs use supervised learning and sometimes unsupervised learning to minimize approximation error in a stream of data. They are used in function approximation, time series prediction, and control. http://en.wikipedia.org/wiki/Radial_basis_function
The terms referring to image space such as “downward”, “inward” “upward”, “upper”, “lower”, “bottom and “top” refer to a non-inverted image as viewed on a monitor from camera 12.
According to the present invention there is provided a method in a computerized system including an image sensor mounted in a moving vehicle. The image sensor captures image frames consecutively in real time. In one of the image frames, a spot is detected of measurable brightness; the spot is matched in subsequent image frames. The spot and the corresponding spot are images of the same object. The object is typically one or more of headlights from an oncoming vehicle, taillights of a leading vehicle, streetlights, street signs and/or traffic signs. Data is acquired from the spot and from the corresponding spot. By processing the data, the object (or spot) is classified producing an object classification. Substantially all the image frames are available to the computerized vehicle control system and at least one other vehicle control system. The vehicle control system controls preferably headlights of the moving vehicle based on the object classification. The other vehicle control system using the image frames is one or more of: lane departure warning system, collision warning system and/or ego-motion estimation system. The object classification is typically provided to and used by one or more of the other vehicle control systems. The data typically relates to a property of the spot and the corresponding spot, such as position in the image frame, shape, brightness, motion, color and spatial alignment. The motion of the spot is tracked (in image space) by comparing respective image frame positions of the spot and the corresponding spot. Spots can be classified using radial basis functions. The spots are classified as headlights, taillights, street lights, or road signs. Street lights are preferably classified using spatial alignment. Preferably, classification is performed using texture and/or edge characteristics of the spots. For each of the spots, the classification uses the data from an area centered on each spot including N×N picture elements of the image frame, wherein N is an integer preferably less than twenty.
Typically, the object is not classified as a street sign when the motion is outward and upward. High beams are deactivated upon classifying the object as being a part of a vehicle such as a passing vehicle, a preceding vehicle or an oncoming vehicle. The high beams are reactivated based on the tracking motion. Preferably, the data is related to a shape of the spot, wherein the shape is indicative of the spot splitting into spots in one or more subsequent image frames. A second spot of measurable brightness is preferably detected in the first image frame; and motion is tracked of the spot and the second spot between the first image frames and one or more subsequent image frames. The first and second spots are paired based on comparing the motion of the first spot to motion of the second spot. The object is classified as a taillight when the motion of the first and second spot is inward. Compensation of the motion, e.g. yaw and/or pitch of the vehicle is preferable prior to classification based on motion of the first and second spots.
According to the present invention there is provided a method for controlling headlights of the vehicle in a computerized system including an image sensor mounted on a moving vehicle. The image sensor captures in real time an image frame. In the image frame are detected spots of measurable brightness, the spots being respective images of objects. Data is acquired from the spots and the data is processed. The objects are classified based on the data and the classification is performed by previously training with known images. The known images include images from objects such as taillights, oncoming lights streetlights, and traffic signs. The image sensor preferably includes a filter. The filter has spatial profile, e g checkerboard profile, including portion which preferentially transmit red light. The spots of the image are correlated with the spatial profile for classifying the spots. Upon classifying the objects as taillights of a leading vehicle, distance is determined to the leading vehicle and the headlights of the moving vehicle are preferably adaptively controlled based on the distance. Upon classifying the objects as three streetlights along a road. The curvature of road is determined and adaptively controlling headlights of the moving vehicle are adaptively controlled based on the curvature. The detection includes detecting includes determining a quadrant of the spots within the image flame. Only one headlight of the moving vehicle based is preferably controlled based on the quadrant. The training and the classification are preferably performed using radial basis functions. The classification preferably uses pairs of the spots to identify vehicle taillights. The classification preferably includes spatial alignment of spots to detect streetlights. Preferably classification is performed using texture and/or edge characteristics of the spots. For each of the spots, the classification uses the data from an area centered around each spot including N by N picture elements of the image frame, wherein N is an integer less than twenty.
According to the present invention there is provided a method for controlling headlights of the vehicle in a computerized system including an image sensor mounted on a moving vehicle. The image sensor captures in real time an image frame. In the image frame are detected spots of measurable brightness, the spots being respective images of objects. Corresponding spots are matched in subsequent image frames. The corresponding spots are respective images of the objects. Data is acquired from the spots and from the corresponding spots. The data is processed and the objects are classified based on the data. The classification is preferably performed based on previous training with known images. Motion of the spots is tracked by comparing respective image frame positions of the spots and the corresponding spots. High beams are deactivated upon classifying the objects as being a part of a vehicle such as passing vehicle, a preceding vehicle and or an oncoming vehicle, and high beams are reactivated based on the tracking motion.
According to the present invention there is provided a system which performs vehicle headlight control according to the methods as disclosed herein.
The invention is herein described, by way of example only, with reference to the accompanying drawings, wherein:
The present invention is of a system and method for detecting and classifying objects in real time, e.g. oncoming vehicle headlights, leading vehicle taillights and streetlights, in a series of image frames 15 obtained from a camera mounted on an automobile to provide a signal for switching the headlights between high beams and low beams, specifically with image frames 15 available for use by other vehicle control applications
The principles and operation of a system and method of detecting and classifying objects in real time, in a series of images obtained from a camera mounted on an automobile to provide a signal for switching the headlights between high beams and low beams, according to the present invention, may be better understood with reference to the drawings and the accompanying description.
Before explaining embodiments of the invention in detail, it is to be understood that the invention is not limited in its application to the details of design and the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
By way of introduction, a principal intention of the present invention is to provide a method of vehicle headlight control which does not place constraints on the image sensor in use nor requires control of the exposure parameters. It is clearly an advantage to be able to use the same image sensor that is used for other vehicle control applications such as lane departure warning and forward collision warning. Bundling up multiple vehicle control applications (e.g. collision warning (step 17), Lane departure warning (step 19) and Ego-motion estimation (step 21) with the same hardware (system 16) reduces cost and more importantly saves space. Since the vehicle control systems are typically mounted on the windshield near the rear view mirror they must be small so as not to block the drivers'"'"' view of road. It is not enough to share the imager hardware, e.g. camera 12 between the different vehicle control applications. It is important to share image frames 15 themselves. In other words, images captured with one gain and exposure setting are preferably used for both headlight control and the additional applications, (e.g. steps 17,19, 21). The strength of the methods, according to embodiments of the present invention is in the image processing algorithms used rather than a particular image sensor 12 configuration. These algorithms first detect bright spots in the image and then track these spots over multiple frames and accumulate statistics of shape, motion, brightness spatial alignment and possibly color when available. Based on these statistics the spots are classified as coming from oncoming vehicle headlights, preceding vehicle taillights and streetlights or other light sources.
Implementation of the method and system of the present invention involves performing or completing selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present invention, several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof. For example, as hardware, selected steps of the invention could be implemented as a chip or a circuit. As software, selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
Referring now to the drawings,
Embodiments of the present invention use a forward looking image sensor 12 such as a CMOS sensor mounted on a vehicle 18. Sensor 12 is connected to an image processing unit 24 which is a general purpose microprocessor, a processor implemented using digital signal processing (DSP) or an application specific integrated circuit (ASIC) or a combination of the different technologies.
High dynamic range images 15 are preferred to avoid saturation of spots, for example, so that a bright spot imaged from a headlight of an oncoming vehicle may be distinguished from a less bright reflection from a traffic sign. Ideally, image sensor 12 has a high dynamic range with a linear response corresponding to more than 8 bits or by having a nonlinear, e.g. logarithmic response, to light intensity. Image sensor 12 is preferably, but not limited to, a standard monochrome or color camera in different embodiments of the present invention. Reference is now made to
In well lit road scenes, headlights are always set (step 23) to low beam. However, at night with a dark road scene, headlight control sets (step 23) to high beam or low beam depending on the presence oncoming and/or preceding vehicles or streetlights. Once a taillight of a leading vehicle is detected, the distance from vehicle 18 may also be determined. Determining distance multiple image frames under scale change is disclosed in co-pending application entitled: “Estimating distance to an object using a sequence of images recorded by a monocular camera”, with priority from U.S. provisional application 60/755,778. The headlamps may be adaptively controlled to high, medium, low or very low settings, based on technologies which allow for “adaptive headlight control”. The paper Adaptive Light Control—A New Light Concept Controlled by Vehicle Dynamics and Navigation,” by J. P. Lowenau, J. H. Bernasch, H. G. Rieker, P. J. Th. Venhovens, J. P. Huber, and W. Huhn, SAE Paper No. 980007, pp. 33-38 is included herein by reference for all purposes as if entirely set forth herein.
Similarly, once three or more streetlights are classified, according to embodiments of the present invention, techniques of adaptive light control can be used to determine the road curvature and send signal to aim the headlights using horizontal adaptive light controls in a particular direction.
Reference is also now made to
Step 301: Dark Scene Detection
A lit scene can be detected by measuring the ambient light. One method useful for detecting the brightness, simply counts the number of pixels in image frames 15 that are above a previously defined brightness threshold. If the number of pixels above the threshold is larger than a previously defined threshold number, the scene is classified as bright and the headlight beams are switched (step 23) to low.
Another method for lit scene/dark scene detection is to tessellate one or more image frames 15 into patches and classify the ambient light based on histograms of pixel values in each patch. In one implementation, the subsampled 160×120 size image was tessellated into 5×5 rectangles of 32×24 pixels each. A 16 bin histogram was created for each bin, Thus, 5×5×16=400 values were created for each image. A classifier was then trained on example images of dark and lit scenes. Neural networks and homogeneous kernel classifiers were both used and gave good results. The second method also works well for nearby cars (typically a distance under 20 m) which are lit by headlights of vehicle 18.
Step 303: Detection of Spots in Image Frame
In dark scenes, the only features detectable with measurable brightness in image frames 15 are spots imaged from lights in the road scene. Three types of lights are of special interest: headlights of oncoming vehicles, taillights of preceding vehicles and streetlights. Thus, detection of vehicles in the road scene is based on detection (step 303) of spots in image frames 15 by verifying that the spots in fact are images of one of the tree types of lights of special interest: Other spots in image frames 15, not of special interest, that is not imaged from vehicles or streetlights include: reflections from railings, lights on buildings in the distance, bus stops, “neon” signs and lit traffic signs on highways. Different features distinguish between the spots of special interest and the other spots. Each feature is used to determine a score and the scores are combined together for classification of the objects imaged by the spots.
Oncoming vehicles headlights are typically very bright. A threshold can be determined such that if a spot has enough pixels above the threshold, the spot is with high confidence an image of a headlight of an oncoming vehicle and the possibility is very low that the bright spot is a reflection from a traffic sign. When oncoming vehicles are more distant or not exactly in line with vehicle 18, the imaged spots of headlights are less bright and will less likely be detected as being above the threshold.
Step 313 Spot Classifier
A spot imaged from a light source, for example a spot 43 as image of a streetlight appears different or has a different texture than a spot 45 as image of a reflection of a traffic sign. The spot from a light source typically has fuzzy edges and a bright center (almost Gaussian in cross section). Spot 45 image of a traffic sign at medium distance has typically sharper edges and uniform appearance inside. A classifier trained on examples of light sources and traffic signs can produce correct classification rates with confidence above 85%. Spot classification when combined with other features of according to embodiments of the present invention improves classification to close to 100% correct.
According to an embodiment of the present invention there is provided a method for detecting from which side of vehicle 18 is the object, e.g. headlights from the right or left. The detector of image sensor 12 is preferably divided into four quadrants. By determining in which quadrant is the image spot of the object e.g. a light of the approaching or leading vehicle, a signal can be generated to control for instance an individual headlamp.
Texture Based Filtering
Reference is now made to
- A single bright spot as illustrates in
FIG. 5 a. - Two bright spots at equal image height.
- Three bright spots with the middle one higher than the other two.
- Four bright spots arranged as two above two, where the lower two are either due to fog lights or more typically, reflections of the headlights from the road surface as illustrated in
FIG. 5 b. According to a feature of the present invention, a score is computed regarding structure (or texture) based on the number of clusters and their position. Alternatively, a classifier can be trained on image moments of the cluster using moments up to the 6th moment.
- A single bright spot as illustrates in
Edge Based Sign Filter
According to another feature of the present invention, spots are classified according to the edge of the spots. Reference is now made to
Streetlight Structure
Three or more spots that are located above the horizon and are along a line passing close to the vanishing point of the road, indicate a streetlight structure. Two pairs of two lights, one on each side of the image are also classified as streetlights 43, as illustrated in
Taillight RBF Classifier
According to a feature of the present invention, taillight detection is based on spot pairs. If two spots of light are detected (step 303) and after data acquisition step 307a are found to have similar size, moderate brightness and similar vertical position, the spots are assumed initially to be vehicle taillights. A classifier is trained uses a patch, e.g. 9 by 9 pixels of raw image data centered around the center of the spot. A radial basis function (RBF) classifier is preferable and typically ˜92% confidence level is found. Other classifiers are expected to operate similarly.
(Step 311) Multi-Frame Processing
According to different features of the present invention, several multi-frame processing techniques are available. Data is acquired (steps 307a 307b) and statistics of the data may be calculated. For instance scores are averaged over several image frames 15. In addition spot motion is computed. For instance, spots that move up and outward are of traffic signs 45 and not of vehicle lights, e.g. headlights 41.
According to the aforementioned feature of the present invention, taillight detection is based on spot pairs. If two spots of light are detected (step 303) and after data acquisition step 307) are found to have similar size, moderate brightness and similar vertical position, the spots are assumed initially to be vehicle taillights. If the spots are far apart, then to continue with the classification of taillights the spots are required to be brighter, since they must come from a close, e.g. 10-20 meters, preceding vehicle. After tracking (step 311), specific constraints on the spot motions are required to continue with the classification of taillights. If fast lateral motion of one spot in the pair is detected (step 311) the taillight classification is removed. Distance between spots under scale change with distance from vehicle 18 is limited to be between specific values. Distance between spots under scale change with distance is disclosed in co-pending application entitled: “Estimating distance to an object using a sequence of images recorded by a monocular camera”, with priority from U.S. provisional application 60/755,778.
Taillight Passing Score
According to another feature of the present invention, spots at the sides of image 15 that have inward motion are typically from vehicles traveling faster and passing host vehicle 18. Upon detecting the inward motion, these spots are immediately tagged as taillights. The passing score is an accurate way to tag passing cars and motorcycles and quickly lower (step 23) host vehicle 18 headlights. Tagging based on taillight passing is optionally made more robust upon compensation for lateral motion or yaw of host vehicle 18 by processing of images 15 or with an external inertial sensor.
Light Control Policy
The basic light control policy is that headlights are lowered when any oncoming headlights, preceding taillights are detected or more than N streetlights are detected. The number of required streetlights can be defined for each country or car manufacturer. When all the above conditions cease to exist the high beams are switched on after a grace period. The exact length of the grace period can depend on various parameters. For example, if oncoming headlights are tracked (step 311) to the edge of image frame 15, high beams can be switched on (step 23) immediately as they disappear. However, if oncoming headlights disappear while still in the center of image frame 15, a grace period of a few seconds is allowed to account for a momentary obstruction and to avoid unwanted flickering of the lights.
Algorithm Flow
One possible implementation of the present invention uses a data structure which is a list of spots. Each spot includes the following data, where:
- (integer) ID.
- (image coordinate) Last XY.
- (image coordinate array) motion history.
- (integer array) shape score history.
- (integer array) brightness score history.
- (integer) Type: oncoming, taillight, streetlight, untagged.
- (integer) Age.
- (integer) Approval delay.
A spot typically starts off as type: untagged. During the “lifetime” of the spot, the spot may be tagged as oncoming, taillight or streetlight. Once a spot is tagged as oncoming or taillight, the spot keeps the tag until the spot is “killed” or re-evaluated and tagged with a different type. For instance, a spot tagged as a streetlight might be reevaluated and tagged as oncoming.
Since taillights of cars often appear in the image as two spots it is also useful to have a data structure to hold possible pairings. Each pair includes the following information:
- (pointer to spot) spot1.
- (pointer to spot) spot2.
- (int array) pair scores history.
- (image coordinate array) pair motion history.
- (int) Age.
- (int) pair Not Found Grace.
- (int) one Light Not Found Grace.
According to embodiments of the present invention, the following algorithm uses the above data structures to analyze an image and detect oncoming headlights, preceding taillights and streetlights.
Detect Spots in New Image:
- (a) Find local maxima in the image with value above a threshold,
- (b) Extend the spot laterally in image space, until the region drops to 90% of the local maxima and determine size and shape of the spot
- (c) Merge spots that overlap,
- (d) Compute zeroth, first and second moments of each spot (i.e. size, position and orientation and eccentricity of approximating ellipse).
- (e) Compute average brightness and variance of each spot.
- (f) Match old spots to new image spots:
- i. Predict location of old spots based on previous motion.
- ii. Find closest spot (given predicted motion) with most similar size and brightness.
The score S for matching old spots to new spots may be determined using:
S=α(xt−xt−1)2+β(yt−yt−1)2+γ(bt−bt−1)2+δ(st−st−1)2 (1)
where (xt, yt), (xt−1, yt−1) are the spot centroid positions at time t and position predicted from time t−1 respectively. b and s represent the brightness and size values respectively. S is the resulting score.
- iii. Both position and size must be within given respective thresholds.
- iv. If the shape of the spot in the previous image is indicative of two headlights close together, splits are allowed of the one spot to two spots. For example the shape of the spot might be elongated above a certain threshold or the contour is non convex. This often occurs with oncoming vehicles. If two new spots both give high scores and neither is matched to another old spot then old spot is matched to both new spots.
- v. A spot that is not matched (step 305) is kept alive for N frames 15.
- (g) Match (step 305) old spot pairs to two new image spots:
- i. For each pair search for best possible two spots.
- ii. If no pair found find best single spot match within threshold and assume other spot has not moved relatively but is obscured. Keep alive for a number of frames.
- iii. If neither spot is matched (step 305), keep alive for N frames with same motion.
- (h) For matched spots add new spot data to old data structure. For unmatched new spots create new data structures.
- (i) Unmatched old spots that have not been detected for the past N frames are killed (i.e. removed from the list).
Classification (step 313) as Oncoming:
(a) Test untagged spots below a certain row in the image using oncoming classifier path:
- i. If spot has N pixels has above threshold T then tag spot as oncoming.
- ii. Compute spot classifier score.
- iii. Compute structure symmetry score.
- iv. Compute vertical edge score.
- v. Compute edge score—if large edge ratio is detected increase ‘approval delay’ value for spot by N.
- vi. Test for approval as oncoming:
- A. If spot age is greater than 10+ approval delay then if all scores from last 10 frames are greater than T1 then tag spot as oncoming.
- B. If spot age is greater than 20+ approval delay then if all scores from last 20 frames are greater than T2 then tag spot as oncoming.
- C. If spot age is greater than 40+ approval delay then if all scores from last 40 frames are greater than T3 then tag spot as oncoming.
- D. If spot age is greater than 80+ approval delay then if all scores from last 80 frames are greater than T4 then tag spot as oncoming.
where T1>T2>T3>T4.
Classification (Step 313) as Streetlights or Taillights - (b) Test untagged spots above a certain row in the image for approval as streetlights:
- i. Compute spot classifier score
- ii. Discard spots whose score is below a threshold.
- iii. If three or more spots form a line passing near the vanishing point of the road tag all these spots as streetlights.
- iv. If two pairs of such spots form two lines intersecting near the vanishing point of the road then tag four spots all as streetlights.
- (c) Classify untagged spots and pairs with tail light classifier:
- i. Test for inward motion (passing score). If passing then tag as taillight.
- ii. Compute taillight RBF score: if last 10 scores are above a threshold T1 then tag as taillights.
- iii. If spot is part of a pair and pair age is greater than N and both spots are visible this frame and both spots have taillight RBF score above a threshold T2 then tag as taillights.
Light Control Policy
- (a) If oncoming spots exist and they are all leaving the scene set a grace counter to a small value (nearly zero). If oncoming spots exist and at least one is not leaving the scene set the grace counter to a larger value. If not oncoming spots exist, decrement the grace counter.
- (b) If taillight spots exist set a grace counter to a given value. If not taillight spots exist, decrement the taillights grace counter.
- (c) If streetlight spots exist set a grace counter to a given value. If not streetlight spots exist, decrement the streetlights grace counter.
- (d) If any grace counter is greater than zero do not switch on high beams.
Reference is now made to
Therefore, the foregoing is considered as illustrative only of the principles of the invention. Further, since numerous modifications and changes will readily occur to those skilled in the art, it is not desired to limit the invention to the exact design and operation shown and described, and accordingly, all suitable modifications and equivalents may be resorted to, falling within the scope of the invention.
While the invention has been described with respect to a limited number of embodiments, it will be appreciated that many variations, modifications and other applications of the invention may be made.