Object motion detection system based on combining 3D warping techniques and a proper object motion detection
First Claim
Patent Images
1. A method for detecting dynamic objects in the visually sensed scene of a driver assistance system of a vehicle or robot with ego-motion, comprising the steps of:
- providing visual signals of an environment of a vision sensor attached to a vehicle or robot with ego-motion, wherein the vision sensor is a dense depth sensor providing 3D data and wherein an additional sensor provides information on ego-motion,detecting a proper motion of objects on the input field of the vision sensor based on a detected optical flow,detecting the motion of objects based on a 3D representation model of the environment and using a 3D warping on the basis of predicted and measured 3D data, the predicted 3D data being generated based on measured 3D data and data representing the ego-motion,combining the 3D warping-based and the optical flow based object motion detection, by using both detecting steps in parallel and wherein the results of one detecting step is used to verify and refine the results of the other detecting step, wherein the 3D warping defines regions where the optical flow based object motion detection is applied and wherein the optical flow based object motion detection defines regions where the 3D warping is applied, andstoring information on the detected dynamic objects and their measured motion parameters.
1 Assignment
0 Petitions
Accused Products
Abstract
The invention proposes a method for detecting dynamic objects in the scene in a driver assistance system of a vehicle, comprising the steps of:
- feeding signals from sensors (internal sensors, 3D sensors, cameras) of the vehicle to the driver assistance system,
- generation of a surrounding model using 3D world positions based on the sensor signals and,
- combination of 3D Warping-based approaches and optical flow based approaches, whose gained novel key points are:
- a) Restriction of the search space of an optical flow method based on the computation results of the 3D warping method,
- b) Additionally, to a) also an optimal parameterization of the optical flow approach in terms of search direction, amplitude, etc.,
- c) Refinement and verification of the detection results of one of the methods based on the computation results of the other method (if both approaches are running in parallel),
- storing the detected dynamic objects and their measured motion parameters for their use in, e.g. a collision avoidance or path planning system.
-
Citations
19 Claims
-
1. A method for detecting dynamic objects in the visually sensed scene of a driver assistance system of a vehicle or robot with ego-motion, comprising the steps of:
-
providing visual signals of an environment of a vision sensor attached to a vehicle or robot with ego-motion, wherein the vision sensor is a dense depth sensor providing 3D data and wherein an additional sensor provides information on ego-motion, detecting a proper motion of objects on the input field of the vision sensor based on a detected optical flow, detecting the motion of objects based on a 3D representation model of the environment and using a 3D warping on the basis of predicted and measured 3D data, the predicted 3D data being generated based on measured 3D data and data representing the ego-motion, combining the 3D warping-based and the optical flow based object motion detection, by using both detecting steps in parallel and wherein the results of one detecting step is used to verify and refine the results of the other detecting step, wherein the 3D warping defines regions where the optical flow based object motion detection is applied and wherein the optical flow based object motion detection defines regions where the 3D warping is applied, and storing information on the detected dynamic objects and their measured motion parameters. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 19)
-
-
17. A driver assistance system with a driving path and/or surrounding model generation apparatus, the model generation apparatus comprising:
-
providing means for providing visual signals of the environment of a vision sensor with ego-motion, wherein the vision sensor is a dense depth sensor providing 3D data and wherein an additional sensor provides information on ego-motion, detecting means for detecting the proper motion of objects on the input field of the vision sensor based on the detected optical flow, detecting means for detecting the motion of objects based on a 3D representation model of the environment and using a 3D warping on the basis of predicted and measured 3D data, the predicted 3D data being generated based on measured 3D data and data representing the ego-motion, combining means for combining the 3D warping-based and the optical flow based object motion detection by using both detecting means in parallel and wherein the results of one detecting means is used to verify and refine the results of the other detecting means, wherein the 3D warping defines regions where the optical flow based object motion detection is applied and wherein the optical flow based object motion detection defines regions where the 3D warping is applied, and storing means for storing information on the detected dynamic objects and their measured motion parameters. - View Dependent Claims (18)
-
Specification