Method and system for recapturing a trajectory of an object
First Claim
1. A method for recapturing a trajectory of an object, comprising:
- employing first and second sensors, respectively capable of gathering information about an object'"'"'s trajectory within first and second at least partially overlapping fields of view, thereby obtaining a sub-trajectory in each field of view;
self-configuring said first and second sensors by computing a spatial relationship from the offsets between sub-trajectories; and
stitching the sub-trajectories together in sequence to form a single estimated trajectory.
2 Assignments
0 Petitions
Accused Products
Abstract
The present invention is in the field of sensor fusion, and discloses in one embodiment a method for extracting the trajectories of moving objects from an assembly of low-resolution sensors, whose spatial relationships are initially unknown, except that their fields of view are known to overlap so as to form a continuous coverage region, which may be much larger than the field of view of any individual sensor. Segments of object trajectories may be extracted from the data of each sensor, and then stitched together to reconstruct the trajectories of the objects. The stitching process also allows determination of the spatial relationships between the sensors, so that from initially knowing little or nothing about the sensor arrangement or the paths of the objects, both may be reconstructed unambiguously.
-
Citations
22 Claims
-
1. A method for recapturing a trajectory of an object, comprising:
-
employing first and second sensors, respectively capable of gathering information about an object'"'"'s trajectory within first and second at least partially overlapping fields of view, thereby obtaining a sub-trajectory in each field of view;
self-configuring said first and second sensors by computing a spatial relationship from the offsets between sub-trajectories; and
stitching the sub-trajectories together in sequence to form a single estimated trajectory. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
time stamping each sub-trajectory.
-
-
4. A method according to claim 3, further comprising:
using the time stamp for identifying sub-trajectories corresponding to the same object.
-
5. A method according to claim 3, further comprising:
using the time stamps for rejecting subtrajectories which if stitched together would give rise to a discontinuous estimated trajectory.
-
6. A method according to claim 1 wherein said stitching comprises choosing among sub-trajectories candidates for stitching, based on the sub-trajectories'"'"' spatial proximity in the overlap region.
-
7. A method according to claim 1, wherein each sub-trajectory is time stamped, and wherein said stitching comprises choosing among sub-trajectories candidates for stitching, based on the sub-trajectories'"'"' spatial proximity in the overlap region.
-
8. A method according to claim 7, further comprising:
using the time stamps for rejecting stitching candidates which, if stitched together, would give rise to a discontinuous estimated trajectory.
-
9. A method according to claim 1, further comprising:
employing a third sensor capable of gathering information about an object'"'"'s trajectory within a third field of view overlapping both the first and second fields of view, in which the mutual consistency of two estimated spatial relationships between sensors is improved by adjusting the estimates to reduce the error residuals from pairwise comparison of a trajectory in the overlap regions.
-
10. The method according to claim 1, wherein said data used for self-configuration of said first and second sensors is used to form said single estimated trajectory.
-
11. A method for recapturing a trajectory of an object, the method comprising the steps of:
-
1) employing first and second sensors, respectively capable of gathering information about an object'"'"'s trajectory in first and second fields of view, thereby obtaining a sub trajectory in each field of view, and so disposed that a known characteristic of the object'"'"'s motion places bounds on an error committed in extrapolating its subtrajectories between the edges of the fields of view; and
2) stitching the subtrajectories into a single estimated trajectory. - View Dependent Claims (12)
-
-
13. A system suitable for recapturing a trajectory of an object, the system comprising:
-
first and second sensors, respectively capable of gathering information about an object'"'"'s trajectory within first and second at least partially overlapping fields of view, for obtaining a sub-trajectory in each field of view; and
means for stitching the sub-trajectories together in sequence to form a single estimated trajectory, wherein said first and second sensors are self-configured by computing a spatial relationship from the offsets between sub-trajectories. - View Dependent Claims (14)
-
-
15. A method for determining a spatial relationship between the coordinate systems corresponding to two multi-pixel sensors, whose fields of view are known to overlap at least partially, the method comprising:
-
gathering sub-trajectory data for an object moving in the fields of view;
selecting sub-trajectories which in sequence form a trajectory of the object; and
computing the spatial relationship from the offsets between the selected sub-trajectories. - View Dependent Claims (16, 17, 18, 19, 20)
employing a third multi-pixel infrared sensor, in which the mutual consistency of two spatial relationships between sensors is improved by the imposition of a priori geometrical constraints.
-
-
21. A method for determining a spatial relationship between the coordinate systems corresponding to two multi-pixel sensors, the method comprising the steps of:
-
1) gathering sub-trajectory data for an object moving in the fields of view of the sensors, where a known characteristic of the object'"'"'s motion places bounds on an error committed in extrapolating its subtrajectories between the edges of the fields of view;
2) selecting from the sub-trajectories those which correspond to the object;
3) extrapolating the subtrajectories until they overlap in time; and
4) computing the spatial relationship from the offsets between the selected sub-trajectories.
-
-
22. A system suitable for determining a spatial relationship between the coordinate systems corresponding to two multi-pixel sensors whose fields of view are known to overlap at least partially, the system comprising:
-
means for gathering sub-trajectory data for an object moving in the fields of view;
means for selecting sub-trajectories which in sequence form a trajectory of the object; and
means for computing the spatial relationships from the offsets between the selected sub-trajectories.
-
Specification