Methods for enhancing performance and data acquired from three-dimensional image systems
First Claim
1. For use with a system that acquires at least one of (x,y,z) distance and brightness measurements from a target source using independent sensors, a method of improving distance measurements comprising the following steps:
- (a) causing said sensors to acquire measurement data at a repetition rate higher than required for operation of said system;
(b) combining output signals from said sensors to obtain statistical average output signals;
wherein random noise in said system is reduced proportionally to a square root of number of said output signals averaged, and accuracy of said distance measurements is enhanced.
3 Assignments
0 Petitions
Accused Products
Abstract
A three-dimension distance time-of-flight system is disclosed in which distance values are acquired by a plurality of sensors independently from each other. For use with this and similar systems, Z-distance accuracy and resolution are enhanced using various techniques including over-sampling acquired sensor data and forming running averages, or forming moving averages. Acquired data may be rejected if it fails to meet criteria associated with distance, luminosity, velocity, or estimated shape information reported by neighboring sensors. A sub-target having at least one pre-calibrated reflectance zone is used to improve system measurement accuracy. Elliptical error is corrected for using a disclosed method, and reversible mapping of Z-values into RGB is provided.
133 Citations
20 Claims
-
1. For use with a system that acquires at least one of (x,y,z) distance and brightness measurements from a target source using independent sensors, a method of improving distance measurements comprising the following steps:
-
(a) causing said sensors to acquire measurement data at a repetition rate higher than required for operation of said system;
(b) combining output signals from said sensors to obtain statistical average output signals;
wherein random noise in said system is reduced proportionally to a square root of number of said output signals averaged, and accuracy of said distance measurements is enhanced. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. For use with a system that acquires at least one of (x,y,z) distance and brightness measurements using energy transmitter from an emitter at a first location on a plane, said energy reflecting at least in part from a target source and being detected by independent sensors defining a sensor array on said plane but spaced-apart from said first location, a method of improving distance measurements comprising the following steps:
-
(a) defining a spherical coordinate for each sensor in said array, and constructing a look-up table containing spherical coordinates for each said sensor;
(b) defining a spatial coordinate of said emitter;
(c) For each sensor <
i,j>
, calculating constants kij and hij as follows;
kij=Cx2+Cy2Cz2, andC=2(pij+Cx cos(aij)sin(bij)+Cy sin(aij)sin(bij)+Cz cos(bij))
(3)wherein sensor p has spherical coordinate (pij, −
aij, −
bij) and has Cartesian coordinate (px, py, pz)=(pij cos(−
aij)sin(−
bij), pij sin(−
aij)sin(−
bij), pij cos(−
bij))(d) constructing a look-up table containing calculated said values of kij and kij;
(e) identifying sensors <
i,j>
that actually detect energy reflected from said target object;
(f) for each sensor <
i,j>
identified at step (e), calculating calculate rA according to rA=((2 d−
p)2−
kij)/(4 d−
hij) where in a spherical coordinate system, point A is representable as (rA, aij, bij) using values of kij and kij from step (d);
(g) calculating roundtrip distance 2d from said target object to sensor <
i,j>
; and
(h) calculating actual coordinate of said target object detected at sensor <
ij>
according to Ax=rA cos(aij)sin(bij), Ay=rA sin(aij)sin(bij), and Az=rA cos(β
ij).
-
-
9. For use with a video imaging system that can encode Z values, a method of encoding said Z values as part of YIQ encoding, the method comprising the following steps:
-
(a) converting a RGB value for each sensor to a RGB matrix and converting said RGB matrix to a YIQ matrix;
(b) partitioning said YIQ matrix in Y, I, and Q planes;
(c) Fourier transforming I and Q dimensions of said YIQ matrix;
(d) allocating segments of said I and Q dimensions in a frequency domain to store Z-values, wherein said segments correspond to the frequencies that if eliminated from a reverse transformation would not substantially alter color perception by a human viewer;
(e) locating segments having at least one characteristic selected from a group consisting of (i) said segments are not used, and (ii) said segments fall below a predetermined threshold of visibility;
said segments being sufficiently large to store Z-values for all sensors;
(f) encoding Z(X,Y) coordinates of each sensor using said segments;
(g) adjusting amplitude coefficients of said segments;
(h) transform I″
Q″
from frequency domain to time domain, and appending Y thereto to create a YI″
Q″
matrix; and
(i) transforming from said YI″
Q″
matrix to a R″
G″
B″
matrix. - View Dependent Claims (10, 11, 12, 13)
-
-
14. A computer-readable storage medium wherein is located a computer program that causes a computer sub-system having at least a processor unit to control a system that acquires at least one of (x,y,z) distance and brightness measurements from a target source using independent sensors to enhance performance of said system by:
-
(a) causing said sensors to acquire measurement data at a repetition rate higher than required for operation of said system;
(b) combining output signals from said sensors to obtain statistical average output signals;
wherein random noise in said system is reduced proportionally to a square root of number of said output signals averaged, and accuracy of said distance measurements is enhanced. - View Dependent Claims (15, 16, 17)
-
-
18. A computer-readable storage medium wherein is located a computer program that causes a computer sub-system having at least a processor unit to improve distance measurements in a system that acquires at least one of (x,y,z) distance and brightness measurements using energy transmitter from an emitter at a first location on a plane, said energy reflecting at least in part from a target source and being detected by independent sensors defining a sensor array on said plane but spaced-apart from said first location by carrying out the following steps:
-
(a) defining a spherical coordinate for each sensor in said array, and constructing a look-up table containing spherical coordinates for each said sensor;
(b) defining a spatial coordinate of said emitter;
(c) For each sensor <
i,j>
, calculating constants kij and hij as follows;
kij=Cx2+Cy2+Cz2, and C=2(pij+Cx cos(aij)sin(bij)+Cy sin(aij)sin(bij)+Cz cos(bij)) wherein sensor p has spherical coordinate (pij, −
aij, −
bij) and has Cartesian coordinate (px, py, pz)=(pij cos(−
aij)sin(−
bij), pij sin(−
aij)sin(−
bij), pij cos(−
bij))(d) constructing a look-up table containing calculated said values of kij and kij;
(e) identifying sensors <
i,j>
that actually detect energy reflected from said target object;
(f) for each sensor <
i,j>
identified at step (e), calculating calculate rA according to rA=((2 d−
pij)2−
kij)/(4 d−
hij) where in a spherical coordinate system, point A is representable as (rA, aij, bij) using values of kij and kij from step (d);
(g) calculating roundtrip distance 2d from said target object to sensor <
i,j>
; and
(h) calculating actual coordinate of said target object detected at sensor <
i,j>
according to Ax=rA cos(aij)sin(bij), Ay=rA sin(aij)sin(bij), and Az=rA cos(β
ij).
-
-
19. A computer-readable storage medium wherein is located a computer program that causes a computer sub-system having a processor unit for use with a video imaging system that can encode Z values to encode Z values as part of YIQ encoding by:
-
(a) converting a RGB value for each sensor to a RGB matrix and converting said RGB matrix to a YIQ matrix;
(b) partitioning said YIQ matrix in Y, I, and Q planes;
(c) Fourier transforming I and Q dimensions of said YIQ matrix;
(d) allocating segments of said I and Q dimensions in a frequency domain to store Z-values, wherein said segments correspond to the frequencies that if eliminated from a reverse transformation would not substantially alter color perception by a human viewer;
(e) locating segments having at least one characteristic selected from a group consisting of (i) said segments are not used, and (ii) said segments fall below a predetermined threshold of visibility;
said segments being sufficiently large to store Z-values for all sensors;
(f) encoding Z(X,Y) coordinates of each sensor using said segments;
(g) adjusting amplitude coefficients of said segments;
(h) transform I″
Q″
from frequency domain to time domain, and appending Y thereto to create a YI″
Q″
matrix; and
(i) transforming from said YI″
Q″
matrix to a R″
G″
B″
matrix. - View Dependent Claims (20)
-
Specification