CALIBRATION OF MULTI-CAMERA DEVICES USING REFLECTIONS THEREOF
First Claim
1. A method of calibrating an imaging device including a plurality of cameras, the method including:
- capturing reflection images of an imaging device using at least two cameras;
analyzing the reflection images to locate at least one feature of the imaging device therein and an error thereof, using a current calibration parameter set; and
determining an improved current calibration parameter set for the imaging device based at least in part on location of the at least one feature of the imaging device.
13 Assignments
0 Petitions
Accused Products
Abstract
The technology disclosed can provide capabilities such as calibrating an imaging device based on images taken by device cameras of reflections of the device itself. Implementations exploit device components that are easily recognizable in the images, such as one or more light-emitting devices (LEDs) or other light sources to eliminate the need for specialized calibration hardware and can be accomplished, instead, with hardware readily available to a user of the device—the device itself and a reflecting surface, such as a computer screen. The user may hold the device near the screen under varying orientations and capture a series of images of the reflection with the device'"'"'s cameras. These images are analyzed to determine camera parameters based on the known positions of the light sources. If the positions of the light sources themselves are subject to errors requiring calibration, they may be solved for as unknowns in the analysis.
-
Citations
20 Claims
-
1. A method of calibrating an imaging device including a plurality of cameras, the method including:
-
capturing reflection images of an imaging device using at least two cameras; analyzing the reflection images to locate at least one feature of the imaging device therein and an error thereof, using a current calibration parameter set; and determining an improved current calibration parameter set for the imaging device based at least in part on location of the at least one feature of the imaging device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15)
-
-
16. A method of determining calibration parameters for an imaging device including three light sources disposed before a reflective surface, including:
-
capturing includes capturing reflection images of the imaging device including three light sources positioned substantially along a straight line; reconstructing a set of three-dimensional (3D) positions for the three light sources of the imaging device captured in the reflection images using stereo matching; computing a set of bisecting planes, each between an origin (actual) location of one of the three light sources and an expected 3D position for the reflected light sources as reconstructed; computing a set of dot products, each of a combination of a normal corresponding to a bisecting plane and a bisecting point, wherein the plane and the point are between the origin location of a particular light source and expected location for the reflected particular light source; computing an error comprised of a variance of the set of all dot products; and finding a calibration parameter set corresponding to a value of the error less than a threshold. - View Dependent Claims (17)
-
-
18. A computer system for calibrating an imaging device including a plurality of cameras, the system including:
-
an interface to receive images from at least two cameras; a memory to store the images and instructions for execution by a processor; and a processor to execute the instructions to analyze reflection images of the imaging device captured with the at least two cameras to thereby locate at least one feature of the imaging device therein, and to determine calibration parameters of the imaging device based on the at least one located feature. - View Dependent Claims (19, 20)
-
Specification