Image processing method and apparatus for calibrating depth of depth sensor
First Claim
Patent Images
1. An image processing method for calibrating a depth of a depth sensor, the image processing method comprising:
- obtaining a depth image of a target object captured by the depth sensor and a color image of the target object captured by a color camera;
calibrating a geometrical relation between a projector of the depth sensor and a depth camera based on the obtained depth image and the color image; and
calibrating a depth of the depth sensor by calculating a correct feature point on an image plane of the depth camera that corresponds to a feature point of an image plane of the projector,wherein the calibrating of the depth of the depth sensor comprises;
calculating the feature point on the image plane of the projector that corresponds to a feature point with a predetermined depth value on the image plane of the depth camera,calculating the correct feature point of the image plane of the depth camera that corresponds to the feature point of the image plane of the projector that is calculated by a first calculator when structured light is projected from an estimated correct position of the projector, andcalculating an actual depth value of the calculated correct feature point on the image plane of the depth camera, and whereinthe calculating of the correct feature point on the image plane of the depth camera comprises;
calculating a rotation matrix correction value and a translation vector correction value,calculating a corrected epipolar line on the image plane of the depth camera using the calculated feature point on the image plane of the projector, the calculated rotation matrix correction value, and the calculated translation vector correction value, andsearching for a new feature point on the corrected epipolar line based on the rotation matrix correction value and the translation vector correction value, wherein the new feature point corresponds to the correct feature point on the image plane of the depth camera that corresponds to the feature point on the image plane of the projector.
1 Assignment
0 Petitions
Accused Products
Abstract
An image processing apparatus and method for calibrating a depth of a depth sensor. The image processing method may include obtaining a depth image of a target object captured by a depth sensor and a color image of the target object captured by a color camera; and calibrating a depth of the depth sensor by calibrating a geometrical relation between a projector and a depth camera, which are included in the depth sensor, based the obtained depth and color images and calculating a correct feature point on an image plane of the depth camera that corresponds to a feature point of an image plane of the projector.
9 Citations
8 Claims
-
1. An image processing method for calibrating a depth of a depth sensor, the image processing method comprising:
-
obtaining a depth image of a target object captured by the depth sensor and a color image of the target object captured by a color camera; calibrating a geometrical relation between a projector of the depth sensor and a depth camera based on the obtained depth image and the color image; and calibrating a depth of the depth sensor by calculating a correct feature point on an image plane of the depth camera that corresponds to a feature point of an image plane of the projector, wherein the calibrating of the depth of the depth sensor comprises; calculating the feature point on the image plane of the projector that corresponds to a feature point with a predetermined depth value on the image plane of the depth camera, calculating the correct feature point of the image plane of the depth camera that corresponds to the feature point of the image plane of the projector that is calculated by a first calculator when structured light is projected from an estimated correct position of the projector, and calculating an actual depth value of the calculated correct feature point on the image plane of the depth camera, and wherein the calculating of the correct feature point on the image plane of the depth camera comprises; calculating a rotation matrix correction value and a translation vector correction value, calculating a corrected epipolar line on the image plane of the depth camera using the calculated feature point on the image plane of the projector, the calculated rotation matrix correction value, and the calculated translation vector correction value, and searching for a new feature point on the corrected epipolar line based on the rotation matrix correction value and the translation vector correction value, wherein the new feature point corresponds to the correct feature point on the image plane of the depth camera that corresponds to the feature point on the image plane of the projector. - View Dependent Claims (2, 3, 4, 5)
-
-
6. An image processing apparatus for calibrating a depth sensor, the image processing apparatus comprising:
-
a projector of the depth sensor configured to project structured light; a depth camera comprising a color camera, the depth camera configured to capture the structured light and the color camera configured to capture a color image of a target object; an image receiver, executed by a processor causing the processor to perform an image detector to obtain a depth image of the target object captured by the depth sensor and the color image of the target object captured by the color camera, wherein; the processor is configured to calibrate a geometrical relation between the projector and the depth camera based on the obtained depth image and the color image, and to correct a depth of the depth sensor by calculating a correct feature point on an image plane of the depth camera that corresponds to a feature point of an image plane of the projector, wherein the processor comprises a first calculator configured to calculate a feature point on an image plane of the projector that corresponds to a feature point with a predetermined depth value on an image plane of the depth camera, a second calculator configured to calculate the correct feature point on the image plane of the depth camera that corresponds to the feature point on the image plane of the projector that is calculated by the first calculator when structured light is projected from an estimated correct position of the projector, and a third calculator configured to calculate an actual depth value and actual three-dimensional (3D) coordinates of the correct feature point on the image plane of the depth camera that has been calculated by the second calculator, and wherein the second calculator is configured to calculate a rotation matrix correction value and a translation vector correction value, calculate a corrected epipolar line on the image plane of the depth camera using the calculated rotation matrix correction value and translation vector correction value and the feature point on the image plane of the projector that has been calculated by the first calculator, and to search for a new feature point on the corrected epipolar line based on the rotation matrix correction value and the translation vector correction value, wherein the new feature point corresponds to the correct feature point on the image plane of the depth camera that corresponds to the feature point on the image plane of the projector. - View Dependent Claims (7, 8)
-
Specification