Method and apparatus for estimating three-dimensional position and orientation through sensor fusion
First Claim
1. An apparatus for estimating a position and an orientation, the apparatus comprising:
- one or more processors configured todetermine a position of a marker in a two-dimensional (2D) image;
determine, in a depth image, a position corresponding to the position of the marker in the 2D image, and to determine a depth of the corresponding position in the depth image to be a depth of the marker; and
estimate, based on the depth of the marker, a marker-based position indicating a three-dimensional (3D) position of the marker,wherein the one or more processors are further configured to determine a 2D position value excluding the depth of the marker, based on the depth of the marker, a field of view at which the 2D image and the depth image are photographed, and a distance from a predetermined reference position to the marker.
1 Assignment
0 Petitions
Accused Products
Abstract
An apparatus and method of estimating a three-dimensional (3D) position and orientation based on a sensor fusion process. The method of estimating the 3D position and orientation may include determining a position of a marker in a two-dimensional (2D) image, determining a depth of a position in a depth image corresponding to the position of the marker in the 2D image to be a depth of the marker, estimating a 3D position of the marker calculated based on the depth of the marker as a marker-based position of a remote apparatus, estimating an inertia-based position and an inertia-based orientation by receiving inertial information associated with the remote apparatus, estimating a fused position based on a weighted sum of the marker-based position and the inertia-based position, and outputting the fused position and the inertia-based orientation.
-
Citations
26 Claims
-
1. An apparatus for estimating a position and an orientation, the apparatus comprising:
-
one or more processors configured to determine a position of a marker in a two-dimensional (2D) image; determine, in a depth image, a position corresponding to the position of the marker in the 2D image, and to determine a depth of the corresponding position in the depth image to be a depth of the marker; and estimate, based on the depth of the marker, a marker-based position indicating a three-dimensional (3D) position of the marker, wherein the one or more processors are further configured to determine a 2D position value excluding the depth of the marker, based on the depth of the marker, a field of view at which the 2D image and the depth image are photographed, and a distance from a predetermined reference position to the marker. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A method of estimating a position and an orientation, the method comprising:
-
determining a position of a marker in a two-dimensional (2D) image; determining a position corresponding to the position of the marker, in a depth image; determining a depth of the corresponding position in the depth image to be a depth of the marker; and estimating, based on the depth of the marker, a marker-based position of a remote apparatus, which indicates a three-dimensional (3D) position of the marker, wherein the estimating comprises determining the 3D position of the marker by calculating a 2D position value, excluding the depth of the marker, based on the depth of the marker, a field of view at which the 2D image and the depth image are photographed, and a distance from a predetermined reference position to the marker. - View Dependent Claims (14, 15, 16, 17, 18, 19)
-
-
20. A system for estimating a position and an orientation, the system comprising:
-
at least one marker generator to generate and output a marker; one more sensors configured to photograph a two-dimensional (2D) image and a depth image; and one or more processors configured to determine a position of the marker in the 2D image, to determine, in the depth image, a position corresponding to the position of the marker in the 2D image, determining a depth of the corresponding position in the depth image to be a depth of the marker, and to estimate, based on the depth of the marker, a marker-based position which indicates a three-dimensional (3D) position of the marker, wherein the one or more processors are further configured to determine a 2D position value excluding the depth of the marker, based on the depth of the marker, a field of view at which the 2D image and the depth image are photographed, and a distance from a predetermined reference position to the marker. - View Dependent Claims (21, 22, 23, 24, 25, 26)
-
Specification