Real-time camera tracking using depth maps
First Claim
1. A method of real-time camera tracking comprising:
- receiving a sequence of depth map frames from a moving mobile depth camera, each of a plurality of the depth map frames comprising a plurality of depth values at a plurality of image elements, the plurality of depth values being related to a distance from the mobile depth camera to a surface in the scene captured by the mobile depth camera;
tracking the position and orientation of the mobile depth camera by computing registration parameters for the plurality of the depth map frames, the registration parameters being parameters of a transformation for aligning a first depth map frame and a preceding depth map frame;
wherein computing the registration parameters comprises using an iterative process to;
identify corresponding points in pairs of depth map frames without computing shapes depicted within the pairs of depth map frames and by using a parallel computing unit to optimize an error metric applied to the identified corresponding points such that the error metric is applied a plurality of the identified corresponding points in parallel.
1 Assignment
0 Petitions
Accused Products
Abstract
Real-time camera tracking using depth maps is described. In an embodiment depth map frames are captured by a mobile depth camera at over 20 frames per second and used to dynamically update in real-time a set of registration parameters which specify how the mobile depth camera has moved. In examples the real-time camera tracking output is used for computer game applications and robotics. In an example, an iterative closest point process is used with projective data association and a point-to-plane error metric in order to compute the updated registration parameters. In an example, a graphics processing unit (GPU) implementation is used to optimize the error metric in real-time. In some embodiments, a dense 3D model of the mobile camera environment is used.
222 Citations
20 Claims
-
1. A method of real-time camera tracking comprising:
-
receiving a sequence of depth map frames from a moving mobile depth camera, each of a plurality of the depth map frames comprising a plurality of depth values at a plurality of image elements, the plurality of depth values being related to a distance from the mobile depth camera to a surface in the scene captured by the mobile depth camera; tracking the position and orientation of the mobile depth camera by computing registration parameters for the plurality of the depth map frames, the registration parameters being parameters of a transformation for aligning a first depth map frame and a preceding depth map frame; wherein computing the registration parameters comprises using an iterative process to; identify corresponding points in pairs of depth map frames without computing shapes depicted within the pairs of depth map frames and by using a parallel computing unit to optimize an error metric applied to the identified corresponding points such that the error metric is applied a plurality of the identified corresponding points in parallel. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A real-time camera tracker comprising:
-
an input arranged to receive a sequence of depth map frames from a moving mobile depth camera a plurality of depth map frames of the sequence of depth map frames comprising a plurality of depth values corresponding to a plurality of image elements, those depth values being related to a corresponding distance from the mobile depth camera to a corresponding surface in the scene captured by the mobile depth camera; a frame alignment engine arranged to track the position and orientation of the mobile depth camera by computing registration parameters for the plurality of depth map frames, those registration parameters being parameters of a transformation for aligning a current depth map frame and a preceding depth map frame; the frame alignment engine being arranged to compute the registration parameters using an iterative process to;
identify corresponding points in pairs of depth map frames without computing shapes depicted within the depth map frames;the frame alignment engine comprising a parallel computing unit arranged to optimize an error metric applied to the identified corresponding points as part of the iterative process such that the error metric is applied to one or more of the identified corresponding points in parallel at the parallel computing unit. - View Dependent Claims (14, 15, 16, 20)
-
-
17. A method of real-time camera tracking comprising:
-
receiving a sequence of depth map frames from a moving mobile depth camera one or more depth map frames of the sequence comprising a depth value at one or more image elements that depth value being related to a distance from the mobile depth camera to a corresponding surface in the scene captured by the mobile depth camera; tracking the position and orientation of the mobile depth camera by computing registration parameters for at least one of the one or more depth map frames, those registration parameters being parameters of a transformation for aligning the at least one of the one or more depth map frames and a preceding depth map frame, the preceding depth map frame being estimated from a dense 3D model of the scene; wherein computing the registration parameters comprises using an iterative process to;
identify corresponding points in pairs of depth map frames without computing shapes depicted within the depth map frames and by using a parallel computing unit to optimize an error metric applied to at least one of the identified corresponding points in parallel at the parallel computing unit. - View Dependent Claims (18, 19)
-
Specification