MOBILE DEVICE WITH THREE DIMENSIONAL AUGMENTED REALITY
First Claim
1. A method for determining an augmented reality scene by a mobile device comprising:
- (a) said mobile device accessing intrinsic calibration parameters of a pair of imaging devices of said mobile device in a manner independent of a sensed scene of said augmented reality scene;
(b) said mobile device determining two dimensional disparity information of a pair of images from said mobile device based upon a stereo matching technique;
(c) said mobile device estimating extrinsic parameters of a sensed scene by said pair of imaging devices, including at least one of rotation and translation;
(d) said mobile device calculating a three dimensional image based upon a depth of different parts of a said sensed scene based upon a triangulation technique;
(e) said mobile device incorporating a three dimensional virtual object in said three dimensional image to determine said augmented reality scene.
1 Assignment
0 Petitions
Accused Products
Abstract
A method for determining an augmented reality scene by a mobile device includes estimating 3D geometry and lighting conditions of the sensed scene based on stereoscopic images captured by a pair of imaging devices. The device accesses intrinsic calibration parameters of a pair of imaging devices of the device independent of a sensed scene of the augmented reality scene. The device determines two dimensional disparity information of a pair of images from the device independent of a sensed scene of the augmented reality scene. The device estimates extrinsic parameters of a sensed scene by the pair of imaging devices, including at least one of rotation and translation. The device calculates a three dimensional image based upon a depth of different parts of the sensed scene based upon a stereo matching technique. The device incorporates a three dimensional virtual object in the three dimensional image to determine the augmented reality scene.
-
Citations
22 Claims
-
1. A method for determining an augmented reality scene by a mobile device comprising:
-
(a) said mobile device accessing intrinsic calibration parameters of a pair of imaging devices of said mobile device in a manner independent of a sensed scene of said augmented reality scene; (b) said mobile device determining two dimensional disparity information of a pair of images from said mobile device based upon a stereo matching technique; (c) said mobile device estimating extrinsic parameters of a sensed scene by said pair of imaging devices, including at least one of rotation and translation; (d) said mobile device calculating a three dimensional image based upon a depth of different parts of a said sensed scene based upon a triangulation technique; (e) said mobile device incorporating a three dimensional virtual object in said three dimensional image to determine said augmented reality scene. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22)
-
Specification