REAL-TIME SELF-LOCALIZATION FROM PANORAMIC IMAGES
First Claim
1. A method comprising:
- producing at least a portion of a panoramic cylindrical map of an environment with a camera;
extracting features from the at least the portion of the panoramic cylindrical map;
comparing the features from the at least the portion of the panoramic cylindrical map to model features from a pre-generated three-dimensional model of the environment to produce a set of corresponding features; and
using the set of corresponding features to determine a position and an orientation of the camera.
1 Assignment
0 Petitions
Accused Products
Abstract
Real-time localization is performed using at least a portion of a panoramic image captured by a camera on a mobile device. A panoramic cylindrical map is generated using images captured by the camera, e.g., as the camera rotates. Extracted features from the panoramic cylindrical map are compared to features from a pre-generated three-dimensional model of the environment. The resulting set of corresponding features may be used to determine the pose of the camera. For example, the set of corresponding features may be converted into rays between the panoramic cylindrical map and the three-dimensional model, where the intersection of the rays is used to determine the pose. The relative orientation of the camera may also be tracked by comparing features from each new image to the panoramic cylindrical map, and the tracked orientation may be fused with the pose.
122 Citations
28 Claims
-
1. A method comprising:
-
producing at least a portion of a panoramic cylindrical map of an environment with a camera; extracting features from the at least the portion of the panoramic cylindrical map; comparing the features from the at least the portion of the panoramic cylindrical map to model features from a pre-generated three-dimensional model of the environment to produce a set of corresponding features; and using the set of corresponding features to determine a position and an orientation of the camera. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. An apparatus comprising:
-
a camera capable of capturing images of an environment; and a processor coupled to the camera, the processor configured to produce at least a portion of a panoramic cylindrical map of the environment using images captured by the camera, extract features from the at least the portion of the panoramic cylindrical map, compare the features from the at least the portion of the panoramic cylindrical map to features from a pre-generated three-dimensional model of the environment to produce a set of corresponding features, and use the set of corresponding features to determine a position and an orientation of the camera. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. An apparatus comprising:
-
means for producing at least a portion of a panoramic cylindrical map of an environment with a camera; means for extracting features from the at least the portion of the panoramic cylindrical map; means for comparing the features from the at least the portion of the panoramic cylindrical map to features from a pre-generated three-dimensional model of the environment to produce a set of corresponding features; and means for using the set of corresponding features to determine a position and an orientation of the camera. - View Dependent Claims (16, 17, 18, 19, 20, 21)
-
-
22. A non-transitory computer-readable medium including program code stored thereon, comprising:
-
program code to produce at least a portion of a panoramic cylindrical map of an environment with images captured by a camera; program code to extract features from the at least the portion of the panoramic cylindrical map; program code to compare the features from the at least the portion of the panoramic cylindrical map to features from a pre-generated three-dimensional model of the environment to produce a set of corresponding features; and program code to use the set of corresponding features to determine a position and an orientation of the camera. - View Dependent Claims (23, 24, 25, 26, 27, 28)
-
Specification