Camera and sensor augmented reality techniques
First Claim
Patent Images
1. A method, comprising:
- obtaining an optical basis generated from image data obtained by a camera of a computing device, the optical basis describing a likely orientation or position of the camera in a physical environment;
obtaining a sensor basis generated from sensor data obtained from one or more sensors of the computing device, the sensor basis describing a likely orientation or position of the one or more sensors in the physical environment; and
comparing the optical basis and the sensor basis to verify the orientation or the position of the computing device in the physical environment, the comparing comprising determining whether or not the orientation or the position described by the optical basis and the orientation or the position described by the sensor basis are consistent.
2 Assignments
0 Petitions
Accused Products
Abstract
Camera and sensor augmented reality techniques are described. In one or more implementations, an optical basis is obtained that was generated from data obtained by a camera of a computing device and a sensor basis is obtained that was generated from data obtained from one or more sensors that are not a camera. The optical basis and the sensor basis describe a likely orientation or position of the camera and the one or more sensors, respectively, in a physical environment. The optical basis and the sensor basis are compared to verify the orientation or the position of the computing device in the physical environment.
20 Citations
20 Claims
-
1. A method, comprising:
-
obtaining an optical basis generated from image data obtained by a camera of a computing device, the optical basis describing a likely orientation or position of the camera in a physical environment; obtaining a sensor basis generated from sensor data obtained from one or more sensors of the computing device, the sensor basis describing a likely orientation or position of the one or more sensors in the physical environment; and comparing the optical basis and the sensor basis to verify the orientation or the position of the computing device in the physical environment, the comparing comprising determining whether or not the orientation or the position described by the optical basis and the orientation or the position described by the sensor basis are consistent. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method implemented by a computing device, the method comprising:
-
comparing an optical basis and a sensor basis to verify an orientation or position of the computing device in a physical environment, the comparing comprising determining whether a likely orientation or position of a camera in a physical environment described by the optical basis and a likely orientation or position of one or more sensors in the physical environment described by the sensor basis are consistent; and responsive to a determination that the optical basis and the sensor basis are consistent as a result of the comparing, calculating a combined basis that describes the orientation or position of the computing device in the physical environment. - View Dependent Claims (12, 13, 14)
-
-
15. An apparatus, comprising:
-
a camera configured to capture image data; an inertial measurement unit (IMU) configured to generate sensor data; one or more modules that are configured to; compute an optical basis from the image data and a sensor basis generated from the sensor data; and compare the optical basis and the sensor basis to verify an orientation or position of the apparatus in a physical environment, the comparing comprising determining whether or not the orientation or the position described by the optical basis and the orientation or the position described by the sensor basis are consistent. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification