Automated frame of reference calibration for augmented reality
First Claim
1. A method for augmented reality executed by a data processing system having at least one processor, the method comprising:
- receiving or establishing a tracking system coordinate frame associated with an object tracking system,wherein the tracking system coordinate frame is aligned with a real 3D space, andwherein the tracking system tracks the position and orientation in a real 3D space of a real object and of a camera;
receiving from the tracking system a first real object frame of reference for the real object,wherein the first real object frame of reference indicates a position and orientation of the real object relative to the tracking system coordinate frame;
determining a second real object frame of reference for the real object,wherein the second real object frame of reference indicates a position and orientation of the real object relative to the tracking system coordinate frame;
receiving a first virtual object frame of reference for a virtual object,wherein the virtual object is modeled after the real object, andwherein the first virtual object frame of reference is unrelated to the tracking system coordinate frame;
determining a real object origin by calculating a centroid of three or more real object non-collinear points;
determining a second virtual object frame of reference for the virtual object,wherein the second virtual object frame of reference indicates a position and orientation of the virtual object relative to the tracking system coordinate frame, and wherein the second real object frame of reference is aligned with the second virtual object frame of reference;
determining a virtual object origin by calculating a centroid of three or more virtual object non-collinear points;
determining a virtual object mapping between the first virtual object frame of reference and the tracking system coordinate frame, wherein the virtual object mapping includes a transform matrix to transform between the first virtual object frame of reference and the tracking system coordinate frame; and
displaying an augmented scene including a view of the real 3D space, a view of the real object and one or more overlaid virtual items,wherein the virtual object mapping is used to place the one or more overlaid virtual items in the augmented scene such that the one or more virtual items are aligned with the real object.
2 Assignments
0 Petitions
Accused Products
Abstract
One or more systems, methods, routines and/or techniques for automated frame of reference calibration for augmented reality are described. One or more systems, methods, routines and/or techniques may allow for calibration of an Augmented Reality (AR) system, for example, by automatically calibrating the frames of reference of virtual objects and/or a camera. One example calibration routine and/or technique may determine and/or calculate a mapping or transform from a frame of reference of a virtual object (e.g., a CAD model) to a coordinate frame associated with the tracking system. Another example calibration routine and/or technique may determine and/or calculate a mapping or transform from a camera lens frame of reference to a frame of reference of the whole camera as determined by a tracking system. These routines and/or techniques may calibrate an AR system to provide rapid, precise alignment between virtual content and a live camera view of a real scene.
-
Citations
20 Claims
-
1. A method for augmented reality executed by a data processing system having at least one processor, the method comprising:
-
receiving or establishing a tracking system coordinate frame associated with an object tracking system, wherein the tracking system coordinate frame is aligned with a real 3D space, and wherein the tracking system tracks the position and orientation in a real 3D space of a real object and of a camera; receiving from the tracking system a first real object frame of reference for the real object, wherein the first real object frame of reference indicates a position and orientation of the real object relative to the tracking system coordinate frame; determining a second real object frame of reference for the real object, wherein the second real object frame of reference indicates a position and orientation of the real object relative to the tracking system coordinate frame; receiving a first virtual object frame of reference for a virtual object, wherein the virtual object is modeled after the real object, and wherein the first virtual object frame of reference is unrelated to the tracking system coordinate frame; determining a real object origin by calculating a centroid of three or more real object non-collinear points; determining a second virtual object frame of reference for the virtual object, wherein the second virtual object frame of reference indicates a position and orientation of the virtual object relative to the tracking system coordinate frame, and wherein the second real object frame of reference is aligned with the second virtual object frame of reference; determining a virtual object origin by calculating a centroid of three or more virtual object non-collinear points; determining a virtual object mapping between the first virtual object frame of reference and the tracking system coordinate frame, wherein the virtual object mapping includes a transform matrix to transform between the first virtual object frame of reference and the tracking system coordinate frame; and displaying an augmented scene including a view of the real 3D space, a view of the real object and one or more overlaid virtual items, wherein the virtual object mapping is used to place the one or more overlaid virtual items in the augmented scene such that the one or more virtual items are aligned with the real object. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A method for augmented reality executed by a data processing system having at least one processor, the method comprising:
-
receiving or establishing a tracking system coordinate frame associated with an object tracking system, wherein the tracking system coordinate frame is aligned with a real 3D space, and wherein the tracking system tracks the position and orientation in a real 3D space of a camera that captures the real 3D space and a printed marker;
receiving from the tracking system a camera frame of reference for the camera,wherein the camera frame of reference indicates a position and orientation of the camera relative to the tracking system coordinate frame; receiving or establishing a printed marker coordinate frame associated with the printed marker, wherein the printed marker coordinate frame is aligned with the real 3D space, and wherein the printed marker coordinate frame is aligned with the tracking system coordinate frame; determining a printed marker origin by calculating a centroid of three or more printed marker non-collinear points; determining a camera lens frame of reference for the lens of the camera, wherein the camera lens frame of reference indicates a position and orientation of the camera lens relative to the printed marker coordinate frame; determining a camera lens mapping between the camera frame of reference and the camera lens frame of reference, wherein the camera lens mapping includes a transform matrix to transform between the camera frame of reference and the camera lens frame of reference; and displaying an augmented scene including a view of the real 3D space and one or more virtual items, wherein the camera lens mapping is used to alter or distort the one or more virtual items in the augmented scene. - View Dependent Claims (9, 10, 11, 12, 13, 14, 15)
-
-
16. A system, comprising:
-
a camera that captures a view of a real 3D space including a real object; a tracking system that tracks the position and orientation in a real 3D space of the real object and of the camera, wherein the tracking system is configured to establish a tracking system coordinate frame associated with the tracking system, wherein the tracking system coordinate frame is aligned with the real 3D space; and a computer coupled to the camera and the tracking system, the computer having one or more memory units, the computer being configured with a virtual modeler, wherein the virtual modeler is configured to receive from the tracking system a first real object frame of reference for the real object, wherein the first real object frame of reference indicates a position and orientation of the real object relative to the tracking system coordinate frame; wherein the virtual modeler is further configured to compute a second real object frame of reference for the real object, wherein the second real object frame of reference indicates a position and orientation of the real object relative to the tracking system coordinate frame; wherein the virtual modeler is further configured to compute the second real object frame of reference by; receiving or detecting three or more real object non-collinear points on the real object, wherein the location of three or more real object non-collinear points are defined relative to the tracking system coordinate frame; determining a real object origin by calculating a centroid of the three or more real object non-collinear points; and determining a real object orientation that is related to the orientation of the first real object frame of reference; wherein the virtual modeler is further configured to receive from the one or more memory units a first virtual object frame of reference for a virtual object, wherein the virtual object is modeled after the real object, and wherein the first virtual object frame of reference is unrelated to the tracking system coordinate frame; wherein the virtual modeler is further configured to compute a second virtual object frame of reference for the virtual object, wherein the second virtual object frame of reference indicates a position and orientation of the virtual object relative to the tracking system coordinate frame, and wherein the second real object frame of reference is aligned with the second virtual object frame of reference; wherein the virtual modeler is further configured to compute the second virtual object frame of reference by; receiving or indicating three or more virtual object non-collinear points on the virtual object, wherein the location of three or more virtual object non-collinear points are defined relative to the tracking system coordinate frame; determining a virtual object origin by calculating a centroid of the three or more virtual object non-collinear points; and determining a virtual object orientation; wherein the virtual modeler is further configured to compute a virtual object mapping between the first virtual object frame of reference and the tracking system coordinate frame, and wherein the virtual object mapping includes a transform matrix to transform between the first virtual object frame of reference and the tracking system coordinate frame; and wherein the virtual modeler is further configured to generate and store in the one or more memory units an augmented scene including a view of the real 3D space, a view of the real object and one or more overlaid virtual items, wherein the virtual object mapping is used to place the one or more overlaid virtual items in the augmented scene such that the one or more virtual items are aligned with the real object. - View Dependent Claims (17, 18, 19)
-
-
20. A data processing system, comprising:
-
one or more memory units that store computer code; and one or more processor units coupled to the one or more memory units, wherein the one or more processor units execute the computer code stored in the one or more memory units to; receive or establish a tracking system coordinate frame associated with an object tracking system, wherein the tracking system coordinate frame is aligned with a real 3D space, and wherein the tracking system tracks the position and orientation in a real 3D space of a camera that captures the real 3D space and a printed marker; receive from the tracking system a camera frame of reference for the camera, wherein the camera frame of reference indicates a position and orientation of the camera relative to the tracking system coordinate frame; receive or establish a printed marker coordinate frame associated with the printed marker, wherein the printed marker coordinate frame is aligned with the real 3D space, and wherein the printed marker coordinate frame is aligned with the tracking system coordinate frame; wherein receiving or establishing the printed marker coordinate frame includes; receiving or detecting three or more printed marker non-collinear points on the printed marker, wherein the location of three or more printed marker non-collinear points are defined relative to the tracking system coordinate frame; determining a printed marker origin by calculating a centroid of the three or more printed marker non-collinear points; and determining a printed marker orientation that is related to the orientation of the printed marker coordinate frame; determine a camera lens frame of reference for the lens of the camera, wherein the camera lens frame of reference indicates a position and orientation of the camera lens relative to the printed marker coordinate frame; determine a camera lens mapping between the camera frame of reference and the camera lens frame of reference, wherein the camera lens mapping includes a transform matrix to transform between the camera frame of reference and the camera lens frame of reference; and display an augmented scene including a view of the real 3D space and one or more virtual items, wherein the camera lens mapping is used to alter or distort the one or more virtual items in the augmented scene.
-
Specification