REAL-TIME SHARED AUGMENTED REALITY EXPERIENCE
First Claim
23. A system for providing a shared augmented reality experience, the system comprising:
- one or more on-site devices for generating augmented reality representations of a real-world location; and
one or more off-site devices for generating virtual augmented reality representations of the real-world location;
wherein the augmented reality representations include content visualized and incorporated with live views of the real-world location;
wherein the virtual augmented reality representations include the content visualized and incorporated with live views in a virtual augmented reality world representing the real-world location; and
wherein the on-site devices synchronize the data of the augmented reality representations with the off-site devices such that the augmented reality representations and the virtual augmented reality representations are consistent with each other.
1 Assignment
0 Petitions
Accused Products
Abstract
A system is provided for enabling a shared augmented reality experience. The system comprises zero, one or more on-site devices for generating augmented reality representations of a real-world location, and one or more off-site devices for generating virtual augmented reality representations of the real-world location. The augmented reality representations include data and or content incorporated into live views of a real-world location. The virtual augmented reality representations of the AR scene incorporate images and data from a real world location and include additional content used in an AR presentation. The on-site devices synchronize the content used to create the augmented reality experience with the off-site devices in real time such that the augmented reality representations and the virtual augmented reality representations are consistent with each other.
-
Citations
51 Claims
-
23. A system for providing a shared augmented reality experience, the system comprising:
-
one or more on-site devices for generating augmented reality representations of a real-world location; and one or more off-site devices for generating virtual augmented reality representations of the real-world location; wherein the augmented reality representations include content visualized and incorporated with live views of the real-world location; wherein the virtual augmented reality representations include the content visualized and incorporated with live views in a virtual augmented reality world representing the real-world location; and wherein the on-site devices synchronize the data of the augmented reality representations with the off-site devices such that the augmented reality representations and the virtual augmented reality representations are consistent with each other. - View Dependent Claims (24, 25, 26, 27, 28, 29)
-
-
30. A computer device for sharing augmented reality experiences, the computer device comprising of:
-
a network interface configured to receive environmental, position, and geometry data of a real-world location from an on-site device in proximity to the real-world location; the network interface further configured to receive augmented reality data or content from the on-site device; an off-site virtual augmented reality engine configured to create a virtual representation of the real-world location based on the environmental data including position and geometry data received from the on-site device; and an engine configured to reproduce the augmented reality content in the virtual representation of reality such that the virtual representation of reality is consistent with the augmented reality representation of the real-world location (AR scene) created by the on-site device. - View Dependent Claims (1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 31, 32, 33, 34, 35, 45)
-
-
36. A method for sharing augmented reality positional data and the relative time values of that positional data, the method comprising:
-
receiving, from at least one on-site device, positional data and the relative time values of that positional data, collected from the motion of the on-site device; creating an augmented reality (AR) three-dimensional Vector based on the positional data and the relative time values of that positional data; placing the augmented reality Vector at a location where the positional data was collected; and visualizing a representation of the augmented reality Vector with a device. - View Dependent Claims (37, 38, 39, 40, 41, 42, 43, 44, 47, 48, 49, 50, 51)
-
-
45-1. The method of claim 35, wherein various types of input can be used to create or change the AR Vector'"'"'s positional data vector, including, but not limited to:
- midi boards, styli, electric guitar output, motion capture, and pedestrian dead reckoning enabled devices.
Specification