System and method for synchronizing, merging, and utilizing multiple data sets for augmented reality application
First Claim
1. An electronic system for synchronizing and merging multiple data sets for an augmented reality application, the electronic system comprising:
- a CPU and a memory unit configured to execute the augmented reality application, wherein the augmented reality application provides a user interface on a display screen for display of video and other information associated with a real-world environment;
a first data set comprising previously-recorded video and/or audio information in a replay reality environment referenced to clock and timestamp information, wherein the first data set is loaded to the memory unit of the electronic system;
a second data set comprising GPS information, map data information, and points of interest information, wherein the second data set is also loaded to the memory unit of the electronic system;
a third data set created from the augmented reality application by synthesizing a graphical and audio metadata set that gathers graphical and/or audio information portion from the first data set and the second data set, and a non-graphical and non-audio metadata set that gathers non-graphical and non-audio information portion from the first data set and the second data set, wherein the third data set is a single savable file, with the graphical and audio metadata set and the non-graphical and non-audio metadata set both retroactively time-referenced to the clock and timestamp information of the first data set for correct time synchronization during replay of the third data set as the single savable file; and
an external communication input and output interface configured to transmit the third data set as one or more data packets to another electronic system via a data network.
0 Assignments
0 Petitions
Accused Products
Abstract
Systems and methods for synchronizing, merging, and utilizing multiple data sets for augmented reality application are disclosed. In one example, an electronic system receives and processes live recorded video information, GPS information, map data information, and points of interest information to produce a data set comprising merged graphical and/or audio information and non-graphical and non-audio information metadata that are referenced to the same clock and timestamp information. This data set can be stored in a cloud network storage. By retaining numerical and textual values of non-graphical and non-audio information (e.g. camera viewing angle information, GPS coordinates, accelerometer values, and compass coordinates) as metadata that are referenced to the same clock and timestamp information within the data set, an augmented reality application that replays information or augments information in real time can dynamically select or change how the data set is presented in augmented reality based on dynamically-changeable user preferences.
18 Citations
17 Claims
-
1. An electronic system for synchronizing and merging multiple data sets for an augmented reality application, the electronic system comprising:
-
a CPU and a memory unit configured to execute the augmented reality application, wherein the augmented reality application provides a user interface on a display screen for display of video and other information associated with a real-world environment; a first data set comprising previously-recorded video and/or audio information in a replay reality environment referenced to clock and timestamp information, wherein the first data set is loaded to the memory unit of the electronic system; a second data set comprising GPS information, map data information, and points of interest information, wherein the second data set is also loaded to the memory unit of the electronic system; a third data set created from the augmented reality application by synthesizing a graphical and audio metadata set that gathers graphical and/or audio information portion from the first data set and the second data set, and a non-graphical and non-audio metadata set that gathers non-graphical and non-audio information portion from the first data set and the second data set, wherein the third data set is a single savable file, with the graphical and audio metadata set and the non-graphical and non-audio metadata set both retroactively time-referenced to the clock and timestamp information of the first data set for correct time synchronization during replay of the third data set as the single savable file; and an external communication input and output interface configured to transmit the third data set as one or more data packets to another electronic system via a data network. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A method for synchronizing, merging, and utilizing multiple data sets for an augmented reality application executed on a CPU and a memory unit of an electronic system, the method comprising:
-
displaying a previously-recorded video footage in a replay reality environment via a user display panel; displaying map graphics and underlying non-graphical and non-audio data including GPS coordinates, points of interest, and/or current timestamps; and when the augmented reality application is configured to merge the previously-recorded video footage with the map graphics and the underlying non-graphical and non-audio data as a single savable file; creating a merged graphical and/or audio information data set comprising the previously-recorded video footage and the map graphics; creating a metadata set for non-graphical and non-audio information which includes the underlying non-graphical and non-audio data, wherein the metadata set is time-synchronized with the merged graphical and/or audio information data set by retroactively referencing to same timestamps; and storing the merged graphical and/or audio information data set and the metadata set for non-graphical and non-audio information as the single savable file in a local and/or a cloud network storage. - View Dependent Claims (13, 14, 15, 16, 17)
-
Specification