System and Method for Synchronizing, Merging, and Utilizing Multiple Data Sets for Augmented Reality Application
First Claim
1. An electronic system for synchronizing and merging multiple data sets for an augmented reality application, the electronic system comprising:
- a CPU and a memory unit configured to execute the augmented reality application, wherein the augmented reality application provides a user interface on a display screen for display of video and other information associated with a real-world environment;
a first data set comprising live recorded video and/or audio information referenced to clock and timestamp information, wherein the first data set is loaded to the memory unit of the electronic system;
a second data set comprising GPS information, map data information, and points of interest information, wherein the second data set is also loaded to the memory unit of the electronic system;
a third data set comprising merged graphical and/or audio information and non-graphical and non-audio information metadata, wherein the merged graphical and/or audio information and the non-graphical and non-audio information metadata are created from the first data set and the second data set using the augmented reality application executed on the CPU, and wherein the merged graphical and/or audio information and the non-graphical and non-audio information metadata of the third data set are both referenced to the clock and timestamp information of the first data set to enable correct time synchronization of the merged graphical and/or audio information and the non-graphical and non-audio information metadata during a full or selective replay; and
an external communication input and output interface configured to transmit the third data set as one or more data packets to another electronic system via a data network.
0 Assignments
0 Petitions
Accused Products
Abstract
Systems and methods for synchronizing, merging, and utilizing multiple data sets for augmented reality application are disclosed. In one example, an electronic system receives and processes live recorded video information, GPS information, map data information, and points of interest information to produce a data set comprising merged graphical and/or audio information and non-graphical and non-audio information metadata that are referenced to the same clock and timestamp information. This data set can be stored in a cloud network storage. By retaining numerical and textual values of non-graphical and non-audio information (e.g. camera viewing angle information, GPS coordinates, accelerometer values, and compass coordinates) as metadata that are referenced to the same clock and timestamp information within the data set, an augmented reality application that replays information or augments information in real time can dynamically select or change how the data set is presented in augmented reality based on dynamically-changeable user preferences.
49 Citations
20 Claims
-
1. An electronic system for synchronizing and merging multiple data sets for an augmented reality application, the electronic system comprising:
-
a CPU and a memory unit configured to execute the augmented reality application, wherein the augmented reality application provides a user interface on a display screen for display of video and other information associated with a real-world environment; a first data set comprising live recorded video and/or audio information referenced to clock and timestamp information, wherein the first data set is loaded to the memory unit of the electronic system; a second data set comprising GPS information, map data information, and points of interest information, wherein the second data set is also loaded to the memory unit of the electronic system; a third data set comprising merged graphical and/or audio information and non-graphical and non-audio information metadata, wherein the merged graphical and/or audio information and the non-graphical and non-audio information metadata are created from the first data set and the second data set using the augmented reality application executed on the CPU, and wherein the merged graphical and/or audio information and the non-graphical and non-audio information metadata of the third data set are both referenced to the clock and timestamp information of the first data set to enable correct time synchronization of the merged graphical and/or audio information and the non-graphical and non-audio information metadata during a full or selective replay; and an external communication input and output interface configured to transmit the third data set as one or more data packets to another electronic system via a data network. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A method for synchronizing, merging, and utilizing multiple data sets for an augmented reality application executed on a CPU and a memory unit of an electronic system, the method comprising:
-
displaying a live video footage via a user display panel while the live footage is being recorded into a local and/or a network-attached storage; displaying map graphics and underlying non-graphical and non-audio data including GPS coordinates, points of interest, and/or current timestamps; and if the augmented reality application is configured to merge the live video footage with the map graphics and the underlying non-graphical and non-audio data; creating a merged graphical and/or audio information data set comprising the live video footage and the map graphics; creating a metadata set for non-graphical and non-audio information which includes the underlying non-graphical and non-audio data, wherein the metadata set is time-synchronized with the merged graphical and/or audio information data set by referencing to same timestamps; and storing the merged graphical and/or audio information data set and the metadata set for non-graphical and non-audio information as separate files or as a combined file in a local and/or a cloud network storage. - View Dependent Claims (13, 14, 15, 16, 17)
-
-
18. A method for providing a geographic location search-based augmented reality application executed on a CPU and a memory unit of an electronic system, the method comprising:
-
loading the geographic location search-based augmented reality application on the CPU and the memory of the electronic system, wherein the electronic system is an electronic goggle with an embedded display and an eye-movement tracking sensor for location-pointing, another wearable computer, or another electronic device, which is configured to retrieve previously-stored information associated with a particular geographic location; selecting or pointing to the particular geographic location using the electronic system during an operation of the geographic location search-based augmented reality application; checking whether a video file is associated with the particular geographic location in a cloud network storage or a local storage operatively connected to the electronic system; and if the video file is found, checking whether the video file includes metadata for non-graphical and non-audio information; if the metadata for non-graphical and non-audio information is included in the video file; extracting the metadata for the geographic location search-based augmented reality application using a user'"'"'s general and/or specific preferences, and displaying dynamically-changeable geographic location-associated graphical information and non-graphical and non-audio information; else if the metadata for non-graphical and non-audio information is not included in the video file; replaying the video file which does not have separate metadata for non-graphical and non-audio information. - View Dependent Claims (19, 20)
-
Specification