System and method for sharing virtual and augmented reality scenes between users and viewers
First Claim
Patent Images
1. An apparatus comprising:
- a user interface having a display;
an orientation sensor configured to determine a first orientation of the user interface relative to a three-dimensional space;
an image capture subsystem configured to capture a plurality of images; and
a processor connected to the user interface, the orientation sensor, and the image capture subsystem, wherein the processor is configured toassociate each image of the plurality of images with orientation data corresponding to the first orientation of the user interface;
correlate the plurality of images based on the orientation data at the user interface associated with image of the plurality of images;
compress at least the correlated plurality of images at the user interface to generate a processed virtual or augmented reality (VAR) scene; and
transmit the processed VAR scene to a server.
3 Assignments
0 Petitions
Accused Products
Abstract
A preferred method for sharing user-generated virtual and augmented reality scenes can include receiving at a server a virtual and/or augmented reality (VAR) scene generated by a user mobile device. Preferably, the VAR scene includes visual data and orientation data, which includes a real orientation of the user mobile device relative to a projection matrix. The preferred method can also include compositing the visual data and the orientation data into a viewable VAR scene; locally storing the viewable VAR scene at the server; and in response to a request received at the server, distributing the processed VAR scene to a viewer mobile device.
107 Citations
20 Claims
-
1. An apparatus comprising:
-
a user interface having a display; an orientation sensor configured to determine a first orientation of the user interface relative to a three-dimensional space; an image capture subsystem configured to capture a plurality of images; and a processor connected to the user interface, the orientation sensor, and the image capture subsystem, wherein the processor is configured to associate each image of the plurality of images with orientation data corresponding to the first orientation of the user interface; correlate the plurality of images based on the orientation data at the user interface associated with image of the plurality of images; compress at least the correlated plurality of images at the user interface to generate a processed virtual or augmented reality (VAR) scene; and transmit the processed VAR scene to a server. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A system comprising:
-
a server including a computer readable storage medium and a processor, in communication with one or more client devices; and wherein the server is configured to; receive a virtual or augmented reality (VAR) scene generated by a capture client device, wherein the VAR scene comprises a plurality of images captured by the capture client device, and wherein each image of the plurality of images is associated with orientation data corresponding to a first orientation of the capture client device relative to a three-dimensional space;
virtual and/or augmented reality (VAR)composite the plurality of images into a viewable VAR scene based on the orientation data associated with each image of the plurality of images; store the viewable VAR scene; and in response to a request, distribute the viewable VAR scene to a viewer client device. - View Dependent Claims (12, 13, 14, 20)
-
-
15. A system comprising:
-
a server including a computer readable storage medium and a processor, in communication with one or more client devices; and a viewer client device including a user interface configured to display one or more virtual or augmented reality (VAR) scenes; wherein the viewer client device is configured to; receive a VAR scene from the server, wherein the VAR scene comprises a plurality of images captured by a capture client device, wherein each image of the plurality of images is associated with orientation data corresponding to an orientation of the capture client device relative to a three-dimensional frame of reference; determine a first orientation of the viewer client device relative to a three-dimensional space; determine a second orientation of the viewer client device relative to a nodal point; using the orientation data corresponding to the orientation of the capture client device associated with each of the plurality of images in the VAR scene, orient the VAR scene displayable on the viewer client device to a viewer based on the first orientation and the second orientation of a viewer client device; and display the VAR scene. - View Dependent Claims (16, 17, 18, 19)
-
Specification