VIRTUAL SPECTATOR EXPERIENCE WITH A PERSONAL AUDIO/VISUAL APPARATUS
First Claim
1. A method for providing a virtual spectator experience of an event for viewing with a near-eye, augmented reality display of a personal audiovisual (A/V) apparatus comprising:
- receiving in real time one or more positions of one or more event objects participating in the event occurring at a first location remote from a second location;
mapping the one or more positions of the one or more event objects in the first 3D coordinate system for the first location to a second 3D coordinate system for a second location remote from the first location;
determining a display field of view of a near-eye, augmented reality display of a personal A/V apparatus being worn by a user at the second location; and
sending in real time 3D virtual data representing the one or more event objects which are within the display field of view to the personal A/V apparatus at the second location.
2 Assignments
0 Petitions
Accused Products
Abstract
Technology is described for providing a virtual spectator experience for a user of a personal A/V apparatus including a near-eye, augmented reality (AR) display. A position volume of an event object participating in an event in a first 3D coordinate system for a first location is received and mapped to a second position volume in a second 3D coordinate system at a second location remote from where the event is occurring. A display field of view of the near-eye AR display at the second location is determined, and real-time 3D virtual data representing the one or more event objects which are positioned within the display field of view are displayed in the near-eye AR display. A user may select a viewing position from which to view the event. Additionally, virtual data of a second user may be displayed at a position relative to a first user.
332 Citations
20 Claims
-
1. A method for providing a virtual spectator experience of an event for viewing with a near-eye, augmented reality display of a personal audiovisual (A/V) apparatus comprising:
-
receiving in real time one or more positions of one or more event objects participating in the event occurring at a first location remote from a second location; mapping the one or more positions of the one or more event objects in the first 3D coordinate system for the first location to a second 3D coordinate system for a second location remote from the first location; determining a display field of view of a near-eye, augmented reality display of a personal A/V apparatus being worn by a user at the second location; and sending in real time 3D virtual data representing the one or more event objects which are within the display field of view to the personal A/V apparatus at the second location. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. An apparatus for providing a virtual spectator experience of an event for viewing with a near-eye, augmented reality display of a personal audiovisual (A/V) apparatus comprising:
-
a first set of one or more processors having access to one or more memories storing a first three-dimensional (3D) coordinate system for a first location where an event is occurring in a first event space and storing a second (3D) coordinate system for a second location having a second event space for hosting a same type of event as the event occurring at the first location; the first set of one or more processors being communicatively coupled to a second set of one or more processors for receiving in real time 3D virtual data representing one or more event objects participating in the event at the first location and one or more positions of the one or more objects in the first 3D coordinate system for the first location; the first set of one or more processors mapping the 3D virtual data representing one or more objects participating in the event at the first location to one or more positions in the second 3D coordinate system for the second location; and the first set of one or more processors being communicatively coupled to the personal A/V apparatus including the near-eye, augmented reality display located at the second location, and the first set of one or more processors sending the 3D virtual data representing one or more event objects participating in the event with their respective one or more positions in the second 3D coordinate system for the second location to the personal A/V apparatus. - View Dependent Claims (13, 14, 15, 16, 17, 19)
-
-
10. The apparatus of claim 9 further comprising:
-
the 3D coordinate system for the first location and the 3D coordinate system for the second location are view independent coordinate systems; the first set of one or more processors determining a display field of view for the near-eye, augmented reality display of the personal A/V apparatus in the second 3D coordinate system based on sensor data received from the personal A/V apparatus; and the first set of one or more processors transforming the 3D virtual data representing the one or more event objects participating in the event from the view independent 3D coordinate system for second location to a view dependent coordinate system for the near-eye, augmented reality display of the personal A/V apparatus and sending the transformed 3D virtual data to the personal A/V apparatus. - View Dependent Claims (11, 12)
-
-
18. One or more processor readable storage devices comprising instructions which cause one or more processors to execute a method for providing a virtual spectator experience of an event using a near-eye, augmented reality display of a first personal A/V apparatus, the method comprising:
-
receiving user input selecting a viewing position at a first location where the event is occurring; and requesting and receiving three-dimensional (3D) virtual data of the event including 3D virtual data for the viewing position at the first location over a communication network from one or more computer systems which generate the 3D virtual data of the event for the viewing position based on received image data and depth data from capture devices at the first location; determining a head position and orientation of a first user wearing the near-eye, augmented reality display; selecting 3D virtual data as current 3D virtual data for display in the near-eye, augmented reality display based on the head position and orientation; and displaying the current 3D virtual data in the near-eye, augmented reality display. - View Dependent Claims (20)
-
Specification