Systems and methods for a plurality of users to interact with an augmented or virtual reality systems
First Claim
1. A method for a first user located in a first real geographic location and a second user located in a second real geographic location different from the first real geographic location to interact within a virtual world comprising virtual world data, the method comprising:
- sensing a plurality of inanimate real world objects at the first real geographic location of a first user device, the first user device comprising at least a first head mounted display device having an environment-sensing system, the plurality of inanimate real world objects being sensed in one or more fields of view of the first user;
creating an avatar for a first user accessing a virtual world through the first user device at the first real geographic location, wherein the virtual world includes a virtual representation of the plurality of inanimate real world objects as the plurality of inanimate real world objects appear in the first real geographical location;
placing the avatar in the virtual world;
displaying the avatar in an augmented reality view of the virtual world to a second user accessing the virtual world through a second user device from a second real geographical location, the second user device comprising at least a second head mounted display device, wherein the avatar is animated based on at least input from the first head mounted display device;
identifying one or more inanimate real world objects of the plurality of real world objects for display by at least receiving one or more inputs from the first user to select one or more inanimate real world objects of the plurality of inanimate real world objects for display or receiving one or more inputs from the first user to remove one or more inanimate real world objects of the plurality of inanimate real world objects from display;
displaying one or more virtual representations of the one or more inanimate real world objects as the one or more inanimate real world objects appear in the first real geographical location to the second user through the second user device, the one or more inanimate real world object sensed at the first real geographic location by the first user device appears to be physically present in the augmented reality view of the virtual world displayed to the second user using the second user device, thereby allowing the second user to experience the first real geographical location through the second user device from the second real geographical location; and
facilitating interaction between the first user and second user in the virtual world through the first user device and the second user device.
3 Assignments
0 Petitions
Reexamination
Accused Products
Abstract
One embodiment is directed to a system for enabling two or more users to interact within a virtual world comprising virtual world data, comprising a computer network comprising one or more computing devices, the one or more computing devices comprising memory, processing circuitry, and software stored at least in part in the memory and executable by the processing circuitry to process at least a portion of the virtual world data; wherein at least a first portion of the virtual world data originates from a first user virtual world local to a first user, and wherein the computer network is operable to transmit the first portion to a user device for presentation to a second user, such that the second user may experience the first portion from the location of the second user, such that aspects of the first user virtual world are effectively passed to the second user.
185 Citations
20 Claims
-
1. A method for a first user located in a first real geographic location and a second user located in a second real geographic location different from the first real geographic location to interact within a virtual world comprising virtual world data, the method comprising:
-
sensing a plurality of inanimate real world objects at the first real geographic location of a first user device, the first user device comprising at least a first head mounted display device having an environment-sensing system, the plurality of inanimate real world objects being sensed in one or more fields of view of the first user; creating an avatar for a first user accessing a virtual world through the first user device at the first real geographic location, wherein the virtual world includes a virtual representation of the plurality of inanimate real world objects as the plurality of inanimate real world objects appear in the first real geographical location; placing the avatar in the virtual world; displaying the avatar in an augmented reality view of the virtual world to a second user accessing the virtual world through a second user device from a second real geographical location, the second user device comprising at least a second head mounted display device, wherein the avatar is animated based on at least input from the first head mounted display device; identifying one or more inanimate real world objects of the plurality of real world objects for display by at least receiving one or more inputs from the first user to select one or more inanimate real world objects of the plurality of inanimate real world objects for display or receiving one or more inputs from the first user to remove one or more inanimate real world objects of the plurality of inanimate real world objects from display; displaying one or more virtual representations of the one or more inanimate real world objects as the one or more inanimate real world objects appear in the first real geographical location to the second user through the second user device, the one or more inanimate real world object sensed at the first real geographic location by the first user device appears to be physically present in the augmented reality view of the virtual world displayed to the second user using the second user device, thereby allowing the second user to experience the first real geographical location through the second user device from the second real geographical location; and facilitating interaction between the first user and second user in the virtual world through the first user device and the second user device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A system for implementing interaction between a first user located in a first real geographic location and a second user located in a second real geographic location different from the first real geographic location within a virtual world comprising virtual world data, the system comprising:
-
a first user device having a first processor; and a second user device having a second processor, the first and second user devices being operatively coupled; wherein the first processor is configured to; sense a plurality of inanimate real world objects at the first real geographic location of a first user device, the first user device comprising at least a first head mounted display device having an environment-sensing system, the plurality of inanimate real world objects being sensed in one or more fields of view of the first user; create an avatar for a first user accessing a virtual world through the first user device at the first real geographic location, wherein the virtual world includes a virtual representation of the plurality of inanimate real world objects as the plurality of inanimate real world objects appear in the first real geographical location; place the avatar in the virtual world; and identify one or more inanimate real world objects of the plurality of real world objects for display by at least receiving one or more inputs from the first user to select one or more inanimate real world objects of the plurality of inanimate real world objects for display or receiving one or more inputs from the first user to remove one or more inanimate real world objects of the plurality of inanimate real world objects from display;
wherein the second processor is configured to instruct the second user device to;display the avatar in an augmented reality view of the virtual world to a second user accessing the virtual world through a second user device from a second real geographical location, the second user device comprising at least a second head mounted display device, wherein the avatar is animated based on at least input from the first head mounted display device; displaying one or more virtual representations of the one or more inanimate real world objects as the one or more inanimate real world objects appear in the first real geographical location to the second user through the second user device, the one or more inanimate real world object sensed at the first real geographic location by the first user device appears to be physically present in the augmented reality view of the virtual world displayed to the second user using the second user device, thereby allowing the second user to experience the first real geographical location through the second user device from the second real geographical location; and wherein the first and second processors are configured to facilitate interaction between the first user and second user in the virtual world through the first user device and the second user device. - View Dependent Claims (13, 14, 15, 16)
-
-
17. A system for implementing interaction between a first user located in a first real geographic location and a second user located in a second real geographic location different from the first real geographic location in a virtual world comprising virtual world data, the system comprising:
-
a server having a server processor and being operatively coupled to a first user device and a second user device, wherein the server operatively couples the first user device and the second user device, wherein the server processor is configured to; sense a plurality of inanimate real world objects at the first real geographic location of a first user device, the first user device comprising at least a first head mounted display device having an environment-sensing system, the plurality of inanimate real world objects being sensed in one or more fields of view of the first user; create an avatar for a first user accessing a virtual world through the first user device at the first real geographic location, wherein the virtual world includes a virtual representation of the plurality of inanimate real world objects as the plurality of inanimate real world objects appear in the first real geographical location; place the avatar in the virtual world; instruct the second user device to display the avatar in an augmented reality view of the virtual world to a second user accessing the virtual world through a second user device from a second real geographical location, the second user device comprising at least a second head mounted display device, wherein the avatar is animated based on at least input from the first head mounted display device; identify one or more inanimate real world objects of the plurality of real world objects for display by at least receiving one or more inputs from the first user to select one or more inanimate real world objects of the plurality of inanimate real world objects for display or receiving one or more inputs from the first user to remove one or more inanimate real world objects of the plurality of inanimate real world objects from display; instruct the second user device to display one or more virtual representations of the one or more inanimate real world objects as the one or more inanimate real world objects appear in the first real geographical location to the second user through the second user device, the one or more inanimate real world object sensed at the first real geographic location by the first user device appears to be physically present in the augmented reality view of the virtual world displayed to the second user using the second user device, thereby allowing the second user to experience the first real geographical location through the second user device from the second real geographical location; and facilitate interaction between the first user and second user in the virtual world through the first user device and the second user device. - View Dependent Claims (18, 19, 20)
-
Specification