Maintaining multiple views on a shared stable virtual space
First Claim
1. A method for controlling a view of a virtual scene with a portable device, the method comprising:
- synchronizing the portable device to a reference point in a physical three-dimensional (3D) space, the reference point being a point in space occupied by the portable device when a signal to synchronize is received by the portable device;
capturing images of the physical 3D space with a camera of the portable device in response to the synchronizing;
tracking a current position of the portable device in the physical 3D space with respect to the reference point, the tracking utilizing image recognition of the captured images and inertial information obtained by inertial sensors in the portable device;
generating a virtual scene defined around the reference point, the virtual scene including virtual reality elements; and
displaying a view of the virtual scene in a display of the portable device based on the current position.
3 Assignments
0 Petitions
Accused Products
Abstract
Ways for controlling a virtual-scene view in a portable device are presented. In one method, a signal is received and the device is synchronized to make the location of the device a reference point in a three-dimensional (3D) space. A virtual scene with virtual reality elements is generated around the reference point. The current position of the device in the 3D space, with respect to the reference point, is determined and a view of the virtual scene created. The view represents the virtual scene as seen from the current position of the device with a viewing angle based on the position of the device. The created view is displayed in the device, and the view of the virtual scene is changed as the device is moved within the 3D space. In another method, multiple players shared the virtual reality and interact with each other in the virtual reality.
-
Citations
36 Claims
-
1. A method for controlling a view of a virtual scene with a portable device, the method comprising:
-
synchronizing the portable device to a reference point in a physical three-dimensional (3D) space, the reference point being a point in space occupied by the portable device when a signal to synchronize is received by the portable device; capturing images of the physical 3D space with a camera of the portable device in response to the synchronizing; tracking a current position of the portable device in the physical 3D space with respect to the reference point, the tracking utilizing image recognition of the captured images and inertial information obtained by inertial sensors in the portable device; generating a virtual scene defined around the reference point, the virtual scene including virtual reality elements; and displaying a view of the virtual scene in a display of the portable device based on the current position. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A method comprising:
-
synchronizing a portable device to a reference point in a physical three-dimensional (3D) space, the reference point being a point in space occupied by the portable device when a signal to synchronize is received by the portable device; capturing images of the physical 3D space with a camera of the portable device in response to the synchronizing; generating a virtual scene defined around the reference point, the virtual scene including virtual reality elements; determining a current position in the physical 3D space of the portable device with respect to the reference point; displaying a view of the virtual scene in the display of the portable device based on the current position; tracking a position of a hand of a player in the physical 3D space based on image recognition of the hand in the captured images; detecting when the position of the hand is where a first virtual element is located; and enabling, after the detecting, an interaction of the hand with the first virtual element to simulate that the hand is touching the virtual element, wherein the hand is able to manipulate the first virtual element to change a position or a property of the first virtual element as if the first virtual element were a real object. - View Dependent Claims (16, 17)
-
-
18. A method for sharing a virtual scene among devices, the method comprising:
-
calculating, by a first device, a first location of the first device relative to a reference point in a physical three-dimensional (3D) space; calculating, by the first device, a second location in the physical 3D space of a second device relative to the first location of the first device, wherein the first device and the second device are handheld devices; sending information from the first device to the second device identifying the reference point in the physical 3D space, the information including the reference point, the first location, and the second location; generating a virtual scene, augmenting the physical 3D space, situated around the reference point, the virtual scene being common to both devices and presented on respective displays of the first device and the second device, the virtual scene changing simultaneously in both devices in response to interaction from the first device or from the second device; creating a first view of the virtual scene as seen from a current position of the first device; displaying the first view in the display of the first device; and changing the displayed first view of the virtual scene as the first device moves within the physical 3D space. - View Dependent Claims (19, 20, 21, 22)
-
-
23. A method for controlling a view of a virtual scene with a first device, the method comprising:
-
calculating a first location of a first device relative to a first reference point in a first physical three-dimensional (3D) space; establishing a communications link between the first device and a second device, the second device being in a second physical 3D space outside the first physical 3D space, the second device having a second location relative to a second reference point in the second physical 3D space, wherein the first device and the second device are handheld devices; sending from the first device a first image of a first user associated with the first device and receiving by the first device a second image of a second user associated with the second device, the first user and the second user being in different locations; generating a common virtual scene that includes virtual reality elements, the common virtual scene being presented on a first display of the first device and a second display of the second device, the first device building the common virtual scene around the first reference point, the second device building the common virtual scene around the second reference point, both devices being able to interact with the virtual reality elements; determining a current position in the first physical 3D space of the first device with respect to the first reference point; creating a view of the common virtual scene, wherein the view represents the common virtual scene as seen from the current position of the first device; blending the second image of the second user into the view of the common virtual scene to create a proximity effect that simulates that the second user is near the first user; displaying the view of the common virtual scene in the display of the first device; and changing the displayed view of the common virtual scene as the first device moves within the first physical 3D space. - View Dependent Claims (24, 25, 26, 27, 28, 29)
-
-
30. A method for controlling a view of a virtual scene with a portable device, the method comprising:
-
synchronizing the portable device to a reference point in a physical three-dimensional (3D) space, the reference point being a point in space occupied by the portable device when a signal to synchronize is received by the portable device, the portable device including a front camera facing the front of the portable device and a rear camera facing the rear of the portable device, the portable device being a handheld device; capturing images of the physical 3D space with a camera of the portable device in response to the synchronizing; tracking a current position of the portable device in the physical 3D space with respect to the reference point, the tracking utilizing image recognition of the captured images and inertial information obtained by inertial sensors in the portable device; generating a virtual scene defined around the reference point, the virtual scene including virtual reality elements; creating a view of the virtual scene based on a current position of the portable device, the view capturing a representation of the virtual scene as seen from a current eye position in the physical 3D space of a player holding the portable device, the capturing corresponding to what the player would see through a window into the virtual scene, a window'"'"'s position in the physical 3D space being equal to a position in the physical 3D space of a display in the portable device; displaying the created view in the display; and changing the displayed view of the virtual scene as the portable device or the player move within the physical 3D space, wherein a change in a position of an eye of the player holding the portable device while keeping the portable device stationary causes a change of the displayed view. - View Dependent Claims (31, 32)
-
-
33. A method for controlling a view of a scene with a portable device, the method comprising:
-
synchronizing the portable device to make a location where the portable device is located a reference point in a physical three-dimensional (3D) space, the reference point being a point in space occupied by the portable device when a signal to synchronize is received by the portable device; generating a virtual scene in a display of the portable device when viewing the physical 3D space via the portable device, the virtual scene defined around the reference point, the virtual scene including virtual reality elements; creating a view of the virtual scene as the portable device is moved away from the reference point, wherein the view represents the virtual scene as seen from a current position of the portable device, wherein the view created is independent of a position of an eye of a user holding the portable device; displaying the created view in the portable device; and changing the displayed view of the virtual scene as the portable device moves within the physical 3D space. - View Dependent Claims (34, 35)
-
-
36. A computer program embedded in a non-transitory computer-readable storage medium, when executed by one or more processors, for sharing a virtual scene among devices, the computer program comprising:
-
program instructions for calculating, by a first device, a first location of a first device relative to a reference point in a physical three-dimensional (3D) space, the first device being a handheld device; program instructions for calculating, by the first device, a second location in the physical 3D space of a second device relative to the first location of the first device; program instructions for sending information from the first device to the second device identifying the reference point in the physical 3D space, the information including the reference point, the first location, and the second location; program instructions for generating a virtual scene, augmenting the physical 3D space, situated around the reference point, the virtual scene being common to both devices and presented on respective displays of the first device and the second device, the virtual scene changing simultaneously in both devices in response to interaction from the first device or from the second device; program instructions for creating a first view of the virtual scene as seen from a current position of the first device; program instructions for displaying the first view in the display of the first device; and program instructions for changing the displayed first view of the virtual scene as the first device moves within the physical 3D space.
-
Specification