Virtual reality and augmented reality functionality for mobile devices
First Claim
1. A system for simulating a user in a digital world produced and displayed on a view screen by an interactive software program, the system comprising:
- a mobile, tracked device attached to the user adapted to transmit an RF signal;
an inertial sensor attached to the tracked device adapted to sense inertia of the tracked device, and provides the sensed inertia information to the tracked device to be transmitted in the RF signal;
a network of four or more spatially separated radio frequency (RF) receiver antennae at different locations, each of the RF receiver antennae receiving RF signals transmitted from the tracked device; and
a base station in communication with the network of four or more spatially separated RF receiver antennae, the base station including one or more receiver channels for processing the RF signals acquired by the four or more spatially separated RF receiver antennae and a processor programmed to dynamically compute, as the tracked device moves within a physical environment, a position and orientation of the tracked device from information conveyed by the RF signals,wherein, while the tracked device moves within the physical environment, the base station sends the computed position and orientation of the tracked device to the interactive software program, and the interactive software program uses the computed position and orientation to determine a virtual viewpoint of the tracked device and to adjust an image displayed on the view screen in accordance with the virtual viewpoint of the tracked device or another viewpoint selected by the user.
2 Assignments
0 Petitions
Accused Products
Abstract
Systems and methods improve virtual reality and augmented reality functionality for mobile devices using radio frequency (RF) signals transmitted by a tracked device and received at four or more spatially separated antennae. These antennae are connected, wirelessly or through wired connections, to a base station. Through RF signal time of arrival information acquired at the antennae, the base station can continuously determine accurate position information of the tracked device, without lighting or line of sight limitations experienced by camera and other optical systems. As the position of the RF-transmitting tracked device is registered within a virtual environment produced by an interactive software program in communication with (or part of) the base station, the virtual viewpoint of the tracked device is controlled to reflect the relative position and orientation of the tracked device with respect to the virtual environment produced by the software program and displayed on a view screen.
133 Citations
20 Claims
-
1. A system for simulating a user in a digital world produced and displayed on a view screen by an interactive software program, the system comprising:
-
a mobile, tracked device attached to the user adapted to transmit an RF signal; an inertial sensor attached to the tracked device adapted to sense inertia of the tracked device, and provides the sensed inertia information to the tracked device to be transmitted in the RF signal; a network of four or more spatially separated radio frequency (RF) receiver antennae at different locations, each of the RF receiver antennae receiving RF signals transmitted from the tracked device; and a base station in communication with the network of four or more spatially separated RF receiver antennae, the base station including one or more receiver channels for processing the RF signals acquired by the four or more spatially separated RF receiver antennae and a processor programmed to dynamically compute, as the tracked device moves within a physical environment, a position and orientation of the tracked device from information conveyed by the RF signals, wherein, while the tracked device moves within the physical environment, the base station sends the computed position and orientation of the tracked device to the interactive software program, and the interactive software program uses the computed position and orientation to determine a virtual viewpoint of the tracked device and to adjust an image displayed on the view screen in accordance with the virtual viewpoint of the tracked device or another viewpoint selected by the user. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A method for improving virtual reality and augmented reality interactions with a tracked device, the method comprising:
-
displaying, on a view screen, a software-generated environment from a three-dimensional virtual viewpoint selected by the user which may be from the viewpoint of the tracked device; receiving, on at least four spatially separated receiver antennae, radio-frequency (RF) signals transmitted by the tracked device; dynamically computing a three-dimensional physical position and orientation of the tracked device, based on information conveyed by the RF signals received by the receiver antennae which includes inertial information from an inertial sensor, as the tracked device moves within a physical environment; and while dynamically computing, in real time, the three-dimensional physical position and the orientation of the tracked device; dynamically updating the three-dimensional virtual viewpoint of the tracked device based on the dynamically computed three-dimensional physical position and orientation of the tracked device; and dynamically updating the display of the software-generated environment on the view screen in accordance with the dynamically updated three-dimensional virtual viewpoint of the tracked device, or other user-selected viewpoint. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
-
-
19. A method for improving virtual reality or augmented reality interactions with a tracked device, the method comprising:
-
dynamically tracking a three-dimensional physical position and orientation of an RF-transmitting tracked device as the tracked device moves within a physical environment using radio-frequency (RF) and inertial sensing signals from an inertial sensor within the tracked device, that are transmitted by the RF-transmitting tracked device; and while dynamically tracking, in real time, the three-dimensional physical position and orientation of the RF-transmitting tracked device; registering a virtual viewpoint of the RF-transmitting tracked device, determined from the physical position and orientation of the RF-transmitting tracked device, in an interactive software program; and displaying a dynamic digital image on a display screen by the interactive software program, from either the virtual viewpoint of the RF-transmitting tracked device or from a user-selected viewpoint.
-
-
20. Computer program product for improving virtual reality and augmented reality interactions with a tracked device, the computer program product comprising:
a computer readable non-transitory storage medium having computer readable program code embodied therewith, the computer readable program code comprising; computer readable program code that, if executed, displays on a view screen a software-generated environment from either a three-dimensional virtual viewpoint of the tracked device, or a user-selected viewpoint; computer readable program code that, if executed, monitors inertial sensors on the tracked device and creates inertial information from the monitoring; computer readable program code that, if executed, receives, at four spatially separated receiver antennae, radio-frequency (RF) signals transmitted by the tracked device that includes inertial information of the tracked device; computer readable program code that, if executed, dynamically computes, based on information conveyed by the RF signals received by the receiver antennae, a three-dimensional physical position and orientation of the tracked device as the tracked device moves within a physical environment; and while the computer readable program code that dynamically computes a three-dimensional physical position and orientation of the tracked device is executing; computer readable program code that, if executed, dynamically updates the three-dimensional virtual viewpoint of the tracked device based on the dynamically computed three-dimensional physical position and orientation of the tracked device; and
computer readable program code that, if executed, updates the display of the software-generated environment on the view screen in accordance with the dynamically updated three-dimensional virtual viewpoint of the tracked device, or other user-selected viewpoint.
Specification