Autonomous drones for tactile feedback in immersive virtual reality
First Claim
1. A system, comprising:
- a processor; and
a memory comprising instructions that, when executed by the processor, cause the processor to perform a method comprising;
generating a real-time rendering of a virtual environment;
receiving real-time tracking information from one or more sensors, the real-time tracking information relating to real motions and positions of a user immersed in the real-time rendering of the virtual environment, the real-time tracking information further relating to motions and positions of an autonomous mobile drone in a real-world environment around the user;
responsive to receipt of the real-time tracking information, directing the autonomous mobile drone to move relative to the user;
further responsive to receipt of the real-time tracking information, directing the autonomous mobile drone to automatically deposit a real-world object in a position within the real-world environment that enables one or more user body parts to contact that object based on real user motions while the user interacts with the virtual environment;
responsive to depositing the real-world object by the autonomous mobile drone, rendering a virtual representation of the real-world object into a corresponding position in the virtual environment; and
wherein contact between the real-world object and the one or more user body parts generates a physically tactile sensation for the virtual representation of the real-world object.
1 Assignment
0 Petitions
Accused Products
Abstract
A “Tactile Autonomous Drone” (TAD) (e.g., flying drones, mobile robots, etc.) supplies real-time tactile feedback to users immersed in virtual reality (VR) environments. TADs are not rendered into the VR environment, and are therefore not visible to users immersed in the VR environment. In various implementations, one or more TADs track users as they move through a real-world space while immersed in the VR environment. One or more TADs apply tracking information to autonomously position themselves, or one or more physical surfaces or objects carried by the TADs, in a way that enables physical contact between those surfaces or objects and one or more portions of the user'"'"'s body. Further, this positioning of surfaces or objects corresponds to some real-time virtual event, virtual object, virtual character, virtual avatar of another user, etc., in the VR environment to provide real-time tactile feedback to users immersed in the VR environment.
-
Citations
20 Claims
-
1. A system, comprising:
-
a processor; and a memory comprising instructions that, when executed by the processor, cause the processor to perform a method comprising; generating a real-time rendering of a virtual environment; receiving real-time tracking information from one or more sensors, the real-time tracking information relating to real motions and positions of a user immersed in the real-time rendering of the virtual environment, the real-time tracking information further relating to motions and positions of an autonomous mobile drone in a real-world environment around the user; responsive to receipt of the real-time tracking information, directing the autonomous mobile drone to move relative to the user; further responsive to receipt of the real-time tracking information, directing the autonomous mobile drone to automatically deposit a real-world object in a position within the real-world environment that enables one or more user body parts to contact that object based on real user motions while the user interacts with the virtual environment; responsive to depositing the real-world object by the autonomous mobile drone, rendering a virtual representation of the real-world object into a corresponding position in the virtual environment; and wherein contact between the real-world object and the one or more user body parts generates a physically tactile sensation for the virtual representation of the real-world object. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A method, comprising:
-
generating a real-time rendering of a virtual environment; displaying the real-time rendering of the virtual environment via a head-worn virtual reality display device; receiving real-time tracking information from one or more sensors, the real-time tracking information relating to real motions and positions of a user immersed in the real-time rendering of the virtual environment, the real-time tracking information further relating to motions and positions of an autonomous mobile drone in a real-world environment around the user; responsive to receipt of the real-time tracking information, directing the autonomous mobile drone to move relative to the user; further responsive to receipt of the real-time tracking information, directing the autonomous mobile drone to automatically deposit a real-world object in a position within the real-world environment that enables one or more user body parts to contact that object based on real user motions while the user interacts with the virtual environment; responsive to depositing the real-world object by the autonomous mobile drone, rendering a virtual representation of the real-world object into a corresponding position in the virtual environment; and wherein contact between the real-world object and the one or more user body parts generates a physically tactile sensation for the virtual representation of the real-world object. - View Dependent Claims (14, 15, 16, 17, 18, 19)
-
-
20. A computer-readable storage device having computer executable instructions stored therein for rendering a real-time virtual environment and directing one or more autonomous mobile drones, said instructions causing a computing device to execute a method comprising:
-
generating a real-time rendering of a virtual environment within a user-worn virtual reality display device; receiving real-time tracking information from one or more sensors, the real-time tracking information relating to real motions and positions of a user immersed in the real-time rendering of the virtual environment, the real-time tracking information further relating to motions and positions of one or more autonomous mobile drone in a real-world environment around the user; responsive to receipt of the real-time tracking information, directing one or more of the autonomous mobile drones to move relative to the user; further responsive to receipt of the real-time tracking information, directing one or more of the autonomous mobile drones to automatically deposit one or more real-world objects in positions within the real-world environment that enables one or more user body parts to contact those objects based on real user motions while the user interacts with the virtual environment; responsive to depositing the one or more real-world objects by the one or more autonomous mobile drones, rendering corresponding virtual representations of the real-world objects into corresponding positions in the virtual environment; and wherein contact between any of the real-world objects and the one or more user body parts generates a physically tactile sensation for the virtual representation of the corresponding real-world object.
-
Specification