Method, system and device for navigating in a virtual reality environment
First Claim
1. A method comprising:
- projecting a synthetic 3D scene, into both eyes of a user, so as to provide a virtual reality view to the user;
identifying at least one gesture or posture carried out by at least one body part of the user;
deriving a vector that spatially represents the identified gesture or posture;
transferring the vector into a continuous movement or action of the user in a virtual reality environment, based on at least one of an angle of the vector and a length of the vector; and
modifying the virtual reality view so as to reflect the movement or action of the user in the virtual reality environment.
7 Assignments
0 Petitions
Accused Products
Abstract
A method, a system, and a device for navigating in a virtual reality scene, using body parts gesturing and posturing are provided herein. The method may include: projecting a synthetic 3D scene, into both eyes of a user, via a near eye display, so as to provide a virtual reality view to the user; identifying at least one gesture or posture carried out by at least one body part of said user; measuring at least one metric of a vector associated with the detected gesture or posture; applying a movement or action of said user in virtual reality environment, based on the measured metrics; and modifying the virtual reality view so as to reflect the movement or action of said user in the virtual reality environment.
-
Citations
22 Claims
-
1. A method comprising:
-
projecting a synthetic 3D scene, into both eyes of a user, so as to provide a virtual reality view to the user; identifying at least one gesture or posture carried out by at least one body part of the user; deriving a vector that spatially represents the identified gesture or posture; transferring the vector into a continuous movement or action of the user in a virtual reality environment, based on at least one of an angle of the vector and a length of the vector; and modifying the virtual reality view so as to reflect the movement or action of the user in the virtual reality environment. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A system comprising:
-
a device configured to project a synthetic 3D scene, into both eyes of a user, so as to provide a virtual reality view to the user; and a computer processor configured to; identify at least one gesture or posture carried out by at least one body part of the user; derive a vector that spatially represents the identified gesture or posture; transfer the vector into a continuous movement or action of the user in a virtual reality environment, based on at least one of an angle of the vector and a length of the vector; and modify the virtual reality view so as to reflect the movement or action of the user in the virtual reality environment. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19, 20, 21, 22)
-
Specification