Indirect 3D scene positioning control
First Claim
1. A method for interacting with a virtual object from within a virtual scene projected to a virtual render plane, comprising:
- determining, via an optical tracking device, a first position and orientation of a physical endpoint of a user interface device in a physical space, wherein the position and orientation is with respect to a display device rendering the virtual scene;
determining a perspective of the user interface device based on the determined position and orientation, wherein the perspective is with respect to the display device, and wherein the perspective is not normal to the virtual render plane;
identifying a virtual position and orientation of a virtual endpoint in a virtual scene, wherein the virtual position and orientation of the virtual endpoint in the virtual scene correspond to the position and orientation of the physical endpoint in the physical space;
identifying a segment in the virtual scene along a path from the virtual endpoint based on the perspective of the user interface device, wherein the segment indicates a path that is not normal to the virtual render plane; and
correlating an intersection between the virtual object and the segment, including receiving, via the user interface device, user input to select a virtual object via a selection shape, comprising;
receiving user input to select a first point in the virtual scene from the perspective in accordance with a first intersection;
receiving user input to select a second point in the virtual scene from a different perspective in accordance with a second intersection, wherein the second perspective is based on a second position and orientation of the user interface device in the physical space, and wherein the second position and orientation is determined via the optical tracking device; and
creating a rectangular volume between the first point and the second point, wherein planes of the rectangular volume are oblique in relation to the virtual render plane, wherein the rectangular volume forms the selection shape.
6 Assignments
0 Petitions
Accused Products
Abstract
Embodiments of the present invention generally relate to interacting with a virtual scene at a perspective which is independent from the perspective of the user. Methods and systems can include either tracking and defining a perspective of the user based on the position and orientation of the user in the physical space, projecting a virtual scene for the user perspective to a virtual plane, tracking and defining a perspective of the a freehand user input device based on the position and orientation of the a freehand user input device, identifying a mark in the virtual scene which corresponds to the position and orientation of the device in the physical space, creating a virtual segment from the mark and interacting with virtual objects in the virtual scene at the end point of the virtual segment, as controlled using the device.
14 Citations
20 Claims
-
1. A method for interacting with a virtual object from within a virtual scene projected to a virtual render plane, comprising:
-
determining, via an optical tracking device, a first position and orientation of a physical endpoint of a user interface device in a physical space, wherein the position and orientation is with respect to a display device rendering the virtual scene; determining a perspective of the user interface device based on the determined position and orientation, wherein the perspective is with respect to the display device, and wherein the perspective is not normal to the virtual render plane; identifying a virtual position and orientation of a virtual endpoint in a virtual scene, wherein the virtual position and orientation of the virtual endpoint in the virtual scene correspond to the position and orientation of the physical endpoint in the physical space; identifying a segment in the virtual scene along a path from the virtual endpoint based on the perspective of the user interface device, wherein the segment indicates a path that is not normal to the virtual render plane; and correlating an intersection between the virtual object and the segment, including receiving, via the user interface device, user input to select a virtual object via a selection shape, comprising; receiving user input to select a first point in the virtual scene from the perspective in accordance with a first intersection; receiving user input to select a second point in the virtual scene from a different perspective in accordance with a second intersection, wherein the second perspective is based on a second position and orientation of the user interface device in the physical space, and wherein the second position and orientation is determined via the optical tracking device; and creating a rectangular volume between the first point and the second point, wherein planes of the rectangular volume are oblique in relation to the virtual render plane, wherein the rectangular volume forms the selection shape. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A non-transitory computer readable memory medium storing program instructions executable by a processor to:
-
determine, via an optical tracking system, a first position and orientation of a physical endpoint of a user interface device in a physical space, wherein the position and orientation is with respect to a display device rendering a virtual scene projected to a virtual render plane; determine a perspective of the user interface device based on the determined position and orientation, wherein the perspective is with respect to the display device, and wherein the perspective is not normal to the virtual render plane; identify a virtual position and orientation of a virtual endpoint in a virtual scene, wherein the virtual position and orientation of the virtual endpoint in the virtual scene correspond to the position and orientation of the physical endpoint in the physical space; identify a segment in the virtual scene along a path from the virtual endpoint based on the perspective of the user interface device, wherein the segment indicates a path that is not normal to the virtual render plane; and correlate an intersection between a virtual object and the segment, via a selection shape, wherein to correlate the intersection, the program instructions are further executable to; receive user input to select a first point in the virtual scene from the perspective in accordance with a first intersection; receive user input to select a second point in the virtual scene from a different perspective in accordance with a second intersection, wherein the second perspective is based on a second position and orientation of the user interface device in the physical space, and wherein the second position and orientation is determined via the optical tracking device; and create a rectangular volume between the first point and the second point, wherein planes of the rectangular volume are oblique in relation to the virtual render plane, wherein the rectangular volume forms the selection shape. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. A system comprising:
-
a display device; an optical tracking system; a memory; and a processor in communication with the memory, wherein the processor is configured to; determine, via the optical tracking system, a first position and orientation of a physical endpoint of a user interface device in a physical space, wherein the position and orientation is with respect to the display device rendering a virtual scene projected to a virtual render plane; determine a perspective of the user interface device based on the determined position and orientation, wherein the perspective is with respect to the display device, and wherein the perspective is not normal to the virtual render plane; identify a virtual position and orientation of a virtual endpoint in a virtual scene, wherein the virtual position and orientation of the virtual endpoint in the virtual scene correspond to the position and orientation of the physical endpoint in the physical space; identify a segment in the virtual scene along a path from the virtual endpoint based on the perspective of the user interface device, wherein the segment indicates a path that is not normal to the virtual render plane; and correlate an intersection between a virtual object and the segment via a selection shape, wherein to correlate the intersection, the processor is further configured to; receive user input to select a first point in the virtual scene from the perspective in accordance with a first intersection; receive user input to select a second point in the virtual scene from a different perspective in accordance with a second intersection, wherein the different perspective is based on a second position and orientation of the user interface device in the physical space, and wherein the second position and orientation is determined via the optical tracking device; and create a rectangular volume between the first point and the second point, wherein planes of the rectangular volume are oblique in relation to the virtual render plane, wherein the rectangular volume forms the selection shape. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification