System and method for presenting virtual and augmented reality scenes to a user
First Claim
Patent Images
1. A method comprising:
- determining a first orientation of a device relative to a three-dimensional frame of reference;
determining a second orientation of the device relative to a nodal point, wherein the nodal point corresponds to a user using the device, and wherein the nodal point is at a first position relative to the three-dimensional frame of reference separate from the device;
displaying a scene including a plurality of displayable aspects on the device based on the first orientation and the second orientation;
responsive to a movement of the nodal point relative to the device, detecting a change in the second orientation of the device based on the movement of the nodal point from the first position to a second position relative to the three-dimensional frame of reference, wherein the movement of the nodal point relative to the device includes a change in linear distance between the nodal point and the device; and
adapting the scene displayed on the device based on the detected change in the position of the nodal point, including increasing a size of a displayable aspect of the plurality of displayable aspects of the scene in response to a reduction in the linear distance between the nodal point and the device and decreasing the size of the displayable aspect of the plurality of displayable aspects of the scene in response to an increase in the linear distance between the nodal point and the device.
3 Assignments
0 Petitions
Accused Products
Abstract
A method of presenting a scene to a user according to a preferred embodiment includes determining a real orientation of a device relative to a projection matrix and determining a user orientation of the device relative to a nodal point. The method of the preferred embodiment can further include orienting a scene displayable on the device to the user in response to the real orientation and the user orientation; and displaying the scene on the device. The method of the preferred embodiment can be performed by an apparatus and/or embodied in computer program product including machine-readable code.
84 Citations
20 Claims
-
1. A method comprising:
-
determining a first orientation of a device relative to a three-dimensional frame of reference; determining a second orientation of the device relative to a nodal point, wherein the nodal point corresponds to a user using the device, and wherein the nodal point is at a first position relative to the three-dimensional frame of reference separate from the device; displaying a scene including a plurality of displayable aspects on the device based on the first orientation and the second orientation; responsive to a movement of the nodal point relative to the device, detecting a change in the second orientation of the device based on the movement of the nodal point from the first position to a second position relative to the three-dimensional frame of reference, wherein the movement of the nodal point relative to the device includes a change in linear distance between the nodal point and the device; and adapting the scene displayed on the device based on the detected change in the position of the nodal point, including increasing a size of a displayable aspect of the plurality of displayable aspects of the scene in response to a reduction in the linear distance between the nodal point and the device and decreasing the size of the displayable aspect of the plurality of displayable aspects of the scene in response to an increase in the linear distance between the nodal point and the device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. An apparatus comprising:
-
a user interface comprising a display and a camera, the display and the camera substantially oriented in a first direction; a first orientation sensor configured to determine a three-dimensional spatial orientation of the user interface relative to a three-dimensional frame of reference; a second orientation sensor configured to determine a second orientation of the user interface relative to a nodal point, wherein the nodal point corresponds to a user using the device, and wherein the nodal point is at a first position relative to the three-dimensional frame of reference separate from the device; and a processor connected to the user interface, the first orientation sensor, and the second orientation sensor, the processor configured to; display a scene including a plurality of displayable aspects to the user on the display based on the first orientation and the second orientation; responsive to a movement of the nodal point relative to the device, detect a change in the second orientation of the device based on the movement of the nodal point from the first position to a second position relative to the three-dimensional frame of reference, wherein the movement of the nodal point relative to the device includes a change in linear distance between the nodal point and the device; and adapting the scene displayed on the device based on the detected change in the position of the nodal point including increasing a size of a displayable aspect of the plurality of displayable aspects of the scene in response to a reduction in the linear distance between the nodal point and the device and decreasing the size of the displayable aspect of the plurality of displayable aspects of the scene in response to an increase in the linear distance between the nodal point and the device. - View Dependent Claims (15, 16, 17, 18, 19, 20)
-
Specification