PHOTOREALISTIC SCENE GENERATION SYSTEM AND METHOD
First Claim
1. A photorealistic scene generation system comprising:
- a virtual reality (VR) headset configured to visually present a user with a three-dimensional (3D) virtual environment;
a VR controller separate from the VR headset and configured to receive input from the user during user interaction with the 3D virtual environment as viewed through the VR headset; and
a VR processor in communication with the VR headset and the VR controller, the VR processor being programmed to;
instruct the VR headset to display the 3D virtual environment having an interactive vantage point based on movement of the user based on positional data received from the VR headset,instruct the VR headset to display a source zone superimposed over the 3D virtual environment to the user, wherein the source zone provides visual representations of physical items previously selected for use in styling the 3D virtual environment,receive input from the VR controller to select and direct movement of one of the visual representations from the source zone and to move the selected one of the visual representations into the 3D virtual environment,convert the selected one of the visual representations into a 3D model of an item corresponding to the selected one of the visual representations as the visual representation is moved from the source zone into the 3D virtual environment, andin real time, adjust the orientation and placement of the 3D model within the 3D virtual environment to have an orientation and positional placement as directed by input from the VR controller.
1 Assignment
0 Petitions
Accused Products
Abstract
A photorealistic scene generation system includes a virtual reality (VR) headset, a VR controller, and a VR processor. The VR headset visually presents a user with a 3D virtual environment. The VR controller receives input from the user. The VR processor communicates with the VR headset and the VR controller and instructs the VR headset to display the 3D virtual environment having an interactive vantage point based on movement of the user based on positional data received from the VR headset and instructs the VR headset to display a source zone superimposed over the 3D virtual environment. The source zone provides visual representations of physical items previously selected for use in styling the 3D virtual environment. The VR processor is programmed to direct movement of one of the visual representations from the source zone and to move the selected one of the visual representations into the 3D virtual environment.
-
Citations
28 Claims
-
1. A photorealistic scene generation system comprising:
-
a virtual reality (VR) headset configured to visually present a user with a three-dimensional (3D) virtual environment; a VR controller separate from the VR headset and configured to receive input from the user during user interaction with the 3D virtual environment as viewed through the VR headset; and a VR processor in communication with the VR headset and the VR controller, the VR processor being programmed to; instruct the VR headset to display the 3D virtual environment having an interactive vantage point based on movement of the user based on positional data received from the VR headset, instruct the VR headset to display a source zone superimposed over the 3D virtual environment to the user, wherein the source zone provides visual representations of physical items previously selected for use in styling the 3D virtual environment, receive input from the VR controller to select and direct movement of one of the visual representations from the source zone and to move the selected one of the visual representations into the 3D virtual environment, convert the selected one of the visual representations into a 3D model of an item corresponding to the selected one of the visual representations as the visual representation is moved from the source zone into the 3D virtual environment, and in real time, adjust the orientation and placement of the 3D model within the 3D virtual environment to have an orientation and positional placement as directed by input from the VR controller. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method of creating a photorealistic scene, the method comprising:
visually presenting a three-dimensional (3D) virtual environment to a user via a virtual reality (VR) headset as directed by a VR processor in communication with the VR headset, including presenting the 3D virtual environment having an interactive vantage point based on movement of a head of the user based on positional data received from the VR headset; displaying a source zone to the user via the VR headset simultaneously with the 3D virtual environment as directed by the VR processor, wherein the source zone is displayed superimposed over the 3D virtual environment and provides visual representations of physical items previously selected for use in styling the 3D virtual environment; receiving input from a VR controller to select and direct movement of one of the visual representations from the source zone and moving the selected one of the visual representations into the 3D virtual environment based on input received from the VR controller; converting the selected one of the visual representations into a 3D model of an item corresponding to the selected one of the visual representations as the visual representation is moved from the source zone into the 3D virtual environment; and in real time, adjusting the orientation and placement of the 3D model within the 3D virtual environment to have an orientation and positional placement as directed by input from the VR controller. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18, 19, 20)
-
21. A non-transitory memory storing instructions for creating a photorealistic scene, the instructions comprising:
-
code for visually presenting a three-dimensional (3D) virtual environment to a user via a virtual reality (VR) headset as directed by a VR processor in communication with the VR headset, including presenting the 3D virtual environment having an interactive vantage point based on movement of a head of the user based on positional data received from the VR headset; code for displaying a source zone to the user via the VR headset simultaneously with the 3D virtual environment as directed by the VR processor, wherein the source zone is displayed superimposed over the 3D virtual environment and provides visual representations of physical items previously selected for use in styling the 3D virtual environment; code for receiving input from a VR controller to select and direct movement of one of the visual representations from the source zone and moving the selected one of the visual representations into the 3D virtual environment based on input received from the VR controller; code for converting the selected one of the visual representations into a 3D model of an item corresponding to the selected one of the visual representations as the visual representation is moved from the source zone into the 3D virtual environment; and code for adjusting, in real time, the orientation and placement of the 3D model within the 3D virtual environment to have an orientation and positional placement as directed by input from the VR controller. - View Dependent Claims (22, 23, 24, 25, 26, 27, 28)
-
Specification