Interactions with 3D virtual objects using poses and multiple-DOF controllers
First Claim
1. A system for interacting with objects for a wearable device, the system comprising:
- a display system of a wearable device configured to present a three-dimensional (3D) view to a user and permit a user interaction with virtual objects in a field of regard (FOR) of a user, the FOR comprising a portion of the environment around the user that is capable of being perceived by the user via the display system;
a sensor configured to acquire data associated with a pose of the user;
a hardware processor in communication with the sensor and the display system, the hardware processor programmed to;
determine a pose of the user based on the data acquired by the sensor;
initiate a cone cast on a group of virtual objects in the FOR, the cone cast comprises casting a virtual cone with an aperture in a direction based at least partly on the pose of the user, wherein the aperture has a dynamically-adjustable size;
analyze contextual information associated with the group of virtual objects in the user'"'"'s environment to calculate the dynamically-adjustable size of the aperture for the cone cast toward the group of virtual objects;
automatically update the dynamically-adjustable size of the aperture of the virtual cone based at least partly on the contextual information of the group of virtual objects; and
cause the display system to render a visual representation of the virtual cone for the cone cast.
5 Assignments
0 Petitions
Accused Products
Abstract
A wearable system can comprise a display system configured to present virtual content in a three-dimensional space, a user input device configured to receive a user input, and one or more sensors configured to detect a user'"'"'s pose. The wearable system can support various user interactions with objects in the user'"'"'s environment based on contextual information. As an example, the wearable system can adjust the size of an aperture of a virtual cone during a cone cast (e.g., with the user'"'"'s poses) based on the contextual information. As another example, the wearable system can adjust the amount of movement of virtual objects associated with an actuation of the user input device based on the contextual information.
65 Citations
20 Claims
-
1. A system for interacting with objects for a wearable device, the system comprising:
-
a display system of a wearable device configured to present a three-dimensional (3D) view to a user and permit a user interaction with virtual objects in a field of regard (FOR) of a user, the FOR comprising a portion of the environment around the user that is capable of being perceived by the user via the display system; a sensor configured to acquire data associated with a pose of the user; a hardware processor in communication with the sensor and the display system, the hardware processor programmed to; determine a pose of the user based on the data acquired by the sensor; initiate a cone cast on a group of virtual objects in the FOR, the cone cast comprises casting a virtual cone with an aperture in a direction based at least partly on the pose of the user, wherein the aperture has a dynamically-adjustable size; analyze contextual information associated with the group of virtual objects in the user'"'"'s environment to calculate the dynamically-adjustable size of the aperture for the cone cast toward the group of virtual objects; automatically update the dynamically-adjustable size of the aperture of the virtual cone based at least partly on the contextual information of the group of virtual objects; and cause the display system to render a visual representation of the virtual cone for the cone cast. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method for interacting with objects for a wearable device, the method comprising:
-
under control of a wearable device comprising a display system and hardware processor; receiving a selection of a target virtual object displayed to a user at a first position in a three-dimensional (3D) space; receiving an indication to move the target virtual object from the first position, wherein the indication comprises a first amount of movement associated with the user in a first direction in the 3D space; analyzing a distance between the target virtual object and the user in a second direction different from the first direction; calculating a multiplier to map the first amount of movement associated with the user into a second amount of movement for the target virtual object, the multiplier calculated based at least partly on the distance between the target virtual object and the user in the second direction; calculating the second amount of movement in the first direction for the target virtual object, the second amount of movement calculated based at least partly on the first amount of movement and the multiplier; and causing the target virtual object to be displayed to the user at a second position, the second position based at least in part on the first position and the second amount of movement. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. A system for interacting with objects for a wearable device, the system comprising:
-
a display system of a wearable device configured to present a three-dimensional (3D) view to a user, the 3D view showing a target virtual object; a hardware processor in communication with the display system, the hardware processor programmed to; receive an indication to move the target virtual object from a first position, wherein the indication comprises a first amount of movement associated with a user in a target direction in the 3D space; analyze a distance between the target virtual object and the user; calculate a multiplier to map the first amount of movement associated with the user into a second amount of movement for the target virtual object, the multiplier calculated based at least partly on the distance between the target virtual object and the user; calculate the second amount of movement in the target direction for the target virtual object, the second amount of movement calculated based at least partly on the first amount of movement and the multiplier; and by cause the display system to display the target virtual object at a second position, the second position based at least in part on the first position and the second amount of movement. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification