MIXED REALITY INTERACTIONS
First Claim
1. A mixed reality interaction system for interacting with a physical object in a mixed reality environment, the mixed reality interaction system comprising:
- a head-mounted display device operatively connected to a computing device, the head-mounted display device including a display system for presenting the mixed reality environment and a plurality of input sensors including a camera for capturing an image of the physical object; and
a mixed reality interaction program executed by a processor of the computing device, the mixed reality interaction program configured to;
identify the physical object based on the captured image;
determine an interaction context for the identified physical object based on one or more aspects of the mixed reality environment;
query a stored profile for the physical object to determine a plurality of interaction modes for the physical object;
programmatically select a selected interaction mode from the plurality of interaction modes based on the interaction context;
receive a user input directed at the physical object via one of the input sensors of the head-mounted display device;
interpret the user input to correspond to a virtual action based on the selected interaction mode;
execute the virtual action with respect to a virtual object associated with the physical object to thereby modify an appearance of the virtual object; and
display the virtual object via the head-mounted display device with the modified appearance.
3 Assignments
0 Petitions
Accused Products
Abstract
Embodiments that relate to interacting with a physical object in a mixed reality environment via a head-mounted display are disclosed. In one embodiment a mixed reality interaction program identifies an object based on an image from captured by the display. An interaction context for the object is determined based on an aspect of the mixed reality environment. A profile for the physical object is queried to determine interaction modes for the object. A selected interaction mode is programmatically selected based on the interaction context. A user input directed at the object is received via the display and interpreted to correspond to a virtual action based on the selected interaction mode. The virtual action is executed with respect to a virtual object associated with the physical object to modify an appearance of the virtual object. The modified virtual object is then displayed via the display.
228 Citations
20 Claims
-
1. A mixed reality interaction system for interacting with a physical object in a mixed reality environment, the mixed reality interaction system comprising:
-
a head-mounted display device operatively connected to a computing device, the head-mounted display device including a display system for presenting the mixed reality environment and a plurality of input sensors including a camera for capturing an image of the physical object; and a mixed reality interaction program executed by a processor of the computing device, the mixed reality interaction program configured to; identify the physical object based on the captured image; determine an interaction context for the identified physical object based on one or more aspects of the mixed reality environment; query a stored profile for the physical object to determine a plurality of interaction modes for the physical object; programmatically select a selected interaction mode from the plurality of interaction modes based on the interaction context; receive a user input directed at the physical object via one of the input sensors of the head-mounted display device; interpret the user input to correspond to a virtual action based on the selected interaction mode; execute the virtual action with respect to a virtual object associated with the physical object to thereby modify an appearance of the virtual object; and display the virtual object via the head-mounted display device with the modified appearance. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method for interacting with a physical object in a mixed reality environment, comprising:
-
providing a head-mounted display device operatively connected to a computing device, the head-mounted display device including a display system for presenting the mixed reality environment and a plurality of input sensors including a camera for capturing an image of the physical object; identifying the physical object based on the captured image; determining an interaction context for the identified physical object based on one or more aspects of the mixed reality environment; querying a stored profile for the physical object to determine a plurality of interaction modes for the physical object; programmatically selecting a selected interaction mode from the plurality of interaction modes based on the interaction context; receiving a user input directed at the physical object via one of the input sensors of the head-mounted display device; interpreting the user input to correspond to a virtual action based on the selected interaction mode; executing the virtual action with respect to a virtual object associated with the physical object to thereby modify an appearance of the virtual object; and displaying the virtual object via the head-mounted display device with the modified appearance. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18)
-
-
19. A method for interacting with a physical object in a mixed reality environment, comprising:
-
providing a head-mounted display device operatively connected to a computing device, the head-mounted display device including a display system for presenting the mixed reality environment and a plurality of input sensors including a camera for capturing an image of the physical object; identifying the physical object based on the captured image; determining an interaction context for the identified physical object based on one or more aspects of the mixed reality environment; querying a stored profile for the physical object to determine a plurality of interaction modes for the physical object; programmatically selecting a first selected interaction mode from the plurality of interaction modes based on the interaction context; receiving a first user input directed at the physical object via one of the input sensors of the head-mounted display device; interpreting the first user input to correspond to a first virtual action based on the first selected interaction mode; executing the first virtual action with respect to a virtual object associated with the physical object to thereby modify an appearance of the virtual object to a first modified appearance; and displaying the virtual object via the head-mounted display device with the first modified appearance; determining a change in the interaction context; based on the change in the interaction context, programmatically selecting a second selected interaction mode from the plurality of interaction modes; receiving a second user input directed at the physical object via one of the input sensors of the head-mounted display device; interpreting the second user input to correspond to a second virtual action based on the second selected interaction mode; executing the second virtual action with respect to the virtual object associated with the physical object to thereby modify an appearance of the virtual object to a second modified appearance; and displaying the virtual object via the head-mounted display device with the second modified appearance. - View Dependent Claims (20)
-
Specification