Using tracking to simulate direct tablet interaction in mixed reality
First Claim
1. A computer system comprising:
- one or more processor(s); and
one or more computer-readable hardware storage media having stored thereon computer-executable instructions that are executable by the one or more processor(s) to cause the computer system to;
on a wearable display of the computer system, render an augmented-reality scene for a user who is wearing the wearable display;
within the augmented-reality scene, render an interactive virtual object;
detect a position of a part of an actual hand of the user, the position being detected relative to a portion of the interactive virtual object such that the part of the user'"'"'s actual hand is identified as approaching the portion of the interactive virtual object;
anticipate a type of gesture the user will subsequently use to interact with the interactive virtual object, wherein the anticipated gesture type is anticipated based at least partially on an identified type of the interactive virtual object, where the identified type is based on which interactive abilities the interactive virtual object is identified as having;
in response to determining that the part of the user'"'"'s actual hand is within a target threshold distance to the portion of the interactive virtual object, display a target visual cue on the portion of the interactive virtual object;
in response to determining that the part of the user'"'"'s actual hand is within an input threshold distance to the portion of the interactive virtual object, display an input visual cue on the portion of the interactive virtual object, the input visual cue indicating that input is now receivable for the interactive virtual object, the input being received at a same location as where the input visual cue is being displayed; and
upon a condition in which the part of the user'"'"'s actual hand is simultaneously positioned within (i) either the target threshold distance or the input threshold distance to the portion of the interactive virtual object and (ii) either a corresponding target threshold distance or a corresponding input threshold distance to a corresponding different portion of a different interactive virtual object;
determine which interactive virtual object the user is attempting to interact with;
in response to determining that the user is attempting to interact with the corresponding different portion of the different interactive virtual object, display either the corresponding target visual cue or the corresponding input visual cue on the corresponding different portion of the different interactive virtual object; and
based on a subsequent movement of the part of the user'"'"'s actual hand while the part of the user'"'"'s actual hand is within the corresponding input threshold distance, provide input to an application via the different interactive virtual object.
1 Assignment
0 Petitions
Accused Products
Abstract
Optimizations are provided for facilitating interactions with virtual objects included within an augmented-reality scene. Initially, an augmented-reality scene is rendered for a user. Within that scene, an interactive virtual object of an application is rendered. Then, the position of the user'"'"'s actual hand is determined relative to the interactive virtual object. When the user'"'"'s actual hand is within a target threshold distance to the interactive virtual object, then a target visual cue is projected onto the interactive virtual object. When the user'"'"'s actual hand is within an input threshold distance to the interactive virtual object, then an input visual cue is projected onto the interactive virtual object. Once the user'"'"'s hand is within the input threshold distance to the interactive virtual object, then input may be provided to the application via the interactive object.
-
Citations
20 Claims
-
1. A computer system comprising:
-
one or more processor(s); and one or more computer-readable hardware storage media having stored thereon computer-executable instructions that are executable by the one or more processor(s) to cause the computer system to; on a wearable display of the computer system, render an augmented-reality scene for a user who is wearing the wearable display; within the augmented-reality scene, render an interactive virtual object; detect a position of a part of an actual hand of the user, the position being detected relative to a portion of the interactive virtual object such that the part of the user'"'"'s actual hand is identified as approaching the portion of the interactive virtual object; anticipate a type of gesture the user will subsequently use to interact with the interactive virtual object, wherein the anticipated gesture type is anticipated based at least partially on an identified type of the interactive virtual object, where the identified type is based on which interactive abilities the interactive virtual object is identified as having; in response to determining that the part of the user'"'"'s actual hand is within a target threshold distance to the portion of the interactive virtual object, display a target visual cue on the portion of the interactive virtual object; in response to determining that the part of the user'"'"'s actual hand is within an input threshold distance to the portion of the interactive virtual object, display an input visual cue on the portion of the interactive virtual object, the input visual cue indicating that input is now receivable for the interactive virtual object, the input being received at a same location as where the input visual cue is being displayed; and upon a condition in which the part of the user'"'"'s actual hand is simultaneously positioned within (i) either the target threshold distance or the input threshold distance to the portion of the interactive virtual object and (ii) either a corresponding target threshold distance or a corresponding input threshold distance to a corresponding different portion of a different interactive virtual object; determine which interactive virtual object the user is attempting to interact with; in response to determining that the user is attempting to interact with the corresponding different portion of the different interactive virtual object, display either the corresponding target visual cue or the corresponding input visual cue on the corresponding different portion of the different interactive virtual object; and based on a subsequent movement of the part of the user'"'"'s actual hand while the part of the user'"'"'s actual hand is within the corresponding input threshold distance, provide input to an application via the different interactive virtual object. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method implemented by one or more processor(s) of a computer system, the method comprising:
-
on a wearable display of the computer system, rendering an augmented-reality scene for a user who is wearing the wearable display; within the augmented-reality scene, rendering an interactive virtual object; detecting a position of a part of an actual hand of the user, the position being detected relative to a portion of the interactive virtual object such that the part of the user'"'"'s actual hand is identified as approaching the portion of the interactive virtual object; anticipate a type of gesture the user will subsequently use to interact with the interactive virtual object, wherein the anticipated gesture type is anticipated based at least partially on an identified type of the interactive virtual object, where the identified type is based on which interactive abilities the interactive virtual object is identified as having; in response to determining that the part of the user'"'"'s actual hand is within a target threshold distance to the portion of the interactive virtual object, displaying a target visual cue on the portion of the interactive virtual object; in response to determining that the part of the user'"'"'s actual hand is within an input threshold distance to the portion of the interactive virtual object, displaying an input visual cue on the portion of the interactive virtual object, the input visual cue indicating that input is now receivable for the interactive virtual object, the input being received at a same location as where the input visual cue is being displayed; and upon a condition in which the part of the user'"'"'s actual hand is simultaneously positioned within (i) either the target threshold distance or the input threshold distance to the portion of the interactive virtual object and (ii) either a corresponding target threshold distance or a corresponding input threshold distance to a corresponding different portion of a different interactive virtual object; determine which interactive virtual object the user is attempting to interact with; in response to determining that the user is attempting to interact with the corresponding different portion of the different interactive virtual object, display either the corresponding target visual cue or the corresponding input visual cue on the corresponding different portion of the different interactive virtual object; and based on a subsequent movement of the part of the user'"'"'s actual hand while the part of the user'"'"'s actual hand is within the corresponding input threshold distance, providing input to an application via the different interactive virtual object. - View Dependent Claims (12, 13, 14, 15)
-
-
16. One or more hardware storage devices having stored thereon computer-executable instructions that are executable by one or more processor(s) of a computer system to cause the computer system to:
-
on a wearable display of the computer system, render an augmented-reality scene for a user who is wearing the wearable display; within the augmented-reality scene, render an interactive virtual object; detect a position of a part of an actual hand of the user, the position being detected relative to a portion of the interactive virtual object such that the part of the user'"'"'s actual hand is identified as approaching the portion of the interactive virtual object; anticipate a type of gesture the user will subsequently use to interact with the interactive virtual object, wherein the anticipated gesture type is anticipated based at least partially on an identified type of the interactive virtual object, where the identified type is based on which interactive abilities the interactive virtual object is identified as having; in response to determining that the part of the user'"'"'s actual hand is within a target threshold distance to the portion of the interactive virtual object, display a target visual cue on the portion of the interactive virtual object; in response to determining that the part of the user'"'"'s actual hand is within an input threshold distance to the portion of the interactive virtual object, display an input visual cue on the portion of the interactive virtual object, the input visual cue indicating that input is now receivable for the interactive virtual object, the input being received at a same location as where the input visual cue is being displayed; and upon a condition in which the part of the user'"'"'s actual hand is simultaneously positioned within (i) either the target threshold distance or the input threshold distance to the portion of the interactive virtual object and (ii) either a corresponding target threshold distance or a corresponding input threshold distance to a corresponding different portion of a different interactive virtual object; determine which interactive virtual object the user is attempting to interact with; in response to determining that the user is attempting to interact with the corresponding different portion of the different interactive virtual object, display either the corresponding target visual cue or the corresponding input visual cue on the corresponding different portion of the different interactive virtual object; and based on a subsequent movement of the part of the user'"'"'s actual hand while the part of the user'"'"'s actual hand is within the corresponding input threshold distance, provide input to an application via the different interactive virtual object. - View Dependent Claims (17, 18, 19)
-
-
20. A computer system comprising:
-
one or more processors; and one or more computer-readable hardware storage media having stored thereon computer-executable instructions, the computer-executable instructions being executable by the one or more processors to cause the computer system to; on a wearable display of the computer system, render an augmented-reality scene for a user who is wearing the wearable display; within the augmented-reality scene, render an interactive virtual object; detect a position of a part of an actual hand of the user, the position being detected relative to a portion of the interactive virtual object; in response to determining that the part of the user'"'"'s actual hand is within a target threshold distance to the portion of the interactive virtual object, display a target visual cue on the portion of the interactive virtual object; in response to determining that the part of the user'"'"'s actual hand is within an input threshold distance to the portion of the interactive virtual object, display an input visual cue on the portion of the interactive virtual object, the input visual cue indicating that input is now receivable for the interactive virtual object, the input being received at a same location as where the input visual cue is being displayed, wherein the part of the user'"'"'s actual hand is simultaneously positioned within (i) either the target threshold distance or the input threshold distance to the portion of the interactive virtual object and (ii) either the target threshold distance or the input threshold distance to a different portion of a different interactive virtual object; determine which interactive virtual object the user is attempting to interact with; in response to determining that the user is attempting to interact with the different portion of the different interactive virtual object, display either the target visual cue or the input visual cue on the different portion of the different interactive virtual object; and based on a subsequent movement of the part of the user'"'"'s actual hand while the part of the user'"'"'s actual hand is within the input threshold distance, provide input to an application via the interactive virtual object.
-
Specification