Providing user feedback in projection environments
First Claim
1. A system comprising:
- one or more processors;
a projector, coupled to the one or more processors and configured to project, onto a surface and within an environment, a user interface that includes multiple selectable portions;
a camera, coupled to the one or more processors and configured to capture information for identifying a user within the environment interacting with the user interface; and
one or more computer-readable media storing computer-executable instructions that, when executed by the one or more processors, cause the one or more processors to perform acts comprising;
identifying, using the information captured by the camera, the user interacting with the user interface by moving a selection tool towards the surface;
determining which of the multiple selectable portions the user is in position to select based at least in part on the identifying;
altering the user interface projected by the projector to highlight the determined selectable portion; and
identifying, using the information captured by the camera, the user selecting the determined selectable portion, the identifying of the user selecting the determined selectable portion comprising at least one of;
(1) identifying the user selecting the determined selectable portion in response to the user moving the selection tool toward the surface and past a predefined selection plane without touching the surface, or (2) identifying the user selecting the determined selectable portion in response to the user moving the selection tool toward the surface, without touching the surface, and then back away from the surface.
1 Assignment
0 Petitions
Accused Products
Abstract
Systems and techniques for providing feedback to users within an environment that interact with user interfaces (UIs) that are projected within the environment. For instance, the systems and techniques may project a UI that includes one or more selectable portions, such as keys, icons, sliders, dials, or any other type of control. After projecting the UI, the systems and techniques may identify the user attempting to interact with the UI. In response, the systems and techniques may provide feedback to the user indicating that the user has engaged the UI. For instance, the systems and techniques may visually alter the projected UI, may output a sound via one or more speakers within the environment, or may provide the feedback in any other manner.
43 Citations
32 Claims
-
1. A system comprising:
-
one or more processors; a projector, coupled to the one or more processors and configured to project, onto a surface and within an environment, a user interface that includes multiple selectable portions; a camera, coupled to the one or more processors and configured to capture information for identifying a user within the environment interacting with the user interface; and one or more computer-readable media storing computer-executable instructions that, when executed by the one or more processors, cause the one or more processors to perform acts comprising; identifying, using the information captured by the camera, the user interacting with the user interface by moving a selection tool towards the surface; determining which of the multiple selectable portions the user is in position to select based at least in part on the identifying; altering the user interface projected by the projector to highlight the determined selectable portion; and identifying, using the information captured by the camera, the user selecting the determined selectable portion, the identifying of the user selecting the determined selectable portion comprising at least one of;
(1) identifying the user selecting the determined selectable portion in response to the user moving the selection tool toward the surface and past a predefined selection plane without touching the surface, or (2) identifying the user selecting the determined selectable portion in response to the user moving the selection tool toward the surface, without touching the surface, and then back away from the surface. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. One or more computer-readable storage media storing computer-executable instructions that, when executed by one or more processors, cause the one or more processors to perform acts comprising:
-
projecting, with a projector, a user interface onto a surface within an environment; capturing, with a camera, images of a user within the environment moving a selection tool toward the surface; identifying a trajectory of the selection tool toward the surface using the captured images; highlighting a portion of multiple different portions of the user interface that corresponds to a position of the selection tool based at least in part on the identified trajectory; and identifying, using the captured images, the user selecting the portion of the user interface that corresponds to the position of the selection tool by at least one of;
(1) identifying the user selecting the portion of the user interface that corresponds to the position of the selection tool in response to the user moving the selection tool toward the surface and past a predefined selection plane without touching the surface, or (2) identifying the user selecting the portion of the user interface that corresponds to the position of the selection tool in response to the user moving the selection tool toward the surface, without touching the surface, and then back away from the surface. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18, 19, 20)
-
-
21. A method, comprising:
-
projecting, with a projector, a user interface that includes multiple selectable portions onto a surface within an environment; identifying, with a camera, a user within the environment attempting to interact with the user interface projected onto the surface by identifying the user moving a selection tool towards the surface; identifying a trajectory of the selection tool toward the surface using the captured images; visually highlighting a portion of multiple selectable portions of the user interface, the portion selected based at least in part on the trajectory of the selection tool; and after visually highlighting the portion, identifying a selection of a key of the portion in response to the user continuing to move the selection tool towards the surface and past a selection plane without touching the surface. - View Dependent Claims (22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32)
-
Specification