Gesture controlled adaptive projected information handling system input and output devices
First Claim
1. An information handling system comprising:
- a housing;
processing components disposed in the housing and operable to cooperate to process information;
a projector interfaced with the processing components and operable to present the information as images at a projection surface, the images including at least one input device;
an input sensor operable to detect end user hand movements proximate the projection surface, the input sensor including at least a camera operable to capture an image of the end user hand; and
a gesture module stored in non-transitory memory and interfaced with the projector and the input sensor, the gesture module operable to adapt a projected input device position on the projection surface in response to an end user hand movement detected by the input sensor;
wherein the gesture module is further operable to determine a unique feature of the end user hand associated with an identification of the end user and in response to the identification to present a user interface associated with the end user.
14 Assignments
0 Petitions
Accused Products
Abstract
Projected input and output devices adapt to a desktop environment by sensing objects at the desktop environment and altering projected light in response to the sensed objects. For instance, projection of input and output devices is altered to limit illumination against and end user'"'"'s hands or other objects disposed at a projection surface. End user hand positions and motions are detected to provide gesture support for adapting a projection work space, and configurations of projected devices are stored so that an end user can rapidly recreate a projected desktop. A projector scan adjusts to limit traces across inactive portions of the display surface and to increase traces at predetermined areas, such as video windows.
-
Citations
8 Claims
-
1. An information handling system comprising:
-
a housing; processing components disposed in the housing and operable to cooperate to process information; a projector interfaced with the processing components and operable to present the information as images at a projection surface, the images including at least one input device; an input sensor operable to detect end user hand movements proximate the projection surface, the input sensor including at least a camera operable to capture an image of the end user hand; and a gesture module stored in non-transitory memory and interfaced with the projector and the input sensor, the gesture module operable to adapt a projected input device position on the projection surface in response to an end user hand movement detected by the input sensor; wherein the gesture module is further operable to determine a unique feature of the end user hand associated with an identification of the end user and in response to the identification to present a user interface associated with the end user. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
Specification