Method and apparatus for gesture interaction with a photo-active painted surface
First Claim
1. A non-transitory computer readable storage medium including instructions that, when executed by a processor, cause the processor to:
- monitor a surface and an area in front of the surface;
determine, based on the monitoring, when a user approaches the surface or is in the area in front of the surface;
generate, based on the determination, an image of a control interface on the surface;
capture image data of the image of the control interface displayed on the surface and user motion performed relative to the image of the control interface;
analyze the captured image data to determine if the user motion is indicative of a gesture associated with the image of the control interface displayed on the surface; and
control a connected system in response to the gesture, wherein the connected system is different than a system including the processor, and wherein the control interface is associated with the connected system.
0 Assignments
0 Petitions
Accused Products
Abstract
A method and apparatus for gesture interaction with a photo-active painted surface is described. The method may include driving a spatial electromagnetic modulator to emit electromagnetic stimulation in the form of an image to cause photo-active paint to display the image. The method may also include capturing, with at least a camera of a painted surface display system, image data of the image displayed on the photo-active paint applied to a surface and a user motion performed relative to the image. The method may also include analyzing the captured image data to determine a sequence of one or more physical movements of the user relative to the image displayed on the photo-active paint. The method may also include determining, based on the analysis, that the user motion is indicative of a gesture, and driving the spatial electromagnetic modulator to update.
47 Citations
20 Claims
-
1. A non-transitory computer readable storage medium including instructions that, when executed by a processor, cause the processor to:
-
monitor a surface and an area in front of the surface; determine, based on the monitoring, when a user approaches the surface or is in the area in front of the surface; generate, based on the determination, an image of a control interface on the surface; capture image data of the image of the control interface displayed on the surface and user motion performed relative to the image of the control interface; analyze the captured image data to determine if the user motion is indicative of a gesture associated with the image of the control interface displayed on the surface; and control a connected system in response to the gesture, wherein the connected system is different than a system including the processor, and wherein the control interface is associated with the connected system. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A system, comprising:
-
an audio-visual sensor coupled to monitor a surface and an area in front of the surface, and provide sensor data in response; an electromagnetic (EM) modulator coupled to stimulate generation of an image on the surface, wherein photoactive compounds disposed on the surface generate the image in response to the stimulation, and wherein the EM modulator stimulates image generation in response to one or more control signals; and a non-transitory computer readable storage medium including instructions that, when executed by a processor, cause the system to; receive first sensor data from the audio-visual sensor; determine, based on the first sensor data, a context for a user with respect to the surface; provide the one or more control signals to the EM modulator in response to the determination of the context for the user, wherein the one or more control signals cause the EM modulator to stimulate generation of an image of a control interface, the control interface associated with a connected system different than the system; receive second sensor data from the audio-visual sensor, the second sensor data including user motion; and determine, based on the second sensor data, that the user motion is indicative of a gesture associated with the image of the control interface displayed on the surface; and control the connected system in response to the gesture. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18, 19, 20)
-
Specification