Calibrating vision systems
First Claim
Patent Images
1. A method of calibration, comprising:
- receiving, by a processor, an image of a human gesture;
calibrating by identifying the human gesture in the image;
determining a gesture interaction area framed by the human gesture;
computing an interaction boundary of the gesture interaction area;
computing a gesture area defined by the interaction boundary of the gesture interaction area;
mapping the gesture area defined by the interaction boundary of the gesture interaction area to pixels in a display device;
receiving another image of a different human gesture;
mapping the another image of the different human gesture to different regions within the gesture interaction area; and
interpreting the different regions within the gesture interaction area to a command;
wherein human gestures are calibrated to the pixels in the display device.
1 Assignment
0 Petitions
Accused Products
Abstract
Methods, systems, and computer program calibrate a vision system. An image of a human gesture is received that frames a display device. A boundary defined by the human gesture is computed, and gesture area defined by the boundary is also computed. The gesture area is then mapped to pixels in the display device.
-
Citations
20 Claims
-
1. A method of calibration, comprising:
-
receiving, by a processor, an image of a human gesture; calibrating by identifying the human gesture in the image; determining a gesture interaction area framed by the human gesture; computing an interaction boundary of the gesture interaction area; computing a gesture area defined by the interaction boundary of the gesture interaction area; mapping the gesture area defined by the interaction boundary of the gesture interaction area to pixels in a display device; receiving another image of a different human gesture; mapping the another image of the different human gesture to different regions within the gesture interaction area; and interpreting the different regions within the gesture interaction area to a command; wherein human gestures are calibrated to the pixels in the display device. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A system, comprising:
-
a processor; and a memory storing code that when executed causes the processor to perform operations, the operations comprising; receiving an image of a human gesture performed by an operator of a vision system; calibrating by recognizing the human gesture in the image; determining a gesture interaction area framed by the human gesture; computing an interaction boundary defined by the gesture interaction area; computing a gesture area defined by the gesture interaction area; and mapping the gesture area defined by the interaction boundary of the gesture interaction area to pixels in a display device; receiving another image of a different human gesture; mapping the another image of the different human gesture to different regions within the gesture interaction area; and interpreting the different regions within the gesture interaction area to a command. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A memory storing processor executable instructions that when executed cause a processor to perform operations, the operations comprising:
-
receiving an image of a human gesture performed by an operator of a vision system; calibrating by recognizing the human gesture in the image; determining a gesture interaction area framed by the human gesture; computing an interaction boundary defined by the gesture interaction area; computing a gesture area defined by the gesture interaction area; mapping the gesture area defined by the interaction boundary of the gesture interaction area to pixels of a display device; receiving another image of a different human gesture; mapping the another image of the different human gesture to different regions within the gesture interaction area; and interpreting the different regions within the gesture interaction area to a command. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification