Systems and methods for the automated sensing of motion in a mobile robot using visual data
First Claim
1. A method of determining a motional state of a mobile robot, the method comprising:
- retrieving pixel data for images taken at intervals from a camera that is coupled to the mobile robot;
comparing pixel data for a first image to pixel data for a second image to generate a measure of a difference between the two images, wherein comparing comprises;
filtering the first image pixel data with a gradient magnitude filter, where the gradient magnitude filter computes at least a spatial gradient;
comparing the gradient-magnitude filtered first image pixel data to a first threshold;
generating a binary map of the first image pixel data at least partly in response to the comparison of the gradient-magnitude filtered smoothed first image pixel data with the first threshold;
filtering the second image pixel data with the gradient magnitude filter;
comparing the gradient-magnitude filtered smoothed second image pixel data to a second threshold;
generating a binary map of the second image pixel data at least partly in response to the comparison of the gradient-magnitude filtered second image pixel data to the second threshold; and
comparing the binary map of the first image pixel data to the binary map of the second image pixel data to identify data for pixels that are different between the first image and the second image;
using the comparison of the pixel data to count the number of pixel data identified as changed;
comparing the count to a third predetermined threshold; and
determining the motional state of the mobile object at least partly in response to the count.
7 Assignments
0 Petitions
Accused Products
Abstract
The invention is related to methods and apparatus that detect motion by monitoring images from a video camera mounted on a mobile robot, such as an autonomously navigated mobile robot. Examples of such robots include automated vacuum floor sweepers. Advantageously, embodiments of the invention can automatically sense a robot'"'"'s motional state in a relatively reliable and cost-efficient manner. Many configurations of robots are configured to include at least one video camera. Embodiments of the invention permit the use of a video camera onboard a robot to determine a motional state for the robot. This can advantageously permit the motional state of a robot to be determined at a fraction of the cost of additional sensors, such as a laser, an infrared, an ultrasonic, or a contact sensor.
259 Citations
52 Claims
-
1. A method of determining a motional state of a mobile robot, the method comprising:
-
retrieving pixel data for images taken at intervals from a camera that is coupled to the mobile robot;
comparing pixel data for a first image to pixel data for a second image to generate a measure of a difference between the two images, wherein comparing comprises;
filtering the first image pixel data with a gradient magnitude filter, where the gradient magnitude filter computes at least a spatial gradient;
comparing the gradient-magnitude filtered first image pixel data to a first threshold;
generating a binary map of the first image pixel data at least partly in response to the comparison of the gradient-magnitude filtered smoothed first image pixel data with the first threshold;
filtering the second image pixel data with the gradient magnitude filter;
comparing the gradient-magnitude filtered smoothed second image pixel data to a second threshold;
generating a binary map of the second image pixel data at least partly in response to the comparison of the gradient-magnitude filtered second image pixel data to the second threshold; and
comparing the binary map of the first image pixel data to the binary map of the second image pixel data to identify data for pixels that are different between the first image and the second image;
using the comparison of the pixel data to count the number of pixel data identified as changed;
comparing the count to a third predetermined threshold; and
determining the motional state of the mobile object at least partly in response to the count. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18)
-
-
19. A method of determining a motional state of a mobile robot, the method comprising:
-
receiving pixel data for video images, where the video images are taken from a camera mounted to the mobile robot;
processing the pixel data for the video images to identify amounts of spatial gradient within a video image;
characterizing pixels of a video image into at least a first group and a second group, wherein the pixels of the first group correspond to a higher spatial gradient than the pixels of the second group; and
using the characterization of the pixels to compare a first video image to a second video image to detect the motional state of the mobile robot. - View Dependent Claims (20, 21, 22, 23, 24, 25, 26)
-
-
27. A method of controlling a behavior of a mobile robot based on a mismatch between an intended motional state and a perceived motional state in a mobile robot, the method comprising:
-
receiving an indication of the intended motional state, where the motional state is selected from the group including moving and not moving;
using visual data from a camera that is coupled to the mobile robot to perceive the motional state of the mobile robot, where the perceived motional state of the mobile robot is selected from the group including moving and not moving;
comparing the intended motional state to the perceived motional state to detect whether a mismatch exists between the intended motional state and the perceived motional state; and
changing the behavior of the mobile robot at least partly in response to a detected mismatch. - View Dependent Claims (28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44)
-
-
45. A method of controlling the motion of a self-navigating mobile robot, the method comprising:
-
receiving an indication that the mobile robot is intended to be traveling in a forward direction;
determining from visual image data collected from a video camera coupled to the mobile robot that the mobile robot has ceased traveling in a forward direction;
discontinuing commands to propel the mobile robot in the forward direction;
commanding the mobile robot to travel in a reverse direction for at least a predetermined distance;
determining that the mobile robot has traveled in the reverse direction for at least about the predetermined distance;
discontinuing commands to propel the mobile robot in the reverse direction;
instructing the mobile robot to yaw by at least a first predetermined angle; and
commanding the mobile robot to resume forward motion. - View Dependent Claims (46, 47, 48)
-
-
49. A method of detecting that a mobile robot has been kidnapped, the method comprising:
-
receiving an indication that the mobile robot is not instructed to be moving;
receiving data for video images from a camera coupled to the mobile robot;
comparing data from different video images to determine whether or not the mobile robot is in motion; and
determining that the mobile robot has been kidnapped when the video images indicate that the mobile robot is in motion.
-
-
50. A circuit for a mobile robot that is configured to determine a motional state of the mobile robot, the circuit comprising:
-
a means for receiving pixel data for video images, where the video images are taken from a camera mounted to the mobile robot;
a means for processing the pixel data for the video images to identify amounts of spatial gradient within a video image;
a means for characterizing pixels of a video image into at least a first group and a second group, wherein the pixels of the first group correspond to a higher spatial gradient than the pixels of the second group; and
a means for using the characterization of the pixels to compare a first video image to a second video image to detect the motional state of the mobile robot.
-
-
51. A computer program embodied in a tangible medium for controlling a behavior of a mobile robot based on a mismatch between an intended motional state and a perceived motional state, the computer program comprising:
-
a module with instructions for receiving an indication of the intended motional state, where the motional state is selected from the group including moving and not moving;
a module with instructions for using visual data from a camera that is coupled to the mobile robot to perceive the motional state of the mobile robot, where the perceived motional state of the mobile robot is selected from the group including moving and not moving;
a module with instructions for comparing the intended motional state to the perceived motional state to detect whether a mismatch exists between the intended motional state and the perceived motional state; and
a module with instructions for changing the behavior of the mobile robot at least partly in response to a detected mismatch.
-
-
52. A circuit for control of a self-navigating mobile robot, the circuit comprising:
-
a circuit configured to receive an indication that the mobile robot is intended to be traveling in a forward direction;
a circuit configured to determine from visual image data collected from a video camera coupled to the mobile robot that the mobile robot has ceased traveling in a forward direction;
a circuit configured to discontinue commands to propel the mobile robot in the forward direction;
a circuit configured to command the mobile robot to travel in a reverse direction for at least a predetermined distance;
a circuit configured to determine that the mobile robot has traveled in the reverse direction for at least about the predetermined distance;
a circuit configured to discontinue commands to propel the mobile robot in the reverse direction;
a circuit configured to instruct the mobile robot to yaw by at least a first predetermined angle; and
a circuit configured to command the mobile robot to resume forward motion.
-
Specification