Method and apparatus to determine robot location using omni-directional image
First Claim
Patent Images
1. A method to locate a robot using an omni-directional image, the method comprising:
- acquiring the omni-directional image from a robot;
extracting a predetermined current line from the acquired omni-directional image;
calculating a correlation coefficient between the extracted current line of the robot and each pre-stored landmark line of nodes corresponding to locations in a robot-locatable area using a Fast Fourier Transform (FFT);
selecting M nodes at which the calculated correlation coefficient is equal to or higher than a predetermined value;
modifying a current line of the robot such that same spatial objects are located at a same location on a basis of the landmark lines of the selected M nodes to create a wrapped current line;
calculating a correlation coefficient between the wrapped current line of the robot and the landmark lines of the M nodes;
selecting N nodes at which the calculated correlation coefficient is equal to or higher than a predetermined correlation coefficient such that N<
M; and
recognizing a location of the robot on a basis of the selected N nodes,wherein the pre-stored landmark lines of the nodes are lines pre-extracted from omni-directional images acquired when the robot is located at the nodes in the robot-locatable area.
1 Assignment
0 Petitions
Accused Products
Abstract
A method to determine the location of a robot using an omni-directional image, the method including acquiring an omni-directional image from a robot, extracting a predetermined current line from the acquired omni-directional image, calculating a correlation coefficient between the extracted current line of the robot and each landmark line of pre-stored nodes using a Fast Fourier Transform (FFT), and performing a stochastic approach method of a particle filtering process on a basis of the calculated correlation coefficient to recognize a location of the robot.
11 Citations
12 Claims
-
1. A method to locate a robot using an omni-directional image, the method comprising:
-
acquiring the omni-directional image from a robot; extracting a predetermined current line from the acquired omni-directional image; calculating a correlation coefficient between the extracted current line of the robot and each pre-stored landmark line of nodes corresponding to locations in a robot-locatable area using a Fast Fourier Transform (FFT); selecting M nodes at which the calculated correlation coefficient is equal to or higher than a predetermined value; modifying a current line of the robot such that same spatial objects are located at a same location on a basis of the landmark lines of the selected M nodes to create a wrapped current line; calculating a correlation coefficient between the wrapped current line of the robot and the landmark lines of the M nodes; selecting N nodes at which the calculated correlation coefficient is equal to or higher than a predetermined correlation coefficient such that N<
M; andrecognizing a location of the robot on a basis of the selected N nodes, wherein the pre-stored landmark lines of the nodes are lines pre-extracted from omni-directional images acquired when the robot is located at the nodes in the robot-locatable area. - View Dependent Claims (2, 3, 4, 5)
-
-
6. A method to locate a robot with an omni directional camera mounted thereon, the method comprising:
-
setting a number landmark lines for a predetermined number of nodes corresponding to locations within a robot-locatable area; acquiring an omni directional image from the robot; extracting a current line from the acquired image; calculating a correlation coefficient between the current line and the landmark lines for each node; selecting a first number of nodes at which the calculated correlation coefficient is equal to or higher than a predetermined value; modifying the current line into a plurality of wrapped current lines such that spatial objects in each wrapped current line are at the same location with corresponding spatial objects in each of the landmark lines for the first number of nodes; calculating a second correlation coefficient between the wrapped current lines and the landmark lines of the first number of nodes; selecting a second number of nodes among the first number of nodes at which the second calculated correlation coefficient is equal to or higher than a second predetermined value; and recognizing a location of the robot on the basis of the selected second number of nodes; and determining a location of the robot based on the calculated correlation coefficient, wherein the landmark lines for the predetermined number of nodes are lines pre-extracted from omni-directional images acquired when the robot is located at the predetermined number of nodes in the robot-locatable area. - View Dependent Claims (7, 8, 9, 10)
-
-
11. A robot locating apparatus, comprising:
-
a robot body; an omni-directional camera mounted on the robot body to acquire an omni-directional image; and a controller to extract a predetermined current line from the acquired omni-directional image, to calculate a correlation coefficient between the extracted current line of the robot and each pre-stored landmark line of nodes corresponding to locations of a robot locatable area using a Fast Fourier Transform (FFT), to select M nodes at which the calculated correlation coefficient is equal to or higher than a predetermined value, to modify the current line of the robot such that same spatial objects are located at a same location on both the current line and the landmark lines on a basis of the landmark lines of the selected M nodes to create a wrapped current line, to calculate a correlation coefficient between the wrapped current line of the robot and the landmark lines of the M nodes, to select N nodes at which the calculated correlation coefficient is equal to or higher than a predetermined correlation coefficient such that N<
M, and to recognize a location of the robot based on the selected N nodes,wherein the pre-stored landmark lines of the nodes are lines pre-extracted from omni-directional images acquired when the robot is located at the nodes in the robot-locatable area. - View Dependent Claims (12)
-
Specification