System for operating mobile robot based on complex map information and operating method thereof
First Claim
Patent Images
1. A mobile robot comprising:
- a memory; and
a controller,wherein the controller selects cleaning area information stored in the memory, recognizes a position of the mobile robot within a cleaning area, and causes the mobile robot to clean the cleaning area based on the selected cleaning area information and the recognized position using feature point maps including feature point information for the cleaning area,wherein the controller obtains images of the cleaning area while the mobile robot moves from a first position to a second position within the cleaning area, and extracts feature point information from the obtained images to configure a temporary feature point map,wherein the controller calculates odometry information using inertia information obtained during the movement of the mobile robot from the first position to the second position, and predicts a movement point based on the calculated odometry information,wherein the controller compares the temporary feature point map with feature point maps associated with the predicted movement point and a vicinity of the predicted movement point without comparing the temporary feature point map with all feature point maps which constitute a loaded composite of cleaning area information, andwherein the controller determines, based on the comparison of the temporary feature point map with the feature point maps associated with the predicted movement point and a vicinity of the predicted movement point, a position having a highest matching rate as the second position.
1 Assignment
0 Petitions
Accused Products
Abstract
Disclosed are a system for operating a mobile robot based on cleaning area information and a method thereof. A mobile robot based on cleaning area information according to an exemplary embodiment of the present invention includes a memory which stores a plurality of cleaning area information in which at least a part of a cleaning available area is changed; and a controller which controls to select one cleaning area information among the plurality of stored cleaning area information, recognize a position on a cleaning area map which configures the selected cleaning area information and perform cleaning on the cleaning available area from the recognized position.
-
Citations
11 Claims
-
1. A mobile robot comprising:
-
a memory; and a controller, wherein the controller selects cleaning area information stored in the memory, recognizes a position of the mobile robot within a cleaning area, and causes the mobile robot to clean the cleaning area based on the selected cleaning area information and the recognized position using feature point maps including feature point information for the cleaning area, wherein the controller obtains images of the cleaning area while the mobile robot moves from a first position to a second position within the cleaning area, and extracts feature point information from the obtained images to configure a temporary feature point map, wherein the controller calculates odometry information using inertia information obtained during the movement of the mobile robot from the first position to the second position, and predicts a movement point based on the calculated odometry information, wherein the controller compares the temporary feature point map with feature point maps associated with the predicted movement point and a vicinity of the predicted movement point without comparing the temporary feature point map with all feature point maps which constitute a loaded composite of cleaning area information, and wherein the controller determines, based on the comparison of the temporary feature point map with the feature point maps associated with the predicted movement point and a vicinity of the predicted movement point, a position having a highest matching rate as the second position. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A mobile robot system, the system comprising:
-
a mobile robot; and a mobile application for use with a user equipment to cause the user equipment to transmit a control command to the mobile robot through a wireless router and receive information on an area where cleaning is completed and cleaning area information while the mobile robot performs cleaning based on previously stored cleaning area information, wherein the mobile robot obtains images of the cleaning area while the mobile robot moves from a first position to a second position within the cleaning area, and extracts feature point information from the obtained images to determine a temporary feature point map, wherein the mobile robot calculates odometry information using inertia information obtained during the movement of the mobile robot from the first position to the second position, and predicts a movement point based on the calculated odometry information, wherein the mobile robot compares the temporary feature point map with feature point maps associated with the predicted movement point and a vicinity of the predicted movement point without comparing the temporary feature point map with all feature point maps which constitute a loaded composite of cleaning area information, wherein the controller determines, based on the comparison of the temporary feature point map with the feature point maps associated with the predicted movement point and a vicinity of the predicted movement point, a position having a highest matching rate as the second position, and wherein the mobile application receives cleaning area information including the second position. - View Dependent Claims (8, 9)
-
-
10. A method performed by a mobile robot, the method comprising:
-
selecting one cleaning area information among a plurality of previously stored cleaning area information according to a control command; recognizing a position on a cleaning area map which configures the selected cleaning area information; and updating the cleaning area map by an area where the cleaning is completed as a result of performing cleaning on a cleaning available area in the cleaning area map from the recognized position using feature point maps including feature point information for a cleaning area, wherein the selecting includes obtaining images of the cleaning area while the mobile robot moves from a first position to a second position within the cleaning area, and extracting feature point information from the obtained images to configure a temporary feature point map, wherein the recognizing includes calculating odometry information using inertia information obtained during the movement of the mobile robot from the first position to the second position, and predicting a movement point based on the calculated odometry information, wherein the recognizing includes the mobile robot comparing the temporary feature point map with feature point maps associated with the predicted movement point and a vicinity of the predicted movement point without comparing the temporary feature point map with all feature point maps which constitute a loaded composite of cleaning area information, and wherein the recognizing includes the mobile robot determining, based on the comparison of the temporary feature point map with the feature point maps associated with the predicted movement point and a vicinity of the predicted movement point, a position having a highest matching rate as the second position. - View Dependent Claims (11)
-
Specification