Robotic floor-cleaning system manager
First Claim
1. A tangible, non-transitory computer-readable media storing instructions that, when executed by one or more processors, effectuate operations comprising:
- obtaining, on a user computing device, via the wireless network communication, a map including a room to be cleaned by a robot, wherein;
the map is based on data from a sensor of the robot,the map includes two spatial dimensions of the room,the map is obtained with simultaneous localization and mapping by the robot,the map includes boundaries of a wall of the room and boundaries of furniture in the room sensed by the robot, andat least some of the boundaries designate virtual barriers the robot is configured to avoid crossing when navigating;
presenting, with the user computing device, the map in a user interface;
receiving, with the user computing device, via the user interface, a specification of a boundary, the specified boundary being a selection of an existing boundary of the map or a definition of a boundary to be added to the map, wherein receiving the specification of the boundary comprises determining that the user selected a given boundary of, or to be added to, the map based on location on a touchscreen of the user computing device touched by a user;
receiving, with the user computing device, via the user interface, an adjustment to the map based on the selected boundary, wherein receiving the adjustment comprises receiving a user input indicating the given boundary is to be shifted in one of at least four candidate directions supported by the user interface including;
up, down, left, and right;
after receiving the adjustment, sending, from the user computing device, via the wireless network, instructions that cause the robot to obtain a version of the map that incorporates the received adjustment;
wherein;
the version of the map that incorporates the received adjustment is used to at least partially navigate the robot and to instruct the robot to clean at least part of the room with a cleaning tool;
receiving, with the user computing device, via the user interface, a selection of an area of the room; and
presenting, after receiving the selection of the area of the room, a plurality of robotic operations to be executed in the selected the area, the robot operations comprising at least the following;
turning on a mopping tool;
turning off a mopping tool;
turning on an ultraviolet light tool;
turning off an ultraviolet light tool;
turning on a suction tool;
turning off a suction tool;
turning on an automatic shutoff timer;
turning off an automatic shutoff timer;
increasing a robot speed;
decreasing the robot speed;
driving the robot to a user-identified location;
turning the robot;
driving the robot forward or backward; and
commencing a series of movements of the robot in a pattern;
receiving a selection or selections of some of the robotic operations with the user interface;
executing, with the robot, the selected robotic operations responsive to both the selection of operations and the selection of the area;
receiving, via the user interface, while the robot is cleaning, a request for the robot to adapt a route; and
causing, responsive to the request, in real-time, the robot to adapt to the route in response to the request.
0 Assignments
0 Petitions
Accused Products
Abstract
A method for instructing operation of a robotic floor-cleaning device based on the position of the robotic floor-cleaning device within a two-dimensional map of the workspace. A two-dimensional map of a workspace is generated using inputs from sensors positioned on a robotic floor-cleaning device to represent the multi-dimensional workspace of the robotic floor-cleaning device. The two-dimensional map is provided to a user on a user interface. A user may adjust the boundaries of the two-dimensional map through the user interface and select settings for map areas to control device operation in various areas of the workspace.
44 Citations
16 Claims
-
1. A tangible, non-transitory computer-readable media storing instructions that, when executed by one or more processors, effectuate operations comprising:
-
obtaining, on a user computing device, via the wireless network communication, a map including a room to be cleaned by a robot, wherein; the map is based on data from a sensor of the robot, the map includes two spatial dimensions of the room, the map is obtained with simultaneous localization and mapping by the robot, the map includes boundaries of a wall of the room and boundaries of furniture in the room sensed by the robot, and at least some of the boundaries designate virtual barriers the robot is configured to avoid crossing when navigating; presenting, with the user computing device, the map in a user interface; receiving, with the user computing device, via the user interface, a specification of a boundary, the specified boundary being a selection of an existing boundary of the map or a definition of a boundary to be added to the map, wherein receiving the specification of the boundary comprises determining that the user selected a given boundary of, or to be added to, the map based on location on a touchscreen of the user computing device touched by a user; receiving, with the user computing device, via the user interface, an adjustment to the map based on the selected boundary, wherein receiving the adjustment comprises receiving a user input indicating the given boundary is to be shifted in one of at least four candidate directions supported by the user interface including;
up, down, left, and right;after receiving the adjustment, sending, from the user computing device, via the wireless network, instructions that cause the robot to obtain a version of the map that incorporates the received adjustment;
wherein;the version of the map that incorporates the received adjustment is used to at least partially navigate the robot and to instruct the robot to clean at least part of the room with a cleaning tool; receiving, with the user computing device, via the user interface, a selection of an area of the room; and presenting, after receiving the selection of the area of the room, a plurality of robotic operations to be executed in the selected the area, the robot operations comprising at least the following; turning on a mopping tool; turning off a mopping tool; turning on an ultraviolet light tool; turning off an ultraviolet light tool; turning on a suction tool; turning off a suction tool; turning on an automatic shutoff timer; turning off an automatic shutoff timer; increasing a robot speed; decreasing the robot speed; driving the robot to a user-identified location; turning the robot; driving the robot forward or backward; and commencing a series of movements of the robot in a pattern; receiving a selection or selections of some of the robotic operations with the user interface; executing, with the robot, the selected robotic operations responsive to both the selection of operations and the selection of the area; receiving, via the user interface, while the robot is cleaning, a request for the robot to adapt a route; and causing, responsive to the request, in real-time, the robot to adapt to the route in response to the request. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
Specification