Interfacing with a mobile telepresence robot
First Claim
1. A telepresence robot system local terminal comprising:
- an electronic display;
a processor in communication with the electronic display; and
a memory in communication with the processor, the memory comprising instructions executable by the processor configured to cause the processor to;
receive at least a portion of a plan view map representative of robot-navigable areas of a robot operating surface;
receive at least one of a plurality of tags, each of the plurality of tags comprising tag coordinates and tag information, wherein the tag coordinates describe a relative location of the tag and wherein the tag information includes a robot action modifier;
receive a video feed from an imaging system of a remote telepresence robot;
receive positioning information associated with a current position of the remote telepresence robot;
display the video feed from the imaging system of the remote telepresence robot;
display a rendition of the tag information of the at least one tag on the video feed using the tag coordinates; and
transmit a command to the remote telepresence robot.
12 Assignments
0 Petitions
Accused Products
Abstract
A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device.
-
Citations
44 Claims
-
1. A telepresence robot system local terminal comprising:
-
an electronic display; a processor in communication with the electronic display; and a memory in communication with the processor, the memory comprising instructions executable by the processor configured to cause the processor to; receive at least a portion of a plan view map representative of robot-navigable areas of a robot operating surface; receive at least one of a plurality of tags, each of the plurality of tags comprising tag coordinates and tag information, wherein the tag coordinates describe a relative location of the tag and wherein the tag information includes a robot action modifier; receive a video feed from an imaging system of a remote telepresence robot; receive positioning information associated with a current position of the remote telepresence robot; display the video feed from the imaging system of the remote telepresence robot; display a rendition of the tag information of the at least one tag on the video feed using the tag coordinates; and transmit a command to the remote telepresence robot. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43)
wherein the sequence of coordinates forming the path is provided by the user input device. -
11. The telepresence robot system of claim 9, wherein the sequence of coordinates forming the path is provided by the remote telepresence robot.
-
12. The telepresence robot system of claim 1, further comprising a communication system configured to facilitate communication between the telepresence robot system local terminal and the remote telepresence robot.
-
13. The telepresence robot system of claim 1, wherein the local terminal further comprises at least one user input device, and
wherein the user input device is configured to allow a user to provide an indication of a desired destination of the remote telepresence robot on at least one of the plan view map and the video feed from the imaging system of the remote telepresence robot, and wherein the command transmitted to the remote telepresence robot comprises the desired destination. -
14. The telepresence robot system of claim 9, wherein the sequence of coordinates forming the robot path is based at least in part on tagging information associated with the at least one tag.
-
15. The telepresence robot system of claim 13, wherein the instructions executable by the processor are further configured to cause the processor to:
-
determine a sequence of coordinates relative to the plan view map to create a robot path between the current position of the remote telepresence robot and the desired destination of the remote telepresence robot, and wherein the command transmitted to the remote telepresence robot comprises the sequence of coordinates forming the robot path.
-
-
16. The telepresence robot system of claim 15, wherein the instructions executable by the processor are further configured to cause the processor to display the sequence of coordinates forming the robot path overlaid on the plan view map.
-
17. The telepresence robot system of claim 15, wherein the instructions executable by the processor are further configured to cause the processor to:
-
determine a distortion between the plan view map and the video feed received from the imaging system of the remote telepresence robot; apply the distortion to the sequence of coordinates forming the robot path to determine corresponding video coordinates and perspective data describing a location and perspective of the sequence of coordinates relative to the video feed; and display a three-dimensional rendition of the sequence of coordinates forming the robot path overlaid on the video feed.
-
-
18. The telepresence robot system of claim 17, wherein the three-dimensional rendition of the sequence of coordinates forming the robot path is overlaid on the video feed with respect to a floor detected in the video feed.
-
19. The telepresence robot system of claim 1, wherein the instructions executable by the processor are further configured to cause the processor to:
-
receive a sequence of coordinates relative to the plan view map from a navigation system of the remote telepresence robot, the sequence of coordinates forming a robot path between the current position of the remote telepresence robot and a desired destination of the remote telepresence robot; and display the sequence of coordinates forming the robot path overlaid on the plan view map.
-
-
20. The telepresence robot system of claim 19, wherein the instructions executable by the processor are further configured to cause the processor to:
-
determine a distortion between the plan view map and the video feed received from the imaging system of the remote telepresence robot; apply the distortion to the sequence of coordinates forming the robot path to determine corresponding video coordinates and perspective data describing the location and perspective of the sequence of coordinates relative to the video feed; and display a three-dimensional rendition of the sequence of coordinates forming the robot path overlaid on the video feed.
-
-
21. The telepresence robot system of claim 20, wherein the three-dimensional rendition of the sequence of coordinates forming the robot path is overlaid on the video feed with respect to a floor detected in the video feed.
-
22. The telepresence robot system of claim 1, wherein the tag information relates to at least one of a position, a path, and a volume, and wherein the control system is configured to execute an action relative to the at least one of the position, the path, and the volume.
-
23. The telepresence robot system of claim 1, wherein the instructions executable by the processor are further configured to cause the processor to receive coordinates on the plan view map of an obstacle detected by a sensor system of the remote telepresence robot.
-
24. The telepresence robot system of claim 1, wherein the plan view map and the plurality of tags are stored remotely from the local terminal.
-
25. The telepresence robot system of claim 23, wherein the plan view map and the plurality of tags are stored within the remote telepresence robot.
-
26. The telepresence robot system of claim 1, wherein the instructions executable by the processor are further configured to cause the processor to:
-
determine a distortion between the plan view map and the video feed received from the imaging system of the remote telepresence robot; and generate a hybrid map view comprising a blended view of the plan view map and the video feed from the imaging system of the remote telepresence robot.
-
-
27. The telepresence robot system of claim 26, wherein the hybrid map view comprises a three-dimensional representation of the plan view map overlaid on the video feed.
-
28. The telepresence robot system of claim 1, wherein the telepresence robot system local terminal further comprises at least one user input device, and
wherein the instructions executable by the processor are further configured to cause the processor to: -
receive a request via the at least one input device for a rendered look ahead for a virtual location of the remote telepresence robot on the plan view map; determine a distortion between the plan view map and the video feed received from the imaging system of the remote telepresence robot; generate a virtual three-dimensional video feed based on a virtual location of the remote telepresence robot; and display the virtual three-dimensional video feed based on the virtual location of the remote telepresence robot.
-
-
29. The telepresence robot system of claim 1, wherein the tag information of the at least one tag comprises a set of coordinates with respect to the plan view map defining a protected region, and
wherein the tag information of the at least one tag is configured to indicate the presence of the protected region. -
30. The telepresence robot system of claim 1, wherein the instructions executable by the processor are further configured to cause the processor to:
receive a request to create a new tag; associate tag coordinates describing a relative location of the new tag and tag information with the new tag; and display a rendition of the tag information of the new tag on the video feed using the tag coordinates.
-
31. The telepresence robot system of claim 30, wherein the request to create the new tag is generated by the remote telepresence robot.
-
32. The telepresence robot system of claim 30, wherein the request to create the new tag is automatically generated based on a detected object in the video feed.
-
33. The telepresence robot system of claim 32, wherein the new tag is a temporary tag configured to expire once the detected object is no longer present in the video feed.
-
34. The telepresence robot system of claim 32, wherein the object is a person and the tag information of the new tag comprises identification information associated with the person.
-
35. The telepresence robot system of claim 32, wherein the object is a person and the tag information of the new tag comprises potential actions the remote telepresence robot can execute with respect to the person.
-
36. The telepresence robot system of claim 30, wherein the request to create the new tag is generated by a user input device in communication with the telepresence robot system local terminal.
-
37. The telepresence robot system of claim 30, wherein the request to create the new tag is made with respect to the video feed.
-
38. The telepresence robot system of claim 30, wherein the request to create the new tag is made with respect to the plan view map.
-
39. The telepresence robot system of claim 30, wherein the request to create a new tag is made with respect to the current position of the remote telepresence robot.
-
40. The telepresence robot system of claim 30, wherein the tag information relates to at least one of a position, a path, and a volume, and wherein the control system is configured to execute an action relative to the at least one of the position, the path, and the volume.
-
41. The telepresence robot system of claim 1, wherein the instructions executable by the processor are further configured to cause the processor to display the plan view map with an indication of the current position of the telepresence robot on the plan view map.
-
42. The telepresence robot system of claim 41, wherein the instructions executable by the processor are further configured to cause the processor to display a rendition of the tag information on the plan view map using the tag coordinates.
-
43. The telepresence robot system of claim 1, wherein the tag information comprises a tag annotation, and wherein displaying a rendition of the tag information comprises displaying a rendition of the tag annotation.
-
-
44. A method for controlling a remote telepresence robot comprising:
-
receiving at least a portion of a plan view map representative of robot-navigable areas of a robot operating surface; receiving at least one of a plurality of tags, each of the plurality of tags comprising tag coordinates and tag information, wherein the tag coordinates describe a relative location of the tag and wherein the tag information includes a robot action modifier; receiving a video feed from an imaging system of a remote telepresence robot; receiving positioning information associated with a current position of the remote telepresence robot; displaying, via an electronic display, the video feed from the imaging system of the remote telepresence robot; displaying, via the electronic display, a rendition of the tag information of the at least one tag on the video feed using the tag coordinates; and transmitting a command to the remote telepresence robot.
-
Specification