Affect-based robot communication methods and systems
First Claim
1. An actual, autonomous a robot comprising:
- a display device displaying a visual facial expression indicative of an internal state of the robot, the visual facial expression selected from a plurality of visual facial expressions.
2 Assignments
0 Petitions
Accused Products
Abstract
An affect-based method of communication between robots is provided by displaying a visual facial expression indicative of a simulated emotional state on a display device of a first robot, and viewing the visual facial expression using a camera on a second robot. The simulated emotional state may be one of happiness, anger, or sadness, for example. The second robot determines the simulated emotional state based upon the visual facial expression. The second robot processes the simulated emotional state to redefine its own simulated emotional state, and to display a visual facial expression indicative thereof. The visual facial expression allows a human observer to discern the simulated emotional state of the robot. Optionally, the robots further communicate affect using audio tones.
-
Citations
70 Claims
-
1. An actual, autonomous a robot comprising:
a display device displaying a visual facial expression indicative of an internal state of the robot, the visual facial expression selected from a plurality of visual facial expressions. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
17. An actual robot comprising:
-
a processor; and a camera in communication with the processor to view a visual facial expression from a like robot, the visual facial expression being indicative of an internal state of the like robot, wherein the processor determines the internal state of the like robot from the visual facial expression. - View Dependent Claims (18, 19)
-
-
20. An actual, autonomous robot comprising:
-
a processor which executes a sequence of program steps to define a first internal state of the robot; and a display device in communication with the processor, the display device displaying a first visual facial expression indicative of the first internal state of the robot, the first visual facial expression selected from a plurality of visual face expressions. - View Dependent Claims (21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35)
-
-
36. A method of operating a robot having an actual, autonomous display device, the method comprising:
displaying a visual facial expression on the display device indicative of an internal state of the robot, the visual facial expression selected from a plurality of visual facial expressions. - View Dependent Claims (37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51)
-
52. A method of operating a robot having an actual processor and a camera, the method comprising:
-
viewing a visual facial expression from a like robot using the camera, the visual facial expression being indicative of an internal state of the like robot; and determining the internal state of the like robot from the visual facial expression using the processor. - View Dependent Claims (53, 54)
-
-
55. A method of operating a robot having an actual, autonomus processor and a display device, the method comprising:
-
executing a sequence of program steps using the processor to define a first internal state of the robot; and displaying a first visual facial expression on the display device, the first visual facial expression indicative of the first internal state of the robot, the first visual facial expression selected from a plurality of visual facial expressions. - View Dependent Claims (56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70)
-
Specification