Control system for virtual characters
First Claim
Patent Images
1. A system for controlling a virtual character, comprising:
- one or more processors and memory located at an interactor node, the memory including a stored library comprising a plurality of facial expressions, a plurality of poses, and a plurality of behaviors associated with a virtual character; and
an input device in communication with the one or more processors and comprising input elements mapped to the plurality of facial expressions, the plurality of poses, and the plurality of behaviors associated with the virtual character, the input elements selectable by a user to select a facial expression, a pose, and a behavior for the virtual character;
wherein the one or more processors are operable, via machine-readable instructions stored in the memory, to;
display a menu on a display device located at the interactor node, the menu listing selections from one of the plurality of facial expressions, the plurality of poses, and the plurality of behaviors associated with a virtual character,receive from the input device at the interactor node one or more of a user-selected facial expression, a user-selected pose, and a user-selected behavior for the virtual character selected from the menu via the mapped input elements,provide in real time an animation of movement of the virtual character from a prior position to a position exhibiting the one or more user-selected facial expression, user-selected pose, and user-selected behavior, anddisplay in real time, on the display device located at the interactor node and on a further display device located at an end user node located remotely from the interactor node, the movement of the virtual character from the prior position to the position exhibiting the one or more user-selected facial expression, user-selected pose, and user-selected behavior.
3 Assignments
0 Petitions
Accused Products
Abstract
A control system provides an interface for virtual characters, or avatars, during live avatar-human interactions. A human interactor can select facial expressions, poses, and behaviors of the virtual character using an input device mapped to menus on a display device.
27 Citations
31 Claims
-
1. A system for controlling a virtual character, comprising:
-
one or more processors and memory located at an interactor node, the memory including a stored library comprising a plurality of facial expressions, a plurality of poses, and a plurality of behaviors associated with a virtual character; and an input device in communication with the one or more processors and comprising input elements mapped to the plurality of facial expressions, the plurality of poses, and the plurality of behaviors associated with the virtual character, the input elements selectable by a user to select a facial expression, a pose, and a behavior for the virtual character; wherein the one or more processors are operable, via machine-readable instructions stored in the memory, to; display a menu on a display device located at the interactor node, the menu listing selections from one of the plurality of facial expressions, the plurality of poses, and the plurality of behaviors associated with a virtual character, receive from the input device at the interactor node one or more of a user-selected facial expression, a user-selected pose, and a user-selected behavior for the virtual character selected from the menu via the mapped input elements, provide in real time an animation of movement of the virtual character from a prior position to a position exhibiting the one or more user-selected facial expression, user-selected pose, and user-selected behavior, and display in real time, on the display device located at the interactor node and on a further display device located at an end user node located remotely from the interactor node, the movement of the virtual character from the prior position to the position exhibiting the one or more user-selected facial expression, user-selected pose, and user-selected behavior. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22)
-
-
23. A method for controlling a virtual character, comprising:
-
displaying a virtual character on a display device located at an interactor node and on an end user display device located remotely from the interactor node; displaying a menu on the interactor node display device, the menu listing selections from one of a plurality of facial expressions, a plurality of poses, and a plurality of behaviors associated with the virtual character; at the interactor node including one or more processors and memory, receiving, from a user input device, one or more of a user-selected facial expression, a user-selected pose, and a user-selected behavior for the virtual character selected from the menu; providing an animation of movement of the virtual character from a prior position to a position exhibiting the one or more user-selected facial expression, user-selected pose, and user-selected behavior; and displaying, on the interactor node display device and the end user display device, the movement of the virtual character in real time from a prior position to a position exhibiting the one or more user-selected facial expression, user-selected pose, and user-selected behavior. - View Dependent Claims (24, 25, 26)
-
-
27. A method of creating a virtual character for a system for virtual character interaction, comprising:
-
creating, at a computer comprising one or more processors and a non-transitory computer-readable memory, a virtual character with an emotional and cognitive profile, further comprising; creating a set of facial expressions for the virtual character comprising capturing an image of each facial expression and storing each facial expression in a library of facial expressions in the non-transitory computer-readable memory; creating a set of poses for the virtual character comprising creating an animation for each pose from a sequence of video images and storing each animation in a library of poses in the non-transitory computer-readable memory; creating a set of behaviors for the virtual character comprising capturing each behavior with a motion capture system, separating each behavior into one or more motion clips, and storing each clip in a library of motions in the non-transitory computer-readable memory; and categorizing the facial expressions, poses, and behaviors into response states based on the emotional and cognitive profile of the virtual character and storing the response states in the non-transitory computer-readable memory, wherein the libraries and the responses states are stored in a configuration system of the computer in communication with an input system, and each facial expression, pose, and behavior is bound to a selection of the input system for selection by a user. - View Dependent Claims (28, 29, 30, 31)
-
Specification