Behavioral learning for a visual representation in a communication environment
First Claim
1. A method for adapting a behavior of an animated visual representation of a first user to the behavior of the first user:
- receiving data from the first user intended for communication to a second user;
determining whether the received data contains a text string and a gesture command associated with the text string; and
if so learning a behavioral rule for animating a visual representation of the first user based on the received data; and
animating the visual representation of the first user to the second user responsive to the learned behavioral rule.
1 Assignment
0 Petitions
Accused Products
Abstract
Utterances comprising text and behavioral movement commands entered by a user are processed to identify patterns of behavioral movements executed by the user'"'"'s visual representation. Once identified, the patterns are used to generate behavioral movements responsive to new utterances received from the user, without requiring the user to explicitly alter the behavioral characteristics selected by the user. An application module parses an utterance generated by a user to determine the presence of gesture commands. If a gesture command is found in an utterance, the utterance is stored for behavioral learning processing. A stored utterance is analyzed with existing stored utterances to determine if the stored utterances provide the basis for creating a new behavioral rule. Newly stored utterances are first analyzed to generate different contexts associated with the behavioral movement. To determine if any of the contexts should be used as a basis for a new behavioral rule in an embodiment in which contexts are stored, the contexts of the existing utterances in the log are compared with the new contexts. If a context appears in the log at a frequency above a threshold, then the context is used as the basis for a new behavioral rule. The new behavioral rule is then used to modify existing rules, or create more generally applicable rules. New general rules are not created unless the number of existing rules that could create a behavioral rule exceeds a threshold to control how persistent a user'"'"'s behavior must be to create a new rule.
240 Citations
27 Claims
-
1. A method for adapting a behavior of an animated visual representation of a first user to the behavior of the first user:
-
receiving data from the first user intended for communication to a second user;
determining whether the received data contains a text string and a gesture command associated with the text string; and
if solearning a behavioral rule for animating a visual representation of the first user based on the received data; and
animating the visual representation of the first user to the second user responsive to the learned behavioral rule. - View Dependent Claims (2, 5, 6, 7, 8, 9, 10, 11)
storing the received data in a set of received data strings; and
analyzing the set of received data strings to generate a new behavioral rule.
-
-
5. The method of claim 2 wherein analyzing the set of received data strings to generate a new behavioral rule comprises:
-
comparing the received text string and associated gesture command to stored text strings and associated gesture commands to determine a frequency of occurrence of the received text string and associated gesture command; and
responsive to the received text string and associated gesture command occurring at a frequency exceeding a predetermined threshold, generating a behavioral rule from the selected text string and associated gesture command.
-
-
6. The method of claim 5 wherein generating further comprises:
establishing a behavioral rule linking the text string to the associated gesture command.
-
7. The method of claim 5 wherein the generated behavioral rule contains a text string and an associated behavioral movement command, the method further comprising:
-
determining whether the text string of the generated behavioral rule is associated with an existing behavioral rule;
responsive to determining that the text string of the generated behavioral rule is associated with an existing behavioral rule, discarding the generated behavioral rule.
-
-
8. The method of claim 2 wherein analyzing the set of received data strings to generate a new behavioral rule comprises:
-
identifying a text category for the received text string;
identifying text strings belonging to the identified text category having associated gesture commands matching the associated gesture command of the received text string;
determining a number of text strings belonging to the identified text category having matching associated behavioral rules;
determining whether the identified text category is associated with an behavioral rule;
responsive to determining that the identified text category is not associated with an behavioral rule, determining whether the number of text strings belonging to the identified text category having matching associated behavioral rules exceeds a predetermined threshold; and
generating a new behavioral rule responsive to determining that the number of text strings belonging to the identified text category having matching associated behavioral rules exceeds a predetermined threshold.
-
-
9. The method of claim 8 wherein a behavioral rule has a behavioral movement field, and the behavioral movement of the selected text is used in the behavioral movement field of the generated behavioral rule.
-
10. The method of claim 8 wherein a behavioral rule has a text category field, and the identified text category is used as the text category for the text category field of the generated behavioral rule.
-
11. The method of claim 2 wherein the received data is parsed into contexts, wherein each context is a unique combination of text and a behavioral command, and wherein storing comprises storing contexts parsed from the received data, and analyzing further comprises analyzing the set of stored contexts.
-
3. A method of adapting a behavior of an animated visual representation of a first user to the behavior of the first user, comprising:
-
receiving data from the first user intended for communication to a second user;
learning a behavioral rule for animating a visual representation of the first user based on the received data, by;
determining whether the received data contain at least one text string;
determining whether the received data contains at least one gesture command;
responsive to the received data containing at least one text string and at least one gesture command, determining that the received data has content relevant to rule generation;
responsive to determining that the received data has content relevant to rule generation, storing the received data in a set of received data strings having content relevant to rule generation; and
analyzing the set of received data strings to generate a new behavioral rule; and
animating the visual representation of the first user to the second user responsive to the learned behavioral rule.- View Dependent Claims (4)
determining whether an existing behavioral rule contains the text string of the received data; and
responsive to an existing behavioral rule containing the text string of the received data, discarding the received data.
-
-
12. A method of generating behavioral rules for controlling animation of a visual representation of a user comprising:
-
receiving an utterance from a user, wherein an utterance comprises a text string and optionally a gesture command for controlling the animation of the visual representation;
parsing the received utterance to determine whether the utterance contains a gesture command;
responsive to the utterance containing a gesture command storing the received utterance in a set of previously stored utterances containing gesture commands;
analyzing the stored utterances to generate a new behavioral rule for controlling the animation of the user'"'"'s visual representation. - View Dependent Claims (13, 14, 15, 16, 17)
generating contexts from the received utterance, wherein contexts comprise text in combination with a gesture command;
storing the generated contexts in a set of previously stored contexts; and
wherein analyzing further comprises;
analyzing the stored contexts to generate a new behavioral rule for controlling the animation of the user'"'"'s visual representation.
-
-
14. The method of claim 13 wherein generating contexts further comprises:
-
parsing the utterance for unique combinations of text and at least one-behavioral movement; and
designating each unique combination of text and at least one behavioral movement as a context.
-
-
15. The method of claim 13 wherein analyzing the stored contexts further comprises:
-
comparing the behavioral movements of the stored utterances to determine a set of contexts having matching behavioral movements;
comparing the text associated with each context to determine a set of contexts having matching text;
identifying a number of contexts belonging to an intersection of the sets of contexts having matching behavioral movements and matching text; and
responsive to the determined number exceeding a predefined threshold, generating a new behavioral rule responsive to the text and behavioral movement of the identified contexts.
-
-
16. The method of claim 15 further comprising:
-
comparing the generated rule to existing behavioral rules associated with the user'"'"'s visual representation; and
responsive to determining that an existing behavioral rule matches the generated rule, discarding the generated rule.
-
-
17. The method of claim 13 wherein analyzing the stored contexts further comprises:
-
selecting a context generated from the received utterance;
determining whether the text associated with the selected context is associated with a text category;
responsive to determining that the text associated with the selected context is associated with a text category, determining whether any stored context has text associated with the same text category and has a matching associated behavioral movement;
responsive to identifying at least one stored context having text associated with the same text category and has a matching behavioral movement, determining whether the text category is associated with a behavioral rule; and
responsive to determining that the text category is not associated with a behavioral rule, generating a behavioral rule having fields responsive to the behavioral movement associated with the selected context and the text category.
-
-
18. A method of learning a behavioral rule for use in animating a visual representation of a first user to a second user, comprising:
-
receiving data from the first user intended for communication to the second user;
determining whether the received data contains a text string and a gesture command;
responsive to the received data containing a text string and a gesture command, learning a behavioral rule for animating a visual representation of the first user based on the received data. - View Dependent Claims (19, 20, 21, 22, 23, 24, 25, 26, 27)
storing the received data in a set of received data strings; and
analyzing the set of received data strings to generate a new behavioral rule.
-
-
21. The method of claim 20, wherein analyzing includes:
-
comparing the received text string and gesture command to stored text strings and associated gesture commands to determine a frequency of occurrence of the received text string and gesture command; and
if the frequency of occurrence exceeds a predetermined threshold, generating the new behavioral rule based on the received data.
-
-
22. The method of claim 21, wherein the new behavioral rule links the received gesture command to at least a portion of the received text string.
-
23. The method of claim 21, wherein the new behavioral rule includes a text string and an associated gesture command, the method further comprising:
determining whether the text string of the new behavioral rule is associated with an existing behavioral rule, and if so, discarding the new behavioral rule.
-
24. The method of claim 20, wherein analyzing includes:
-
identifying a text category for the received text string;
gesture identifying text strings belonging to the identified text category having associated gesture commands matching the received gesture command;
determining a number of the identified text strings having matching associated behavioral rules;
determining whether the identified text category is associated with a behavioral rule;
responsive to determining that the identified text category is not associated with a behavioral rule, determining whether the number of identified text strings exceeds a predetermined threshold; and
if the number exceeds the threshold, generating the new behavioral rule.
-
-
25. The method of claim 24, wherein a behavioral rule has a gesture field, and wherein the received gesture command is used in the gesture field of the generated behavioral rule.
-
26. The method of claim 24, wherein a behavioral rule has a text category field, and wherein the identified text category is used as the text category for the text category field of the generated behavioral rule.
-
27. The method of claim 20, wherein the received data is parsed into contexts, wherein each context is a unique combination of text and a gesture command, wherein storing comprises storing contexts parsed from the received data, and wherein analyzing comprises analyzing the set of stored contexts.
Specification