Systems and Methods for Enhancing Responsiveness to Utterances Having Detectable Emotion
First Claim
1. A method comprising:
- receiving, via a user device, a natural utterance;
converting the natural utterance into text;
extracting a textual emotion feature from the text;
extracting a non-textual emotion feature from the natural utterance, the textual emotion feature and the non-textual emotion feature corresponding to conflicting first and second commands;
selecting either the non-textual emotion feature or the textual emotion feature, thereby providing a selected emotion feature;
mapping the selected emotion feature to an emotion; and
responding to the natural utterance by providing a verbalized acknowledgement of the natural utterance adapted to the emotion and performing an action corresponding to the command associated with the selected emotion feature.
1 Assignment
0 Petitions
Accused Products
Abstract
Methods, systems, and related products that provide emotion-sensitive responses to user'"'"'s commands and other utterances received at an utterance-based user interface. Acknowledgements of user'"'"'s utterances are adapted to the user and/or the user device, and emotions detected in the user'"'"'s utterance that have been mapped from one or more emotion features extracted from the utterance. In some examples, extraction of a user'"'"'s changing emotion during a sequence of interactions is used to generate a response to a user'"'"'s uttered command. In some examples, emotion processing and command processing of natural utterances are performed asynchronously.
2 Citations
17 Claims
-
1. A method comprising:
-
receiving, via a user device, a natural utterance; converting the natural utterance into text; extracting a textual emotion feature from the text; extracting a non-textual emotion feature from the natural utterance, the textual emotion feature and the non-textual emotion feature corresponding to conflicting first and second commands; selecting either the non-textual emotion feature or the textual emotion feature, thereby providing a selected emotion feature; mapping the selected emotion feature to an emotion; and responding to the natural utterance by providing a verbalized acknowledgement of the natural utterance adapted to the emotion and performing an action corresponding to the command associated with the selected emotion feature. - View Dependent Claims (3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
2. (canceled)
-
13-14. -14. (canceled)
-
15. A non-transitory computer readable medium comprising:
-
an emotion processor having one or more sequences of emotion processor instructions that, when executed by one or more processors, causes the one or more processors to generate an output adapted to an emotion detected in a natural utterance; and one or more sequences of instructions which, when executed by one or more processors, cause the one or processors to; receive the natural utterance, convert the natural utterance into text; extract a textual emotion feature from the text; extract a non-textual emotion feature from the natural utterance, the textual emotion feature and the non-textual emotion feature corresponding to conflicting first and second commands; select either the non-textual emotion feature or the textual emotion feature, thereby providing a selected emotion feature; map the selected emotion feature to an emotion; and respond to the natural utterance by providing a verbalized acknowledgement adapted to the emotion and performing an action corresponding to the command associated with the selected emotion feature.
-
-
16. A method comprising:
-
receiving, via a user device, a natural utterance, the natural utterance including a command and at least one emotion feature; extracting the command; extracting the at least one emotion feature from the natural utterance; mapping each of the at least one emotion feature to an emotion; and responding to the natural utterance at least by providing a verbalized acknowledgement of the natural utterance adapted to the emotion, wherein a type of the verbalized acknowledgement is based on a type of the user device. - View Dependent Claims (17)
-
Specification