Voice Outputting Method, Voice Interaction Method and Electronic Device
First Claim
1. A voice output method applied in an electronic device, characterized in that, the method comprises:
- acquiring a first content to be output;
analyzing the first content to be output to acquire a first emotion information for expressing the emotion carried by the first content to be output;
acquiring a first voice data to be output corresponding to the first content to be output;
processing the first voice data to be output based on the first emotion information to generate a second voice data to be output with a second emotion information, wherein the second emotion information is used to express the emotion of the electronic device outputting the second voice data to be output to enable the user to acquire the emotion of the electronic device, and wherein the first emotion information and the second emotion information are matched to/correlated to each other;
outputting the second voice data to be output.
1 Assignment
0 Petitions
Accused Products
Abstract
A voice outputting method, a voice interaction method and an electronic device are described The method includes acquiring a first content to be output; analyzing the first content to acquire a first emotion information for expressing the emotion carried by the first content to be output; acquiring a first voice data to be output corresponding to the first content; processing the first voice data to be output based on the first emotion information to generate a second voice data to be output with a second emotion information, wherein the second emotion information is used to express the emotion of the electronic device outputting the second voice data to be output to enable the user to acquire the emotion of the electronic device, and wherein the first and the second emotion information are matched to and/or correlated to each other; outputting the second voice data to be output.
43 Citations
17 Claims
-
1. A voice output method applied in an electronic device, characterized in that, the method comprises:
-
acquiring a first content to be output; analyzing the first content to be output to acquire a first emotion information for expressing the emotion carried by the first content to be output; acquiring a first voice data to be output corresponding to the first content to be output; processing the first voice data to be output based on the first emotion information to generate a second voice data to be output with a second emotion information, wherein the second emotion information is used to express the emotion of the electronic device outputting the second voice data to be output to enable the user to acquire the emotion of the electronic device, and wherein the first emotion information and the second emotion information are matched to/correlated to each other; outputting the second voice data to be output. - View Dependent Claims (2, 3, 4)
-
-
5. A voice interaction method applied in an electronic device, characterized in that, the method comprises:
-
receiving a first voice data input by a user; analyzing the first voice data to acquire a first emotion information, wherein the first emotion information is used to express the emotion of the user when the user input the first voice data; acquiring a first response voice data with respect to the first voice data; processing the first response voice data based on the first emotion information to generate a second response voice data with a second emotion information;
the second emotion information is used to express the emotion of the electronic device outputting the second voice data to be output to enable the user to acquire the emotion of the electronic device, and wherein the first emotion information and the second emotion information are matched to/correlated to each other;outputting the second response voice data. - View Dependent Claims (6, 7, 8, 9)
-
-
10. An electronic device, characterized in that, the electronic device comprises:
-
a circuit board; an acquiring unit electrically connected to the circuit board for acquiring a first content to be output; a processing chip set on the circuit board for analyzing the first content to be output to acquire a first emotion information for expressing the emotion carried by the first content to be output;
acquiring a first voice data to be output corresponding to the first content to be output;
processing the first voice data to be output based on the first emotion information to generate a second voice data to be output with a second emotion information, wherein the second emotion information is used to express the emotion of the electronic device outputting the second voice data to be output to enable the user to acquire the emotion of the electronic device, and wherein the first emotion information and the second emotion information are matched to/correlated to each other;an output unit electrically connected to the processing chip 303 for outputting the second voice data to be output. - View Dependent Claims (11, 12)
-
-
13. An electronic device, characterized in that, the electronic device comprises:
-
a circuit board; a voice receiving unit electrically connected to the circuit board for receiving a first voice input of a user; a processing chip set on the circuit board for analyzing the first voice data to acquire a first emotion information, wherein the first emotion information is used to express the emotion of the user when the user input the first voice data;
acquiring a first response voice data with respect to the first voice data;
processing the first response voice data based on the first emotion information to generate a second response voice with a second emotion information;
the second emotion information is used to express the emotion of the electronic device outputting the second voice data to be output to enable the user to acquire the emotion of the electronic device, and wherein the first emotion information and the second emotion information are matched to/correlated to each other;an output unit electrically connected to the processing chip for outputting the second response voice data. - View Dependent Claims (14, 15, 16, 17)
-
Specification