Method and apparatus for generating meta data of content
First Claim
1. A method of generating emotional information in relation to content, the method comprising:
- acquiring a user image while the content is reproduced;
extracting facial information from the user image;
generating the emotional information in relation to the content, by comparing the facial information with facial expression template information, wherein the emotional information comprises a quantitative measure of a strength of an emotion corresponding to an emotion type; and
transmitting the emotional information to a server which provides the content, whereinmultiple user images of a same user in each scene are acquired, andthe generating the emotional information comprises;
extracting a plurality of pieces of the facial information from the user images;
generating pieces of the emotional information corresponding to each piece of the facial information;
identifying the emotion type in each piece of the generated emotional information; and
generating a representative type of the emotion in each scene based on a number of times the emotion type is identified in corresponding pieces of the generated emotional information.
1 Assignment
0 Petitions
Accused Products
Abstract
A method and apparatus are provided for generating emotional information including a user'"'"'s impressions in relation to multimedia content or meta data regarding the emotional information, and a computer readable recording medium storing the method. The meta data generating method includes receiving emotional information in relation to the content from at least one client system which receives and reproduces the content; generating meta data for an emotion using the emotional information; and coupling the meta data for the emotion to the content. Accordingly, it is possible to automatically acquire emotional information by using the facial expression of a user who is appreciating multimedia content, and use the emotional information as meta data.
40 Citations
13 Claims
-
1. A method of generating emotional information in relation to content, the method comprising:
-
acquiring a user image while the content is reproduced; extracting facial information from the user image; generating the emotional information in relation to the content, by comparing the facial information with facial expression template information, wherein the emotional information comprises a quantitative measure of a strength of an emotion corresponding to an emotion type; and transmitting the emotional information to a server which provides the content, wherein multiple user images of a same user in each scene are acquired, and the generating the emotional information comprises; extracting a plurality of pieces of the facial information from the user images; generating pieces of the emotional information corresponding to each piece of the facial information; identifying the emotion type in each piece of the generated emotional information; and generating a representative type of the emotion in each scene based on a number of times the emotion type is identified in corresponding pieces of the generated emotional information. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A non-transitory computer readable recording medium storing a computer program for performing a method of generating emotional information in relation to content, the method comprising:
-
acquiring a user image while the content is reproduced; extracting facial information from the user image; generating the emotional information in relation to the content by comparing the facial information with facial expression template information, wherein the emotional information comprises a quantitative measure of a strength of an emotion corresponding to an emotion type; and transmitting the emotional information to a server which provides the content, wherein multiple user images of a same user in each scene are acquired, and the generating the emotional information comprises; extracting a plurality of pieces of the facial information from the user images; generating pieces of the emotional information corresponding to each piece of the facial information; identifying the emotion type in each piece of the generated emotional information; and generating a representative type of the emotion in each scene based on a number of times the emotion type is identified in corresponding pieces of the generated emotional information.
-
-
8. An apparatus for generating emotional information in relation to content, the apparatus comprising:
-
an image input unit which acquires a user image while the content is reproduced; a facial image extraction unit which extracts facial information from the user image; an emotional information generating unit which compares the facial information with facial expression template information, and generates the emotional information in relation to the content based on a result of the comparison, wherein the emotional information comprises a quantitative measure of a strength of an emotion corresponding to an emotion type; and a communication unit which transmits the emotional information to a server which provides the content, wherein multiple user images of a same user in each scene are acquired, and the emotional information generating unit; extracts a plurality of pieces of the facial information from the user images; generates pieces of the emotional information corresponding to each piece of the facial information; identifies the emotion type in each piece of the generated emotional information; and generates a representative type of the emotion in each scene based on a number of times the emotion type is identified in corresponding pieces of the generated emotional information. - View Dependent Claims (9, 10, 11, 12, 13)
-
Specification