Annotating stimulus based on determined emotional response
First Claim
Patent Images
1. A method of indexing a stimulus comprising:
- measuring a plurality of facial expressions in an audience of the stimulus;
interpreting measurements of the facial expressions using a processor, wherein the processor interpreting the measurements produces a result estimating a mood for a time period of the stimulus, and wherein interpreting measurements of the facial expressions comprises;
classifying each of the facial expressions as corresponding to an emotion from a set of emotions; and
statistically analyzing the respective emotions corresponding to the plurality of facial expressions to estimate the mood;
generating an annotation of the stimulus, wherein the annotation is generated based on the mood estimated;
repeating the measuring, interpreting, and generating steps to produce a plurality of the annotations that respectively correspond to different time periods of the stimulus; and
indexing the annotations according to the respective moods, wherein the indexing provides a two-way linkage between the annotations and the time periods of the stimulus.
1 Assignment
0 Petitions
Accused Products
Abstract
A method of annotating audio-visual data is disclosed. The method includes detecting a plurality of facial expressions in an audience based on a stimulus, determining an emotional response to the stimulus based on the facial expressions and generating at least one annotation of the stimulus based on the determined emotional response.
-
Citations
16 Claims
-
1. A method of indexing a stimulus comprising:
-
measuring a plurality of facial expressions in an audience of the stimulus; interpreting measurements of the facial expressions using a processor, wherein the processor interpreting the measurements produces a result estimating a mood for a time period of the stimulus, and wherein interpreting measurements of the facial expressions comprises; classifying each of the facial expressions as corresponding to an emotion from a set of emotions; and statistically analyzing the respective emotions corresponding to the plurality of facial expressions to estimate the mood; generating an annotation of the stimulus, wherein the annotation is generated based on the mood estimated; repeating the measuring, interpreting, and generating steps to produce a plurality of the annotations that respectively correspond to different time periods of the stimulus; and indexing the annotations according to the respective moods, wherein the indexing provides a two-way linkage between the annotations and the time periods of the stimulus. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A computer program product for annotating a stimulus, the computer program product comprising a non-transitory computer usable medium having computer readable program means for causing a computer to perform the steps of:
-
detecting a plurality of facial expressions in an audience of the stimulus; interpreting the facial expressions to estimate a mood for a time period of the stimulus, wherein interpreting the facial expressions comprises; classifying each of the facial expressions as corresponding to an emotion from a set of emotions; and statistically analyzing the respective emotions corresponding to the plurality of facial expressions to estimate the mood; generating an annotation of the stimulus based on the mood estimated; repeating the detecting, interpreting, and generating steps to produce a plurality of the annotations that respectively correspond to different time periods of the stimulus; and indexing the annotations according to the respective moods, wherein the indexing provides a two-way linkage between the annotations and the time periods of the stimulus. - View Dependent Claims (8, 9, 10)
-
-
11. A system for annotating a stimulus comprising:
-
an image capture device; and a computer coupled to the image capture device, wherein the computer executes an annotation generation module that interacts with the image capture device, and wherein the annotation generation module comprises logic for performing the steps of; detecting a plurality of facial expressions in an audience of the stimulus; interpreting the facial expressions to estimate a mood for a time period of the stimulus, wherein interpreting the facial expressions comprises; classifying each of the facial expressions as corresponding to an emotion from a set of emotions; and statistically analyzing the respective emotions corresponding to the plurality of facial expressions to estimate the mood; generating an annotation of the stimulus based on the mood estimated; repeating the measuring, interpreting, and generating steps to produce a plurality of the annotations that respectively correspond to different time periods of the stimulus; and indexing the annotations according to the respective moods, wherein the indexing provides a two-way linkage between the annotations and the time periods of the stimulus. - View Dependent Claims (12, 13, 14, 15)
-
-
16. A method of indexing a stimulus, comprising:
-
measuring a plurality of facial expressions in an audience of the stimulus; interpreting measurements of the facial expressions using a processor, wherein the processor interpreting the measurements produces a result estimating a mood for a time period of the stimulus; generating an annotation of the stimulus, wherein the annotation is generated based on the mood estimated; repeating the measuring, interpreting, and generating steps to produce a plurality of the annotations that respectively correspond to different time periods of the stimulus; indexing the annotations according to the respective moods, wherein the indexing provides a two-way linkage between the annotations and the time periods of the stimulus employing a query that indicates a mood; and using the indexing of the annotations to find in the stimulus a time period that is linked to the mood indicated in the query.
-
Specification