Method and system for measuring human response to visual stimulus based on changes in facial expression
First Claim
1. A method for determining a person'"'"'s emotional response to a visual stimulus, based on the person'"'"'s facial expressions, comprising the following steps of:
- a) detecting and tracking a face from input images of the person captured by at least a means for capturing images, and localizing the face and facial features,b) deriving emotion-sensitive feature filters by generating emotion-sensitive candidate filters and determining the emotion-sensitive feature filters by choosing the filters that yield high responses to a predefined number of facial images from the emotion-sensitive candidate filters,c) extracting emotion-sensitive features from the face by applying the emotion-sensitive feature filters to localized facial features and transient features,d) determining facial muscle actions of the face based on the emotion-sensitive features,e) calculating likelihoods of expressions belonging to each of emotional categories using the facial muscle actions,f) finding the changes in affective state, called an emotion trajectory, of the person based on the facial muscle actions, andg) determining the response of the person to the visual stimulus, by analyzing the emotion trajectory,wherein the likelihoods determine the coordinates of instances of emotion in affect space, and wherein a series of estimations for the coordinates generates the emotion trajectory in the affect space.
11 Assignments
0 Petitions
Accused Products
Abstract
The present invention is a method and system for measuring human emotional response to visual stimulus, based on the person'"'"'s facial expressions. Given a detected and tracked human face, it is accurately localized so that the facial features are correctly identified and localized. Face and facial features are localized using the geometrically specialized learning machines. Then the emotion-sensitive features, such as the shapes of the facial features or facial wrinkles, are extracted. The facial muscle actions are estimated using a learning machine trained on the emotion-sensitive features. The instantaneous facial muscle actions are projected to a point in affect space, using the relation between the facial muscle actions and the affective state (arousal, valence, and stance). The series of estimated emotional changes renders a trajectory in affect space, which is further analyzed in relation to the temporal changes in visual stimulus, to determine the response.
-
Citations
26 Claims
-
1. A method for determining a person'"'"'s emotional response to a visual stimulus, based on the person'"'"'s facial expressions, comprising the following steps of:
-
a) detecting and tracking a face from input images of the person captured by at least a means for capturing images, and localizing the face and facial features, b) deriving emotion-sensitive feature filters by generating emotion-sensitive candidate filters and determining the emotion-sensitive feature filters by choosing the filters that yield high responses to a predefined number of facial images from the emotion-sensitive candidate filters, c) extracting emotion-sensitive features from the face by applying the emotion-sensitive feature filters to localized facial features and transient features, d) determining facial muscle actions of the face based on the emotion-sensitive features, e) calculating likelihoods of expressions belonging to each of emotional categories using the facial muscle actions, f) finding the changes in affective state, called an emotion trajectory, of the person based on the facial muscle actions, and g) determining the response of the person to the visual stimulus, by analyzing the emotion trajectory, wherein the likelihoods determine the coordinates of instances of emotion in affect space, and wherein a series of estimations for the coordinates generates the emotion trajectory in the affect space. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. An apparatus for determining a person'"'"'s emotional response to a visual stimulus, based on the person'"'"'s facial expressions, comprising:
-
a) means for detecting and tracking a face from input images of the person captured by at least a means for capturing images, and localizing the face and facial features, b) means for deriving emotion-sensitive feature filters by generating emotion-sensitive candidate filters and determining the emotion-sensitive feature filters by choosing the filters that yield high responses to a predefined number of facial images from the emotion-sensitive candidate filters, c) means for extracting emotion-sensitive features from the face by applying the emotion-sensitive feature filters to localized facial features and transient features, d) means for determining facial muscle actions of the face based on the emotion-sensitive features, e) means for calculating likelihoods of expressions belonging to each of emotional categories using the facial muscle actions, f) means for finding the changes in affective state, called an emotion trajectory, of the person based on the facial muscle actions, and g) means for determining the response of the person to the visual stimulus, by analyzing the emotion trajectory, wherein the likelihoods determine the coordinates of instances of emotion in affect space, and wherein a series of estimations for the coordinates generates the emotion trajectory in the affect space. - View Dependent Claims (15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26)
-
Specification