Learning emotional states using personalized calibration tasks
First Claim
1. A method for determining an emotional state of a subject taking an assessment, the method comprising:
- generating a calibration task to elicit predicted responses from an associated subject administered the task, wherein each portion of the task is intended to elicit a certain emotional response that conveys a baseline characteristic of the associated subject;
receiving video data capturing the associated subject performing the calibration task, wherein each frame of the video data is synchronized within the task to correspond to a portion of the task;
processing the video data for determining an observable physical behavior experienced by the associated subject across a series of frames during the each portion of the task;
detecting an emotional response experienced by the associated subject across the series of frames corresponding to the each portion of the task;
associating (or tagging) the observed behavior with one of multiple emotional categories (or labels), wherein each category (or label) corresponds with one of the emotional responses; and
,training a classifier using the features extracted from the image data, where each class is one of the categories associated with the observed behavior.
1 Assignment
0 Petitions
Accused Products
Abstract
A method for determining an emotional state of a subject taking an assessment. The method includes eliciting predicted facial expressions from a subject administered questions each intended to elicit a certain facial expression that conveys a baseline characteristic of the subject; receiving a video sequence capturing the subject answering the questions; determining an observable physical behavior experienced by the subject across a series of frames corresponding to the sample question; associating the observed behavior with the emotional state that corresponds with the facial expression; and training a classifier using the associations. The method includes receiving a second video sequence capturing the subject during an assessment and applying features extracted from the second image data to the classifier for determining the emotional state of the subject in response to an assessment item administered during the assessment.
-
Citations
21 Claims
-
1. A method for determining an emotional state of a subject taking an assessment, the method comprising:
-
generating a calibration task to elicit predicted responses from an associated subject administered the task, wherein each portion of the task is intended to elicit a certain emotional response that conveys a baseline characteristic of the associated subject; receiving video data capturing the associated subject performing the calibration task, wherein each frame of the video data is synchronized within the task to correspond to a portion of the task; processing the video data for determining an observable physical behavior experienced by the associated subject across a series of frames during the each portion of the task; detecting an emotional response experienced by the associated subject across the series of frames corresponding to the each portion of the task; associating (or tagging) the observed behavior with one of multiple emotional categories (or labels), wherein each category (or label) corresponds with one of the emotional responses; and
,training a classifier using the features extracted from the image data, where each class is one of the categories associated with the observed behavior. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A method for determining an emotional state of a subject taking an assessment, the method comprising:
-
generating sample questions to elicit predicted facial expressions from an associated subject administered the questions, wherein each question is intended to elicit a certain facial expression that conveys a baseline characteristic of the associated subject; receiving a video sequence capturing the associated subject answering the questions, wherein each frame of the video sequence is synchronized within a sample question; determining an observable physical behavior experienced by the associated subject across a series of frames corresponding to the sample question; detecting a facial expression conveyed by the associated subject across the series of frames corresponding to the question; associating (or tagging) the observed behavior with the emotional state that corresponds with the facial expression; training a classifier using the associations; receiving a second video sequence capturing the associated subject during an assessment administered after the sample questions; applying features extracted from the second image data to the classifier for determining the emotional state of the associated subject in response to an assessment item administered during the assessment.
-
-
13. A calibration system for determining an emotional state of a subject taking an assessment, the system comprising:
-
a processor, and a non-transitory computer readable memory storing instructions that are executable by the processor to perform the operations of; generating a calibration task to elicit predicted responses from an associated subject administered the task, wherein each portion of the task is intended to elicit a certain emotional response that conveys a baseline characteristic of the associated subject; receiving image data from an image capture device capturing the associated subject performing the calibration task, wherein each frame of the image data is synchronized to correspond to a portion of the task; determining an observable physical behavior experienced by the associated subject across a series of frames during the portion of the task; detecting an emotional response experienced by the associated subject across the series of frames corresponding to the portion of the task; associating (or tag) the observed behavior with one of multiple emotional categories (or labels), wherein each category (or label) corresponds with one of the emotional responses; training a classifier using the associations; receiving second image data capturing the associated subject during an assessment administered after the calibration task; and
,applying features extracted from the second image data to the classifier for determining the emotional state of the associated subject in response to an assessment item administered during the assessment. - View Dependent Claims (14, 15, 16, 17, 18, 19, 20, 21)
-
Specification