ENGAGEMENT LEVEL DETERMINATION AND DISSEMINATION
First Claim
1. An apparatus for engagement dissemination, the apparatus comprising:
- a face detector configured to detect a face in video data;
a context filterer configured to determine a student is on-platform based on log data and to determine a section type for the student based on the log data;
an appearance monitor configured to;
select a first emotional classifier specific to the section type;
classify, using the first emotional classifier, a first emotional component based on the detected face;
select a first behavioral classifier specific to the section type; and
classify, using the first behavioral classifier, a first behavioral component based on the detected face;
a context-performance monitor configured to;
select a second emotional classifier specific to the section type;
classify, using the second emotional classifier, a second emotional component based on the log data and the section type;
select a second behavioral classifier specific to the section type; and
classify, using the second behavioral classifier;
a second behavioral component based on the log data and the section type;
a fuser configured to;
combine the classified first emotional component and the second emotional component into an emotional state of the student based on confidence values of the first emotional component and the second emotional component;
combine the classified first behavioral component and the second behavioral component into a behavioral state of the student based on confidence values of the first behavioral component and the second behavioral component; and
determine an engagement level of the student based on the emotional state of the student and the behavioral state of the student.
1 Assignment
0 Petitions
Accused Products
Abstract
Various systems and methods for engagement dissemination. A face detector detects a face in video data. A context filterer determines a student is on-platform and a section type. An appearance monitor selects an emotional and a behavioral classifiers. Emotional and behavioral components are classified based on the detected face. A context-performance monitor selects an emotional and a behavioral classifiers specific to the section type, Emotional and behavioral components are classified based on the log data. A fuser combines the emotional components into an emotional state of the student based on confidence values of the emotional components. The fuser combines the behavioral components a behavioral state of the student based on confidence values of the behavioral components. The user determine an engagement level of the student based on the emotional state and the behavioral state of the student.
-
Citations
25 Claims
-
1. An apparatus for engagement dissemination, the apparatus comprising:
-
a face detector configured to detect a face in video data; a context filterer configured to determine a student is on-platform based on log data and to determine a section type for the student based on the log data; an appearance monitor configured to; select a first emotional classifier specific to the section type; classify, using the first emotional classifier, a first emotional component based on the detected face; select a first behavioral classifier specific to the section type; and classify, using the first behavioral classifier, a first behavioral component based on the detected face; a context-performance monitor configured to; select a second emotional classifier specific to the section type; classify, using the second emotional classifier, a second emotional component based on the log data and the section type; select a second behavioral classifier specific to the section type; and classify, using the second behavioral classifier;
a second behavioral component based on the log data and the section type;a fuser configured to; combine the classified first emotional component and the second emotional component into an emotional state of the student based on confidence values of the first emotional component and the second emotional component; combine the classified first behavioral component and the second behavioral component into a behavioral state of the student based on confidence values of the first behavioral component and the second behavioral component; and determine an engagement level of the student based on the emotional state of the student and the behavioral state of the student. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A method for engagement dissemination, the method comprising operations performed using an electronic processor, the operations comprising:
-
detecting a face in video data; determining a student is on-platform based on log data; determining a section type for the student based on the log data; selecting, using an appearance monitor (APM), a first emotional classifier specific to the section type; classifying, using the first emotional classifier, a first emotional component based on the detected face; selecting, using the APM, a first behavioral classifier specific to the section type; classifying, using the first behavioral classifier, a first behavioral component based on the detected face; selecting, using a context-performance monitor (CPM), a second emotional classifier specific to the section type; classifying, using the second emotional classifier, a second emotional component based on the log data and the section type; selecting, using the CPM, a second behavioral classifier specific to the section type; classifying, using the second behavioral classifier, a second behavioral component based on the log data and the section type; combining the classified first emotional component and the second emotional component into an emotional state of the student based on confidence values of the first emotional component and the second emotional component; combining the classified first behavioral component and the second behavioral component into a behavioral state of the student based on confidence values of the first behavioral component and the second behavioral component; and determining an engagement level of the student based on the emotional state of the student and the behavioral state of the student. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
-
-
19. At least one non-transitory computer-readable medium including instructions for engagement dissemination, the computer-readable medium comprising instructions that, when executed by a machine, cause the machine to perform operations comprising:
-
detecting a face in video data; determining a student is on-platform based on log data; determining a section type for the student based on the log data; selecting, using an appearance monitor (APM), a first emotional classifier specific to the section type; classifying, using the first emotional classifier, a first emotional component based on the detected face; selecting, using the APM, a first behavioral classifier specific to the section type; classifying, using the first behavioral classifier, a first behavioral component based on the detected face; selecting, using a context-performance monitor (CPM), a second emotional classifier specific to the section type; classifying, using the second emotional classifier, a second emotional component based on the log data and the section type; selecting, using the CPM, a second behavioral classifier specific to the section type; classifying, using the second behavioral classifier, a second behavioral component based on the log data and the section type; combining the classified first emotional component and the second emotional component into an emotional state of the student based on confidence values of the first emotional component and the second emotional component; combining the classified first behavioral component and the second behavioral component into a behavioral state of the student based on confidence values of the first behavioral component and the second behavioral component; and determining an engagement level of the student based on the emotional state of the student and the behavioral state of the student. - View Dependent Claims (20, 21, 22, 23, 24, 25)
-
Specification