Emotion classification based on expression variations associated with same or similar emotions
First Claim
1. A system, comprising:
- a memory that stores computer executable components;
a processor that executes the computer executable components stored in the memory, wherein the computer executable components comprise;
a clustering component that partitions a data set comprising facial expression data into different clusters of the facial expression data based on one or more distinguishing features respectively associated with the different clusters, wherein the facial expression data reflects facial expressions respectively expressed by people, and wherein the clustering component iteratively partitions the data set into the different clusters such that respective iterations result in an incrementally increased number of the different clusters; and
a multi-task learning component that determines a final number of the different clusters for the data set using a multi-task learning process that is dependent on an output of an emotion classification model that classifies emotion types respectively associated with the facial expressions, wherein the multi-task learning process comprises iteratively applying the emotion classification model to the different clusters generated at the respective iterations, and determining the final number of the different clusters based on a number of clusters associated with an iteration of the respective iterations associated with a drop in a classification rate by the emotion classification model.
2 Assignments
0 Petitions
Accused Products
Abstract
Techniques are described that facilitate automatically distinguishing between different expressions of a same or similar emotion. In one embodiment, a computer-implemented is provided that comprises partitioning, by a device operatively coupled to a processor, a data set comprising facial expression data into different clusters of the facial expression data based on one or more distinguishing features respectively associated with the different clusters, wherein the facial expression data reflects facial expressions respectively expressed by people. The computer-implemented method can further comprise performing, by the device, a multi-task learning process to determine a final number of the different clusters for the data set using a multi-task learning process that is dependent on an output of an emotion classification model that classifies emotion types respectively associated with the facial expressions.
-
Citations
20 Claims
-
1. A system, comprising:
-
a memory that stores computer executable components; a processor that executes the computer executable components stored in the memory, wherein the computer executable components comprise; a clustering component that partitions a data set comprising facial expression data into different clusters of the facial expression data based on one or more distinguishing features respectively associated with the different clusters, wherein the facial expression data reflects facial expressions respectively expressed by people, and wherein the clustering component iteratively partitions the data set into the different clusters such that respective iterations result in an incrementally increased number of the different clusters; and a multi-task learning component that determines a final number of the different clusters for the data set using a multi-task learning process that is dependent on an output of an emotion classification model that classifies emotion types respectively associated with the facial expressions, wherein the multi-task learning process comprises iteratively applying the emotion classification model to the different clusters generated at the respective iterations, and determining the final number of the different clusters based on a number of clusters associated with an iteration of the respective iterations associated with a drop in a classification rate by the emotion classification model. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A computer implemented method, comprising:
-
partitioning, by a device operatively coupled to a processor, a data set comprising facial expression data into different clusters of the facial expression data based on one or more distinguishing features respectively associated with the different clusters, wherein the facial expression data reflects facial expressions respectively expressed by people, and wherein the clustering comprises iteratively partitioning the data set into the different clusters such that respective iterations result in an incrementally increased number of the different clusters; and performing, by the device, a multi-task learning process to determine a final number of the different clusters for the data set, wherein the multi-task learning process is dependent on an output of an emotion classification model that classifies emotion types respectively associated with the facial expressions, and wherein the performing the multi-task learning process comprises iteratively applying the emotion classification model to the different clusters generated at the respective iterations, and determining the final number of the different clusters based on a number of clusters associated with an iteration of the respective iterations associated with a drop in a classification rate by the emotion classification model. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. A computer program product facilitating automatically distinguishing between different facial expressions of a same emotion type associated with different demographic profiles, the computer program product comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by a processing component to cause the processing component to:
-
cluster a data set comprising facial expression data into different clusters of the facial expression data based on one or more distinguishing features respectively associated with the different clusters, wherein the facial expression data reflects facial expressions respectively expressed by people, and wherein the clustering of the data set comprises iteratively partitioning the data set into the different clusters such that respective iterations result in an incrementally increased number of the different clusters; and determine a final number of the different clusters for the data set and facial feature representations for the different clusters using a multi-task learning process that is dependent on an output of an emotion classification model that classifies emotion types respectively associated with the facial expressions, and wherein the using the multi-task learning process comprises iteratively applying the emotion classification model to the different clusters generated at the respective iterations, and determining the final number of the different clusters based on a number of clusters associated with an iteration of the respective iterations associated with a drop in a classification rate by the emotion classification model. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification