METHOD FOR DETECTING FACIAL EXPRESSIONS AND EMOTIONS OF USERS
First Claim
1. A method for detecting a facial expression of a user comprises:
- during a sampling interval, recording a set of electromyograph signals through a set of sense electrodes arranged about a viewing window in a virtual reality headset worn by a user;
deducting a reference signal from each electromyograph signal in the set of electromyograph signals to generate a set of composite signals;
for each composite signal in the set of composite signals, transforming the composite signal into a spectrum of oscillating electromyograph components within a frequency range of interest;
for each facial action unit in a set of facial action units, calculating a score indicating presence of the facial action unit in the user'"'"'s facial musculature during the sampling interval based on the spectrum of oscillating electromyograph components; and
mapping scores for the set of facial action units to a facial expression of the user during the sampling interval;
transforming the facial expression of the user to an emotion of the user based on an emotion model; and
outputting an identifier of the emotion to a device.
0 Assignments
0 Petitions
Accused Products
Abstract
A method for detecting facial emotions includes: recording a set of electromyograph signals through a set of sense electrodes arranged about a viewing window in a virtual reality headset; deducting a reference signal from each electromyograph signal in the set of electromyograph signals to generate a set of composite signals; for each composite signal in the set of composite signals, transforming the composite signal into a spectrum of electromyograph components; for each facial action unit in a set of facial action units, calculating a score indicating presence of the facial action unit in the user'"'"'s facial musculature during the sampling interval based on the spectrum of electromyograph components; and mapping scores for the set of facial action units to a facial expression of the user during the sampling; transforming the facial expression of the user to an emotion of the user based on an emotion model.
17 Citations
20 Claims
-
1. A method for detecting a facial expression of a user comprises:
-
during a sampling interval, recording a set of electromyograph signals through a set of sense electrodes arranged about a viewing window in a virtual reality headset worn by a user; deducting a reference signal from each electromyograph signal in the set of electromyograph signals to generate a set of composite signals; for each composite signal in the set of composite signals, transforming the composite signal into a spectrum of oscillating electromyograph components within a frequency range of interest; for each facial action unit in a set of facial action units, calculating a score indicating presence of the facial action unit in the user'"'"'s facial musculature during the sampling interval based on the spectrum of oscillating electromyograph components; and mapping scores for the set of facial action units to a facial expression of the user during the sampling interval; transforming the facial expression of the user to an emotion of the user based on an emotion model; and outputting an identifier of the emotion to a device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method for detecting a facial expression of a user comprises:
-
during a sampling interval, recording a set of electromyograph signals through a set of sense electrodes arranged about a viewing window in a virtual reality headset worn by a user; deducting a reference signal from each electromyograph signal in the set of electromyograph signals to generate a set of composite signals; for each composite signal in the set of composite signals, transforming the composite signal into a spectrum of oscillating electromyograph components within a frequency range of interest; for each facial action unit in a set of facial action units; calculating a confidence level associated with the facial action unit, based on the spectrum of oscillating electromyograph components; and in response to the confidence level associated with the facial action unit exceeding a confidence threshold, identifying the facial action unit as a component facial action unit in a set of component facial action units; mapping the set of component facial action units to a facial expression of the user during the sampling interval; outputting an identifier of the facial expression of the user to a device. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. A method for detecting a facial expression of a user comprises:
-
during a sampling interval, recording a set of electromyograph signals through a set of sense electrodes arranged about a viewing window in a virtual reality headset worn by a user; deducting a reference signal from each electromyograph signal in the set of electromyograph signals to generate a set of composite signals; for each composite signal in the set of composite signals, transforming the composite signal into a spectrum of oscillating electromyograph components within a frequency range of interest; and for each facial action unit in a set of facial action units; calculating a confidence level associated with the facial action unit, based on the spectrum of oscillating electromyograph components and an action unit model; and in response to the confidence level associated with the facial action unit exceeding a confidence threshold, outputting an identifier associated with the facial action unit. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification