Systems and methods for determining an emotional environment from facial expressions
First Claim
1. A wearable apparatus for determining an emotional environment of a user of the wearable apparatus, the wearable apparatus comprising:
- a wearable image sensor configured to capture one or more images from an environment of the user; and
at least one processor programmed to;
analyze the one or more images to identify a facial expression of a person in the environment of the user;
aggregate and weight features of the facial expression over a period of time to determine a mood of the person;
receive information indicative of a state of the user;
determine, based on the mood and the state of the user, an interaction factor between the user and the person in the environment of the user; and
transmit, to an external device, information associated with the facial expression and the interaction factor to cause the external device to determine a classification associated with the emotional environment based on the information associated with the facial expression and the interaction factor.
1 Assignment
0 Petitions
Accused Products
Abstract
A wearable apparatus is provided for capturing and processing images from an environment of a user. In one implementation, the wearable apparatus may determine an emotional environment of the user of the wearable apparatus. The wearable apparatus may include an image sensor and capture one or more images from an environment around the user. The wearable device may also be configured to analyze the one or more images to identify facial expressions of a person in the image. In some embodiments, the wearable apparatus may also identify the person in the one or more images. The wearable apparatus may also transmit information associated with the facial expression and/or identity to an external device.
15 Citations
35 Claims
-
1. A wearable apparatus for determining an emotional environment of a user of the wearable apparatus, the wearable apparatus comprising:
-
a wearable image sensor configured to capture one or more images from an environment of the user; and at least one processor programmed to; analyze the one or more images to identify a facial expression of a person in the environment of the user; aggregate and weight features of the facial expression over a period of time to determine a mood of the person; receive information indicative of a state of the user; determine, based on the mood and the state of the user, an interaction factor between the user and the person in the environment of the user; and transmit, to an external device, information associated with the facial expression and the interaction factor to cause the external device to determine a classification associated with the emotional environment based on the information associated with the facial expression and the interaction factor. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19)
-
-
20. A method for determining an emotional environment of a user of a wearable apparatus, the method comprising:
-
obtaining one or more images of at least a portion of an environment of the user; analyzing the one or more images to identify a facial expression of a person in the environment of the user; aggregating and weighting features of the facial expression over a period of time to determine a mood of the person; receive information indicative of a state of the user; determining, based on the mood and the state of the user, an interaction factor between the user and the person in the environment of the user; and transmitting, to an external device, information associated with the facial expression and the interaction factor to cause the external device to determine a classification associated with the emotional environment based on the information associated with the facial expression and the interaction factor. - View Dependent Claims (21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35)
-
Specification