Personalized situation awareness using human emotions and incident properties
First Claim
Patent Images
1. A method comprising:
- receiving an incident report from a user device, the incident report pertaining to an incident involving a target;
obtaining, from at least one sensor of the user device, emotional data pertaining to a reporting person using the user device, wherein the emotional data includes at least one of heat, heartbeat, conductivity, text, speech, image, and video data;
processing the emotional data to determine an emotional status of the reporting person, wherein processing the emotional data includes at least one of face detection, voice/speech analytics, and text analytics;
processing the incident report to determine an incident type, incident category, and incident characteristic; and
transmitting an initiation dialog sequence to the user device, the initiation dialog sequence configured to initiate on the user device a cognitive dialog module that prompts a further interaction with the reporting person, wherein the further interaction is based at least in part on the emotional status of the reporting person and at least in part on the incident characteristics.
1 Assignment
0 Petitions
Accused Products
Abstract
The disclosure provides systems and methods for generating and delivering situational-dependent incident responses. An incident response is generated based on incident characterizations as well as emotional data pertaining to a reporting person. The generated response may also involve an interactive component in order to obtain further situational details from the reporting person and other sources, thereby further refine the response and recommendations. Persons reporting incidents are aided by appropriate recommendations and assistance from appropriate service providers.
-
Citations
20 Claims
-
1. A method comprising:
-
receiving an incident report from a user device, the incident report pertaining to an incident involving a target; obtaining, from at least one sensor of the user device, emotional data pertaining to a reporting person using the user device, wherein the emotional data includes at least one of heat, heartbeat, conductivity, text, speech, image, and video data; processing the emotional data to determine an emotional status of the reporting person, wherein processing the emotional data includes at least one of face detection, voice/speech analytics, and text analytics; processing the incident report to determine an incident type, incident category, and incident characteristic; and transmitting an initiation dialog sequence to the user device, the initiation dialog sequence configured to initiate on the user device a cognitive dialog module that prompts a further interaction with the reporting person, wherein the further interaction is based at least in part on the emotional status of the reporting person and at least in part on the incident characteristics. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A user device comprising:
-
at least one sensor collecting emotional data from a reporting person, wherein the emotional data includes at least one of heat, heartbeat, text, speech, image, and video data; a reasoning module implementing at least one of face detection, voice/speech analytics, and text analytics to infer an emotional status of the reporting person; a communications module communicating the emotional data, the emotional status, and an incident report to a server; and a cognitive dialog module implementing a metaphor generator to prompt a further interaction with the reporting person. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification