Method and apparatus for correlating and viewing disparate data
First Claim
Patent Images
1. A method for generating data to provide situational awareness or decision-making assistance to a user in relation to a physical environment, the method comprising, with a computer system:
- processing input data comprising at least data associated with the physical environment; and
when a need for situational awareness or decision-making assistance is detected based on the input data, generating response data, the response data derived from multimodal data from a plurality of electronic data streams comprising audio, visual and textual information, the data streams received from a plurality of data sources, wherein generating the response data comprises;
determining a characteristic of the need for situational awareness or decision-making assistance;
extracting semantic information from the audio, visual and textual information;
correlating the extracted semantic information in accordance with the characteristic;
selecting a subset of the audio, visual and textual information based on the correlation of the extracted semantic information with the characteristic; and
outputting at least a portion of the selected subset as the response data.
1 Assignment
0 Petitions
Accused Products
Abstract
Methods and apparatuses of the present invention generally relate to generating actionable data based on multimodal data from unsynchronized data sources. In an exemplary embodiment, the method comprises receiving multimodal data from one or more unsynchronized data sources, extracting concepts from the multimodal data, the concepts comprising at least one of objects, actions, scenes and emotions, indexing the concepts for searchability; and generating actionable data based on the concepts.
15 Citations
20 Claims
-
1. A method for generating data to provide situational awareness or decision-making assistance to a user in relation to a physical environment, the method comprising, with a computer system:
-
processing input data comprising at least data associated with the physical environment; and when a need for situational awareness or decision-making assistance is detected based on the input data, generating response data, the response data derived from multimodal data from a plurality of electronic data streams comprising audio, visual and textual information, the data streams received from a plurality of data sources, wherein generating the response data comprises; determining a characteristic of the need for situational awareness or decision-making assistance; extracting semantic information from the audio, visual and textual information; correlating the extracted semantic information in accordance with the characteristic; selecting a subset of the audio, visual and textual information based on the correlation of the extracted semantic information with the characteristic; and outputting at least a portion of the selected subset as the response data. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15)
-
-
16. A method for generating informational assistance for a user in relation to a physical environment, the method comprising, with a computer system:
-
processing input data to identify a need for informational assistance relating to the physical environment; determining a characteristic of the physical environment, in addition to a geographical location, associated with the need for informational assistance; extracting semantic information from a plurality of electronic data streams comprising audio, visual and textual information, the data streams received from a plurality of data sources; correlating the extracted semantic information with the characteristic and the geographical location; selecting a subset of the audio, visual and textual information based on the correlation of the extracted semantic information with the physical characteristic and the geographical location; generating the informational assistance comprising at least a portion of the selected subset; and outputting the informational assistance to the user. - View Dependent Claims (17, 18, 19, 20)
-
Specification