MULTIMODAL NATURAL LANGUAGE INTERFACE FOR FACETED SEARCH
First Claim
Patent Images
1. A search interface comprising:
- an electronic device comprising a plurality of interfaces, the interfaces collectively capable of receiving multimodal input including audio signals; and
a dialog interface module disposed within the electronic device and configured to;
obtain an audio signal representing audio data of a spoken utterance from the plurality of interfaces;
obtain a second signal representing a modality other than audio data from the plurality of interfaces;
map the audio signal and second signal to a query interpretation by correlating audio signal and second signal to a plurality of data facets;
construct proposed search criteria representing the query interpretation and including alternative values for each data facet of the plurality of data facets;
present the proposed search criteria, including the alternative values, to a user via the electronic device;
receive at least one selected value from the presented alternative values;
identify a targeted search engine having an indexing system based on the at least one selected value, the second signal, and the query interpretation;
translate the query interpretation to a targeted query as a function the audio signal, second signal, and the at least one selected value, the targeted query constructed according to an indexing system of the target search engine;
cause the targeted query to be submitted to the target search engine; and
enable the electronic device to present search results from the target search engine response to submission of the targeted query.
2 Assignments
0 Petitions
Accused Products
Abstract
Search interfaces, systems, and methods are presented. Contemplated search interfaces allow electronic devices to capture multi-modal interaction data, including audio signals. A dialog interface capable of interacting with a user processes the interaction data and communicates with a user to establish a desirable query interpretation. Further, the dialog interface can identify a target search engine for a corresponding query based on modalities of the interaction data beyond the data represented by the audio signal.
68 Citations
19 Claims
-
1. A search interface comprising:
-
an electronic device comprising a plurality of interfaces, the interfaces collectively capable of receiving multimodal input including audio signals; and a dialog interface module disposed within the electronic device and configured to; obtain an audio signal representing audio data of a spoken utterance from the plurality of interfaces; obtain a second signal representing a modality other than audio data from the plurality of interfaces; map the audio signal and second signal to a query interpretation by correlating audio signal and second signal to a plurality of data facets; construct proposed search criteria representing the query interpretation and including alternative values for each data facet of the plurality of data facets; present the proposed search criteria, including the alternative values, to a user via the electronic device; receive at least one selected value from the presented alternative values; identify a targeted search engine having an indexing system based on the at least one selected value, the second signal, and the query interpretation; translate the query interpretation to a targeted query as a function the audio signal, second signal, and the at least one selected value, the targeted query constructed according to an indexing system of the target search engine; cause the targeted query to be submitted to the target search engine; and enable the electronic device to present search results from the target search engine response to submission of the targeted query. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19)
-
Specification