Method for building a natural language understanding model for a spoken dialog system
First Claim
1. A computing device for generating a natural language understanding (NLU) model for use in a spoken dialog service, the computing device comprising a processor and further comprising:
- (a) a module configured to control the processor to use sample utterances to create at least one hand crafted rule for each call-type defined in a labeling guide, wherein each call-type comprises a call category with at least one defined attribute;
(b) a module configured to control the processor to generate and test a first NLU model using the at least one hand crafted rule and sample utterances;
(c) a module configured to control the processor to build a second NLU model using the sample utterances as new training data and using the at least one hand crafted rule;
(d) a module configured to control the processor to test the performance of the second NLU model using a first batch of labeled data;
(e) a module configured to control the processor to (1) build a series of NLU models by adding a previous batch of labeled data to training data and (2) use a new batch of labeling data as test data to generate the series of NLU models with training data that increases constantly;
(f) a module configured, if not all the labeling data is received, to control the processor to repeat steps performed by module (e) until all labeling data is received; and
(g) a module configured, after all the training data is received, at least once, to control the processor to build a third NLU model using all the labeling data, wherein the third NLU model is used in generating the spoken dialog service.
5 Assignments
0 Petitions
Accused Products
Abstract
A method of generating a natural language model for use in a spoken dialog system is disclosed. The method comprises using sample utterances and creating a number of hand crafted rules for each call-type defined in a labeling guide. A first NLU model is generated and tested using the hand crafted rules and sample utterances. A second NLU model is built using the sample utterances as new training data and using the hand crafted rules. The second NLU model is tested for performance using a first batch of labeled data. A series of NLU models are built by adding a previous batch of labeled data to training data and using a new batch of labeling data as test data to generate the series of NLU models with training data that increases constantly. If not all the labeling data is received, the method comprises repeating the step of building a series of NLU models until all labeling data is received. After all the training data is received, at least once, the method comprises building a third NLU model using all the labeling data, wherein the third NLU model is used in generating the spoken dialog service.
-
Citations
17 Claims
-
1. A computing device for generating a natural language understanding (NLU) model for use in a spoken dialog service, the computing device comprising a processor and further comprising:
-
(a) a module configured to control the processor to use sample utterances to create at least one hand crafted rule for each call-type defined in a labeling guide, wherein each call-type comprises a call category with at least one defined attribute; (b) a module configured to control the processor to generate and test a first NLU model using the at least one hand crafted rule and sample utterances; (c) a module configured to control the processor to build a second NLU model using the sample utterances as new training data and using the at least one hand crafted rule; (d) a module configured to control the processor to test the performance of the second NLU model using a first batch of labeled data; (e) a module configured to control the processor to (1) build a series of NLU models by adding a previous batch of labeled data to training data and (2) use a new batch of labeling data as test data to generate the series of NLU models with training data that increases constantly; (f) a module configured, if not all the labeling data is received, to control the processor to repeat steps performed by module (e) until all labeling data is received; and (g) a module configured, after all the training data is received, at least once, to control the processor to build a third NLU model using all the labeling data, wherein the third NLU model is used in generating the spoken dialog service. - View Dependent Claims (2, 3, 4)
-
-
5. A system for generating a natural language understanding (NLU) model for use in a spoken dialog service, the system having a processor and comprising:
-
(a) a module configured to control the processor to build a first NLU model using sample utterances from a labeling guide, hand crafted rules and labeled utterances of available human/human dialogs or human/machine dialogs, if available; (b) a module configured to control the processor to test the performance of the first NLU model using sample utterances in the labeling guide; and (c) a module configured to control the processor to build a series of NLU models and evaluate the performance of the series of NLU models as labeled data becomes available by; (i) adding a previous batch of labeled data to training data; and (ii) using a new batch of labeling data as test data to generate the series of NLU models with training data that increases constantly. - View Dependent Claims (6, 7, 8, 9)
-
-
10. A tangible computer readable medium storing instructions for controlling a computing device to generate a natural language understanding (NLU) model for use in a spoken dialog system, the instructions comprising:
-
(a) controlling a processor to build a first NLU model using sample utterances from a labeling guide, hand crafted rules and labeled utterances of available human/human dialogs or human/machine dialogs, if available; (b) controlling the processor to test the performance of the first NLU model using sample utterances in the labeling guide; and (c) controlling the processor to build a series of NLU models and evaluate the performance of the series of NLU models as labeled data becomes available by; (i) adding a previous batch of labeled data to training data; and (ii) using a new batch of labeling data as test data to generate the series of NLU models with training data that increases constantly. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17)
-
Specification