EFFICIENT WORD ENCODING FOR RECURRENT NEURAL NETWORK LANGUAGE MODELS
First Claim
1. A non-transitory computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to:
- receive a representation of a current word of a word sequence, wherein the representation of the current word is indicative of a class of a plurality of classes and a word associated with the class;
determine a current word context based on the representation of the current word and a previous word context; and
provide a representation of a next word of the word sequence, wherein the representation of the next word of the word sequence is based on the current word context.
1 Assignment
0 Petitions
Accused Products
Abstract
Systems and processes for word encoding are provided. In accordance with one example, a method includes, at an electronic device with one or more processors and memory, receiving a user input, determining a first similarity between a representation of the user input and a first acoustic model of a plurality of acoustic models, and determining a second similarity between the representation of the user input and a second acoustic model of the plurality of acoustic models. The method further includes determining whether the first similarity is greater than the second similarity. In accordance with a determination that the first similarity is greater than the second similarity, the first acoustic model may be selected; and in accordance with a determination that the first similarity is not greater than the second similarity, the second acoustic model may be selected.
104 Citations
14 Claims
-
1. A non-transitory computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to:
-
receive a representation of a current word of a word sequence, wherein the representation of the current word is indicative of a class of a plurality of classes and a word associated with the class; determine a current word context based on the representation of the current word and a previous word context; and provide a representation of a next word of the word sequence, wherein the representation of the next word of the word sequence is based on the current word context. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A method comprising:
-
receiving a representation of a current word of a word sequence, wherein the representation of the current word is indicative of a class of a plurality of classes and a word associated with the class; determining a current word context based on the representation of the current word and a previous word context; and providing a representation of a next word of the word sequence, wherein the representation of the next word of the word sequence is based on the current word context.
-
-
14. An electronic device, comprising:
-
one or more processors; a memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for; receiving a representation of a current word of a word sequence, wherein the representation of the current word is indicative of a class of a plurality of classes and a word associated with the class; determining a current word context based on the representation of the current word and a previous word context; and providing a representation of a next word of the word sequence, wherein the representation of the next word of the word sequence is based on the current word context.
-
Specification