Efficient word encoding for recurrent neural network language models
First Claim
1. A non-transitory computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to:
- receive a user input including a word sequence;
provide a representation of a current word of the word sequence, wherein the representation of the current word is indicative of a class of a plurality of classes and a word associated with the class;
determine a current word context based on a weighted representation of the current word and a weighted previous word context,wherein the current word context is a context at a first time and the previous word context is a context at a second time, andwherein the weighted representation of the current word is weighted with a first weight factor and the weighted previous word context is weighted with a second factor different than the first weight factor;
provide a representation of a next word of the word sequence, wherein the representation of the next word of the word sequence is based on the current word context; and
display, proximate to the user input, the next word of the word sequence.
1 Assignment
0 Petitions
Accused Products
Abstract
Systems and processes for efficient word encoding are provided. In accordance with one example, a method includes, at an electronic device with one or more processors and memory, receiving a user input including a word sequence, and providing a representation of a current word of the word sequence. The representation of the current word may be indicative of a class of a plurality of classes and a word associated with the class. The method further includes determining a current word context based on the representation of the current word and a previous word context, and providing a representation of a next word of the word sequence. The representation of the next word of the word sequence may be based on the current word context. The method further includes displaying, proximate to the user input, the next word of the word sequence.
4276 Citations
39 Claims
-
1. A non-transitory computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to:
-
receive a user input including a word sequence; provide a representation of a current word of the word sequence, wherein the representation of the current word is indicative of a class of a plurality of classes and a word associated with the class; determine a current word context based on a weighted representation of the current word and a weighted previous word context, wherein the current word context is a context at a first time and the previous word context is a context at a second time, and wherein the weighted representation of the current word is weighted with a first weight factor and the weighted previous word context is weighted with a second factor different than the first weight factor; provide a representation of a next word of the word sequence, wherein the representation of the next word of the word sequence is based on the current word context; and display, proximate to the user input, the next word of the word sequence. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. A method comprising:
-
receiving a user input including a word sequence; providing a representation of a current word of the word sequence, wherein the representation of the current word is indicative of a class of a plurality of classes and a word associated with the class; determining a current word context based on a weighted representation of the current word and a weighted previous word context, wherein the current word context is a context at a first time and the previous word context is a context at a second time, and wherein the weighted representation of the current word is weighted with a first weight factor and the weighted previous word context is weighted with a second factor different than the first weight factor; providing a representation of a next word of the word sequence, wherein the representation of the next word of the word sequence is based on the current word context; and displaying, proximate to the user input, the next word of the word sequence. - View Dependent Claims (15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26)
-
-
27. An electronic device, comprising:
-
one or more processors; a memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for; receiving a user input including a word sequence; providing a representation of a current word of the word sequence, wherein the representation of the current word is indicative of a class of a plurality of classes and a word associated with the class; determine a current word context based on a weighted representation of the current word and a weighted previous word context, wherein the current word context is a context at a first time and the previous word context is a context at a second time, and wherein the weighted representation of the current word is weighted with a first weight factor and the weighted previous word context is weighted with a second factor different than the first weight factor; providing a representation of a next word of the word sequence, wherein the representation of the next word of the word sequence is based on the current word context; and displaying, proximate to the user input, the next word of the word sequence. - View Dependent Claims (28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39)
-
Specification