Efficient word encoding for recurrent neural network language models

  • US 10,366,158 B2
  • Filed: 04/28/2016
  • Issued: 07/30/2019
  • Est. Priority Date: 09/29/2015
  • Status: Active Grant
  • ×
    • Pin Icon | RPX Insight
    • Pin
First Claim
Patent Images

1. A non-transitory computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to:

  • receive a user input including a word sequence;

    provide a representation of a current word of the word sequence, wherein the representation of the current word is indicative of a class of a plurality of classes and a word associated with the class;

    determine a current word context based on a weighted representation of the current word and a weighted previous word context,wherein the current word context is a context at a first time and the previous word context is a context at a second time, andwherein the weighted representation of the current word is weighted with a first weight factor and the weighted previous word context is weighted with a second factor different than the first weight factor;

    provide a representation of a next word of the word sequence, wherein the representation of the next word of the word sequence is based on the current word context; and

    display, proximate to the user input, the next word of the word sequence.

View all claims