×

N-gram selection for practical-sized language models

  • US 8,655,647 B2
  • Filed: 03/11/2010
  • Issued: 02/18/2014
  • Est. Priority Date: 03/11/2010
  • Status: Active Grant
First Claim
Patent Images

1. In a computing environment, a computer-implemented method performed on at least one processor, comprising:

  • processing training data to train an N-gram model, including excluding a higher-order probability estimate for an N-gram in the model when a backoff probability estimate for the N-gram is within a maximum likelihood set determined by that N-gram and the N-gram'"'"'s associated context.

View all claims
  • 2 Assignments
Timeline View
Assignment View
    ×
    ×