×

N-Gram Selection for Practical-Sized Language Models

  • US 20110224971A1
  • Filed: 03/11/2010
  • Published: 09/15/2011
  • Est. Priority Date: 03/11/2010
  • Status: Active Grant
First Claim
Patent Images

1. In a computing environment, a method performed on at least one processor, comprising, processing training data to train an N-gram model, including excluding a higher-order probability estimate for an N-gram in the model when a backoff probability estimate for the N-gram is within a maximum likelihood set determined by that N-gram and the N-gram'"'"'s associated context.

View all claims
  • 2 Assignments
Timeline View
Assignment View
    ×
    ×