Combined quantized and continuous feature vector HMM approach to speech recognition
First Claim
1. An HMM generator, comprising:
- vector quantizing means for generating a model by quantizing vectors of a training pattern having a vector series, and converting said quantizing vectors into a label series of clusters to which they belong, continuous probability distribution density HMM generating means for generating a continuous probability distribution density HMM from a quantized vector series corresponding to each label of said label series of clusters, and label incidence calculating means for calculating the incidence of the labels in each state from said quantizing vectors of a training pattern classified in the same label series of clusters and the continuous probability distribution density HMM.
0 Assignments
0 Petitions
Accused Products
Abstract
A device capable of achieving recognition at a high accuracy and with fewer calculations and which utilizes an HMM. The present device has a vector quantizing circuit generating a model by quantizing vectors of a training pattern having a vector series, and converting the vectors into a label series of clusters to which they belong, a continuous distribution probability density HMM generating circuit for generating a continuous distribution probability density HMM from a quantized vector series corresponding to each label of the label series, and a label incidence calculating circuit for calculating the incidence of the labels in each state from the training vectors classified in the same clusters and the continuous distribution probability density HMM.
-
Citations
13 Claims
-
1. An HMM generator, comprising:
-
vector quantizing means for generating a model by quantizing vectors of a training pattern having a vector series, and converting said quantizing vectors into a label series of clusters to which they belong, continuous probability distribution density HMM generating means for generating a continuous probability distribution density HMM from a quantized vector series corresponding to each label of said label series of clusters, and label incidence calculating means for calculating the incidence of the labels in each state from said quantizing vectors of a training pattern classified in the same label series of clusters and the continuous probability distribution density HMM. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
state transition probability memory means for storing a state transition probability obtained by the HMM (Hidden Markov Model) generator according to claim 1, and label incidence calculating memory means for storing a label incidence in each state.
-
-
3. A likelihood calculating device, comprising:
-
the vector quantizing means according to claim 1 for converting the quantized vector series to label series of clusters by substituting labels for vectors of a feature vector series that constitute an input pattern for the vector quantizing means, and likelihood calculating means for calculating, from a state transition probability and label incidence stored in an HMM memory device, the likelihood of HMM described by parameters stored in the HMM memory device to the input pattern, said HMM memory device comprising an HMM generator having vector quantizing means for generating a model by quantizing vectors of a training pattern having a vector series, and converting said quantizing vectors into a label series of clusters to which they belong, continuous probability distribution density HMM generating means for generating a continuous probability distribution density HMM from a quantized vector series corresponding to each label of said label series of clusters, and label incidence calculating means for calculating the incidence of the labels in each state from said quantizing vectors of a training pattern classified in the same label series of clusters and the continuous probability distribution density HMM.
-
-
4. A recognizing device comprising the likelihood calculating device according to claim 3 for each recognition unit, wherein:
the likelihood of the recognition models to the input signals is calculated, and the recognition unit to which the input signal corresponds, is determined from the likelihood.
-
5. An HMM (Hidden Markov Model) generator according to claim 1, wherein
the label incidence calculating means provides a probability density of quantized vectors corresponding to the clusters from a probability density function of the continuous probability distribution density HMM in state i, and recognizes the probability density as incidence bim of Cm in the state i, where Cm (m=1, . . . M) is the cluster. -
6. An HMM generator according to claim 5, wherein the label incidence calculating means includes a incidence normalizing means further calculating bim=bim/(bil+ —
- —
—
+biM) from the bim and recognizes the normalized incidence bim as incidence of Cm in the state i.
- —
-
7. An HMM generator according to claim 5, wherein the label incidence calculating means includes a incidence normalizing means further calculating bim=bim/(bil+—
- —
—
+biM) from the bim and recognizes the normalized incidence bim as incidence of Cm in the state i.
- —
-
8. An HMM generator according to claim 1, wherein
the label incidence calculating means calculates a probability distribution density of said quantizing vectors of a training pattern including a cm from a probability distribution density function of the continuous probability distribution density HMM in state i, includes characteristic values such as the mean and median of the probability distribution density, and recognizes a characteristic value as an incidence bim of cm in the state i, where cm (m=1, . . . M) is the cluster.
-
9. An HMM generator, comprising:
-
word pattern memory means for storing a training pattern and generating a series of feature vectors;
vector quantizing means connected to said word pattern memory means for quantizing vectors of a training pattern received from said word pattern memory means and converting said quantizing vectors into a label series of clusters to which they belong;
buffer memory means connected to said vector quantizing means for temporarily storing training word patterns of a word converted at said vector quantizing means;
parameter estimating means connected to said buffer memory means for generating a model corresponding to said word converted at said vector quantizing means;
parameter memory means connected to said parameter estimating means for storing re-estimated values of at least a transition probability for various states; and
label incidence calculating means connected to said parameter memory means for calculating the incidence of the labels in each state from said quantizing vectors of a training pattern classified in the same label series of clusters and the continuous probability distribution density HMM. - View Dependent Claims (10, 11, 12, 13)
clustering part means connected to said word pattern memory means for clustering said feature vectors as cluster members and generating a label of cluster members and its centroid.
-
-
13. The HMM generator according to claim 12, further comprising:
cluster vector memory means connected to said clustering part means, said vector quantizing means and said label incidence calculating means for storing respective vectors and centroids of cluster generated in said clustering part means.
Specification