نتایج جستجو برای: cdhmms
تعداد نتایج: 34 فیلتر نتایج به سال:
Based on the concept of multiple-stream prior evolution and posterior pooling, we propose a new incremental adaptive Bayesian learning framework for e cient on-line adaptation of the continuous density hidden Markov model (CDHMM) parameters. As a rst step, we apply the a ne transformations to the mean vectors of CDHMMs to control the evolution of their prior distribution. This new stream of pri...
This paper addresses the problem of robust speech recognition in noisy conditions in the framework of hidden Markov models (HMMs) and missing feature techniques. It presents a new statistical approach to detection and estimation of unreliable features based on a probabilistic measure and Gaussian mixture model (GMM). In the estimation process, the GMM is compensated using parameters of the stat...
We introduce a new adaptive Bayesian learning framework, called multiple-stream prior evolution and posterior pooling, for online adaptation of the continuous density hidden Markov model (CDHMM) parameters. Among three architectures we proposed for this framework, we study in detail a specific two-stream system where linear transformations are applied to the mean vectors of CDHMMs to control th...
We address the problem of robustness of auditory models as front ends for speech recognition. Auditory models have been referred as superior front ends when speech is corrupted by noise or linear filtering, but there is not yet a deep understanding of its functioning. We analyze some commonly used auditory models and show that they present some interesting properties which are useful for robust...
In our previous works, a maximum likelihood training approach was developed based on the concept of stochastic vector mapping (SVM) that performs a frame-dependent bias removal to compensate for environmental variabilities in both training and recognition stages. Its effectiveness was confirmed by evaluation experiments on Aurora2 and Aurora3 databases. In this paper, we present an extended ML ...
This paper presents an automatic phrase boundary labeling method for speech synthesis database annotation using contextdependent hidden Markov models (CD-HMMs) and n-gram prior distributions. At training stage, CD-HMMs are built to describe the conditional distribution of acoustic features given phonetic label and phrase boundary. In addition, n-gram models are estimated to represent the prior ...
In our previous works, a Switching Linear Gaussian Hidden Markov Model (SLGHMM) and its segmental derivative, SSLGHMM, were proposed to cast the problem of modelling a noisy speech utterance by a well-designed dynamic Bayesian network. We presented parameter learning procedures for both models with maximum likelihood (ML) criterion. The effectiveness of such models was confirmed by evaluation e...
In this work the objective is to increase the accuracy of speaker dependent phonetic transcription of spoken utterances using continuous density and semi-continuous HMMs. Experiments with LVQ based corrective tuning indicate that the average recognition error rate can be made to decrease about 5% { 10%. Experiments are also made to increase the eeciency of the Viterbi decoding by a discriminati...
In this paper, we present a Hierarchical Correlation Compensation (HCC) scheme to reliably estimate full covariance matrices for Gaussian components in CDHMMs for speech recognition. First, we build a hierarchical tree in the covariance space, where each leaf node represents a Gaussian component in the CDHMM set. For all lower-level nodes in the tree, we estimate a diagonal covariance matrix as...
In this paper, we propose a model-based hierarchical clustering algorithm that automatically builds a regression class tree for the well-known speaker adaptation technique Maximum Likelihood Linear Regression (MLLR). When building a regression class tree, the mean vectors of the Gaussian components of the model set of a speaker independent CDHMMbased speech recognition system are collected as t...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید