نتایج جستجو برای: cdhmms
تعداد نتایج: 34 فیلتر نتایج به سال:
This paper presents an Hidden Markov Model (HMM)-based variable speech rate Mandarin Chinese text-to-speech (TTS) system. In this system, parameters of spectrum, fundametal frequency and state duration are generated by a context dependent HMM (CDHMM) whose model parameters are linear-interpolated from those of three CDHMMs trained by corpora in three different speech rates (SRs), i.e. fast, med...
This work studies a Bayesian (or Maximum A Posteriori MAP) approach to the adaptation of Continuous Density Hidden Markov Models (CDHMMs) to a specific condition of a speech recognition application. In order to improve the model robustness, CDHMMs formerly trained from laboratory data are then adapted using context dependent field utterances. Two specific problems have to be faced when using th...
The Switching Linear Gaussian (SLG) Models was proposed recently for time series data with nonlinear dynamics. In this paper, we present a new modelling approach, called SLGHMM, that uses a hybrid Dynamic Bayesian Network of SLG models and Continuous Density HMMs (CDHMMs) to compensate for the nonstationary distortion that may exist in speech utterance to be recognized. With this representation...
The Bayesian Learning approach (MAP Maximum A Posteriori) can be used for the incremental training of Continuous Density Hidden Markov Models (CDHMM), performed through speech data collected in real applications. The effectiveness of MAP is heavily conditioned by the correct balance between the a-priori knowledge and the field training data. In this paper we propose and evaluate several optimiz...
We present an efficient maximum likelihood (ML) training procedure for Gaussian mixture continuous density hidden Markov model (CDHMM) parameters. This procedure is proposed using the concept of approximate prior evolution, posterior intervention and feedback (PEPIF). In a series of experiments for training CDHMMs for a continuous Mandarin Chinese speech recognition task, the new PEPIF procedur...
Abstract. Recently, we proposed a new derivative to conventional continuous density hidden Markov modeling (CDHMM) that we call “subspace distribution clustering hidden Markov modeling” (SDCHMM). SDCHMMs can be created by tying low-dimensional subspace Gaussians in CDHMMs. In tasks we tried, usually only 32–256 subspace Gaussian prototypes were needed in SDCHMM-based system to maintain recognit...
Statistical speech recognition using continuousdensity hidden Markov models (CDHMMs) has yielded many practical applications. However, in general, mismatches between the training data and input data significantly degrade recognition accuracy. Various acoustic model adaptation techniques using a few input utterances have been employed to overcome this problem. In this article, we survey these ad...
Statistical speech recognition using continuous-density hidden Markov models (CDHMMs) has yielded many practical applications. However, in general, mismatches between the training data and input data significantly degrade recognition accuracy. Various acoustic model adaptation techniques using a few input utterances have been employed to overcome this problem. In this article, we survey these a...
Despite what is generally believed, we have recently shown that discrete-distribution HMMs can outperform continuousdensity HMMs at significantly faster decoding speeds. Recognition performance and decoding speed of the discrete HMMs are improved by using product-code Vector Quantization (VQ) and mixtures of discrete distributions. In this paper, we present efficient training and decoding algor...
This paper extends the evaluation of Hidden Markov Models with quantized parameters (qHMM) presented in [5] to the case of speaker adaptive training. In speaker-independent speech recognition tasks, qHMMs were found to provide a similar performance as the original continuous density HMMs (CDHMM) with substantially reduced memory requirements. In this paper, we propose a Bayesian type of adaptat...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید