نتایج جستجو برای: gaussian mixture model
تعداد نتایج: 2218239 فیلتر نتایج به سال:
Mixture of Gaussian processes models extended a single Gaussian process with ability of modeling multi-modal data and reduction of training complexity. Previous inference algorithms for these models are mostly based on Gibbs sampling, which can be very slow, particularly for large-scale data sets. We present a new generative mixture of experts model. Each expert is still a Gaussian process but ...
Knowledge of the noise probability density function (PDF) is central in signal detection problems, not only for optimum receiver structures but also for processing procedures such as power normalization. Unfortunately, the statistical knowledge must be acquired since the classical assumption of a Gaussian noise PDF is often not valid in underwater acoustics. In this report, we study statistical...
This report describes implementation of the standard i-vector-PLDA framework for the Kaldi speech recognition toolkit. The current existing speaker recognition system implementation is based on the Subspace Gaussian Mixture Model (SGMM) technique although it shares many similarities with the standard implementation. In our implementation, we modified the code so that it mimics the standard algo...
The mixture of Gaussian distributions, a soft version of k-means ( [2]), is considered a stateof-the-art clustering algorithm. It is widely used in computer vision for selecting classes, e.g., color[4, 1, 5],texture[1, 9], shapes [12, 10]. In this algorithm, each class is described by a Gaussian distribution, defined by its mean and covariance. The data is described by a weighted sum of these G...
The bottleneck (BN) feature, particularly based on deep structures, has gained significant success in automatic speech recognition (ASR). However, applying the BN feature to small/medium-scale tasks is nontrivial. An obvious reason is that the limited training data prevent from training a complicated deep network; another reason, which is more subtle, is that the BN feature tends to possess hig...
In this paper, we explore new high-level features for language identification. The recently introduced Subspace Gaussian Mixture Models (SGMM) provide an elegant and efficient way for GMM acoustic modelling, with mean supervectors represented in a low-dimensional representative subspace. SGMMs also provide an efficient way of speaker adaptation by means of lowdimensional vectors. In our framewo...
The HMM (Hidden Markov Model) is a probabilistic model of the joint probability of a collection of random variables with both observations and states. The GMM (Gaussian Mixture Model) is a finite mixture probability distribution model. Although the two models have a close relationship, they are always discussed independently and separately. The EM (Expectation-Maximum) algorithm is a general me...
Under the Bayesian Ying-Yang (BYY) harmony learning theory, a harmony function has been developed for Gaussian mixture model with an important feature that, via its maximization through a gradient learning rule, model selection can be made automatically during parameter learning on a set of sample data from a Gaussian mixture. This paper proposes two further gradient learning rules, called conj...
We introduce the mixture of Gaussian processes (MGP) model which is useful for applications in which the optimal bandwidth of a map is input dependent. The MGP is derived from the mixture of experts model and can also be used for modeling general conditional probability densities. We discuss how Gaussian processes -in particular in form of Gaussian process classification, the support vector mac...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید