در کاربرد تشخیص زبان گفتاری GMM-VSM در قالب سیستم GMM

Authors

  • قاسمیان , فهیمه دانشکده مهندسی کامپیوتر و فناوری اطلاعات، دانشگاه صنعتی امیرکبیر
  • همایون‌پور , محمدمهدی دانشکده مهندسی کامپیوتر و فناوری اطلاعات، دانشگاه صنعتی امیرکبیر
Abstract:

GMM is one of the most successful models in the field of automatic language identification. In this paper we have proposed a new model named adapted weight GMM (AW-GMM). This model is similar to GMM but the weights are determined using GMM-VSM LID system based on the power of each component in discriminating one language from the others. Also considering the computational complexity of GMM-VSM, we have proposed a technique for constructing bigram sequences of components which could be used for higher sequence orders and decreases the complexity. Experiments on four languages of OGI corpus including English, Farsi, French and German have shown the effectiveness of proposed techniques.

Upgrade to premium to download articles

Sign up to access the full text

Already have an account?login

similar resources

Comparison Comparison PCA Train GMM Feature Reduction Classify GMM Threshold

A new approach to face verification from 3D data is presented. The method uses 3D registration techniques designed to work with resolution levels typical of the irregular point cloud representations provided by Structured Light scanning. Preprocessing using a-priori information of the human face and the Iterative Closest Point algorithm are employed to establish correspondence between test and ...

full text

Gaussian Multipole Model (GMM).

An electrostatic model based on charge density is proposed as a model for future force fields. The model is composed of a nucleus and a single Slater-type contracted Gaussian multipole charge density on each atom. The Gaussian multipoles are fit to the electrostatic potential (ESP) calculated at the B3LYP/6-31G* and HF/aug-cc-pVTZ levels of theory and tested by comparing electrostatic dimer ene...

full text

Single-tree GMM training

In this short document, we derive a tree-independent single-tree algorithm for Gaussian mixture model training, based on a technique proposed by Moore [8]. Here, the solution we provide is tree-independent and thus will work with any type of tree and any type of traversal; this is more general than Moore’s original formulation, which was limited to mrkd-trees. This allows us to develop a flexib...

full text

Tunable GMM Kernels

The recently proposed “generalized min-max” (GMM) kernel [9] can be efficiently linearized, with direct applications in large-scale statistical learning and fast near neighbor search. The linearized GMM kernel was extensively compared in [9] with linearized radial basis function (RBF) kernel. On a large number of classification tasks, the tuning-free GMM kernel performs (surprisingly) well comp...

full text

Gmm-free Dnn Training

While deep neural networks (DNNs) have become the dominant acoustic model (AM) for speech recognition systems, they are still dependent on Gaussian mixture models (GMMs) for alignments both for supervised training and for context dependent (CD) tree building. Here we explore bootstrapping DNN AM training without GMM AMs and show that CD trees can be built with DNN alignments which are better ma...

full text

Implementing the Infinite GMM

Rasmussen [2000] describes a hierarchical Bayesian model for a mixture of Gaussians with a possibly infinite number of components. I have implemented his model for univariate data, along with the Adaptive Rejection Sampling method of Gilks and Wild [1992]. In this paper I explain some of the difficulties in implementing Rasmussen’s model and clarify some of the points he leaves vague in his pap...

full text

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 2  issue 5

pages  1- 8

publication date 2011-01

By following a journal you will be notified via email when a new issue of this journal is published.

Keywords

No Keywords

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023