Manifold Constrained Finite Gaussian Mixtures

نویسندگان

  • Cédric Archambeau
  • Michel Verleysen
چکیده

In many practical applications, the data is organized along a manifold of lower dimension than the dimension of the embedding space. This additional information can be used when learning the model parameters of Gaussian mixtures. Based on a mismatch measure between the Euclidian and the geodesic distance, manifold constrained responsibilities are introduced. Experiments in density estimation show that manifold Gaussian mixtures outperform ordinary Gaussian mixtures.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Manifold Constrained Variational Mixtures

In many data mining applications, the data manifold is of lower dimension than the dimension of the input space. In this paper, it is proposed to take advantage of this additional information in the frame of variational mixtures. The responsibilities computed in the VBE step are constrained according to a discrepancy measure between the Euclidean and the geodesic distance. The methodology is ap...

متن کامل

Internally Constrained Mixtures of Elastic Continua

A treatment of internally constrained mixtures of elastic continua at a common temperature is de­ veloped. Internal constraints involving the defonnation gradient tensors and the common mixture temperature are represented by a constraint manifold, and an internally constrained mixture ofelastic continua is associated with each unique equivalence class ofunconstrained mixtures. The example of in...

متن کامل

On w-mixtures: Finite convex combinations of prescribed component distributions

We consider the space of w-mixtures that are the set of finite statistical mixtures sharing the same prescribed component distributions. The geometry induced by the Kullback-Leibler (KL) divergence on this family of w-mixtures is a dually flat space in information geometry called the mixture family manifold. It follows that the KL divergence between two w-mixtures is equivalent to a Bregman Div...

متن کامل

Mixture autoregressive hidden Markov models for speech signals

In this paper a signal modeling technique based upon finite mixture autoregressive probabilistic functions of Markov chains is developed and applied to the problem of speech recognition, particularly speaker-independent recognition of isolated digits. Two types of mixture probability densities are investigated: finite mixtures of Gaussian autoregressive densities (GAM) and nearest-neighbor part...

متن کامل

Classifying with Gaussian Mixtures and Clusters

In this paper, we derive classifiers which are winner-take-all (WTA) approximations to a Bayes classifier with Gaussian mixtures for class conditional densities. The derived classifiers include clustering based algorithms like LVQ and k-Means. We propose a constrained rank Gaussian mixtures model and derive a WTA algorithm for it. Our experiments with two speech classification tasks indicate th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005