Topological mixture estimation

نویسنده

  • Steve Huntsman
چکیده

Density functions that represent sample data are often multimodal, i.e. they exhibit more than one maximum. Typically this behavior is taken to indicate that the underlying data deserves a more detailed representation as a mixture of densities with individually simpler structure. The usual specification of a component density is quite restrictive, with log-concave the most general case considered in the literature, and Gaussian the overwhelmingly typical case. It is also necessary to determine the number of mixture components a priori, and much art is devoted to this. Here, we introduce topological mixture estimation, a completely nonparametric and computationally efficient solution to the one-dimensional problem where mixture components need only be unimodal. We repeatedly perturb the unimodal decomposition of Baryshnikov and Ghrist to produce a topologically and information-theoretically optimal unimodal mixture. We also detail a smoothing process that optimally exploits topological persistence of the unimodal category in a natural way when working directly with sample data. Finally, we illustrate these techniques through examples.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Density Estimation by Mixture Models with Smoothing Priors

In the statistical approach for self-organizing maps (SOMs), learning is regarded as an estimation algorithm for a gaussian mixture model with a gaussian smoothing prior on the centroid parameters. The values of the hyperparameters and the topological structure are selected on the basis of a statistical principle. However, since the component selection probabilities are fixed to a common value,...

متن کامل

Using the self-organizing map to speed up the probability density estimation for speech recognition with mixture density HMMs

This paper presents methods to improve the probability density estimation in hidden Markov models for phoneme recognition by exploiting the Self-Organizing Map (SOM) algorithm. The advantage of using the SOM is based on the created approximative topology between the mixture densities by training the Gaussian mean vectors used as the kernel centers by the SOM algorithm. The topology makes the ne...

متن کامل

Error estimation for nonlinear pseudoparabolic equations with nonlocal boundary conditions in reproducing kernel space

In this paper we discuss about nonlinear pseudoparabolic equations with nonlocal boundary conditions and their results. An effective error estimation for this method altough has not yet been discussed. The aim of this paper is to fill this gap.

متن کامل

Speech Enhancement Using Gaussian Mixture Models, Explicit Bayesian Estimation and Wiener Filtering

Gaussian Mixture Models (GMMs) of power spectral densities of speech and noise are used with explicit Bayesian estimations in Wiener filtering of noisy speech. No assumption is made on the nature or stationarity of the noise. No voice activity detection (VAD) or any other means is employed to estimate the input SNR. The GMM mean vectors are used to form sets of over-determined system of equatio...

متن کامل

Statistical Wavelet-based Image Denoising using Scale Mixture of Normal Distributions with Adaptive Parameter Estimation

Removing noise from images is a challenging problem in digital image processing. This paper presents an image denoising method based on a maximum a posteriori (MAP) density function estimator, which is implemented in the wavelet domain because of its energy compaction property. The performance of the MAP estimator depends on the proposed model for noise-free wavelet coefficients. Thus in the wa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1712.04487  شماره 

صفحات  -

تاریخ انتشار 2017