Dialect Distance Assessment Based on 2-dimensional Pitch Slope Features and Kullback Leibler Divergences

نویسنده

  • Mahnoosh Mehrabani
چکیده

Dialect variations of a language have a severe impact on the performance of speech systems. Therefore, knowing how close or separate dialects are in a given language space provides useful information to predict, or improve, system performance when there is a mismatch between train and test data. Distance measures have been used in several applications of speech processing, including speech recognition, speech coding, and speech synthesis. However, apart from phonetic measures, little if any work has been done on dialect distance measurement. This study explores pitch movement differences among dialects. A method of dialect separation assessment based on modeling 2D pitch slope patterns within dialects is proposed. KullbackLeibler divergence is employed to compare the obtained statistical models.The presented scheme is evaluated on a corpus of Arabic dialects. The sensitivity of the proposed measure to changes on input data is quantified. It is also shown in a perceptive evaluation that the presented objective approach of dialect distance measurement correlates well with subjective distances.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Using Kullback-Leibler distance for performance evaluation of search designs

This paper considers the search problem, introduced by Srivastava cite{Sr}. This is a model discrimination problem. In the context of search linear models, discrimination ability of search designs has been studied by several researchers. Some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possibl...

متن کامل

Model Confidence Set Based on Kullback-Leibler Divergence Distance

Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...

متن کامل

Topological Data Analysis with Bregman Divergences

Given a finite set in a metric space, the topological analysis generalizes hierarchical clustering using a 1-parameter family of homology groups to quantify connectivity in all dimensions. Going beyond Euclidean distance and really beyond metrics, we show that the tools of topological data analysis also apply when we measure distance with Bregman divergences. While these divergences violate two...

متن کامل

Metrics Defined by Bregman Divergences †

Bregman divergences are generalizations of the well known Kullback Leibler divergence. They are based on convex functions and have recently received great attention. We present a class of “squared root metrics” based on Bregman divergences. They can be regarded as natural generalization of Euclidean distance. We provide necessary and sufficient conditions for a convex function so that the squar...

متن کامل

Kullback-Leibler Boosting

In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting has following properties. First, classification is based on the sum of histogram divergences along corresponding global and discriminating linear features. Second, these linear features, called KL features, are iteratively learnt by maximizing the projected Kullback-Leibler d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009