نتایج جستجو برای: معیار kullback

تعداد نتایج: 34836  

2002
Febe de Wet Johan de Veth Louis Boves

In this paper, the accumulated Kullback divergence (AKD) is used to analyze ASR performance deterioration due to the presence of background noise. The AKD represents a distance between the feature value distribution observed during training and the distribution of the observations in the noisy test condition for each individual feature vector component. In our experiments the AKD summed over al...

Journal: :CoRR 2015
Lan Yang Jingbin Wang Yujin Tu Prarthana Mahapatra Nelson Cardoso

This paper proposes a new method for vector quantization by minimizing the Kullback-Leibler Divergence between the class label distributions over the quantization inputs, which are original vectors, and the output, which is the quantization subsets of the vector set. In this way, the vector quantization output can keep as much information of the class label as possible. An objective function is...

Journal: :IEEE Trans. Automat. Contr. 2008
Augusto Ferrante Michele Pavon Federico Ramponi

In this paper, we study a matricial version of a generalized moment problem with degree constraint. We introduce a new metric on multivariable spectral densities induced by the family of their spectral factors, which, in the scalar case, reduces to the Hellinger distance. We solve the corresponding constrained optimization problem via duality theory. A highly nontrivial existence theorem for th...

Journal: :IEEE Trans. Information Theory 2003
Tryphon T. Georgiou Anders Lindquist

We introduce a Kullback-Leibler type distance between spectral density functions of stationary stochastic processes and solve the problem of optimal approximation of a given spectral density Ψ by one that is consistent with prescribed second-order statistics. In general, such statistics are expressed as the state covariance of a linear filter driven by a stochastic process whose spectral densit...

Journal: : 2022

هدف اصلی این پژوهش ارائه چارچوبی برای ارزیابی عملکرد شرکت بهره‌­برداری قطار شهری مشهد بر مبنای کارت امتیازی متوازن و تکنیک تصمیم‌­گیری چندمعیاره (بهترین ـ بدترین فازی) است. حاضر از نظر هدف، کاربردی روش، کمّی اسنادی جمع‌­آوری داده‌های کیفی اساس 15 خبره در طی مقطع زمانی بین سال­‌های 1396 تا 1398 صورت پذیرفت. 9 معیار منظر مالی، 16 مشتریان، 8 فرایند داخلی 14 رشد توسعه یادگیری طریق اسناد پژوهش­‌های پ...

Journal: :Pattern Recognition 2017
Moacir Ponti Josef Kittler Mateus Riva Teófilo Emídio de Campos Cemre Zor

In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback-Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback-Leibler divergence (DC-KL), to reduce the contribution of the...

2003
Sumiyoshi Abe

Nonadditive (nonextensive) generalization of the quantum Kullback-Leibler divergence, termed the quantum q-divergence, is shown not to increase by projective measurements in an elementary manner.

Journal: :Journal of radio electronics 2022

The estimation of the feature space in analysis radar signals (airplanes, ships, navigation stations, etc.), is an important element machine learning. From point view queuing theory, a mathematical model complex detected signal can be represented as ordinary flow events described by Poisson distribution for randomly varying parameters signal. paper demonstrates orthogonality characteristic sour...

Journal: : 2022

هدف انتخاب تأمین‌کنندگان و تخصیص سفارش بر اساس ابعاد پایداری ریسک پایداری، در شرایط عدم‌­قطعیت برخی پارامترها است؛ همچنین با محدودیت استراتژی کاهش برای مدیریت همراه است. معیارهای ریسک، امتیاز توسط روش‌های تاپسیس فازی تجزیه‌وتحلیل آثار شکست محاسبه شده است سپس برنامه تصادفی چندمرحله‌ای ایجاد معیار ارزش معرض شرطی، پیرامون تأمین منابع یک فضای چند­دوره‌ای برنامه‌‎ریزی دستیابی به برنامه‌ریزی انعطاف‌پ...

2011
Zhirong Yang He Zhang Zhijian Yuan Erkki Oja

The I-divergence or unnormalized generalization of KullbackLeibler (KL) divergence is commonly used in Nonnegative Matrix Factorization (NMF). This divergence has the drawback that its gradients with respect to the factorizing matrices depend heavily on the scales of the matrices, and learning the scales in gradient-descent optimization may require many iterations. This is often handled by expl...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید