نتایج جستجو برای: معیار kullback
تعداد نتایج: 34836 فیلتر نتایج به سال:
In this paper, the accumulated Kullback divergence (AKD) is used to analyze ASR performance deterioration due to the presence of background noise. The AKD represents a distance between the feature value distribution observed during training and the distribution of the observations in the noisy test condition for each individual feature vector component. In our experiments the AKD summed over al...
This paper proposes a new method for vector quantization by minimizing the Kullback-Leibler Divergence between the class label distributions over the quantization inputs, which are original vectors, and the output, which is the quantization subsets of the vector set. In this way, the vector quantization output can keep as much information of the class label as possible. An objective function is...
In this paper, we study a matricial version of a generalized moment problem with degree constraint. We introduce a new metric on multivariable spectral densities induced by the family of their spectral factors, which, in the scalar case, reduces to the Hellinger distance. We solve the corresponding constrained optimization problem via duality theory. A highly nontrivial existence theorem for th...
We introduce a Kullback-Leibler type distance between spectral density functions of stationary stochastic processes and solve the problem of optimal approximation of a given spectral density Ψ by one that is consistent with prescribed second-order statistics. In general, such statistics are expressed as the state covariance of a linear filter driven by a stochastic process whose spectral densit...
هدف اصلی این پژوهش ارائه چارچوبی برای ارزیابی عملکرد شرکت بهرهبرداری قطار شهری مشهد بر مبنای کارت امتیازی متوازن و تکنیک تصمیمگیری چندمعیاره (بهترین ـ بدترین فازی) است. حاضر از نظر هدف، کاربردی روش، کمّی اسنادی جمعآوری دادههای کیفی اساس 15 خبره در طی مقطع زمانی بین سالهای 1396 تا 1398 صورت پذیرفت. 9 معیار منظر مالی، 16 مشتریان، 8 فرایند داخلی 14 رشد توسعه یادگیری طریق اسناد پژوهشهای پ...
In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback-Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback-Leibler divergence (DC-KL), to reduce the contribution of the...
Nonadditive (nonextensive) generalization of the quantum Kullback-Leibler divergence, termed the quantum q-divergence, is shown not to increase by projective measurements in an elementary manner.
The estimation of the feature space in analysis radar signals (airplanes, ships, navigation stations, etc.), is an important element machine learning. From point view queuing theory, a mathematical model complex detected signal can be represented as ordinary flow events described by Poisson distribution for randomly varying parameters signal. paper demonstrates orthogonality characteristic sour...
هدف انتخاب تأمینکنندگان و تخصیص سفارش بر اساس ابعاد پایداری ریسک پایداری، در شرایط عدمقطعیت برخی پارامترها است؛ همچنین با محدودیت استراتژی کاهش برای مدیریت همراه است. معیارهای ریسک، امتیاز توسط روشهای تاپسیس فازی تجزیهوتحلیل آثار شکست محاسبه شده است سپس برنامه تصادفی چندمرحلهای ایجاد معیار ارزش معرض شرطی، پیرامون تأمین منابع یک فضای چنددورهای برنامهریزی دستیابی به برنامهریزی انعطافپ...
The I-divergence or unnormalized generalization of KullbackLeibler (KL) divergence is commonly used in Nonnegative Matrix Factorization (NMF). This divergence has the drawback that its gradients with respect to the factorizing matrices depend heavily on the scales of the matrices, and learning the scales in gradient-descent optimization may require many iterations. This is often handled by expl...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید