نتایج جستجو برای: kullback leibler
تعداد نتایج: 7228 فیلتر نتایج به سال:
In this article, a method based on a non-parametric estimation of the Kullback–Leibler divergence using a local feature space is proposed for synthetic aperture radar (SAR) image change detection. First, local features based on a set of Gabor filters are extracted from both preand post-event images. The distribution of these local features from a local neighbourhood is considered as a statistic...
We show that the Kullback-Leibler distance is a good measure of the statistical uncertainty of correlation matrices estimated by using a finite set of data. For correlation matrices of multivariate Gaussian variables we analytically determine the expected values of the Kullback-Leibler distance of a sample correlation matrix from a reference model and we show that the expected values are known ...
We propose a method to measure real-valued time series irreversibility which combines two different tools: the horizontal visibility algorithm and the Kullback-Leibler divergence. This method maps a time series to a directed network according to a geometric criterion. The degree of irreversibility of the series is then estimated by the Kullback-Leibler divergence (i.e. the distinguishability) b...
Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself is of the order of the number of observations. A correction term employed in AIC is an example to...
We provide background information to allow a heuristic understanding of two types of criteria used in selecting a model for making inferences from ringing data. The first type of criteria (e.g., AIC, AIC, QAIC and TIC) are estimates of (relative) Kullback-Leibler information or-distance and attempt to select a good approximating model for inference, based on the Principle of Parsimony. The seco...
The Wasserstein probability metric has received much attention from the machine learning community. Unlike the Kullback-Leibler divergence, which strictly measures change in probability, the Wasserstein metric reflects the underlying geometry between outcomes. The value of being sensitive to this geometry has been demonstrated, among others, in ordinal regression and generative modelling. In th...
One of the key tasks in time series data mining is to cluster time series. However, traditional clustering methods focus on the similarity of time series patterns in past time periods. In many cases such as retail sales, we would prefer to cluster based on the future forecast values. In this paper, we show an approach to cluster forecasts or forecast time series patterns based on the Kullback-L...
BACKGROUND To select a proper diagnostic test, it is recommended that the most specific test be used to confirm (rule in) a diagnosis, and the most sensitive test be used to establish that a disease is unlikely (rule out). These rule-in and rule-out concepts can also be characterized by the likelihood ratio (LR). However, previous papers discussed only the case of binary tests and assumed test ...
Content-based video retrieval systems have shown great potential in supporting decision making in clinical activities, teaching, and biological research. In content-based video retrieval, feature combination plays a key role. As a result content-based retrieval of all different type video data turns out to be a challenging and vigorous problem. This paper presents an effective content based vid...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید