Local Parametric Modeling via U-Divergence
نویسنده
چکیده
This paper discusses a local parametric modeling by the use of U-divergence in a statistical pattern recognition. The class of U-divergence measures commonly has an empirical loss function in a simple form including Kullback-Leibler divergence, the power divergence and mean squared error. We propose a minimization algorithm for parametric models of sequentially increasing dimension by incorporating kernel localization into. This is a boosting algorithm with spatial information. The objective of this paper is to accommodate simultaneously local and global fitting for the statistical pattern recognition, which is extended to non-parametric estimation of density function and regression function.
منابع مشابه
Information Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملPark Estimation of Kullback – Leibler divergence by local likelihood
Motivated from the bandwidth selection problem in local likelihood density estimation and from the problem of assessing a final model chosen by a certain model selection procedure, we consider estimation of the Kullback–Leibler divergence. It is known that the best bandwidth choice for the local likelihood density estimator depends on the distance between the true density and the ‘vehicle’ para...
متن کاملModeling and thermal parametric analysis of U-pipe evacuated tube solar collector for four different climates in Iran
In this study, thermal performance of the collector with analytic and quasi-dynamic method is evaluated based on energy balance equations for each part of the U-pipe evacuated tube solar collector. Using this approach, effect of different parameters such as tube size, overall heat loss coefficient, absorber tube absorptivity, mass flow rate and air layer thermal resistance on thermal performanc...
متن کاملFeature-based non-parametric estimation of Kullback–Leibler divergence for SAR image change detection
In this article, a method based on a non-parametric estimation of the Kullback–Leibler divergence using a local feature space is proposed for synthetic aperture radar (SAR) image change detection. First, local features based on a set of Gabor filters are extracted from both preand post-event images. The distribution of these local features from a local neighbourhood is considered as a statistic...
متن کاملOn Unified Generalizations of Relative Jensen–shannon and Arithmetic–geometric Divergence Measures, and Their Properties Pranesh Kumar and Inder Jeet Taneja
Abstract. In this paper we shall consider one parametric generalization of some nonsymmetric divergence measures. The non-symmetric divergence measures are such as: Kullback-Leibler relative information, χ2−divergence, relative J – divergence, relative Jensen – Shannon divergence and relative Arithmetic – Geometric divergence. All the generalizations considered can be written as particular case...
متن کامل