Class Probability Estimation via Differential Geometric Regularization

نویسندگان

  • Qinxun Bai
  • Steven Rosenberg
  • Zheng Wu
  • Stan Sclaroff
چکیده

We use differential geometry techniques to estimate the class probability P (y = l|x) for learning both binary and multiclass plug-in classifiers. We propose a geometric regularization technique to find the optimal submanifold corresponding to the estimator of P (y = l|x). The regularization term measures the volume of this submanifold, based on the intuition that overfitting produces fast oscillations and hence large volume of the estimator. We use gradient flow methods to move from an initial estimator towards a minimizer of a penalty function that penalizes both the deviation of the submanifold from the training data and large volume. We establish Bayes consistency for our algorithm under mild initialization assumptions. In experiments for both binary and multiclass classification, our implementation compares favorably to several widely used classification methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Differential Geometric Regularization for Supervised Learning of Classifiers

We study the problem of supervised learning for both binary and multiclass classification from a unified geometric perspective. In particular, we propose a geometric regularization technique to find the submanifold corresponding to an estimator of the class probability P (y|x). The regularization term measures the volume of this submanifold, based on the intuition that overfitting produces rapi...

متن کامل

Spectral Regularization for Support Estimation

In this paper we consider the problemof learning fromdata the support of a probability distribution when the distribution does not have a density (with respect to some reference measure). We propose a new class of regularized spectral estimators based on a new notion of reproducing kernel Hilbert space, which we call “completely regular”. Completely regular kernels allow to capture the relevant...

متن کامل

Spectral Regularization for Support Estimation

In this paper we consider the problem of learning from data the support of a probability distribution when the distribution does not have a density (with respect to some reference measure). We propose a new class of regularized spectral estimators based on a new notion of reproducing kernel Hilbert space, which we call “completely regular”. Completely regular kernels allow to capture the releva...

متن کامل

Image alignment via kernelized feature learning

Machine learning is an application of artificial intelligence that is able to automatically learn and improve from experience without being explicitly programmed. The primary assumption for most of the machine learning algorithms is that the training set (source domain) and the test set (target domain) follow from the same probability distribution. However, in most of the real-world application...

متن کامل

Regression on Manifolds : Estimation of the Exterior Derivative

Collinearity and near-collinearity of predictors cause difficulties when doing regression. In these cases, variable selection becomes untenable because of mathematical issues concerning the existence and numerical stability of the regression coefficients, and interpretation of the coefficients is ambiguous because gradients are not defined. Using a differential geometric interpretation, in whic...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015