A New Meta-Criterion for Regularized Subspace Information Criterion

نویسندگان

  • Yasushi Hidaka
  • Masashi Sugiyama
چکیده

In order to obtain better generalization performance in supervised learning, model parameters should be determined appropriately, i.e., they should be determined so that the generalization error is minimized. However, since the generalization error is inaccessible in practice, the model parameters are usually determined so that an estimator of the generalization error is minimized. The regularized subspace information criterion (RSIC) is such a generalization error estimator for model selection. RSIC includes an additional regularization parameter and it should be determined appropriately for better model selection. A meta-criterion for determining the regularization parameter has also been proposed and shown to be useful in practice. In this paper, we show that there are several drawbacks in the existing meta-criterion and give an alternative meta-criterion that can solve the problems. Through simulations, we show that the use of the new meta-criterion further improves the model selection performance.

منابع مشابه

Analytic Optimization of Adaptive Ridge Parameters Based on Regularized Subspace Information Criterion

In order to obtain better learning results in supervised learning, it is important to choose model parameters appropriately. Model selection is usually carried out by preparing a finite set of model candidates, estimating a generalization error for each candidate, and choosing the best one from the candidates. If the number of candidates is increased in this procedure, the optimization quality ...

متن کامل

A new information criterion for the selection of subspace models

The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. In this paper, we propose a new criterion for model selection named the subspace information criterion (SIC). Computer simulations show that SIC works well even when the number of training examples is small.

متن کامل

About Subspace-Frequently Hypercyclic Operators

In this paper, we introduce subspace-frequently hypercyclic operators. We show that these operators are subspace-hypercyclic and there are subspace-hypercyclic  operators that are not subspace-frequently hypercyclic. There is a criterion like to subspace-hypercyclicity criterion that implies subspace-frequent hypercyclicity and if an operator $T$ satisfies this criterion, then $Toplus T$ is sub...

متن کامل

Functional Analytic Approach to Model Selection — Subspace Information Criterion

The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. In this paper, we propose a new criterion for model selection called the subspace information criterion (SIC). Computer simulations show that SIC works well even when the number of training examples is small.

متن کامل

A Fast Signal Subspace Tracking Algorithm Based on a Subspace Information Criterion

In this paper, we present a new algorithm for tracking the signal subspace recursively. It is based on a new interpretation of the signal subspace. We introduce a novel information criterion for signal subspace estimation. We show that the solution of the proposed constrained optimization problem results the signal subspace. In addition, we introduce three adaptive algorithms which can be used ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

متن کامل
عنوان ژورنال:
  • IEICE Transactions

دوره 90-D  شماره 

صفحات  -

تاریخ انتشار 2007