نتایج جستجو برای: selection criterion

تعداد نتایج: 386454  

2000
Loo-Nin Teow Kia-Fock Loe

The selection of kernel parameters is an open problem in the training of nonlinear support vector machines. The usual selection criterion is the quotient of the radius of the smallest sphere enclosing the training features and the margin width. Empirical studies on real-world data using Gaussian and polynomial kernels show that the test error due to this criterion is often much larger than the ...

Journal: :Pattern Recognition 1996
André F. Kohn Luis G. M. Nakano Miguel Oliveira e Silva

-This paper presents a new class discriminability measure based on an adaptive partitioning of the feature space according to the available class samples. It is intended to be used as a criterion in a classifier-independent feature selection procedure. The partitioning is performed according to a binary splitting rule and appropriate stopping criteria. Results from several tests with Gaussian a...

2006
Zhang Zhang Jun Li Xiao-Qian Zhao Jun Wang Gane Ka-Shu Wong Jun Yu

KaKs_Calculator is a software package that calculates nonsynonymous (Ka) and synonymous (Ks) substitution rates through model selection and model averaging. Since existing methods for this estimation adopt their specific mutation (substitution) models that consider different evolutionary features, leading to diverse estimates, KaKs_Calculator implements a set of candidate models in a maximum li...

2003
Mário A. T. Figueiredo Anil K. Jain Martin H. C. Law

We propose a feature selection approach for clustering which extends Koller and Sahami's mutual-information-based criterion to the unsupervised case. This is achieved with the help of a mixture-based model and the corresponding expectation-maximization algorithm. The result is a backward search scheme, able to sort the features by order of relevance. Finally, an MDL criterion is used to prune t...

2000
Masashi Sugiyama Hidemitsu Ogawa

%b%G%kA*Br$NLdBj$O!$65;UIU$-3X=,$K$*$$$F9b$$HF2=G=NO$r3MF@$9$k$?$a$K=EMW$G$"$k!%%b%G%kA*Br$NpJsNL5,=‘(AIC) $,$3$l$ $̂G$K9-$/MQ$$$i$l$F$-$?!%$7$+$7$J$,$i!$AIC $G$O71N} %G!<%?$K4X$9$kA26a6a;w$rMQ$$$F$$$k$?$a!$71N}%G!<%??t$,>/$J$$>l9g$K$OE,@Z$K%b%G%kA*Br$r9T$&$3$H$, $G$-$J$$!%$=$3$GK\O@J8$G$O!$subspace information criterion (SIC) $H$h$P$l$k?7$7$$%b%G%kA*Br$N5,=‘$r Ds0F$7!$...

Journal: :CoRR 2017
Mahdi Zarei

In this paper we introduce a new feature selection algorithm to remove the irrelevant or redundant features in the data sets. In this algorithm the importance of a feature is based on its fitting to the Catastrophe model. Akaike information criterion value is used for ranking the features in the data set. The proposed algorithm is compared with well-known RELIEF feature selection algorithm. Bre...

2000
PRASAD A. NAIK CHIH-LING TSAI

We derive a new model selection criterion for single-index models, AICC , by minimizing the expected Kullback-Leibler distance between the true and candidate models. The proposed criterion selects not only relevant variables but also the smoothing parameter for an unknown link function. Thus, it is a general selection criterion that provides a uniÞed approach to model selection across both para...

2000
Rudy Moddemeijer

A correctly derived Auto Regressive (AR) model can not always optimize the intended approximation. An optimal model should balance bias, caused by under-fitting, and additional variance, caused by over-fitting. The selection of this optimal AR-model is a combination of AR-order estimation and the reduction of the number of coefficients by pruning. We leave the classical approach of ARorder esti...

2008
Junfeng Shang

In the mixed modeling framework, Monte Carlo simulation and cross validation are employed to develop an “improved” Akaike information criterion, AICi, and the predictive divergence criterion, PDC, respectively, for model selection. The selection and the estimation performance of the criteria is investigated in a simulation study. Our simulation results demonstrate that PDC outperforms AIC and A...

2012
William D. Penny

In neuroimaging it is now becoming standard practise to fit multiple models to data and compare them using a model selection criterion. This is especially prevalent in the analysis of brain connectivity. This paper describes a simulation study which compares the relative merits of three model selection criteria (i) Akaike's Information Criterion (AIC), (ii) the Bayesian Information Criterion (B...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید