نتایج جستجو برای: rbf kernel function

تعداد نتایج: 1254130  

2008
Meng-Dar Shieh

In this paper, Support Vector Regression (SVR) training models using three different kernels: polynomial, Radial Basis Function (RBF), and mixed kernels, are constructed to demonstrate the training performance of unarranged data obtained from 32 virtual 3-D computer models. The 32 samples used as input data for training the three SVR models are represented by the coordination value sets of poin...

2012
Lluís A. Belanche Muñoz Jerónimo Hernández

A two-layer neural network is developed in which the neuron model computes a user-defined similarity function between inputs and weights. The neuron model is formed by the composition of an adapted logistic function with the mean of the partial input-weight similarities. The model is capable of dealing directly with variables of potentially different nature (continuous, ordinal, categorical); t...

2007
EDWIRDE LUIZ SILVA

This paper is intender to be a simple example illustrating some of the capabilities of Radial basis function by pruning with QLP decomposition. The applicability of the radial basis function (RBF) type function of artificial neural networks (ANNS) approach for re-estimate the Box, Traingle, Epanechnikov and Normal densities. We propose an application of QLP decomposition model to reduce to the ...

2008
Pedro Antonio Gutiérrez César Hervás-Martínez Mariano Carbonero-Ruz Juan Carlos Fernández

This paper proposes a hybrid neural network model using a possible combination of different transfer projection functions (sigmoidal unit, SU, product unit, PU) and kernel functions (radial basis function, RBF) in the hidden layer of a feed-forward neural network. An evolutionary algorithm is adapted to this model and applied for learning the architecture, weights and node typology. Three diffe...

2006
Yuya Kamada Shigeo Abe

In our previous work we have shown that Mahalanobis kernels are useful for support vector classifiers both from generalization ability and model selection speed. In this paper we propose using Mahalanobis kernels for function approximation. We determine the covariance matrix for the Mahalanobis kernel using all the training data. Model selection is done by line search. Namely, first the margin ...

M. Araghi, M. Khatibinia,

Flow number of asphalt–aggregate mixtures as an explanatory factor has been proposed in order to assess the rutting potential of asphalt mixtures. This study proposes a multiple–kernel based support vector machine (MK–SVM) approach for modeling of flow number of asphalt mixtures. The MK–SVM approach consists of weighted least squares–support vector machine (WLS–SVM) integrating two kernel funct...

2011
Sutao Song Zhichao Zhan Zhiying Long Jiacai Zhang Li Yao

BACKGROUND Support vector machine (SVM) has been widely used as accurate and reliable method to decipher brain patterns from functional MRI (fMRI) data. Previous studies have not found a clear benefit for non-linear (polynomial kernel) SVM versus linear one. Here, a more effective non-linear SVM using radial basis function (RBF) kernel is compared with linear SVM. Different from traditional stu...

2013
Rahul Samant Srikantha Rao

This paper investigates the ability of several models of Support Vector Machines (SVMs) with alternate kernel functions to predict the probability of occurrence of Essential Hypertension (HT) in a mixed patient population. To do this a SVM was trained with 13 inputs (symptoms) from the medical dataset. Different kernel functions, such as Linear, Quadratic, Polyorder (order three), Multi Layer P...

Journal: :CoRR 2011
Julio Enrique Castrillón-Candás Jun Li Victor Eijkhout

In this paper we develop a discrete Hierarchical Basis (HB) to efficiently solve the Radial Basis Function (RBF) interpolation problem with variable polynomial order. The HB forms an orthogonal set and is adapted to the kernel seed function and the placement of the interpolation nodes. Moreover, this basis is orthogonal to a set of polynomials up to a given order defined on the interpolating no...

ژورنال: اندیشه آماری 2020
, ,

The Area under the ROC Curve (AUC) is a common index for evaluating the ability of the biomarkers for classification. In practice, a single biomarker has limited classification ability, so to improve the classification performance, we are interested in combining biomarkers linearly and nonlinearly. In this study, while introducing various types of loss functions, the Ramp AUC method and some of...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید