نتایج جستجو برای: nearest neighbors knn algorithm four artificial neural network models and two hammerstein

تعداد نتایج: 17360759  

2012
SHIQING ZHANG XIAOMING ZHAO BICHENG LEI Shiqing Zhang Xiaoming Zhao Bicheng Lei

Facial expression recognition is an interesting and challenging subject in signal processing and artificial intelligence. In this paper, a new method of facial expression recognition based on the sparse representation classifier (SRC) is presented. Two typical appearance facial features, i.e., local binary patterns (LBP) and Gabor wavelets representations are extracted to evaluate the performan...

2013
ZAIXIANG HUANG ZHONGMEI ZHOU TIANZHONG HE

Associative classification usually generates a large set of rules. Therefore, it is inevitable that an instance matches several rules which classes are conflicted. In this paper, a new framework called Associative Classification with KNN (AC-KNN) is proposed, which uses an improved KNN algorithm to address rule conflicts. Traditional K-Nearest Neighbor (KNN) is low efficient due to its calculat...

2000
Marco Muselli

The definition of nearest-neighbor probability p(C) is introduced to characterize classification problems with binary inputs. It measures the likelihood that two patterns, which are close according to the Hamming distance, are assigned to the same class. It is shown that the generalization ability gNN (C) of neural networks that resemble the nearestneighbor algorithm can be expressed as a funct...

ژورنال: اقتصاد مالی 2017

هدف پژوهش حاضر پیش‌بینی شاخص قیمت بورس اوراق بهادار تهران با استفاده از مدل شبکه عصبی هیبریدی مبتنی بر الگوریتم ژنتیک و جستجوی هارمونی است. مربوط‌ترین نماگرهای تکنیکی به عنوان متغیرهای ورودی و تعداد بهینه نرون در لایه پنهان شبکه عصبی مصنوعی با استفاده از الگوریتم‌های فراابتکاری ژنتیک و جستجوی هارمونی حاصل می‌گردد. مقادیر روزانه شاخص قیمت بورس اوراق بهادار تهران از تاریخ 1/10/91 الی 30/9/94 جهت ...

2012
M. Kozak K. Stapor

The k-nearest neighbors (knn) is a simple but effective method of classification. In this paper we present an extended version of this technique for chemical compounds used in High Throughput Screening, where the distances of the nearest neighbors can be taken into account. Our algorithm uses kernel weight functions as guidance for the process of defining activity in screening data. Proposed ke...

2010
Zygmunt Hasiewicz Grzegorz Mzyk Przemyslaw Sliwinski

In the paper we recover a Hammerstein system nonlinearity. Hammerstein systems, incorporating nonlinearity and dynamics, play an important role in various applications, and e¤ective algorithms determining their characteristics are not only of theoretical but also of practical interest. The proposed algorithm is quasi-parametric, that is, there are several parametric model candidates and we assu...

2009
Igor Santos Javier Nieves Yoseba K. Penya Pablo García Bringas

Microshrinkages are known as probably the most difficult defects to avoid in high-precision foundry. The presence of this failure renders the casting invalid, with the subsequent cost increment. Modelling the foundry process as an expert knowledge cloud allows properly-trained machine learning algorithms to foresee the value of a certain variable, in this case the probability that a microshrink...

2010
Víctor Laguna Alneu de Andrade Lopes

Semi-supervised learning is a machine learning paradigm in which the induced hypothesis is improved by taking advantage of unlabeled data. It is particularly useful when labeled data is scarce. Cotraining is a widely adopted semi-supervised approach that assumes availability of two views of the training data a restrictive assumption for most real world tasks. In this paper, we propose a one-vie...

The artificial neural networks, the learning algorithms and mathematical models mimicking the information processing ability of human brain can be used non-linear and complex data. The aim of this study was to predict the breeding values for milk production trait in Iranian Holstein cows applying artificial neural networks. Data on 35167 Iranian Holstein cows recorded between 1998 to 2009 were ...

Journal: :Adv. Artificial Neural Systems 2010
Qi Yu Yoan Miché Antti Sorjamaa Alberto Guillén Amaury Lendasse Eric Séverin

This paper presents a methodology named Optimally Pruned K-Nearest Neighbors (OP-KNNs) which has the advantage of competing with state-of-the-art methods while remaining fast. It builds a one hidden-layer feedforward neural network using K-Nearest Neighbors as kernels to perform regression. Multiresponse Sparse Regression (MRSR) is used in order to rank each kth nearest neighbor and finally Lea...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید