Komparasi Fungsi Aktivasi Relu Dan Tanh Pada Multilayer Perceptron

نویسندگان

چکیده

Neural network is a popular method used in machine research, and activation functions, especially ReLu Tanh, have very important function neural networks, to minimize the error value between output layer target class. With variations number of hidden layers, as well neurons each different layer, this study analyzes 8 models classify Titanic's Survivor dataset. The result that has better performance than Tanh function, seen from average accuracy precision which higher function. addition layers no effect on increasing classification results, it can be decrease use 3 4 layers. highest was obtained model using with 50 while 100

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Watermarking pada Video: Robustness, Impercetibility dan Pendekatan untuk Domain Terkompresi

ABSTRAK Meningkatnya penggunaan dokumen digital khususnya multimedia (citra, audio, video) dan kemudahan transmisi data melalui Internet meningkatkan kebutuhan terhadap keamanan data terhadap pelanggaran hak cipta. Watermarking merupakan pendekatan yang telah banyak digunakan dan merupakan bagian dari Digital Right Management (DRM) yang dibuat untuk memenuhi kebutuhan tersebut. Khusus untuk vid...

متن کامل

Multilayer Perceptron Training

In this contribution we present an algorithm for using possibly inaccurate knowledge of model derivatives as a part of the training data for a multilayer perceptron network (MLP). In many practical process control problems there are many well-known rules about the eeect of control variables to the target variables. With the presented algorithm the basically data driven neural network model can ...

متن کامل

Multilayer Perceptron Algebra

Artificial Neural Networks(ANN) has been phenomenally successful on various pattern recognition tasks. However, the design of neural networks rely heavily on the experience and intuitions of individual developers. In this article, the author introduces a mathematical structure called MLP algebra on the set of all Multilayer Perceptron Neural Networks(MLP), which can serve as a guiding principle...

متن کامل

Auto-kernel using multilayer perceptron

This work presents a constructive method to train the multilayer perceptron layer after layer successively and to accomplish the kernel used in the support vector machine. Data in different classes will be trained to map to distant points in each layer. This will ease the mapping of the next layer. A perfect mapping kernel can be accomplished successively. Those distant mapped points can be dis...

متن کامل

Multilayer Perceptron for Label Ranking

Label Ranking problems are receiving increasing attention in machine learning. The goal is to predict not just a single value from a finite set of labels, but rather the permutation of that set that applies to a new example (e.g., the ranking of a set of financial analysts in terms of the quality of their recommendations). In this paper, we adapt a multilayer perceptron algorithm for label rank...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: JIKO (Jurnal Informatika dan Komputer)

سال: 2022

ISSN: ['2656-1948', '2614-8897']

DOI: https://doi.org/10.26798/jiko.v6i2.600