JIKO (Jurnal Informatika dan Komputer)
Vol 6, No 2 (2022): ReBorn -- September 2022

Komparasi Fungsi Aktivasi Relu Dan Tanh Pada Multilayer Perceptron

Ichsan Firmansyah (Universitas Potensi Utama)
B. Herawan Hayadi (Unknown)



Article Info

Publish Date
19 Sep 2022

Abstract

Neural network is a popular method used in machine research, and activation functions, especially ReLu and Tanh, have a very important function in neural networks, to minimize the error value between the output layer and the target class. With variations in the number of hidden layers, as well as the number of neurons in each different hidden layer, this study analyzes 8 models to classify the Titanic's Survivor dataset. The result is that the ReLu function has a better performance than the Tanh function, seen from the average value of accuracy and precision which is higher than the Tanh activation function. The addition of the number of hidden layers has no effect on increasing the performance of the classification results, it can be seen from the decrease in the average accuracy and precision of the models that use 3 hidden layers and models that use 4 hidden layers. The highest accuracy value was obtained in the model using the ReLu activation function with 4 hidden layers and 50 neurons in each hidden layer, while the highest precision value was obtained in the model using the ReLu activation function with 4 hidden layers and 100 neurons in each hidden layer

Copyrights © 2022






Journal Info

Abbrev

jiko

Publisher

Subject

Computer Science & IT

Description

JIKO (Jurnal Informatika dan Komputer) is a scientific journal published by Lembaga Penelitian dan Pengabdian Masyarakat of Universitas Teknologi Digital Indonesia (d.h STMIK AKAKOM) Yogyakarta, Indonesia. First published in 2016 for a printed and online version. We receive original research ...