Ichsan Firmansyah
Universitas Potensi Utama

Published : 2 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search
Journal : JIKO (Jurnal Informatika dan Komputer)

Komparasi Fungsi Aktivasi Relu Dan Tanh Pada Multilayer Perceptron Ichsan Firmansyah; B. Herawan Hayadi
JURNAL INFORMATIKA DAN KOMPUTER Vol 6, No 2 (2022): ReBorn -- September 2022
Publisher : Lembaga Penelitian dan Pengabdian Masyarakat - Universitas Teknologi Digital Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (272.079 KB) | DOI: 10.26798/jiko.v6i2.600

Abstract

Neural network is a popular method used in machine research, and activation functions, especially ReLu and Tanh, have a very important function in neural networks, to minimize the error value between the output layer and the target class. With variations in the number of hidden layers, as well as the number of neurons in each different hidden layer, this study analyzes 8 models to classify the Titanic's Survivor dataset. The result is that the ReLu function has a better performance than the Tanh function, seen from the average value of accuracy and precision which is higher than the Tanh activation function. The addition of the number of hidden layers has no effect on increasing the performance of the classification results, it can be seen from the decrease in the average accuracy and precision of the models that use 3 hidden layers and models that use 4 hidden layers. The highest accuracy value was obtained in the model using the ReLu activation function with 4 hidden layers and 50 neurons in each hidden layer, while the highest precision value was obtained in the model using the ReLu activation function with 4 hidden layers and 100 neurons in each hidden layer