Claim Missing Document
Check
Articles

Found 1 Documents
Search
Journal : Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi)

Investigating the Impact of ReLU and Sigmoid Activation Functions on Animal Classification Using CNN Models M Mesran; Sitti Rachmawati Yahya; Fifto Nugroho; Agus Perdana Windarto
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 8 No 1 (2024): February 2024
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.29207/resti.v8i1.5367

Abstract

VGG16 is a convolutional neural network model used for image recognition. It is unique in that it only has 16 weighted layers, rather than relying on a large number of hyperparameters. It is considered one of the best vision model architectures. However, several things need to be improved to increase the accuracy of image recognition. In this context, this work proposes and investigates two ensemble CNNs using transfer learning and compares them with state-of-the-art CNN architectures. This study compares the performance of (rectified linear unit) ReLU and sigmoid activation functions on CNN models for animal classification. To choose which model to use, we tested two state-of-the-art CNN architectures: the default VGG16 with the proposed method VGG16. A dataset consisting of 2,000 images of five different animals was used. The results show that ReLU achieves a higher classification accuracy than sigmoid. The model with ReLU in fully connected and convolutional layers achieved the highest precision of 97.56% in the test dataset. The research aims to find better activation functions and identify factors that influence model performance. The dataset consists of animal images collected from Kaggle, including cats, cows, elephants, horses, and sheep. It is divided into training sets and test sets (ratio 80:20). The CNN model has two convolution layers and two fully connected layers. ReLU and sigmoid activation functions with different learning rates are used. Evaluation metrics include accuracy, precision, recall, F1 score, and test cost. ReLU outperforms sigmoid in accuracy, precision, recall, and F1 score. This study emphasizes the importance of choosing the right activation function for better classification accuracy. ReLU is identified as effective in solving the vanish-gradient problem. These findings can guide future research to improve CNN models in animal classification.