Muhammad Rizqi Assabil
Sriwijaya University

Published : 1 Documents Claim Missing Document
Claim Missing Document

Found 1 Documents

Text Generation using Long Short Term Memory to Generate a LinkedIn Post Muhammad Rizqi Assabil; Novi Yusliani; Annisa Darmawahyuni
Sriwijaya Journal of Informatics and Applications Vol 4, No 2 (2023)
Publisher : Fakultas Ilmu Komputer Universitas Sriwijaya

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.36706/sjia.v4i2.64


LinkedIn is one of the most popular sites out there to advertise oneself to potential employer. This study aims to create a good enough text generation model that it can generate a text as if it were made by someone who posts on LinkedIn. This study will use a Neural Network layer called Long Short Term Memory (LSTM) as the main algorithm and the train data consists of actual posts made by users in LinkedIn. LSTM is an algorithm that is created to reduce vanishing and exploding gradient problem in Neural Network. From the result, final accuracy and loss varies. Increasing learning rate from its default value of 0.001, to 0.01, or even 0.1 creates worse model. Meanwhile, increasing dimensions of LSTM will sometimes increases training time or decreases it while not really increasing model performance. In the end, models chosen at the end are models with around 97% of accuracy. From this study, it can be concluded that it is possible to use LSTM to create a text generation model. However, the result might not be too satisfying. For future work, it is advised to instead use a newer model, such as the Transformer model.