Data growth is a sizeable challenge. The goal of data compression is to reduce the size of data needed to still represent useful information. Data compression can be used to increase the efficiency of data storage, transmission and protection. Lossless algorithms can precisely reconstruct the original data from the compressed data. Lossless compression is often used for data that needs to be stored or transmitted accurately. Several lossless compression methods and algorithms include the Lempel–Ziv–Markov chain algorithm (LZMA), Prediction by partial matching (PPM), Burrows-Wheeler block sorting text compression algorithm and Huffman coding (BZip2), and Deflate. Even though all compression systems are based on the same principles, there should still be differences in performance. Because of that, a general guide is needed to help determine the most appropriate data compression algorithm to use. This study aims to determine the data compression algorithm that has the best performance, based on a comparison using the Compression Ratio and Space Saving values. The research phase begins with determining the compression algorithm used, data preparation, performance testing, to then be discussed and conclusions drawn. The results show that the compression ratio and space savings that can be achieved specifically will depend on the data used. Although the range of average values of compression performance is not that big, in general LZMA2 shows the best results with a compression ratio of 1.457 and a space saving of 15.00%. Hopefully, the results of this test can be used as an overview in helping to choose a lossless data compression algorithm.