Claim Missing Document
Check
Articles

Found 32 Documents
Search
Journal : Eksponensial

Optimasi Algoritma Naïve Bayes Menggunakan Algoritma Genetika Untuk Memprediksi Kelulusan Elisa Feronica; Yuki Novia Nasution; Ika Purnamasari
EKSPONENSIAL Vol 13 No 2 (2022)
Publisher : Program Studi Statistika FMIPA Universitas Mulawarman

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (1106.672 KB) | DOI: 10.30872/eksponensial.v13i2.1057

Abstract

The Naïve Bayes algorithm is classification method that uses the principle of probability to create predictive models. Naïve Bayes is based on the assumption that all its attributes are independent which can be optimized by genetic algorithms. Genetic algorithm is an optimization technique which works by imitating the process of evaluating and changing the genetic structure of living creatures. In this study, the Naive Bayes algorithm was optimized using by genetic algorithm to predict student graduation with attributes, namely gender, regional origin, admission path and employment status. The data used is the students of the Mathematics Department, Faculty of Mathematics and Natural Sciences, Mulawarman University who graduated in March 2018 to December 2020. The results of this study indicate the accuracy value generated by Naïve Bayes of 50% increased by 16,67% after the attributes were optimized by using the genetic algorithm to 66,67% with 3 selected attributes, namely regional origin, admission path and employment status
Peramalan Menggunakan Time Invariant Fuzzy Time Series Siti Rahmah Binaiya; Memi Nor Hayati; Ika Purnamasari
EKSPONENSIAL Vol 10 No 2 (2019)
Publisher : Program Studi Statistika FMIPA Universitas Mulawarman

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (632.787 KB)

Abstract

Forecasting is a technique for estimating a value in the future by looking at past and current data. Fuzzy Time Series is a forecasting method that uses fuzzy principles as the basis, where the forecasting process uses the concept of fuzzy set. This study discusses the Time Invariant Fuzzy Time Series method developed by Sah and Degtiarev to forecast the East Kalimantan Province Consumer Price Index (CPI) in May 2018. In the Time Invariant Fuzzy Time Series method using a frequency distribution to determine the length of the interval, 13 fuzzy sets are used in the forecasting process. Based on this study, using CPI data of East Kalimantan Province from September 2016 to April 2018, the forecasting results for May 2018 were obtained 135.977 and obtained the results of forecasting error values using Mean Absolute Percentage Error (MAPE) is under 10% of 0.0949%.
Peramalan Regarima Pada Data Time Series Yudha Muhammad Faishol; Ika Purnamasari; Rito Goejantoro
EKSPONENSIAL Vol 8 No 1 (2017)
Publisher : Program Studi Statistika FMIPA Universitas Mulawarman

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (166.802 KB)

Abstract

RegArima method is a modelling technique that combines the ARIMA model with a regression model which uses a dummy variable called regressors or variable regression. The purposes of this study was to determine the calendar variation models and application of the model to predict plane ticket sales in January 2016 - December 2017. Based on the data analysis show that ticket sales have seasonal pattern, ie an increase in ticket sales when Idul Fitri. First determine the regressors which is only affected by one feast day is Eid. Then do the regression model, where the dependent variable (Y) is the volume of plane ticket sales and the independent variable (X) is regressors, so the regression model is Ŷt=1.029+1.335 X. The results of analysis show that all parameters had significant regression model and then do a fit test the model, the obtained residual normal distribution and ineligible white noise, which means that it still contained residual autocorrelation. ARIMA modeling is then performed on the data regression residuals. Results of analysis performed subsequent residual own stationary ARIMA model estimation and obtained ARIMA (0,0,1) with all parameters of the model was already significant and conformance test models had also found and that the residual qualified white noise and residual normal distribution. So the calendar variation model was obtained by the method RegARIMA: Yt = 1.029,5 + 1.337,3 Dt + 0,28712 at-1 + at. Based on the model of those variations could be predicted on plane ticket sales for January 2016-December 2017.
Bootstrap Aggregating Multivariate Adaptive Regression Splines Marisa Nanda Rahmaniah; Yuki Novia Nasution; Ika Purnamasari
EKSPONENSIAL Vol 7 No 2 (2016)
Publisher : Program Studi Statistika FMIPA Universitas Mulawarman

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (132.495 KB)

Abstract

MARS is one of the classification methods that focus on the high dimension and discontinuity of the data. The level of accuracy in MARS can be improved by using Bagging method (Bootstrap Agregating). This method is used to improve stability, accuracy and strength’s of prediction. This study discusses the MARS bagging applications in analyzing the issue of accreditation, which the accreditation level of a schools can be predicted based on the identifier components. Therefore, in this study will be identified these components to create a classification model. The data used is the accreditation data of the primary school in East Kalimantan Province 2015 issued by the Accreditation Board of the Provincial Schools (BAP-S/M) of East Kalimantan Province. This study obtained six components that affect the determination of the accreditation of schools at primary school level. The components are the variables that contribute to the classification. The variables are a standard component of content (X1), a standard component of the process (X2), a standard component of graduates (X3), standard components of teachers and staffs (X4), a standard component of infrastructure (X5) and standard component of financial (X7). Based on the result of the classification accuracy of MARS method (using Apparent Error Rate (APER), it is amounted to 78.87%, while the classification accuracy (using APER) with method of bagging of the best MARS models amounted to 89.44%. This means that the method of bagging MARS gives better classification accuracy of the classification than MARS.
Perbandingan Hasil Analisis Cluster Dengan Menggunakan Metode Average Linkage Dan Metode Ward Imasdiani Imasdiani; Ika Purnamasari; Fidia Deny Tisna Amijaya
EKSPONENSIAL Vol 13 No 1 (2022)
Publisher : Program Studi Statistika FMIPA Universitas Mulawarman

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (829.126 KB) | DOI: 10.30872/eksponensial.v13i1.875

Abstract

Hierarchical cluster analysis is an analysis used to classify data based on its characteristics. The average linkage method and the Ward method are methods of hierarchical cluster analysis. Grouping data from various aspects, one of which is poverty. This study uses poverty indicator data in East Kalimantan in 2018. The average linkage method is based on the average distance size, while the Ward method is based on the size of the distance between clusters by minimizing the number of squares. The purpose of this study was to determine the best method based on the average value of the standard deviation ratio. The results of the study using the average linkage method obtained two clusters, both the average linkage method and the Ward method both obtained two clusters. Where in the average linkage method, the first cluster consists of 7 districts / cities and the second cluster consists of 3 districts / cities. Whereas in the Ward method, the first cluster consists of 6 districts / cities and the second cluster consists of 4 districts / cities. For the best method based on the average standard deviation ratio in groups (Sw) and the standard deviation between groups (Sb), it is found that the ratio in the Ward method is smaller than the average linkage method, which is 2,681 which indicates that the average linkage method is the best method.
Analisis Pengendalian Kualitas Produk Amplang Menggunakan Peta Kendali Kernel Rahmad Fahreza Adiyasa; Desi Yuniarti; Ika Purnamasari
EKSPONENSIAL Vol 10 No 1 (2019)
Publisher : Program Studi Statistika FMIPA Universitas Mulawarman

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (466.794 KB)

Abstract

Quality control is the use of techniques and activities to maintain and improve the quality of products or services. One of the quality control methods is epanevhnikov kernel control chart. The epanevhnikov kernel control chart is a control chart used to evaluate nonparametric product quality characteristic data because it does not require certain assumptions. The purpose of this research is to find out whether the 1 kg packaged Amplang product in UD. H. Icam Samarinda is within the control limit and what factors can cause the weight of the product becomes uncontrollable. The result shows that there is no sample point outside the control limits in the control chart with kernel density function estimation. So it can be concluded that the weight of the product is within a controlled condition. The factors that can cause the products uncontrolable are environmental factors, human factors, machine factors and material factors.
Proses Optimasi Masalah Penugasan One-Objective dan Two-Objective Menggunakan Metode Hungarian Diang Dewi Tamimi; Ika Purnamasari; Wasono Wasono
EKSPONENSIAL Vol 8 No 1 (2017)
Publisher : Program Studi Statistika FMIPA Universitas Mulawarman

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (324.556 KB)

Abstract

Assignment problem is a situation where m workers are assigned to complete n tasks/jobs to minimize costs and time or maximize profits and quality by setting the proper task to each worker. Many researches have been focused to solve assignment problem, but most of them only consider one-objective such as minimizing the cost of operation. Two-objectiveassignment problem is the assignment problem that has two objectives optimization of some of the resources owned by each worker to complete every task/job which are cost and time for this case. Case in this research use primary data drawn from the interviews of Rattan furniture craftman in Rotan Sejati store, Samarinda. This research will optimize the one-objective and two-objective assignment problem by using Hungarian Method. The analysis result revealed that the optimization proccess of one-objective assignment problem only considering operation cost is Rp. 2.950.000,- with total time is 63 days. The optimization proccess of one-objective assignment problem only considering operation time is Rp. 3.290.000,- with total time is 52 days. The optimization proccess of one-objective assignment problem only considering quality is Rp. 3.550.000,- with total time is 59 days. The optimization proccess of two-objective assignment problem only considering operation cost and operation time is Rp. 3.170.000,- with total time is 52 days. The optimization proccess of two-objective assignment problem only considering operation cost and quality is Rp. 3.380.000,- with total time is 61 days. The optimization proccess of two-objective assignment problem only considering operation time and quality is Rp. 3.350.000,- with total time is 59 days.
Penerapan Metode Full-Profile Dalam Pengumpulan Data Untuk Analisis Konjoin Roy Tridoni Situmorang; Desi Yuniarti; Ika Purnamasari
EKSPONENSIAL Vol 9 No 2 (2018)
Publisher : Program Studi Statistika FMIPA Universitas Mulawarman

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (444.278 KB)

Abstract

Conjoint analysis is an analytical technique that is used to examine the impact of attributes of goods or service. Conjoint analysis can be applied to know the attribute that become the main choice of student of Mulawarman University in choosing GSM prepaid card product. Where the attribute used are SMS tariff, phone tariff, internet package, signal and bonuses. The purpose of this study is to know the combination of attribute level which is most interested by student and relative importance value from each attribute. The result of this study is the combination of attributes of the GSM prepaid card that the student are interest in are the SMS package tarif with the utility value is 1,445, the phone tarif per minute with the utility value is 0,525, full 4G internet package with the utility value is 2,51, strong signal with the utility value is 1,895, SMS bonus with the utility value is 1,42. The attribute that become the student’s preferred choice in choosing GSM prepaid card is internet package with the relative importance value is 0,352.
Penyelesaian Assignment Problem Dengan Menggunakan Metode Program Dinamis Franklin Peter Anton Karundeng; Ika Purnamasari; Desi Yuniarti
EKSPONENSIAL Vol 12 No 2 (2021)
Publisher : Program Studi Statistika FMIPA Universitas Mulawarman

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (590.358 KB) | DOI: 10.30872/eksponensial.v12i2.806

Abstract

Assignment problem that maximize profits or minimize time, distance and cost by placing the appropiate workforce with ability. Solving the assignment problem can be done by dynamic program method. To apply the dynamic program method the number of sources assigned should be equal to the number of tasks to be completed.Otherwise each source should be assigned only for one task. The purpose of this study is to determine the minimum total time of completion of work and know the assignment of employees has been optimal. The data used is the time of assignment of employees completing the work on the worksop in showroom CV. Sinar Utama of Samarinda. From the analysis result using dynamic program method obtained by total completion time of 85 minutes and by looking at the comparison before and after using dynamic program method that total employee assignment time by using dynamic program method equal to 257 minutes and before using dynamic program method that is equal to 530 minutes. It can be concluded that the total minimum work completion time of 85 minutes and based on the comparison before and after using the dynamic program method idicates that the assignment of employee has been optimal.
Penentuan Besaran Premi Asuransi Jiwa Berjangka dengan Model True Fractional Premiums Muhammad Al-Firdaus Erdian; Ika Purnamasari; Wenny Kristina
EKSPONENSIAL Vol 9 No 1 (2018)
Publisher : Program Studi Statistika FMIPA Universitas Mulawarman

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (451.092 KB) | DOI: 10.30872/eksponensial.v9i1.271

Abstract

The model of the payment of life insurance premium that can be paid more than once a year is called fractional premiums. This model consists of two types, namely true fractional premiums and apportionable premium. The true fractional premiums is divided into two models of payment of compensation, namely discrete payment model and continuous payment model. This study aims to find out the comparison of 20 years life insurance premium with true fractional premiums model based on gender and number of payments made in a year from both payment models. The data used in this research is the simulation data. Based on the research result, it is found that the amount of life insurance premium using discrete compensation payment model is cheaper than the one using the continuous payment model. While based on gender, the premium of male is more expensive than female. Based on the amount of payments made in one year, payments made each month are more expensive than the payments made each quarter and semester.