This Author published in this journals
All Journal Jurnal EECCIS
Panca Mudji Raharjo
Jurusan Teknik Elektro Fakultas Teknik Universitas Brawijaya

Published : 2 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 2 Documents
Search

Pengenalan Ekspresi Wajah berbasis Filter Gabor dan Backpropagation Neural Network Panca Mudji Raharjo
Jurnal EECCIS Vol 4, No 1 (2010)
Publisher : Fakultas Teknik, Universitas Brawijaya

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (339.527 KB)

Abstract

Sebuah algoritma berbasis filter Gabor dan Backpropagation (BPP) Neural Network diusulkan untuk pengenalan ekspresi wajah. Pertama, ciri emosi ekspresi wajah dinyatakan dengan filter Gabor. Kemudian ciri digunakan untuk melatih jaringan neural dengan algoritma pelatihan Backpropagation. Terakhir, ekspresi wajah diklasifikasi dengan jaringan neural. Menggunakan algoritma tersebut, diperoleh hasil pengenalan yang tinggi.Kata Kunci—Pengenalan ekspresi wajah, filter Gabor, Jaringan Backpropagation.
The Implementation of Feedforward Backpropagation Algorithm for Digit Handwritten Recognition in a Xilinx Spartan-3 Panca Mudji Raharjo; Mochammad Rif'an; Nanang Sulistyanto
Jurnal EECCIS Vol 4, No 2 (2010)
Publisher : Fakultas Teknik, Universitas Brawijaya

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (235.938 KB)

Abstract

This research is aimed to implement feedforward backpropagation algorithm for digit handwritten recognition in an FPGA, Xilinx Spartan 3. This research is expected to give a contribution such as the feedforward algorithm design in VLSI technology based on FPGA, the practice module of Xilinx Spartan-3 development board and further research in artificial neural network and FPGA field in Electronics Laboratory.The feedforward backpropagation algorithm is used to recognize 10 objects. The feedforward backpropagation network consists of two layers, 36 input unit which is the feature vector of object, 10 hidden neurons, and 10 output unit. The first layer activation function is tansig and second layer activation function is purelin.The multipliers use 18 bits. The proposed design fits into the smallest Xilinx FPGAs3.Index Terms—feedforward backpropagation network, digit handwritten recognition, FPGA, Spartan-3.