To search, Click below search items.

 

All Published Papers Search Service

Title

Back Propagation Algorithm : The Best Algorithm Among the Multi-layer Perceptron Algorithm

Author

Mutasem khalil Sari Alsmadi, Khairuddin Bin Omar, Shahrul Azman Noah

Citation

Vol. 9  No. 4  pp. 378-383

Abstract

A multilayer perceptron is a feed forward artificial neural network model that maps sets of input data onto a set of appropriate output. It is a modification of the standard linear perceptron in that it uses three or more layers of neurons (nodes) with nonlinear activation functions and is more powerful than the perceptron in that it can distinguish data that is not linearly separable, or separable by a hyper plane. MLP networks are general-purpose, flexible, nonlinear models consisting of a number of units organized into multiple layers. The complexity of the MLP network can be changed by varying the number of layers and the number of units in each layer. Given enough hidden units and enough data, it has been shown that MLPs can approximate virtually any function to any desired accuracy. This study presents the performance comparison between multi-layer perceptron (back propagation, delta rule and perceptron). Perceptron is a steepest descent type algorithm that normally has slow convergence rate and the search for the global minimum often becomes trapped at poor local minima. The current study investigates the performance of three algorithms to train MLP networks. It was found that the back propagation algorithm are much better than others algorithms.

Keywords

Back propagation, perceptron, delta rule learning, classification

URL

http://paper.ijcsns.org/07_book/200904/20090451.pdf