Sigmoid Neural Network Incremental Construction by using a Boosting Method

B.J.B. Cannon, D.J. Brown, and X. Wang (UK)

Keywords

ANN, Simulation, Learning Algorithms, Nonlinear

Abstract

The Sigmoid Neural Network (SNN) approximates a function by adjustment of its translation scale and weight parameters using the Gradient algorithm. A different method has been proposed based on the boosting technique of selection and replacement to approximate a function, which we will refer to as the Weighted Searching (WS) algorithm. This paper shows the results, which have been obtained from using the Gradient algorithm and WS algorithm. A comparison has been made between two different cost functions used with the WS algorithm. The results have indicated that using a cost function incorporating an error band has yielded the optimum result and from this a multi-neural SNN has been achieved using neurons to represent different sections of the training data.

Important Links:



Go Back