Abstract:Generalized simulated annealing algorithm can make nonlinear multi-extremism object function converge to global extremism property, thus replacing error backpropagation algorithm based on gradient decline principle in learning of neural network. In the method described here, the object function which was constructed by using square sum of the differences between learning outputs and expected outputs of neural network is considered as a whole energy system to simulate metal annealing processing and to regulate joint weight valises in network, so that energy in the system converges to global minimum. It is better than BP algorithm. This method needs no gradient computation, offers the output response in the form of nondifferentiable excitation function, and performs no error backpropagation computation, therefore, local feedback-jointed network structure can be introduced in the learning of neural network. This method is a new way for neural network learning.