Published 2000
| Version v1
Publication
Evaluation of gradient descent learning algorithms with an adaptive local rate technique for hierarchical feed forward architectures
Contributors
Description
Gradient descent learning algorithms (namely Back Propagation and Weight Perturbation) can significantly increase their classification performances adopting a local and adaptive learning rate management approach. In this paper, we present the results of the comparison of the classification performance of the two algorithms in a tough application: quality control analysis in steel industry. The feed forward network is hierarchically organized (i.e. tree of Multi Layer Perceptrons). The comparison has been performed starting from the same operating conditions (i.e. network topology, stopping criterion, etc): the results show that the probability of correct classification is significantly better for the Weight Perturbation algorithm.
Abstract
Gradient descent learning algorithms (namely backpropagation and weight perturbation) can significantly increase their classification performances by adopting a local and adaptive learning rate management approach. We present the results of the comparison of the classification performance of the two algorithms in a tough application: quality control analysis in the steel industry. The feedforward network is hierarchically organized (i.e. tree of multilayer perceptrons). The comparison has been performed starting from the same operating conditions (i.e. network topology, stopping criterion, etc.): the results show that the probability of correct classification is significantly better for the weight perturbation algorithmAdditional details
Identifiers
- URL
- http://hdl.handle.net/11567/847660
- URN
- urn:oai:iris.unige.it:11567/847660
Origin repository
- Origin repository
- UNIGE