1. The performance of backpropagation networks which use gradient descent on sigmoidal steepness
- Creator:
- Leung, Wing Kai
- Format:
- bez média and svazek
- Type:
- model:article and TEXT
- Subject:
- neural networks, backpropagation, sigmoidal steepness, neural metrics, and algorithmic complexity
- Language:
- English
- Description:
- Backpropagation which uses gradient descent on the steepness of the sigmoid function (BPSA) has been widely studied (e.g. Kruschke et al. [1]). However, most of these studies only analysed the BPSA empirically where no adequate measurements of the network’s quality characteristics (e.g. efficiency and complexity) were given. This paper attempts to show that the BPSA is more efficient than the standard BPA by quantitatively comparing the convergence performance of both algorithms on several benchmark application problems. The convergence performance is measured by the values of the neural metrics [2] evaluated in the training process.
- Rights:
- http://creativecommons.org/publicdomain/mark/1.0/ and policy:public