Normalized data barrier amplifier for feed-forward neural network
- Title:
- Normalized data barrier amplifier for feed-forward neural network
- Creator:
- Fuangkhon, P.
- Identifier:
- https://cdk.lib.cas.cz/client/handle/uuid:8824c3ea-2326-45ed-ab0d-ae19ce2088e5
uuid:8824c3ea-2326-45ed-ab0d-ae19ce2088e5
doi:10.14311/NNW.2021.31.007 - Subject:
- data mining, data normalization, data reduction, instance selection, and neural network
- Type:
- model:article and TEXT
- Format:
- bez média and svazek
- Description:
- A boundary vector generator is a data barrier amplifier that improves the distribution model of the samples to increase the classification accuracy of the feed-forward neural network. It generates new forms of samples, one for amplifying the barrier of their class (fundamental multi-class outpost vectors) and the other for increasing the barrier of the nearest class (additional multi-class outpost vectors). However, these sets of boundary vectors are enormous. The reduced boundary vector generators proposed three boundary vector reduction techniques that scale down fundamental multi-class outpost vectors and additional multi-class outpost vectors. Nevertheless, these techniques do not consider the interval of the attributes, causing some attributes to suppress over the other attributes on the Euclidean distance calculation. The motivation of this study is to explore whether six normalization techniques; min-max, Z-score, mean and mean absolute deviation, median and median absolute deviation, modified hyperbolic tangent, and hyperbolic tangent estimator, can improve the classification performance of the boundary vector generator and the reduced boundary vector generators for maximizing class boundary. Each normalization technique pre-processes the original training set before the boundary vector generator or each of the three reduced boundary vector generators will begin. The experimental results on the real-world datasets generally confirmed that (1) the final training set having only FF-AA reduced boundary vectors can be integrated with one of the normalization techniques effectively when the accuracy and precision are prioritized, (2) the final training set having only the boundary vectors can be integrated with one of the normalization techniques effectively when the recall and F1-score are prioritized, (3) the Z-score normalization can generally improve the accuracy and precision of all types of training sets, (4) the modified hyperbolic tangent normalization can generally improve the recall of all types of training sets, (5) the min-max normalization can generally improve the accuracy and F1-score of all types of training sets, and (6) the selection of the normalization techniques and the training set types depends on the key performance measure for the dataset.
- Language:
- English
- Rights:
- http://creativecommons.org/licenses/by-nc-sa/4.0/
policy:public - Coverage:
- 125-157
- Source:
- Neural network world: international journal on neural and mass-parallel computing and information systems | 2021 Volume:31 | Number:2
- Harvested from:
- CDK
- Metadata only:
- false
The item or associated files might be "in copyright"; review the provided rights metadata:
- http://creativecommons.org/licenses/by-nc-sa/4.0/
- policy:public