The GMDH MIA algorithm uses linear regression for adaptation. We show that Gauss-Markov conditions are not met here and thus estimations of network parameters are biased. To eliminate this we propose to use cloning of neuron parameters in the GMDH network with genetic selection and cloning (GMC GMDH) that can outperform other powerful methods. It is demonstrated on tasks from the Machine Learning Repository.
This article introduces an improved growing hyperspheres (GHS) neural classifier that is based on a proper distribution of hyperspheres over patterns to properly cover all the patterns of a given class. The union of these hyperspheres then form a discrimination surface among the classes. The article describes a complete general algorithrn together with all up-to-date modifications and shows its abilities on an economical problém. A comparison with results obtained by the inultilayer perceptron (MLP) neural network is presented. The problém consists in a detection of peaks (steep time changes) in a time seqiience of the total factor productivity - a residual factor in the production fmiction. The peaks can be interpreted - at least in the Real Business Cycles (RBC) Theory - as shocks caused by sudden technological innovations. The results from the GHS and MLP neural network are compared with results obtained by means of empirical rules compiled by an economic expert.
In this paper we introdiice a new approach to the preprocessing (initial setting) of weight vectors and thus a spoed-up of the well-knowri SOM (Kohonen’s, SOFM) neural network. The idea of the method (we call it Prep through this paper) consists in spreading a small lattice over the pattern space and consequently completing its inner meshes and boundaries to obtain a larger lattice. This large lattice is then tuned by its training for a short time. To justify the speed up of the Prep method we give a detailed time analysis. To demonstrate the suggested method we show its abilities on several representative examples.