Several algorithms have been developed for time series forecasting. In this paper, we develop a type of algorithm that makes use of the numerical methods for optimizing on objective function that is the Kullbak-Leibler divergence between the joint probability density function of a time series xi, X2, Xn and the product of their marginal distributions. The Grani-charlier expansion is ušed for estimating these distributions.
Using the weights that have been obtained by the neural network, and adding to them the Kullback-Leibler divergence of these weights, we obtain new weights that are ušed for forecasting the new value of Xn+k.