This article introduces an algorithm for implicit High Dimensional Model Representation (HDMR) of the Bellman equation. This approximation technique reduces memory demands of the algorithm considerably. Moreover, we show that HDMR enables fast approximate minimization which is essential for evaluation of the Bellman function. In each time step, the problem of parametrized HDMR minimization is relaxed into trust region problems, all sharing the same matrix. Finding its eigenvalue decomposition, we effectively achieve estimates of all minima. Their full-domain representation is avoided by HDMR and then the same approach is used recursively in the next time step. An illustrative example of N-armed bandit problem is included. We assume that the newly established connection between approximate HDMR minimization and the trust region problem can be beneficial also to many other applications.
This paper suggests a new algorithm for data compression that depends on Boolean minimization of binary data. On the compressor side, the input bit-stream is chopped into chunks of 16-bit each, and a "sum of products" function is found for each chunk of bits using the Quine-McClusky algorithm. The minimized "sum of products" function is stored in a file. Later, the Huffman coding is applied to this file. The obtained Huffman code is used to convert the original file into a compressed one. On the decompression side, the Huffman tree is used to retrieve the original file. The experimental results of the proposed algorithm showed that the saving ratio on average is around 50%. In addition, the worst case was investigated and a remedy to it was suggested. The proposed technique can be used for various file formats including images and videos.
The paper solves the problem of minimization of the Kullback divergence between a partially known and a completely known probability distribution. It considers two probability distributions of a random vector (u1,x1,...,uT,xT) on a sample space of 2T dimensions. One of the distributions is known, the other is known only partially. Namely, only the conditional probability distributions of xτ given u1,x1,...,uτ−1,xτ−1,uτ are known for τ=1,...,T. Our objective is to determine the remaining conditional probability distributions of uτ given u1,x1,...,uτ−1,xτ−1 such that the Kullback divergence of the partially known distribution with respect to the completely known distribution is minimal. Explicit solution of this problem has been found previously for Markovian systems in Karný \cite{Karny:96a}. The general solution is given in this paper.