This paper considers a fuzzy perceptron that has the same topological structure as the conventional linear perceptron. A learning algorithm based on a fuzzy δ rule is proposed for this fuzzy perceptron. The inner operations involved in the working process of this fuzzy perceptron are based on the max-min logical operations rather than conventional multiplication and summation, etc. The initial values of the network weights are fixed as 1. It is shown that each network weight is non-increasing in the training process and remains unchanged once it is less than 0.5. The learning algorithm has an advantage, as proved in this paper, that it converges in a finite number of steps if the training patterns are fuzzily separable. This result generalizes a corresponding classical result for conventional linear perceptrons. Some numerical experiments for the learning algorithm are provided to support our theoretical findings.
Intuitionistic fuzzy sets (IFSs) are generalization of fuzzy sets by adding an additional attribute parameter called non-membership degree. In this paper, a max-min intuitionistic fuzzy Hopfield neural network (IFHNN) is proposed by combining IFSs with Hopfield neural networks. The stability of IFHNN is investigated. It is shown that for any given weight matrix and any given initial intuitionistic fuzzy pattern, the iteration process of IFHNN converges to a limit cycle. Furthermore, under suitable extra conditions, it converges to a stable point within finite iterations. Finally, a kind of Lyapunov stability of the stable points of IFHNN is proved, which means that if the initial state of the network is close enough to a stable point, then the network states will remain in a small neighborhood of the stable point. These stability results indicate the convergence of memory process of IFHNN. A numerical example is also provided to show the effectiveness of the Lyapunov stability of IFHNN.