The artificial Immune Recognition System (AIRS) algorithm inspired by a natural immune system makes use of the training data to generate memory cells (or prototypes). These memory cells are used in the test phase to classify unseen data using the K-nearest neighbor (K-NN) algorithm. The performance of the AIRS algorithm, similar to other distance-based classifiers, is highly dependent on the distance function used to classify a test instance. In this paper, we present a new version of the AIRS algorithm named Adaptive Distance AIRS (AD-AIRS) that uses an adaptive distance metric to improve the generalization accuracy of the basic AIRS algorithm. The adaptive distance metric is based on assigning weights to the evolved memory cells. The weights of memory cells are used in the test phase to classify test instances. Apart from this, the AD-AIRS algorithm uses the concept of clustering to modify the way that memory cells are generated. Each memory cell represents a group of similar instances (or antigens). A subset of the UCI datasets is used to evaluate the effectiveness of the proposed AD-AIRS algorithm in comparison with the basic AIRS. Experimental results show that the AD-AIRS achieves higher accuracy with a fewer number of memory cells when compared with the basic AIRS algorithm.
Estimation in truncated parameter space is one of the most important features in statistical inference, because the frequently used criterion of unbiasedness is useless, since no unbiased estimator exists in general. So, other optimally criteria such as admissibility and minimaxity have to be looked for among others. In this paper we consider a subclass of the exponential families of distributions. Bayes estimator of a lower-bounded scale parameter, under the squared-log error loss function with a sequence of boundary supported priors is obtained. An admissible estimator of a lower-bounded scale parameter, which is the limiting Bayes estimator, is given. Also another class of estimators of a lower-bounded scale parameter, which is called the truncated linear estimators, is considered and several interesting properties of the estimators in this class are studied. Some comparisons of the estimators in this class with an admissible estimator of a lower-bounded scale parameter are presented.
In this paper, sandstones from three Czech localities were subjected to mechanical fracture tests in order to obtain their properties. Carboniferous sandstone from the Staříč site was primarily different from the two other Cretaceous sandstones from Podhorní Újezd and Javorka localities in the type of grain contact, as well as in their mineralogical composition of the rock matrix and cement. These differences were primarily reflected in different rock porosities. An advanced assessment of the fracture response of the chevron notch specimens made of sandstones subjected to three-point bending test was carried out by means of the GTDiPS program suggested for processing the loading diagrams. Bending Young's modulus, mode I fracture toughness, and fracture energy were subsequently calculated for all tested sandstone samples. Obtained outcomes show that the sandstone from the Staříč mine exhibits several times higher values of investigated properties than the Podhorní Újezd and Javorka sandstones. This was a result of a higher degree of rock compaction, siliciferous rock cement and, therefore, relatively low total porosity. Internal rock texture and mineralogical composition of matrix or cement are thus one of the most important factors influencing the values of mechanical fracture parameters of sandstones.
We say that a binary operation $*$ is associated with a (finite undirected) graph $G$ (without loops and multiple edges) if $*$ is defined on $V(G)$ and $uv\in E(G)$ if and only if $u\ne v$, $u * v=v$ and $v*u=u$ for any $u$, $v\in V(G)$. In the paper it is proved that a connected graph $G$ is geodetic if and only if there exists a binary operation associated with $G$ which fulfils a certain set of four axioms. (This characterization is obtained as an immediate consequence of a stronger result proved in the paper).
The Cantor-Bernstein-Schröder theorem of the set theory was generalized by Sikorski and Tarski to $\sigma $-complete boolean algebras, and recently by several authors to other algebraic structures. In this paper we expose an abstract version which is applicable to algebras with an underlying lattice structure and such that the central elements of this lattice determine a direct decomposition of the algebra. Necessary and sufficient conditions for the validity of the Cantor-Bernstein-Schröder theorem for these algebras are given. These results are applied to obtain versions of the Cantor-Bernstein-Schröder theorem for $\sigma $-complete orthomodular lattices, Stone algebras, $BL$-algebras, $MV$-algebras, pseudo $MV$-algebras, Łukasiewicz and Post algebras of order $n$.
We present an algorithm to generate a smooth curve interpolating a set of data on an n-dimensional ellipsoid, which is given in closed form. This is inspired by an algorithm based on a rolling and wrapping technique, described in \cite{fatima-knut-rolling} for data on a general manifold embedded in Euclidean space. Since the ellipsoid can be embedded in an Euclidean space, this algorithm can be implemented, at least theoretically. However, one of the basic steps of that algorithm consists in rolling the ellipsoid, over its affine tangent space at a point, along a curve. This would allow to project data from the ellipsoid to a space where interpolation problems can be easily solved. However, even if one chooses to roll along a geodesic, the fact that explicit forms for Euclidean geodesics on the ellipsoid are not known, would be a major obstacle to implement the rolling part of the algorithm. To overcome this problem and achieve our goal, we embed the ellipsoid and its affine tangent space in \Rn+1 equipped with an appropriate Riemannian metric, so that geodesics are given in explicit form and, consequently, the kinematics of the rolling motion are easy to solve. By doing so, we can rewrite the algorithm to generate a smooth interpolating curve on the ellipsoid which is given in closed form.
Standard Bidirectional Associative Memory (BAM) Stores sum-of-thecorrelation-matrices of the pairs of patterns. When a pattern of an encoded pair is presented, the other is expected to be recalled. It has been shown that standard BAM cannot correctly recall a pattern pair if it is not at local minima of the energy function. To overcome this problem, novel niethods for encoding have been proposed. The efficient novel-encoding methods require knowledge of the interference noise in the standard BAM. In this paper, we propose an algorithm for computing the exact amount of interference noise in standard encoding of BAM. The computational cornplexity of the algorithm is the same as that of computing the correlation matrix for the standard BAM.
In this paper, a hybrid regularizers model for Poissonian image restoration is introduced. We study existence and uniqueness of minimizer for this model. To solve the resulting minimization problem, we employ the alternating minimization method with rigorous convergence guarantee. Numerical results demonstrate the efficiency and stability of the proposed method for suppressing Poisson noise.
Web Applications have become a critical component of the global information infrastructure, and it is important that they be validated to ensure their reliability. Exploiting user session data is a promising approach to testing Web applications. However, the effectiveness of user session testing technique depends on the set of collected user session data: The wider this set, the greater the capability of the approach to detect failures, but the wider the user session data set, the greater the cost of collecting, analyzing and storing data. In this paper, a technique for reducing a set of user sessions to an equivalent smaller one is implemented. This technique allows reducing of a wider set of user sessions to an equivalent reduced user session and pages, sufficient to test a Web application effectively. Reduction of a user session for several web applications like TCENet Web application, Portal application, Social Networking, Online shopping, Online Library is carried out in order to validate the proposed technique; and our technique is compared with HGS, Random Reduction technique and the Concept Lattice technique to evaluate its efficiency.