Cardiovascular dynamic and variability data are commonly used in experimental protocols involving cognitive challenge. Usually, the analysis is based on a sometimes more and sometimes less well motivated single specific time resolution ranging from a few seconds to several minutes. The present paper aimed at investigating in detail the impact of different time resolutions of the cardiovascular data on the interpretation of effects. We compared three template tasks involving varying types of challenge, in order to provide a case study of specific effects and combinations of effects over different time frames and using different time resolutions. Averaged values of hemodynamic variables across an entire protocol confirmed typical findings regarding the effects of mental challenge and social observation. However, the hemodynamic response also incorporates transient variations in variables reflecting important features of the control system response. The fine-grained analysis of the transient behavior of hemodynamic variables demonstrates that information that is important for interpreting effects may be lost when only average values over the entire protocol are used as a representative of the system response. The study provides useful indications of how cardiovascular measures may be fruitfully used in experiments involving cognitive demands, allowing inferences on the physiological processes underlying the responses., H. K. Lackner, J. J. Batzel, A. Rössler, H. Hinghofer-Szalkay, I. Papousek., and Obsahuje bibliografii
Total correlation (`TC') and dual total correlation (`DTC') are two classical ways to quantify the correlation among an n-tuple of random variables. They both reduce to mutual information when n=2. The first part of this paper sets up the theory of TC and DTC for general random variables, not necessarily finite-valued. This generality has not been exposed in the literature before. The second part considers the structural implications when a joint distribution μ has small TC or DTC. If TC(μ)=o(n), then μ is close to a product measure according to a suitable transportation metric: this follows directly from Marton's classical transportation-entropy inequality. If DTC(μ)=o(n), then the structural consequence is more complicated: μ is a mixture of a controlled number of terms, most of them close to product measures in the transportation metric. This is the main new result of the paper.
This paper studies a new model of social opinion dynamics in multiagent system by counting in two important factors, individual susceptibility and anchoring effect. Different from many existing models only focusing on one factor, this model can exhibit not only agreement phenomena, but also disagreement phenomena such as clustering and fluctuation, during opinion evolution. Then we provide several conditions to show how individual susceptibility and anchoring effect work on steady-state behaviors in some specific situations, with strict mathematical analysis. Finally, we investigate the model for general situations via simulations.
The purpose of this article is to introduce multi-agent modelling as an area of research that has developed rapidly in sociology over the last fifteen years. This article starts by outlining some characteristics of multi-agent modelling and then covers the history of sociological component of complexity science. In the following part, the fundamental concepts used in multi-agent modelling such as model, agent, environment and emergence are defined. Thereafter, the article focuses on the application of multi-agent modelling in sociology and identifies specific areas where it might be used productively. An illustrative example of a multi-agent model called ‘Slumulation’ that explores how slums emerge in the city is described. Finally, the advantages and limits of this approach are summarized.
The purpose of this article is to introduce multi-agent modelling as an area of research that has developed rapidly in sociology over the last fifteen years. This article starts by outlining some characteristics of multi-agent modelling and then covers the history of sociological component of complexity science. In the following part, the fundamental concepts used in multi-agent modelling such as model, agent, environment and emergence are defined. Thereafter, the article focuses on the application of multi-agent modelling in sociology and identifies specific areas where it might be used productively. An illustrative example of a multi-agent model called ‘Slumulation’ that explores how slums emerge in the city is described. Finally, the advantages and limits of this approach are summarized., Anna Krčková., and Obsahuje bibliografii
The article presents the construction of an agent-based model of segregation step by step. The article is intended as a tutorial for the reader’s first steps with agent-based modeling. The model is programmed in the NetLogo software and provided in two versions: first as an online executable version, for first-impression purposes, and second as NetLogo code, for serious experiments and further model improvements by the reader. The article describes the user interface and source code of the model in close detail. Most of the article is dedicated to careful, in-depth explanation of the NetLogo code. The model aims to answer Schelling’s classical question: "Is it possible to obtain an ethnically segregated structure of a town with relatively tolerant inhabitants?" The model also aims to answer the question: "Does size of recognized neighbourhood suppress tendency to segregation?" Analysis of the data produced by the model informs us that the tendency to segregation decreased with larger recognized neighbourhood - the larger the neighbourhood the lower the number of inhabitants living in an ethnically homogenous neighbourhood. However, size of recognized neighbourhood did not moderate the relationship between intolerance and tendency to segregation - the slope of the relationship was still the same (or even steeper for larger neighbourhoods)., František Kalvas., and Obsahuje bibliografické odkazy
Blur is a common problem that limits the effective resolution of many imaging systems. In this article, we give a general overview of methods that can be used to reduce the blur. This includes the classical multi-channel deconvolution problems as well as challenging extensions to spatially varying blur. The proposed methods are formulated as energy minimization problems with specific regularization terms on images and blurs. Experiments on real data illustrate very good and stable performance of the methods.
In social insects, the high variability in the number of queens per colony raises fundamental questions about the evolution of altruism. It is hypothesized, for instance, that nestmate recognition should be less efficient in polygynous than in monogynous colonies because the presence of several breeders increases the diversity of genetically determined recognition cues, leading to a less specific colonial signature. Recent studies, however, have shown that the link between the number of queens in a colony and the recognition abilities of its members is more complex than previously suggested. Here, we studied intraspecific aggression, diversity of potential recognition cues and genetic structure of colonies in the highly polygynous ant Crematogaster pygmaea. Our results reveal that workers of this species are clearly aggressive towards non-nestmates in field experiments but not in more artificial bioassays conducted in Petri dishes, underscoring the importance of context-dependent aspects of the assessment of nestmate recognition. Behavioural, genetic and chemical data show that C. pygmaea is a multicolonial species, forming spatially restricted and well-defined entities. Therefore, the postulated negative correlation between recognition ability of workers and queen number in a colony is not supported by the results of this study., Rachid Hamidi ... [et al.]., and Obsahuje seznam literatury
Fluorescence images of leaves of sugar beet plants (Beta vulgaris L. cv. Patricia) grown on an experimental field with different fertilisation doses of nitrogen [0, 3, 6, 9, 12, 15 g(N) m-2] were taken, applying a new multicolour flash-lamp fluorescence imaging system (FL-FIS). Fluorescence was excited by the UV-range (280-400 nm, λmax = 340 nm) of a pulsed Xenon lamp. The images were acquired successively in the four fluorescence bands of leaves near 440, 520, 690, and 740 nm (F440, F520, F690, F740) by means of a CCD-camera. Parallel measurements were performed to characterise the physiological state of the leaves (nitrogen content, invert-sugars, chlorophylls and carotenoids as well as chlorophyll fluorescence induction kinetics and beet yield). The fluorescence images indicated a differential local patchiness across the leaf blade for the four fluorescence bands. The blue (F440) and green fluorescence (F520) were high in the leaf veins, whereas the red (F690) and far-red (F740) chlorophyll (Chl) fluorescences were more pronounced in the intercostal leaf areas. Sugar beet plants with high N supply could be distinguished from beet plants with low N supply by lower values of F440/F690 and F440/F740. Both the blue-green fluorescence and the Chl fluorescence rose at a higher N application. This increase was more pronounced for the Chl fluorescence than for the blue-green one. The results demonstrate that fluorescence ratio imaging of leaves can be applied for a non-destructive monitoring of differences in nitrogen supply. The FL-FIS is a valuable diagnostic tool for screening site-specific differences in N-availability which is required for precision farming. and G. Langsdorf ... [et al.].
The image de-noising is a practical application of image processing.
Both linear and nonlinear filters are ušed for the noise reduction. The filters which are realizable in Lukasiewicz algebra with square root were analyzed first and then they were used for the 2D image de-noising. There is a set of quality measures recommended for the evaluation of de-noising quality. In čase of various quality measures we can find the best filter. The Pareto optimality principle and the AIA technique were used for this purpose. The procedures were demonstrated on a set of MRI biomedical images.