Updating probabilities by information from only one hypothesis and thereby ignoring alternative hypotheses, is not only biased but leads to progressively imprecise conclusions. In psychology this phenomenon was studied in experiments with the "pseudodiagnosticity task''. In probability logic the phenomenon that additional premises increase the imprecision of a conclusion is known as "degradation''. The present contribution investigates degradation in the context of second order probability distributions. It uses beta distributions as marginals and copulae together with C-vines to represent dependence structures. It demonstrates that in Bayes' theorem the posterior distributions of the lower and upper probabilities approach 0 and 1 as more and more likelihoods belonging to only one hypothesis are included in the analysis.
In contrast to most other approaches used to represent multidimensional probability distributions, which are based on graphical Markov modelling (i.e. dependence structure of distributions is represented by graphs), the described method is rather procedural. Here, we describe a process by which a multidimensional distribution can be composed from a “generating sequence” - a sequence of low-dimensional distributions. The main advantage of this approach is that the same apparatus based on operators of composition can be applied for description of both probabilistic and possibilistic models.