I.I.1 Definitions of probability
First we want to define the concept of probability. This is quite
difficult since there is a lot of disagreement on this matter.
Several definitions of probability may be found in literature
(ZELLNER in GRILICHES, ZVI and INTRILIGATOR, MICHAEL D. (eds.)):
One may think of a set of different possible events as a
probability space. This space then represents all possible events,
and thus there will always be at least one event in this space that
is valid. This thought can also be expressed by saying that the
chance of the space itself equals one.
On the other hand the chance of one specific event occurring will
always lie between 0 and 1. We denote this as
(where P is used as a symbol for Probability, and Event1 is a
specific event in our space). Also the probability of nonoccurrence
(c.q. failure) is denoted by
.
Thus, the sum of success and failure of a specific event equals
one:
.
The following definitions
are commonly used and facilitate nomenclature:
 negation  
(I.I.11)

intersection
or conjunction  
(I.I.12)

union
or disjunction  
(I.I.13)

multiple
intersection or conjunction  
(I.I.14)
(all are true)
 multiple union
or disjunction  
(I.I.15)
(at least one is true)

exclusiveness;
propositions E_{i} for i = 1, 2, 3, ..., n are exclusive
on data D  
if only one of these
propositions can be true given data D, or if none of them is true
given D
 exhaustiveness;
propositions E_{i} for i = 1, 2, 3, ..., n are
exhaustive on data
D if at least one of these propositions is true given D  
if
exactly one proposition E_{i} (for i = 1, 2, 3, ..., n) is
true given D, these propositions are said to be
exclusive and
exhaustive
Furthermore, it is very important to keep in mind that
the union of several
independent events of the space equals the sum of the individual
probabilities of each event. This can symbolically be written as:
(I.I.16)
As in the description of exclusiveness and
exhaustiveness, probabilities
may exist in a conditional form. Let C and X denote two events then
If the validity of X would not have any influence on C
this would imply that C would be independent
from X. This could be denoted as:
(I.I.17)
In case X would have an influence on the
occurrence of C (dependence),
we would write
(I.I.18)
where P(C X) is the probability of C and X occurring at
the same time.
This property is so important that the
reader is asked to bear it mind! Also, this property is very useful
to prove Bayes' theorem.
In the special case where
eq. (I.I.18) can be rewritten as P(C X) =
P(C) P(X) (if C and X are independent events).
The generalized form of eq.
(I.I.18)
is
(I.I.19)
and in the special case where all events are independent
(I.I.110)
