Xycoon logo
Home    Site Map    Site Search    Free Online Software    
horizontal divider
vertical whitespace

Introduction to Econometrics - Definitions and Properties of RandomVariables

[Home] [Up] [Probability] [Axiom System] [Bayes Theorem] [Matrix Algebra] [Distribution Theory] [Estimator Properties] [Random Variables]

Many statistical models may be represented by

Introduction to Econometrics - Definitions and Properties of RandomVariables


with the three elements


Sometimes it is possible to describe the probability measure in (I.III-2) by the probability density function (pdf)


If a sample of observations is independent, identically distributed (iid) then the pdf can be written as


In Bayesian statistics an additional element is available. A prior pdf


is introduced into the model


The prior pdf represents prior information about the possible parameter values (without using the observations).



The aim of a lot of statistical research is to provide adequate theories and procedures in order to be able to

A variable is called a random (c.q. stochastic) variable if the possible values of the variable have different probabilities. Therefore a random variable always has a probability density function (pdf) and a probability distribution.

The relationship between distributions and probabilities can be defined as


where f(X) is the probability density function.

A cumulative probability distribution is defined as follows


for discrete distributions, or as


for continuous distributions.

There are three properties of cumulative probability distributions (c.q. ogives):

Furthermore there is the very important notion of expectation that we 'll define now as




according to Jeffreys' definition. Both expressions (I.III-12) and (I.III-13) are formulated for discrete variables.

Alternate definitions for continuous variables are




according to Jeffreys.

Some very interesting properties of expectations have to be considered in order to provide us a sound mathematical basis for several proofs and theorems to be discussed in the following chapters:

  1. E(a X + b) = a E(X) + b where E(b) = b

  2. E(g(X) + h(X)) = E(g(X)) + E(h(X))

  3. E(X + Y) = E(X) + E(Y)

  4. E(X Y) = E(X) E(Y)

if, and only if

(a and b are real numbers).

The mathematical expectation can be thought of as the most probable value of a variable.

The variance of a random variable is another very important property of probability distributions. It is in fact a measure for the spread of a stochastic variable. The easiest way to define a variance is by means of mathematical expectations as




Equation (I.III-17) can be proved using (I.III-16) and the four properties of expectations as follows

Of course variances have properties similar to the properties of expectations




It can be shown that centered moments can quite easily be computed from uncentered moments


(see also eq. (I.III-17)).

A probability distribution function can be characterized by its centered and uncentered moments (e.g., mean, variance, ...). Therefore a general method of deriving uncentered moments would be a very nice thing to have (the centered moments would then be computed from the conversion formulae (I.III-20)). This is not wishful thinking since the moment generating function of a discrete stochastic variable



Eq. (I.III-22) can be proved as


The covariance between two random variables can be defined easily using mathematical expectations


or equivalently


whereas the correlation is derived from the covariance as


The correlation between X and Y, as defined in, (I.III-26) lies between -1 and 1.

Intuitively the covariance and the correlation can be thought of as a measure of collinearity of the points in a scatter plot (a plot of variable Y against the corresponding values of X). The only difference between covariance and correlation is that the latter has been standardized and is therefore independent of both variables' dimension.

Above this, it is important to understand the following relationship


which clearly states that the implication is valid in one direction only, since the covariance and the correlation are by definition measures for linear relationships while the dependence of two random variables could by of any kind (linear and nonlinear)!

So far nothing has been said about the probability distributions of the random variables. Theoretically of course there are an infinite number of possible probability density functions but only few of them are worthwhile discussing shortly because of their immense importance in econometrics and statistics.

The binomial distribution can be defined as





p = chance of a success
q = chance of a failure
p + q = 1
n = number of independent draws
X = number of successes.

Furthermore E(X) = n p and V(X) = n p q. This can be proved by using the moment generating function (I.III-21) of (I.III-28) and applying (I.III-22) to it.

The binomial distribution is of huge importance in quality control and experimental design.

The most important probability distribution in statistics though is the normal distribution. This function is defined as follows




The mean and variance can be derived from the moment generating function. Above that, any normal distribution function is perfectly identified by its mean and variance.

The fact that a random variable is normally distributed can be denoted as


Having introduced this kind of notation for normal distributions, it is quite easy to describe the socalled additivity property.


then, it follows that


In other words, a linear function of normally distributed random variables is also normally distributed!

Proof for the generalized additivity property for independent variables is straightforward


(for independent Xt variables) since on using the moment generating function we obtain






according to the explicit expression of the moment generating function of normal stochastic variables, yielding finally


which proves the property (Q.E.D.).

Analogously a generalized additivity property for dependent variables can be obtained (see eq. (I.III-33) and (I.III-34)). This is however not necessary in our further discussions and thus beyond the scope of this work.

Derived from the normal distribution is the log-normal distribution. A random variable is log-normally distributed if, and only if


and if (by definition of ln)


The expectation and variance of this probability density function is given by


As previously stated a probability distribution function of a random variable can be formulated using probability theory, e.g., in (I.III-9) and (I.III-11). It is however imperative to formalize another link between probability and distribution theory: the Bienaymé-Chebyshev theorem (or inequality).


Since we defined in (I.III-45)


it is obvious that, on combining (I.III-45) and (I.III-46), the following holds


which is the Bienaymé-Chebyshev inequality.

vertical whitespace

Axiom System
Bayes Theorem
Matrix Algebra
Distribution Theory
Estimator Properties
Random Variables
horizontal divider
horizontal divider

© 2000-2022 All rights reserved. All Photographs (jpg files) are the property of Corel Corporation, Microsoft and their licensors. We acquired a non-transferable license to use these pictures in this website.
The free use of the scientific content in this website is granted for non commercial use only. In any case, the source (url) should always be clearly displayed. Under no circumstances are you allowed to reproduce, copy or redistribute the design, layout, or any content of this website (for commercial use) including any materials contained herein without the express written permission.

Information provided on this web site is provided "AS IS" without warranty of any kind, either express or implied, including, without limitation, warranties of merchantability, fitness for a particular purpose, and noninfringement. We use reasonable efforts to include accurate and timely information and periodically updates the information without notice. However, we make no warranties or representations as to the accuracy or completeness of such information, and it assumes no liability or responsibility for errors or omissions in the content of this web site. Your use of this web site is AT YOUR OWN RISK. Under no circumstances and under no legal theory shall we be liable to you or any other person for any direct, indirect, special, incidental, exemplary, or consequential damages arising from your access to, or use of, this web site.

Contributions and Scientific Research: Prof. Dr. E. Borghers, Prof. Dr. P. Wessa
Please, cite this website when used in publications: Xycoon (or Authors), Statistics - Econometrics - Forecasting (Title), Office for Research Development and Education (Publisher), http://www.xycoon.com/ (URL), (access or printout date).

Comments, Feedback, Bugs, Errors | Privacy Policy