Chapter 2. Fundamentals of Stochastic Modeling

This chapter is devoted to the most important distributions derived from the exponential distribution. The lifetime of series and parallel systems are investigated which play crucial role in reliability theory. It is shown how to generate random numbers having given distribution. Finally, random sums are treated which occurred in many practical situations.

The material is based on mainly the following books: Allen [ 2 ], Gnyedenko, Beljajev, Szolovjev [ 29 ], Kleinrock [ 48 ], Ovcharov [ 60 ], Ravichandran [ 64 ], Ross [ 67 ], Stewart [ 74 ], Tijms [ 91 ], Trivedi [ 94 ].

2.1. 2.1. Distributions Related to the Exponential Distribution

Theorem 2.1. (Memoryless or Markov property) If then it satisfies the following, so-called memoryless, or Markov property

Proof.

The proof of the second formula can be carried out in the same way.

Theorem 2.2. , where (small ordo h) is defined by .

Proof. As it can be seen the statement is equivalent to

which can be proved by applying the L'Hospital's rule. That is

Theorem 2.3. If is the distribution function of a random variable for which , and

then , if .

Proof. It can be seen from the conditions that

therefore

According to the initial condition thus we have , consequently

In many practical problems it is important to determine the distribution of the minimum of independent random variables.

Theorem 2.4. (Distribution of the lifetime of a series system) If and are independent random variables then

is also exponentially distributed with parameter .

Proof. By using the properties of the probability and the independent events we have

Example 2.1. Let , be independent exponentially distributed random variables with parameters , , respectively. Find the probability that .

Solution:

if and only if . By the theorem of total probability we have

Example 2.2. (Distribution of the lifetime of a parallel system) Let be independent random variables and . Find the distribution of .

Solution:

If , then

In addition, if , then .

Example 2.3. Find the mean lifetime of a parallel system with two independent and exponentially distributed components.

Solution:

Let us solve the problem first according to the definition of the mean.

This case

Thus

This can be expressed as

Now, let us show how this problem can be solved by probabilistic reasoning.

At the beginning both components are operating, thus the mean of the first failure is

The second failure happens if the remaining component fails, too. We have 2 cases, depending which component failed first. It is easy to see that by the memoryless property of the exponential distribution the distribution of the residual life time of the remaining component is the same as it was at the beginning. Then by using the theorem of total expectation for the mean residual life time after the first failure we have

Hence the mean operating time of a parallel system is

In homogeneous case it reduces to as we will see in the next problem.

It is easy to see that the second moment of the lifetime could be calculated by the same way by using either the definition or the theorem of second moments and thus the variance can be obtained. Of course these are much complicated formulas but in homogeneous case they could be simplified as we see in the next Example.

Example 2.4. Find the mean and variance of a parallel system with homogeneous, independent and exponentially distributed components, that is .

Solution:

As it is well-known if then

Using substitution we get

Due to the memoryless property of the exponential distribution it is easy to see that the time difference between the consecutive failures are exponentially distributed. More precisely, the distribution of time between the th and th failures is exponentially distributed with parameter , . Moreover, they are independent of each other. This fact can be used to get the mean and variance of the th failure.After these arguments it is clear that

In particular, the variance of the lifetime of a parallel system is

Definition 2.5. Let and independent random variables with density functions and respectively. Then the density function of can be obtained as

which is said to be the convolution of and .

In addition, if and , then

Example 2.5. Let and be independent and exponentially distributed random variables with parameter . Find their convolution.

Solution:

After substitution we have

which shows the fact that the sum of independent exponentially distributed random variables is not exponentially distributed.

Example 2.6. Let be independent and exponentially distributed random variables with the same parameter . Show that

Solution:

To prove this we shall use induction. As we have seen this statement is true for and Let us assume it is valid for and let us see what happens to .

what is exactly the density function of an Erlang distribution with parameters .

This representation of the Erlang distribution help us to compute its mean and variance in a very simple way without using its density function.

The Erlang distribution is very useful to approximate the distribution of such a random variable for which the squared coefficient of variation . In other words, if the first two moments of are given then

is the mixture of two Erlang distributions with parameters and , where

with the property that

Such a distribution of is denoted by and it matches on the first two moments.

Hypoexponential distribution

Let () be independent exponentially distributed random variables. The random variable is said to have a hypoexponential distribution.

It can be shown that its density function is given by

It is easy to see that

Thus for the squared coefficient of variation we have

Hyperexponential distribution

Let () and be distribution. A random variable is said to have a hyperexponential distribution if its density function is given by

Its distribution function is

It is easy to see that

It can be shown that

In the case when for a random variable , then the the following two-moment fit is suggested

that is is a -phase hyperexponentially distributed random variable. Since the density function of contains parameters and the fit is based on the first two moments the distribution is not uniquely determined.

The most commonly used procedure is the balanced mean method, that is

In this case

The solution is

If the fit is based on the first 3 , , moments then the condition is needed, and it gives a unique solution. It can be shown that the gamma and lognormal distributions satisfy this condition. The parameters of the resulting unique hyperexponential distribution are

where

Mixture of Distributions

Definition 2.6. Let be random variables and be a distribution. The distribution function is called the mixture of distributions and weights .

Similarly

The density function is called the mixture of density functions and weights .

It is easy to see that are indeed distribution, density functions, respectively.

Using this terminology we can say that the hyperexponential is the mixture of exponential distributions.