Passive Air Cooling System and Solar Water Heater.ppt
Slides Part 3.ppt
1. ‹#›
EE571
PART 3
Random Processes
Huseyin Bilgekul
Eeng571 Probability and astochastic Processes
Department of Electrical and Electronic Engineering
Eastern Mediterranean University
4. ‹#›
EE571
• A RANDOM VARIABLE X, is a rule for
assigning to every outcome, w, of an
experiment a number X(w).
– Note: X denotes a random variable and X(w) denotes
a particular value.
• A RANDOM PROCESS X(t) is a rule for
assigning to every w, a function X(t,w).
– Note: for notational simplicity we often omit the
dependence on w.
Random Processes
6. ‹#›
EE571
The set of all possible functions is called the
ENSEMBLE.
Ensemble of Sample Functions
7. ‹#›
EE571
• A general Random or Stochastic
Process can be described as:
– Collection of time functions
(signals) corresponding to various
outcomes of random experiments.
– Collection of random variables
observed at different times.
• Examples of random processes
in communications:
– Channel noise,
– Information generated by a source,
– Interference.
t1 t2
Random Processes
8. ‹#›
EE571
Let denote the random outcome of an experiment. To every such
outcome suppose a waveform
is assigned.
The collection of such
waveforms form a
stochastic process. The
set of and the time
index t can be continuous
or discrete (countably
infinite or finite) as well.
For fixed (the set of
all experimental outcomes), is a specific time function.
For fixed t,
is a random variable. The ensemble of all such realizations
over time represents the stochastic
)
,
(
t
X
}
{ k
S
i
)
,
( 1
1 i
t
X
X
)
,
(
t
X
t
1
t 2
t
)
,
( n
t
X
)
,
( k
t
X
)
,
( 2
t
X
)
,
( 1
t
X
)
,
(
t
X
0
)
,
(
t
X
Random Processes
20. ‹#›
EE571
Introduction
• A random process is a process (i.e., variation in
time or one dimensional space) whose behavior
is not completely predictable and can be
characterized by statistical laws.
• Examples of random processes
– Daily stream flow
– Hourly rainfall of storm events
– Stock index
21. ‹#›
EE571
Random Variable
• A random variable is a mapping function which assigns outcomes of a
random experiment to real numbers. Occurrence of the outcome
follows certain probability distribution. Therefore, a random variable
is completely characterized by its probability density function (PDF).
25. ‹#›
EE571
STOCHASTIC PROCESS
• The term “stochastic processes” appears mostly in
statistical textbooks; however, the term “random
processes” are frequently used in books of many
engineering applications.
27. ‹#›
EE571
DENSITY OF STOCHASTIC PROCESSES
• First-order densities of a random process
A stochastic process is defined to be completely or
totally characterized if the joint densities for the
random variables are known for all
times and all n.
In general, a complete characterization is practically
impossible, except in rare cases. As a result, it is
desirable to define and work with various partial
characterizations. Depending on the objectives of
applications, a partial characterization often suffices to
ensure the desired outputs.
)
(
),
(
),
( 2
1 n
t
X
t
X
t
X
n
t
t
t ,
,
, 2
1
28. ‹#›
EE571
• For a specific t, X(t) is a random variable with
distribution .
• The function is defined as the first-order
distribution of the random variable X(t). Its
derivative with respect to x
is the first-order density of X(t).
]
)
(
[
)
,
( x
t
X
p
t
x
F
)
,
( t
x
F
x
t
x
F
t
x
f
)
,
(
)
,
(
DENSITY OF STOCHASTIC PROCESSES
29. ‹#›
EE571
• If the first-order densities defined for all time t, i.e. f(x,t),
are all the same, then f(x,t) does not depend on t and we call
the resulting density the first-order density of the random
process ; otherwise, we have a family of first-order
densities.
• The first-order densities (or distributions) are only a partial
characterization of the random process as they do not
contain information that specifies the joint densities of the
random variables defined at two or more different times.
)
(t
X
DENSITY OF STOCHASTIC PROCESSES
30. ‹#›
EE571
• Mean and variance of a random process
The first-order density of a random process, f(x,t), gives the
probability density of the random variables X(t) defined for all time
t. The mean of a random process, mX(t), is thus a function of time
specified by
t
t
t
t
X dx
t
x
f
x
X
E
t
X
E
t
m )
,
(
]
[
)]
(
[
)
(
MEAN AND VARIANCE OF RP
• For the case where the mean of X(t) does not depend on t, we have
• The variance of a random process, also a function of time, is defined
by
constant).
(a
)]
(
[
)
( X
X m
t
X
E
t
m
2
2
2
2
)]
(
[
]
[
)]
(
)
(
[
)
( t
m
X
E
t
m
t
X
E
t X
t
X
X
31. ‹#›
EE571
• Second-order densities of a random process
For any pair of two random variables X(t1) and X(t2),
we define the second-order densities of a random
process as or .
• Nth-order densities of a random process
The nth order density functions for at times
are given by
or .
)
,
;
,
( 2
1
2
1 t
t
x
x
f )
,
( 2
1 x
x
f
)
(t
X
n
t
t
t ,
,
, 2
1
)
,
,
,
;
,
,
,
( 2
1
2
1 n
n t
t
t
x
x
x
f
)
,
,
,
( 2
1 n
x
x
x
f
HIGHER ORDER DENSITY OF RP
32. ‹#›
EE571
• Given two random variables X(t1) and X(t2), a
measure of linear relationship between them is
specified by E[X(t1)X(t2)]. For a random process,
t1 and t2 go through all possible values, and
therefore, E[X(t1)X(t2)] can change and is a
function of t1 and t2. The autocorrelation function
of a random process is thus defined by
)
,
(
)
(
)
(
)
,
( 1
2
2
1
2
1 t
t
R
t
X
t
X
E
t
t
R
Autocorrelation function of RP
34. ‹#›
EE571
• Strict-sense stationarity seldom holds for random
processes, except for some Gaussian processes.
Therefore, weaker forms of stationarity are needed.
) )
n
n
n
n t
t
t
x
x
x
f
t
t
t
x
x
x
f ,
,
,
;
,
,
,
,
,
,
;
,
,
, 2
1
2
1
2
1
2
1
Stationarity of Random Processes
36. ‹#›
EE571
.
all
for
constant)
(
)
( t
m
t
X
E
) ) .
and
all
for
,
)
,
( 2
1
1
2
1
2
2
1 t
t
t
t
R
t
t
R
t
t
R
Wide Sense Stationarity (WSS) of Random Processes
37. ‹#›
EE571
• Equality
• Note that “x(t, wi) = y(t, wi) for every wi” is not
the same as “x(t, wi) = y(t, wi) with probability
1”.
Equality and Continuity of RP
49. ‹#›
EE571
• A random sequence or a discrete-time random
process is a sequence of random variables
{X1(w), X2(w), …, Xn(w),…} = {Xn(w)}, w .
• For a specific w, {Xn(w)} is a sequence of
numbers that might or might not converge. The
notion of convergence of a random sequence
can be given several interpretations.
Stochastic Convergence
50. ‹#›
EE571
• The sequence of random variables {Xn(w)}
converges surely to the random variable X(w) if the
sequence of functions Xn(w) converges to X(w) as n
for all w , i.e.,
Xn(w) X(w) as n for all w .
Sure Convergence (Convergence Everywhere)
58. ‹#›
EE571
• Convergence with probability one applies to the
individual realizations of the random process.
Convergence in probability does not.
• The weak law of large numbers is an example of
convergence in probability.
• The strong law of large numbers is an example of
convergence with probability 1.
• The central limit theorem is an example of convergence
in distribution.
Remarks
70. ‹#›
EE571
The above theorem shows that one can expect
a sample average to converge to a constant in
mean square sense if and only if the average of
the means converges and if the memory dies
out asymptotically, that is , if the covariance
decreases as the lag increases.
The Mean-Square Ergodic Theorem
75. ‹#›
EE571
Examples of Stochastic Processes
• iid random process
A discrete time random process {X(t), t = 1, 2,
…} is said to be independent and identically
distributed (iid) if any finite number, say k, of
random variables X(t1), X(t2), …, X(tk) are
mutually independent and have a common
cumulative distribution function FX() .
76. ‹#›
EE571
• The joint cdf for X(t1), X(t2), …, X(tk) is given by
• It also yields
where p(x) represents the common probability
mass function.
)
)
(
)
(
)
(
,
,
,
)
,
,
,
(
2
1
2
2
1
1
2
1
,
,
, 2
1
k
X
X
X
k
k
k
X
X
X
x
F
x
F
x
F
x
X
x
X
x
X
P
x
x
x
F k
)
(
)
(
)
(
)
,
,
,
( 2
1
2
1
,
,
, 2
1 k
X
X
X
k
X
X
X x
p
x
p
x
p
x
x
x
p k
iid Random Stochastic Processes
79. ‹#›
EE571
• Let 0 denote the probability mass function of
X0. The joint probability of X0, X1, Xn is
)
)
|
(
)
|
(
)
(
)
(
)
(
)
(
)
(
)
(
)
(
,
,
,
)
,
,
,
(
1
0
1
0
0
1
0
1
0
0
1
0
1
1
0
0
1
0
1
1
0
0
1
1
0
0
n
n
n
n
n
n
n
n
n
n
n
n
x
x
P
x
x
P
x
x
x
f
x
x
f
x
x
x
P
x
x
P
x
X
P
x
x
x
x
x
X
P
x
X
x
X
x
X
P
Random walk process
81. ‹#›
EE571
The property
is known as the Markov property.
A special case of random walk: the Brownian
motion.
)
|
(
)
,
,
,
|
( 1
1
1
0
0
1
1 n
n
n
n
n
n
n
n x
X
x
X
P
x
X
x
X
x
X
x
X
P
Random walk process
82. ‹#›
EE571
Gaussian process
• A random process {X(t)} is said to be a
Gaussian random process if all finite
collections of the random process, X1=X(t1),
X2=X(t2), …, Xk=X(tk), are jointly Gaussian
random variables for all k, and all choices of t1,
t2, …, tk.
• Joint pdf of jointly Gaussian random variables
X1, X2, …, Xk:
85. ‹#›
EE571
The Brownian motion
(one-dimensional, also known as random walk)
• Consider a particle randomly moves on a real line.
• Suppose at small time intervals the particle jumps a small
distance randomly and equally likely to the left or to the
right.
• Let be the position of the particle on the real line at
time t.
)
(t
X
86. ‹#›
EE571
• Assume the initial position of the particle is at the
origin, i.e.
• Position of the particle at time t can be expressed as
where
are independent random variables, each having
probability 1/2 of equating 1 and 1.
( represents the largest integer not exceeding
.)
0
)
0
(
X
)
]
/
[
2
1
)
(
t
Y
Y
Y
t
X
,
, 2
1 Y
Y
/
t
/
t
The Brownian motion
87. ‹#›
EE571
Distribution of X(t)
• Let the step length equal , then
• For fixed t, if is small then the distribution of
is approximately normal with mean 0 and variance t,
i.e., .
)
]
/
[
2
1
)
(
t
Y
Y
Y
t
X
)
(t
X
)
t
N
t
X ,
0
~
)
(
89. ‹#›
EE571
• If t and h are fixed and is sufficiently small
then
) )
)
1 2 [( )/ ] 1 2 [ / ]
[ / ] 1 [ / ] 2 [( )/ ]
2
( ) ( )
t h t
t t t h
t t t h
X t h X t
Y Y Y Y Y Y
Y Y Y
Y Y Y
90. ‹#›
EE571
Graphical Distribution of the displacement of
• The random variable is normally
distributed with mean 0 and variance h, i.e.
)
(
)
( t
X
h
t
X
)
(
)
( t
X
h
t
X
)
du
h
u
h
x
t
X
h
t
X
P
x
2
exp
2
1
)
(
)
(
2
91. ‹#›
EE571
• Variance of is dependent on t, while variance
of is not.
• If , then ,
are independent random variables.
)
(t
X
)
(
)
( t
X
h
t
X
m
t
t
t 2
2
1
0
)
(
)
( 1
2 t
X
t
X
,
),
(
)
( 3
4
t
X
t
X
)
(
)
( 1
2
2
m
m t
X
t
X
93. ‹#›
EE571
Covariance and Correlation functions of )
(t
X
t
Y
Y
Y
E
Y
Y
Y
Y
Y
Y
Y
Y
Y
E
Y
Y
Y
Y
Y
Y
E
h
t
X
t
X
E
h
t
X
t
X
Cov
t
h
t
t
t
t
t
h
t
t
2
2
1
2
1
2
1
2
2
1
2
1
2
1
)
(
)
(
)
(
),
(
)
)
Cov ( ), ( )
Correl ( ), ( )
=
X t X t h
X t X t h
t t h
t
t t h