SlideShare a Scribd company logo
1 of 72
Download to read offline
Return Interval Distribution of Extreme Events in Long
memory Time Series With Two Scaling Exponents
Smrati Kumar Katiyar
Department of Physics
IISER, Pune
May 3, 2011
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 1 / 27
1 Explanation for the title
2 Statistical test for long memory
3 Foundation stone for our work
4 our work
Analytical approach
Numerical approach to the problem
Comparison of analytical and numerical results
5 Long memory probability process with two scaling exponents
6 conclusion
7 future direction
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 2 / 27
What are the key terms?
Return interval distribution of extreme events in long memory time series
with two scaling exponents.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 3 / 27
What are the key terms?
Return interval distribution of extreme events in long memory time series
with two scaling exponents.
1 Return interval and extreme events
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 3 / 27
What are the key terms?
Return interval distribution of extreme events in long memory time series
with two scaling exponents.
1 Return interval and extreme events
2 Long memory time series
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 3 / 27
What are the key terms?
Return interval distribution of extreme events in long memory time series
with two scaling exponents.
1 Return interval and extreme events
2 Long memory time series
3 Scaling exponents
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 3 / 27
Return interval and extreme events
Given a time series X(t)
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 4 / 27
Return interval and extreme events
Given a time series X(t)
0 20 40 60 80 100
t
-4
-3
-2
-1
0
1
2
3
x(t)
threshold
r1 r2 r3
Figure: Return intervals and extreme events
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 4 / 27
Aim of our project work
Example : let say we are given a time series X(t) and there are total 11
time instants at which the value of X is more than the threshold(q).
Those time instants are,
t = 0, 1, 3, 5, 6, 7, 10, 11, 12, 14, 16
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 5 / 27
Aim of our project work
Example : let say we are given a time series X(t) and there are total 11
time instants at which the value of X is more than the threshold(q).
Those time instants are,
t = 0, 1, 3, 5, 6, 7, 10, 11, 12, 14, 16
So the return intervals will be :
return intervals = 1, 2, 2, 1, 1, 3, 1, 1, 2, 2
out of these 10 return intervals we have
5 return intervals of length 1
4 return intervals of length 2
and 1 return interval of length 3
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 5 / 27
Aim of our project work
Example : let say we are given a time series X(t) and there are total 11
time instants at which the value of X is more than the threshold(q).
Those time instants are,
t = 0, 1, 3, 5, 6, 7, 10, 11, 12, 14, 16
So the return intervals will be :
return intervals = 1, 2, 2, 1, 1, 3, 1, 1, 2, 2
out of these 10 return intervals we have
5 return intervals of length 1
4 return intervals of length 2
and 1 return interval of length 3
so the probability of occurance of return interval of length 1 will be
P(1) = 5
10,
similarly P(2) = 4
10 and P(3) = 1
10
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 5 / 27
Long memory time series
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
Long memory time series
Plot of sample autocorrelation function (ACF) ρk against lag k is one of
the most useful tool to analyse a given time series.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
Long memory time series
Plot of sample autocorrelation function (ACF) ρk against lag k is one of
the most useful tool to analyse a given time series.
ρk =
n
t=k+1(xt −x)(xt−k −x)
n
t=1(xt −x)2
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
Long memory time series
Plot of sample autocorrelation function (ACF) ρk against lag k is one of
the most useful tool to analyse a given time series.
ρk =
n
t=k+1(xt −x)(xt−k −x)
n
t=1(xt −x)2
For long memory processes
ρk → Cρk−γ
as k → ∞
where Cρ > 0 and γ ∈ (0, 1)
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
Long memory time series
Plot of sample autocorrelation function (ACF) ρk against lag k is one of
the most useful tool to analyse a given time series.
ρk =
n
t=k+1(xt −x)(xt−k −x)
n
t=1(xt −x)2
For long memory processes
ρk → Cρk−γ
as k → ∞
where Cρ > 0 and γ ∈ (0, 1)
0 10 20 30 40 50 60
0.00.20.40.60.81.0
Lag
ACF
Series a
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
Long memory time series
Plot of sample autocorrelation function (ACF) ρk against lag k is one of
the most useful tool to analyse a given time series.
ρk =
n
t=k+1(xt −x)(xt−k −x)
n
t=1(xt −x)2
For long memory processes
ρk → Cρk−γ
as k → ∞
where Cρ > 0 and γ ∈ (0, 1)
0 10 20 30 40 50 60
0.00.20.40.60.81.0
Lag
ACF
Series a
A Long memory process is trend reinforcing, which means the direction
(up or down compared to the last value) of the next value is more likely
the same as current value.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
Scaling exponents
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 7 / 27
Scaling exponents
The most common power laws relate two variables and have the form
f (x) ∝ xα
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 7 / 27
Scaling exponents
The most common power laws relate two variables and have the form
f (x) ∝ xα
Here α is called the scaling exponent. where the word ”scaling” denotes
the fact that a power-law function satisfies
f (cx) = cαf (x) ∝ f (x)
Here c is a constant.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 7 / 27
Statistical test for long memory
How to find whether a given time series x(t) has long memory or not?
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 8 / 27
Statistical test for long memory
How to find whether a given time series x(t) has long memory or not?
Detrended fluctuation analysis
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 8 / 27
Statistical test for long memory
How to find whether a given time series x(t) has long memory or not?
Detrended fluctuation analysis
x(t) is the time series. (t = 1, 2, 3, .......Nmax )
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 8 / 27
Statistical test for long memory
How to find whether a given time series x(t) has long memory or not?
Detrended fluctuation analysis
x(t) is the time series. (t = 1, 2, 3, .......Nmax )
y(k) = k
i=1(xi − x ) cumulative sum or profile
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 8 / 27
Statistical test for long memory
How to find whether a given time series x(t) has long memory or not?
Detrended fluctuation analysis
x(t) is the time series. (t = 1, 2, 3, .......Nmax )
y(k) = k
i=1(xi − x ) cumulative sum or profile
Divide y(k) into time window of length n samples. In each box of length
n, we fit y(k), using a polynomial function of order l, which represents the
trend in that box. The y coordinate of the fit line in each box is denoted
by yn(k). Since we use a polynomial fit of order l, we denote the algorithm
as DFA-l.
0 100 200 300 400 500 600 700 800 900 1000
0
50
100
150
200
250
300
350
K
Yk
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 8 / 27
The integrated signal y(k) is detrended by subtracting the local trend
yn(k) in each box of length n.
For a given box size n, the root-mean-square (rms) fluctuation for this
integrated and detrended signal is calculated:
F(n) = 1
Nmax
Nmax
k=1 [y(k) − yn(k)]2
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 9 / 27
The integrated signal y(k) is detrended by subtracting the local trend
yn(k) in each box of length n.
For a given box size n, the root-mean-square (rms) fluctuation for this
integrated and detrended signal is calculated:
F(n) = 1
Nmax
Nmax
k=1 [y(k) − yn(k)]2
The above computation is repeated for a broad range of scales (box size
n) to provide a relationship between F(n) and the box size n.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 9 / 27
The integrated signal y(k) is detrended by subtracting the local trend
yn(k) in each box of length n.
For a given box size n, the root-mean-square (rms) fluctuation for this
integrated and detrended signal is calculated:
F(n) = 1
Nmax
Nmax
k=1 [y(k) − yn(k)]2
The above computation is repeated for a broad range of scales (box size
n) to provide a relationship between F(n) and the box size n.
A power-law relation between the average root-meansquare fluctuation
function F(n) and the box size n indicates the presence of scaling
F(n) ∝ nα
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 9 / 27
We can fit the log-log plot with a straight line and the slope of that line
will be the scaling exponent.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 10 / 27
We can fit the log-log plot with a straight line and the slope of that line
will be the scaling exponent.
0 0.2 0.4 0.6 0.8 1
ln n
0
0.2
0.4
0.6
0.8
1
lnF(n)
Figure: log-log plot of F(n) Vs n
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 10 / 27
We can fit the log-log plot with a straight line and the slope of that line
will be the scaling exponent.
0 0.2 0.4 0.6 0.8 1
ln n
0
0.2
0.4
0.6
0.8
1
lnF(n)
Figure: log-log plot of F(n) Vs n
When slope of line is in range (1/2,1) ,The time series displays long
memory.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 10 / 27
Foundation stone for our work
for a long memory time series with one scaling exponent, the probability
distribution of return intervals are known (Santhanam et. al, 2008)
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 11 / 27
Foundation stone for our work
for a long memory time series with one scaling exponent, the probability
distribution of return intervals are known (Santhanam et. al, 2008)
P(R) = a R−(1−γ) e
−( a
γ
)Rγ
Here R is the scaled return interval
R = r
r ,where r are the actual return intervals r = 1, 2, 3, 4, ......
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 11 / 27
our work
What about time series with two scaling exponent?
How to calculate their return interval distributions?
Examples of these kind of time series are:
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 12 / 27
our work
What about time series with two scaling exponent?
How to calculate their return interval distributions?
Examples of these kind of time series are:
high frequency financial data,
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 12 / 27
our work
What about time series with two scaling exponent?
How to calculate their return interval distributions?
Examples of these kind of time series are:
high frequency financial data,
network traffic of a web server etc.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 12 / 27
our work
What about time series with two scaling exponent?
How to calculate their return interval distributions?
Examples of these kind of time series are:
high frequency financial data,
network traffic of a web server etc.
0 0.2 0.4 0.6 0.8 1
ln n
0
0.2
0.4
0.6
0.8
1
lnF(n)
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 12 / 27
our work
What about time series with two scaling exponent?
How to calculate their return interval distributions?
Examples of these kind of time series are:
high frequency financial data,
network traffic of a web server etc.
0 0.2 0.4 0.6 0.8 1
ln n
0
0.2
0.4
0.6
0.8
1
lnF(n)
Figure: Podobnik et al. PHYSICA A,
2002
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 12 / 27
our approach to solve the problem
analytical approach
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 13 / 27
our approach to solve the problem
analytical approach We will consider a probability model for a stationary
process with long memory, given an extreme event at time t = 0, the
probability to find an extreme event at time t = r is given by
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 13 / 27
our approach to solve the problem
analytical approach We will consider a probability model for a stationary
process with long memory, given an extreme event at time t = 0, the
probability to find an extreme event at time t = r is given by
Pex (r) =
a1r−(2α1−1) = a1r−(1−γ1) for 0 < r < nx
a2r−(2α2−1) = a2r−(1−γ2) for nx < r < ∞
where 0.5 < α1, α2 < 1 are DFA exponents and 0 < γ1, γ2 < 1 are
autocorrelation exponents
nx is the crossover scale
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 13 / 27
After a very long algebra we find the return interval distribution
P(r) =
a1r−(1−γ1)e−(a1/γ1)rγ1
for 0 < r < nx
Ca2r−(1−γ2)e−(a2/γ2)rγ2
for nx < r < ∞
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
After a very long algebra we find the return interval distribution
P(r) =
a1r−(1−γ1)e−(a1/γ1)rγ1
for 0 < r < nx
Ca2r−(1−γ2)e−(a2/γ2)rγ2
for nx < r < ∞
How to find three unknowns a1, a2 and C ?
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
After a very long algebra we find the return interval distribution
P(r) =
a1r−(1−γ1)e−(a1/γ1)rγ1
for 0 < r < nx
Ca2r−(1−γ2)e−(a2/γ2)rγ2
for nx < r < ∞
How to find three unknowns a1, a2 and C ? we need three equations........
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
After a very long algebra we find the return interval distribution
P(r) =
a1r−(1−γ1)e−(a1/γ1)rγ1
for 0 < r < nx
Ca2r−(1−γ2)e−(a2/γ2)rγ2
for nx < r < ∞
How to find three unknowns a1, a2 and C ? we need three equations........
normalization equation
∞
0
P(r)dr = 1.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
After a very long algebra we find the return interval distribution
P(r) =
a1r−(1−γ1)e−(a1/γ1)rγ1
for 0 < r < nx
Ca2r−(1−γ2)e−(a2/γ2)rγ2
for nx < r < ∞
How to find three unknowns a1, a2 and C ? we need three equations........
normalization equation
∞
0
P(r)dr = 1.
normalizing r to unity
∞
0
rP(r)dr = 1
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
After a very long algebra we find the return interval distribution
P(r) =
a1r−(1−γ1)e−(a1/γ1)rγ1
for 0 < r < nx
Ca2r−(1−γ2)e−(a2/γ2)rγ2
for nx < r < ∞
How to find three unknowns a1, a2 and C ? we need three equations........
normalization equation
∞
0
P(r)dr = 1.
normalizing r to unity
∞
0
rP(r)dr = 1
using continuity condition
a1r−(1−γ1) = a2r−(1−γ2) at r = nx
a1n
−(1−γ1)
x = a2n
−(1−γ2)
x
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
final equations for a1, a2 and C
a1n
−(1−γ1)
x = a2n
−(1−γ2)
x
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 15 / 27
final equations for a1, a2 and C
a1n
−(1−γ1)
x = a2n
−(1−γ2)
x
Ce−(a2/γ2)n
γ2
x = e−(a1/γ1)n
γ1
x
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 15 / 27
final equations for a1, a2 and C
a1n
−(1−γ1)
x = a2n
−(1−γ2)
x
Ce−(a2/γ2)n
γ2
x = e−(a1/γ1)n
γ1
x
C(γ2/a2)1/γ2
nx Eγ2−1
γ2
(n
γ2
x )
γ2
− (γ1/a1)1/γ1
nx Eγ1−1
γ1
(n
γ1
x )
γ1
= 1
Here En(x) =
∞
1
e−xt
tn dt =
1
0 e−x/ηη(n−2)dη
En(x) is known as exponential integral function.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 15 / 27
numerical approach to the problem
first and only challenge with this approach : to get a long memory time
series which contains two different scaling exponents.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 16 / 27
numerical approach to the problem
first and only challenge with this approach : to get a long memory time
series which contains two different scaling exponents.
The model
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 16 / 27
numerical approach to the problem
first and only challenge with this approach : to get a long memory time
series which contains two different scaling exponents.
The model
Step 1:
set the length of time series, say, l = 105.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 16 / 27
numerical approach to the problem
first and only challenge with this approach : to get a long memory time
series which contains two different scaling exponents.
The model
Step 1:
set the length of time series, say, l = 105.
Step 2:
generate a series of random numbers yi i = 0......(l − 1) which follow
gaussian distribution with mean 0 and variance 1
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 16 / 27
numerical approach to the problem
first and only challenge with this approach : to get a long memory time
series which contains two different scaling exponents.
The model
Step 1:
set the length of time series, say, l = 105.
Step 2:
generate a series of random numbers yi i = 0......(l − 1) which follow
gaussian distribution with mean 0 and variance 1
Step 3:
generate a series of coefficients defined as:
Cα
i =
Γ(i − α)
Γ(−α)Γ(i + 1)
= −
α
Γ(1 − α)
Γ(i − α)
Γ(i + 1)
α =
α1 for 0 < i < nx
α2 for nx < i < ∞
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 16 / 27
Both α1 and α2 belong to the interval (−0.5, 0)
The asymptotic behaviour of Cα
i for large i can be written as
Cα
i ≃ −
α
Γ(1 − α)
i−(1+α)
for i ≫ 1
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 17 / 27
Both α1 and α2 belong to the interval (−0.5, 0)
The asymptotic behaviour of Cα
i for large i can be written as
Cα
i ≃ −
α
Γ(1 − α)
i−(1+α)
for i ≫ 1
Step 4:
Now, get a series yα
i using yi and Cα
i according to the relation
yα
i =
i
j=0
yi−jCα
j i = 0.....(l − 1) (1)
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 17 / 27
DFA of time series generated using previous model
0 1 2 3 4 5 6
log (n)
0
1
2
3
4
logF(n)
DFA analysis of time series
crossover region
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 18 / 27
comparison of analytical and numerical results
try to fit P(r) = ar−(1−γ)e−(c/γ)rγ
to each segment, according to their
corresponding γ values
-5 -4 -3 -2 -1 0 1 2
ln (R)
-8
-7
-6
-5
-4
-3
lnP(R)
segment 1
segment2
break point
discrepancy because of threshold dependence of constants and long
memory in return intervals.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 19 / 27
-4.8
-4.6
-4.4
-4.2
-4
-3.8
-3.6
-3.4
-4.5 -4 -3.5 -3 -2.5 -2 -1.5 -1
lnP(R)
ln(R)
return interval distribution segment(1)
-8
-7.5
-7
-6.5
-6
-5.5
-5
-4.5
-0.5 0 0.5 1 1.5
lnP(R)
ln(R)
return interval distribution segment(2)
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 20 / 27
Long memory probability process with two scaling
exponents
To remove discrepancy because of long memory in return intervals. We
will generate return intervals such that they have no dependence on each
other.
first determine the constants a1 and a2 by normalizing it in the region
kmin = 1 and kmax .
kmax
1
Pex (r)dr =
nx
1
a1r−(1−γ1)
dr +
kmax
nx
a2r−(1−γ2)
dr = 1
Use continuity condition as well and solve for a1 and a2.
a1 =
1
[n
γ1
x
γ1
− 1
γ1
+ k
γ2
max n
γ1−γ2
x
γ2
− n
γ1
x
γ2
]
a2 =
1
[n
γ2
x
γ1
− n
γ2−γ1
x
γ1
+ k
γ2
max
γ2
− n
γ2
x
γ2
]
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 21 / 27
Now generate a random number ξr from a uniform distribution at every r
and compare it with the value of Pex (r). A random number is accepted as
an extreme event if ξr < Pex (r) at any given value of r. If ξr ≥ Pex (r),
then it is not an extreme event. Using this procedure we can generate a
series of extreme events.
-6 -4 -2 0 2
ln (R)
-9
-8
-7
-6
-5
-4
-3
-2
lnP(R)
segment 2
segm
ent 1
break point
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 22 / 27
-6.5
-6
-5.5
-5
-4.5
-4
-3.5
-3
-2.5
-2
-5.5 -5 -4.5 -4 -3.5 -3 -2.5 -2 -1.5
lnP(R)
ln(R)
return interval distribution of segment 1
-8.5
-8
-7.5
-7
-6.5
-6
-1 -0.5 0 0.5 1 1.5 2
lnP(R)
ln(R)
return interval distribution of segment 2
P(r) =
a1r−(1−γ1)e−(a1/γ1)rγ1
for 0 < r < nx
Ca2r−(1−γ2)e−(a2/γ2)rγ2
for nx < r < ∞
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 23 / 27
In the previous slide, for segment(1) in place of a1, we have two variables a
and b. Why we have two different variables? the possible reason is that
for normalization integrals we have lower limit as 0 but in reality the
minimum size of return interval is 1. So even after scaling, the minimum
value of lower limit is 1/ r .
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 24 / 27
conclusion
for a long memory time series with two different scaling exponent
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 25 / 27
conclusion
for a long memory time series with two different scaling exponent
There is a break point in the return interval distribution graph.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 25 / 27
conclusion
for a long memory time series with two different scaling exponent
There is a break point in the return interval distribution graph.
For each scaling exponent there will be a different segment in return
interval distribution.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 25 / 27
conclusion
for a long memory time series with two different scaling exponent
There is a break point in the return interval distribution graph.
For each scaling exponent there will be a different segment in return
interval distribution.
Each segment still follow a distribution of the form which is product of
power law and stretched exponential.
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 25 / 27
future direction
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 26 / 27
future direction
the model that we have used to generate time series with more than one
scaling exponent need a fine tuning so that we can test our analytical
results more accurately
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 26 / 27
future direction
the model that we have used to generate time series with more than one
scaling exponent need a fine tuning so that we can test our analytical
results more accurately
we should also think of the effects of long memory in return intervals
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 26 / 27
Thank you
Questions????
Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 27 / 27

More Related Content

What's hot

T coffee algorithm dissection
T coffee algorithm dissectionT coffee algorithm dissection
T coffee algorithm dissectionGui Chen
 
Clustering techniques
Clustering techniquesClustering techniques
Clustering techniquestalktoharry
 
The Fourth Largest Estrada Indices for Trees
The Fourth Largest Estrada Indices for TreesThe Fourth Largest Estrada Indices for Trees
The Fourth Largest Estrada Indices for Treesinventionjournals
 
Exact Solutions of the Klein-Gordon Equation for the Q-Deformed Morse Potenti...
Exact Solutions of the Klein-Gordon Equation for the Q-Deformed Morse Potenti...Exact Solutions of the Klein-Gordon Equation for the Q-Deformed Morse Potenti...
Exact Solutions of the Klein-Gordon Equation for the Q-Deformed Morse Potenti...ijrap
 
Introduction to Reinforcement Learning for Molecular Design
Introduction to Reinforcement Learning for Molecular Design Introduction to Reinforcement Learning for Molecular Design
Introduction to Reinforcement Learning for Molecular Design Dan Elton
 
On a Deterministic Property of the Category of k-almost Primes: A Determinist...
On a Deterministic Property of the Category of k-almost Primes: A Determinist...On a Deterministic Property of the Category of k-almost Primes: A Determinist...
On a Deterministic Property of the Category of k-almost Primes: A Determinist...Ramin (A.) Zahedi
 
Algorithm Design and Complexity - Course 11
Algorithm Design and Complexity - Course 11Algorithm Design and Complexity - Course 11
Algorithm Design and Complexity - Course 11Traian Rebedea
 
Intro to MATLAB and K-mean algorithm
Intro to MATLAB and K-mean algorithmIntro to MATLAB and K-mean algorithm
Intro to MATLAB and K-mean algorithmkhalid Shah
 
K means clustering
K means clusteringK means clustering
K means clusteringThomas K T
 
Artificial intelligence ai choice mechanism hypothesis of a mathematical method
Artificial intelligence ai choice mechanism hypothesis of a mathematical methodArtificial intelligence ai choice mechanism hypothesis of a mathematical method
Artificial intelligence ai choice mechanism hypothesis of a mathematical methodAlexander Decker
 
Connected Total Dominating Sets and Connected Total Domination Polynomials of...
Connected Total Dominating Sets and Connected Total Domination Polynomials of...Connected Total Dominating Sets and Connected Total Domination Polynomials of...
Connected Total Dominating Sets and Connected Total Domination Polynomials of...iosrjce
 
On selection of periodic kernels parameters in time series prediction
On selection of periodic kernels parameters in time series predictionOn selection of periodic kernels parameters in time series prediction
On selection of periodic kernels parameters in time series predictioncsandit
 
K means clustering | K Means ++
K means clustering | K Means ++K means clustering | K Means ++
K means clustering | K Means ++sabbirantor
 
19. algorithms and-complexity
19. algorithms and-complexity19. algorithms and-complexity
19. algorithms and-complexityashishtinku
 
Graph Based Clustering
Graph Based ClusteringGraph Based Clustering
Graph Based ClusteringSSA KPI
 

What's hot (19)

T coffee algorithm dissection
T coffee algorithm dissectionT coffee algorithm dissection
T coffee algorithm dissection
 
Clustering techniques
Clustering techniquesClustering techniques
Clustering techniques
 
The Fourth Largest Estrada Indices for Trees
The Fourth Largest Estrada Indices for TreesThe Fourth Largest Estrada Indices for Trees
The Fourth Largest Estrada Indices for Trees
 
Exact Solutions of the Klein-Gordon Equation for the Q-Deformed Morse Potenti...
Exact Solutions of the Klein-Gordon Equation for the Q-Deformed Morse Potenti...Exact Solutions of the Klein-Gordon Equation for the Q-Deformed Morse Potenti...
Exact Solutions of the Klein-Gordon Equation for the Q-Deformed Morse Potenti...
 
Introduction to Reinforcement Learning for Molecular Design
Introduction to Reinforcement Learning for Molecular Design Introduction to Reinforcement Learning for Molecular Design
Introduction to Reinforcement Learning for Molecular Design
 
Data structures
Data structuresData structures
Data structures
 
Ijetcas14 439
Ijetcas14 439Ijetcas14 439
Ijetcas14 439
 
On a Deterministic Property of the Category of k-almost Primes: A Determinist...
On a Deterministic Property of the Category of k-almost Primes: A Determinist...On a Deterministic Property of the Category of k-almost Primes: A Determinist...
On a Deterministic Property of the Category of k-almost Primes: A Determinist...
 
E021201032037
E021201032037E021201032037
E021201032037
 
Algorithm Design and Complexity - Course 11
Algorithm Design and Complexity - Course 11Algorithm Design and Complexity - Course 11
Algorithm Design and Complexity - Course 11
 
Intro to MATLAB and K-mean algorithm
Intro to MATLAB and K-mean algorithmIntro to MATLAB and K-mean algorithm
Intro to MATLAB and K-mean algorithm
 
K means clustering
K means clusteringK means clustering
K means clustering
 
Artificial intelligence ai choice mechanism hypothesis of a mathematical method
Artificial intelligence ai choice mechanism hypothesis of a mathematical methodArtificial intelligence ai choice mechanism hypothesis of a mathematical method
Artificial intelligence ai choice mechanism hypothesis of a mathematical method
 
Connected Total Dominating Sets and Connected Total Domination Polynomials of...
Connected Total Dominating Sets and Connected Total Domination Polynomials of...Connected Total Dominating Sets and Connected Total Domination Polynomials of...
Connected Total Dominating Sets and Connected Total Domination Polynomials of...
 
On selection of periodic kernels parameters in time series prediction
On selection of periodic kernels parameters in time series predictionOn selection of periodic kernels parameters in time series prediction
On selection of periodic kernels parameters in time series prediction
 
K means clustering | K Means ++
K means clustering | K Means ++K means clustering | K Means ++
K means clustering | K Means ++
 
19. algorithms and-complexity
19. algorithms and-complexity19. algorithms and-complexity
19. algorithms and-complexity
 
Graph Based Clustering
Graph Based ClusteringGraph Based Clustering
Graph Based Clustering
 
Beamerpresentation
BeamerpresentationBeamerpresentation
Beamerpresentation
 

Similar to Master's Thesis defence presentation

Sensor Fusion Study - Ch3. Least Square Estimation [강소라, Stella, Hayden]
Sensor Fusion Study - Ch3. Least Square Estimation [강소라, Stella, Hayden]Sensor Fusion Study - Ch3. Least Square Estimation [강소라, Stella, Hayden]
Sensor Fusion Study - Ch3. Least Square Estimation [강소라, Stella, Hayden]AI Robotics KR
 
EH1 - Reduced-order modelling for vibration energy harvesting
EH1 - Reduced-order modelling for vibration energy harvestingEH1 - Reduced-order modelling for vibration energy harvesting
EH1 - Reduced-order modelling for vibration energy harvestingUniversity of Glasgow
 
Efficient Identification of Improving Moves in a Ball for Pseudo-Boolean Prob...
Efficient Identification of Improving Moves in a Ball for Pseudo-Boolean Prob...Efficient Identification of Improving Moves in a Ball for Pseudo-Boolean Prob...
Efficient Identification of Improving Moves in a Ball for Pseudo-Boolean Prob...jfrchicanog
 
OBSERVER-BASED REDUCED ORDER CONTROLLER DESIGN FOR THE STABILIZATION OF LARGE...
OBSERVER-BASED REDUCED ORDER CONTROLLER DESIGN FOR THE STABILIZATION OF LARGE...OBSERVER-BASED REDUCED ORDER CONTROLLER DESIGN FOR THE STABILIZATION OF LARGE...
OBSERVER-BASED REDUCED ORDER CONTROLLER DESIGN FOR THE STABILIZATION OF LARGE...ijcseit
 
Lesson 14: Derivatives of Exponential and Logarithmic Functions (Section 021 ...
Lesson 14: Derivatives of Exponential and Logarithmic Functions (Section 021 ...Lesson 14: Derivatives of Exponential and Logarithmic Functions (Section 021 ...
Lesson 14: Derivatives of Exponential and Logarithmic Functions (Section 021 ...Matthew Leingang
 
Exploring the feature space of large collections of time series
Exploring the feature space of large collections of time seriesExploring the feature space of large collections of time series
Exploring the feature space of large collections of time seriesRob Hyndman
 
A common fixed point theorem for two random operators using random mann itera...
A common fixed point theorem for two random operators using random mann itera...A common fixed point theorem for two random operators using random mann itera...
A common fixed point theorem for two random operators using random mann itera...Alexander Decker
 
Instrumentation Engineering : Signals & systems, THE GATE ACADEMY
Instrumentation Engineering : Signals & systems, THE GATE ACADEMYInstrumentation Engineering : Signals & systems, THE GATE ACADEMY
Instrumentation Engineering : Signals & systems, THE GATE ACADEMYklirantga
 
Eh3 analysis of nonlinear energy harvesters
Eh3   analysis of nonlinear energy harvestersEh3   analysis of nonlinear energy harvesters
Eh3 analysis of nonlinear energy harvestersUniversity of Glasgow
 
signals and systems chapter3-part3_signals and systems chapter3-part3.pdf
signals and systems chapter3-part3_signals and systems chapter3-part3.pdfsignals and systems chapter3-part3_signals and systems chapter3-part3.pdf
signals and systems chapter3-part3_signals and systems chapter3-part3.pdfislamsharawneh
 
Wavelet neural network conjunction model in flow forecasting of subhimalayan ...
Wavelet neural network conjunction model in flow forecasting of subhimalayan ...Wavelet neural network conjunction model in flow forecasting of subhimalayan ...
Wavelet neural network conjunction model in flow forecasting of subhimalayan ...iaemedu
 
Exploring temporal graph data with Python: 
a study on tensor decomposition o...
Exploring temporal graph data with Python: 
a study on tensor decomposition o...Exploring temporal graph data with Python: 
a study on tensor decomposition o...
Exploring temporal graph data with Python: 
a study on tensor decomposition o...André Panisson
 
Adaptive relevance feedback in information retrieval
Adaptive relevance feedback in information retrievalAdaptive relevance feedback in information retrieval
Adaptive relevance feedback in information retrievalYI-JHEN LIN
 
Low Power Adaptive FIR Filter Based on Distributed Arithmetic
Low Power Adaptive FIR Filter Based on Distributed ArithmeticLow Power Adaptive FIR Filter Based on Distributed Arithmetic
Low Power Adaptive FIR Filter Based on Distributed ArithmeticIJERA Editor
 

Similar to Master's Thesis defence presentation (20)

Master's thesis
Master's thesisMaster's thesis
Master's thesis
 
Sensor Fusion Study - Ch3. Least Square Estimation [강소라, Stella, Hayden]
Sensor Fusion Study - Ch3. Least Square Estimation [강소라, Stella, Hayden]Sensor Fusion Study - Ch3. Least Square Estimation [강소라, Stella, Hayden]
Sensor Fusion Study - Ch3. Least Square Estimation [강소라, Stella, Hayden]
 
EH1 - Reduced-order modelling for vibration energy harvesting
EH1 - Reduced-order modelling for vibration energy harvestingEH1 - Reduced-order modelling for vibration energy harvesting
EH1 - Reduced-order modelling for vibration energy harvesting
 
D143136
D143136D143136
D143136
 
Efficient Identification of Improving Moves in a Ball for Pseudo-Boolean Prob...
Efficient Identification of Improving Moves in a Ball for Pseudo-Boolean Prob...Efficient Identification of Improving Moves in a Ball for Pseudo-Boolean Prob...
Efficient Identification of Improving Moves in a Ball for Pseudo-Boolean Prob...
 
OBSERVER-BASED REDUCED ORDER CONTROLLER DESIGN FOR THE STABILIZATION OF LARGE...
OBSERVER-BASED REDUCED ORDER CONTROLLER DESIGN FOR THE STABILIZATION OF LARGE...OBSERVER-BASED REDUCED ORDER CONTROLLER DESIGN FOR THE STABILIZATION OF LARGE...
OBSERVER-BASED REDUCED ORDER CONTROLLER DESIGN FOR THE STABILIZATION OF LARGE...
 
Lesson 14: Derivatives of Exponential and Logarithmic Functions (Section 021 ...
Lesson 14: Derivatives of Exponential and Logarithmic Functions (Section 021 ...Lesson 14: Derivatives of Exponential and Logarithmic Functions (Section 021 ...
Lesson 14: Derivatives of Exponential and Logarithmic Functions (Section 021 ...
 
Ck4201578592
Ck4201578592Ck4201578592
Ck4201578592
 
Exploring the feature space of large collections of time series
Exploring the feature space of large collections of time seriesExploring the feature space of large collections of time series
Exploring the feature space of large collections of time series
 
multiscale_tutorial.pdf
multiscale_tutorial.pdfmultiscale_tutorial.pdf
multiscale_tutorial.pdf
 
DIMENSIONAL ANALYSIS (Lecture notes 08)
DIMENSIONAL ANALYSIS (Lecture notes 08)DIMENSIONAL ANALYSIS (Lecture notes 08)
DIMENSIONAL ANALYSIS (Lecture notes 08)
 
A common fixed point theorem for two random operators using random mann itera...
A common fixed point theorem for two random operators using random mann itera...A common fixed point theorem for two random operators using random mann itera...
A common fixed point theorem for two random operators using random mann itera...
 
Instrumentation Engineering : Signals & systems, THE GATE ACADEMY
Instrumentation Engineering : Signals & systems, THE GATE ACADEMYInstrumentation Engineering : Signals & systems, THE GATE ACADEMY
Instrumentation Engineering : Signals & systems, THE GATE ACADEMY
 
Eh3 analysis of nonlinear energy harvesters
Eh3   analysis of nonlinear energy harvestersEh3   analysis of nonlinear energy harvesters
Eh3 analysis of nonlinear energy harvesters
 
signals and systems chapter3-part3_signals and systems chapter3-part3.pdf
signals and systems chapter3-part3_signals and systems chapter3-part3.pdfsignals and systems chapter3-part3_signals and systems chapter3-part3.pdf
signals and systems chapter3-part3_signals and systems chapter3-part3.pdf
 
Wavelet neural network conjunction model in flow forecasting of subhimalayan ...
Wavelet neural network conjunction model in flow forecasting of subhimalayan ...Wavelet neural network conjunction model in flow forecasting of subhimalayan ...
Wavelet neural network conjunction model in flow forecasting of subhimalayan ...
 
Exploring temporal graph data with Python: 
a study on tensor decomposition o...
Exploring temporal graph data with Python: 
a study on tensor decomposition o...Exploring temporal graph data with Python: 
a study on tensor decomposition o...
Exploring temporal graph data with Python: 
a study on tensor decomposition o...
 
Adaptive relevance feedback in information retrieval
Adaptive relevance feedback in information retrievalAdaptive relevance feedback in information retrieval
Adaptive relevance feedback in information retrieval
 
Low Power Adaptive FIR Filter Based on Distributed Arithmetic
Low Power Adaptive FIR Filter Based on Distributed ArithmeticLow Power Adaptive FIR Filter Based on Distributed Arithmetic
Low Power Adaptive FIR Filter Based on Distributed Arithmetic
 
Aq4201280292
Aq4201280292Aq4201280292
Aq4201280292
 

Recently uploaded

STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCESTERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCEPRINCE C P
 
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.aasikanpl
 
Cultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptxCultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptxpradhanghanshyam7136
 
Physiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptxPhysiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptxAArockiyaNisha
 
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...Sérgio Sacani
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTSérgio Sacani
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfnehabiju2046
 
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfAnalytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfSwapnil Therkar
 
Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)PraveenaKalaiselvan1
 
Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Nistarini College, Purulia (W.B) India
 
Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Patrick Diehl
 
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRStunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRDelhi Call girls
 
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxSOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxkessiyaTpeter
 
Animal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxAnimal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxUmerFayaz5
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhousejana861314
 
Isotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoIsotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoSérgio Sacani
 
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.aasikanpl
 
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSpermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSarthak Sekhar Mondal
 
Grafana in space: Monitoring Japan's SLIM moon lander in real time
Grafana in space: Monitoring Japan's SLIM moon lander  in real timeGrafana in space: Monitoring Japan's SLIM moon lander  in real time
Grafana in space: Monitoring Japan's SLIM moon lander in real timeSatoshi NAKAHIRA
 
Analytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptxAnalytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptxSwapnil Therkar
 

Recently uploaded (20)

STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCESTERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
 
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
 
Cultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptxCultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptx
 
Physiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptxPhysiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptx
 
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOST
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdf
 
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfAnalytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
 
Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)
 
Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...
 
Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?
 
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRStunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
 
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxSOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
 
Animal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxAnimal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptx
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhouse
 
Isotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoIsotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on Io
 
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
 
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSpermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
 
Grafana in space: Monitoring Japan's SLIM moon lander in real time
Grafana in space: Monitoring Japan's SLIM moon lander  in real timeGrafana in space: Monitoring Japan's SLIM moon lander  in real time
Grafana in space: Monitoring Japan's SLIM moon lander in real time
 
Analytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptxAnalytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptx
 

Master's Thesis defence presentation

  • 1. Return Interval Distribution of Extreme Events in Long memory Time Series With Two Scaling Exponents Smrati Kumar Katiyar Department of Physics IISER, Pune May 3, 2011 Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 1 / 27
  • 2. 1 Explanation for the title 2 Statistical test for long memory 3 Foundation stone for our work 4 our work Analytical approach Numerical approach to the problem Comparison of analytical and numerical results 5 Long memory probability process with two scaling exponents 6 conclusion 7 future direction Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 2 / 27
  • 3. What are the key terms? Return interval distribution of extreme events in long memory time series with two scaling exponents. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 3 / 27
  • 4. What are the key terms? Return interval distribution of extreme events in long memory time series with two scaling exponents. 1 Return interval and extreme events Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 3 / 27
  • 5. What are the key terms? Return interval distribution of extreme events in long memory time series with two scaling exponents. 1 Return interval and extreme events 2 Long memory time series Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 3 / 27
  • 6. What are the key terms? Return interval distribution of extreme events in long memory time series with two scaling exponents. 1 Return interval and extreme events 2 Long memory time series 3 Scaling exponents Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 3 / 27
  • 7. Return interval and extreme events Given a time series X(t) Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 4 / 27
  • 8. Return interval and extreme events Given a time series X(t) 0 20 40 60 80 100 t -4 -3 -2 -1 0 1 2 3 x(t) threshold r1 r2 r3 Figure: Return intervals and extreme events Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 4 / 27
  • 9. Aim of our project work Example : let say we are given a time series X(t) and there are total 11 time instants at which the value of X is more than the threshold(q). Those time instants are, t = 0, 1, 3, 5, 6, 7, 10, 11, 12, 14, 16 Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 5 / 27
  • 10. Aim of our project work Example : let say we are given a time series X(t) and there are total 11 time instants at which the value of X is more than the threshold(q). Those time instants are, t = 0, 1, 3, 5, 6, 7, 10, 11, 12, 14, 16 So the return intervals will be : return intervals = 1, 2, 2, 1, 1, 3, 1, 1, 2, 2 out of these 10 return intervals we have 5 return intervals of length 1 4 return intervals of length 2 and 1 return interval of length 3 Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 5 / 27
  • 11. Aim of our project work Example : let say we are given a time series X(t) and there are total 11 time instants at which the value of X is more than the threshold(q). Those time instants are, t = 0, 1, 3, 5, 6, 7, 10, 11, 12, 14, 16 So the return intervals will be : return intervals = 1, 2, 2, 1, 1, 3, 1, 1, 2, 2 out of these 10 return intervals we have 5 return intervals of length 1 4 return intervals of length 2 and 1 return interval of length 3 so the probability of occurance of return interval of length 1 will be P(1) = 5 10, similarly P(2) = 4 10 and P(3) = 1 10 Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 5 / 27
  • 12. Long memory time series Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
  • 13. Long memory time series Plot of sample autocorrelation function (ACF) ρk against lag k is one of the most useful tool to analyse a given time series. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
  • 14. Long memory time series Plot of sample autocorrelation function (ACF) ρk against lag k is one of the most useful tool to analyse a given time series. ρk = n t=k+1(xt −x)(xt−k −x) n t=1(xt −x)2 Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
  • 15. Long memory time series Plot of sample autocorrelation function (ACF) ρk against lag k is one of the most useful tool to analyse a given time series. ρk = n t=k+1(xt −x)(xt−k −x) n t=1(xt −x)2 For long memory processes ρk → Cρk−γ as k → ∞ where Cρ > 0 and γ ∈ (0, 1) Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
  • 16. Long memory time series Plot of sample autocorrelation function (ACF) ρk against lag k is one of the most useful tool to analyse a given time series. ρk = n t=k+1(xt −x)(xt−k −x) n t=1(xt −x)2 For long memory processes ρk → Cρk−γ as k → ∞ where Cρ > 0 and γ ∈ (0, 1) 0 10 20 30 40 50 60 0.00.20.40.60.81.0 Lag ACF Series a Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
  • 17. Long memory time series Plot of sample autocorrelation function (ACF) ρk against lag k is one of the most useful tool to analyse a given time series. ρk = n t=k+1(xt −x)(xt−k −x) n t=1(xt −x)2 For long memory processes ρk → Cρk−γ as k → ∞ where Cρ > 0 and γ ∈ (0, 1) 0 10 20 30 40 50 60 0.00.20.40.60.81.0 Lag ACF Series a A Long memory process is trend reinforcing, which means the direction (up or down compared to the last value) of the next value is more likely the same as current value. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 6 / 27
  • 18. Scaling exponents Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 7 / 27
  • 19. Scaling exponents The most common power laws relate two variables and have the form f (x) ∝ xα Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 7 / 27
  • 20. Scaling exponents The most common power laws relate two variables and have the form f (x) ∝ xα Here α is called the scaling exponent. where the word ”scaling” denotes the fact that a power-law function satisfies f (cx) = cαf (x) ∝ f (x) Here c is a constant. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 7 / 27
  • 21. Statistical test for long memory How to find whether a given time series x(t) has long memory or not? Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 8 / 27
  • 22. Statistical test for long memory How to find whether a given time series x(t) has long memory or not? Detrended fluctuation analysis Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 8 / 27
  • 23. Statistical test for long memory How to find whether a given time series x(t) has long memory or not? Detrended fluctuation analysis x(t) is the time series. (t = 1, 2, 3, .......Nmax ) Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 8 / 27
  • 24. Statistical test for long memory How to find whether a given time series x(t) has long memory or not? Detrended fluctuation analysis x(t) is the time series. (t = 1, 2, 3, .......Nmax ) y(k) = k i=1(xi − x ) cumulative sum or profile Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 8 / 27
  • 25. Statistical test for long memory How to find whether a given time series x(t) has long memory or not? Detrended fluctuation analysis x(t) is the time series. (t = 1, 2, 3, .......Nmax ) y(k) = k i=1(xi − x ) cumulative sum or profile Divide y(k) into time window of length n samples. In each box of length n, we fit y(k), using a polynomial function of order l, which represents the trend in that box. The y coordinate of the fit line in each box is denoted by yn(k). Since we use a polynomial fit of order l, we denote the algorithm as DFA-l. 0 100 200 300 400 500 600 700 800 900 1000 0 50 100 150 200 250 300 350 K Yk Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 8 / 27
  • 26. The integrated signal y(k) is detrended by subtracting the local trend yn(k) in each box of length n. For a given box size n, the root-mean-square (rms) fluctuation for this integrated and detrended signal is calculated: F(n) = 1 Nmax Nmax k=1 [y(k) − yn(k)]2 Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 9 / 27
  • 27. The integrated signal y(k) is detrended by subtracting the local trend yn(k) in each box of length n. For a given box size n, the root-mean-square (rms) fluctuation for this integrated and detrended signal is calculated: F(n) = 1 Nmax Nmax k=1 [y(k) − yn(k)]2 The above computation is repeated for a broad range of scales (box size n) to provide a relationship between F(n) and the box size n. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 9 / 27
  • 28. The integrated signal y(k) is detrended by subtracting the local trend yn(k) in each box of length n. For a given box size n, the root-mean-square (rms) fluctuation for this integrated and detrended signal is calculated: F(n) = 1 Nmax Nmax k=1 [y(k) − yn(k)]2 The above computation is repeated for a broad range of scales (box size n) to provide a relationship between F(n) and the box size n. A power-law relation between the average root-meansquare fluctuation function F(n) and the box size n indicates the presence of scaling F(n) ∝ nα Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 9 / 27
  • 29. We can fit the log-log plot with a straight line and the slope of that line will be the scaling exponent. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 10 / 27
  • 30. We can fit the log-log plot with a straight line and the slope of that line will be the scaling exponent. 0 0.2 0.4 0.6 0.8 1 ln n 0 0.2 0.4 0.6 0.8 1 lnF(n) Figure: log-log plot of F(n) Vs n Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 10 / 27
  • 31. We can fit the log-log plot with a straight line and the slope of that line will be the scaling exponent. 0 0.2 0.4 0.6 0.8 1 ln n 0 0.2 0.4 0.6 0.8 1 lnF(n) Figure: log-log plot of F(n) Vs n When slope of line is in range (1/2,1) ,The time series displays long memory. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 10 / 27
  • 32. Foundation stone for our work for a long memory time series with one scaling exponent, the probability distribution of return intervals are known (Santhanam et. al, 2008) Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 11 / 27
  • 33. Foundation stone for our work for a long memory time series with one scaling exponent, the probability distribution of return intervals are known (Santhanam et. al, 2008) P(R) = a R−(1−γ) e −( a γ )Rγ Here R is the scaled return interval R = r r ,where r are the actual return intervals r = 1, 2, 3, 4, ...... Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 11 / 27
  • 34. our work What about time series with two scaling exponent? How to calculate their return interval distributions? Examples of these kind of time series are: Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 12 / 27
  • 35. our work What about time series with two scaling exponent? How to calculate their return interval distributions? Examples of these kind of time series are: high frequency financial data, Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 12 / 27
  • 36. our work What about time series with two scaling exponent? How to calculate their return interval distributions? Examples of these kind of time series are: high frequency financial data, network traffic of a web server etc. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 12 / 27
  • 37. our work What about time series with two scaling exponent? How to calculate their return interval distributions? Examples of these kind of time series are: high frequency financial data, network traffic of a web server etc. 0 0.2 0.4 0.6 0.8 1 ln n 0 0.2 0.4 0.6 0.8 1 lnF(n) Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 12 / 27
  • 38. our work What about time series with two scaling exponent? How to calculate their return interval distributions? Examples of these kind of time series are: high frequency financial data, network traffic of a web server etc. 0 0.2 0.4 0.6 0.8 1 ln n 0 0.2 0.4 0.6 0.8 1 lnF(n) Figure: Podobnik et al. PHYSICA A, 2002 Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 12 / 27
  • 39. our approach to solve the problem analytical approach Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 13 / 27
  • 40. our approach to solve the problem analytical approach We will consider a probability model for a stationary process with long memory, given an extreme event at time t = 0, the probability to find an extreme event at time t = r is given by Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 13 / 27
  • 41. our approach to solve the problem analytical approach We will consider a probability model for a stationary process with long memory, given an extreme event at time t = 0, the probability to find an extreme event at time t = r is given by Pex (r) = a1r−(2α1−1) = a1r−(1−γ1) for 0 < r < nx a2r−(2α2−1) = a2r−(1−γ2) for nx < r < ∞ where 0.5 < α1, α2 < 1 are DFA exponents and 0 < γ1, γ2 < 1 are autocorrelation exponents nx is the crossover scale Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 13 / 27
  • 42. After a very long algebra we find the return interval distribution P(r) = a1r−(1−γ1)e−(a1/γ1)rγ1 for 0 < r < nx Ca2r−(1−γ2)e−(a2/γ2)rγ2 for nx < r < ∞ Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
  • 43. After a very long algebra we find the return interval distribution P(r) = a1r−(1−γ1)e−(a1/γ1)rγ1 for 0 < r < nx Ca2r−(1−γ2)e−(a2/γ2)rγ2 for nx < r < ∞ How to find three unknowns a1, a2 and C ? Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
  • 44. After a very long algebra we find the return interval distribution P(r) = a1r−(1−γ1)e−(a1/γ1)rγ1 for 0 < r < nx Ca2r−(1−γ2)e−(a2/γ2)rγ2 for nx < r < ∞ How to find three unknowns a1, a2 and C ? we need three equations........ Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
  • 45. After a very long algebra we find the return interval distribution P(r) = a1r−(1−γ1)e−(a1/γ1)rγ1 for 0 < r < nx Ca2r−(1−γ2)e−(a2/γ2)rγ2 for nx < r < ∞ How to find three unknowns a1, a2 and C ? we need three equations........ normalization equation ∞ 0 P(r)dr = 1. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
  • 46. After a very long algebra we find the return interval distribution P(r) = a1r−(1−γ1)e−(a1/γ1)rγ1 for 0 < r < nx Ca2r−(1−γ2)e−(a2/γ2)rγ2 for nx < r < ∞ How to find three unknowns a1, a2 and C ? we need three equations........ normalization equation ∞ 0 P(r)dr = 1. normalizing r to unity ∞ 0 rP(r)dr = 1 Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
  • 47. After a very long algebra we find the return interval distribution P(r) = a1r−(1−γ1)e−(a1/γ1)rγ1 for 0 < r < nx Ca2r−(1−γ2)e−(a2/γ2)rγ2 for nx < r < ∞ How to find three unknowns a1, a2 and C ? we need three equations........ normalization equation ∞ 0 P(r)dr = 1. normalizing r to unity ∞ 0 rP(r)dr = 1 using continuity condition a1r−(1−γ1) = a2r−(1−γ2) at r = nx a1n −(1−γ1) x = a2n −(1−γ2) x Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 14 / 27
  • 48. final equations for a1, a2 and C a1n −(1−γ1) x = a2n −(1−γ2) x Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 15 / 27
  • 49. final equations for a1, a2 and C a1n −(1−γ1) x = a2n −(1−γ2) x Ce−(a2/γ2)n γ2 x = e−(a1/γ1)n γ1 x Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 15 / 27
  • 50. final equations for a1, a2 and C a1n −(1−γ1) x = a2n −(1−γ2) x Ce−(a2/γ2)n γ2 x = e−(a1/γ1)n γ1 x C(γ2/a2)1/γ2 nx Eγ2−1 γ2 (n γ2 x ) γ2 − (γ1/a1)1/γ1 nx Eγ1−1 γ1 (n γ1 x ) γ1 = 1 Here En(x) = ∞ 1 e−xt tn dt = 1 0 e−x/ηη(n−2)dη En(x) is known as exponential integral function. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 15 / 27
  • 51. numerical approach to the problem first and only challenge with this approach : to get a long memory time series which contains two different scaling exponents. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 16 / 27
  • 52. numerical approach to the problem first and only challenge with this approach : to get a long memory time series which contains two different scaling exponents. The model Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 16 / 27
  • 53. numerical approach to the problem first and only challenge with this approach : to get a long memory time series which contains two different scaling exponents. The model Step 1: set the length of time series, say, l = 105. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 16 / 27
  • 54. numerical approach to the problem first and only challenge with this approach : to get a long memory time series which contains two different scaling exponents. The model Step 1: set the length of time series, say, l = 105. Step 2: generate a series of random numbers yi i = 0......(l − 1) which follow gaussian distribution with mean 0 and variance 1 Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 16 / 27
  • 55. numerical approach to the problem first and only challenge with this approach : to get a long memory time series which contains two different scaling exponents. The model Step 1: set the length of time series, say, l = 105. Step 2: generate a series of random numbers yi i = 0......(l − 1) which follow gaussian distribution with mean 0 and variance 1 Step 3: generate a series of coefficients defined as: Cα i = Γ(i − α) Γ(−α)Γ(i + 1) = − α Γ(1 − α) Γ(i − α) Γ(i + 1) α = α1 for 0 < i < nx α2 for nx < i < ∞ Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 16 / 27
  • 56. Both α1 and α2 belong to the interval (−0.5, 0) The asymptotic behaviour of Cα i for large i can be written as Cα i ≃ − α Γ(1 − α) i−(1+α) for i ≫ 1 Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 17 / 27
  • 57. Both α1 and α2 belong to the interval (−0.5, 0) The asymptotic behaviour of Cα i for large i can be written as Cα i ≃ − α Γ(1 − α) i−(1+α) for i ≫ 1 Step 4: Now, get a series yα i using yi and Cα i according to the relation yα i = i j=0 yi−jCα j i = 0.....(l − 1) (1) Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 17 / 27
  • 58. DFA of time series generated using previous model 0 1 2 3 4 5 6 log (n) 0 1 2 3 4 logF(n) DFA analysis of time series crossover region Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 18 / 27
  • 59. comparison of analytical and numerical results try to fit P(r) = ar−(1−γ)e−(c/γ)rγ to each segment, according to their corresponding γ values -5 -4 -3 -2 -1 0 1 2 ln (R) -8 -7 -6 -5 -4 -3 lnP(R) segment 1 segment2 break point discrepancy because of threshold dependence of constants and long memory in return intervals. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 19 / 27
  • 60. -4.8 -4.6 -4.4 -4.2 -4 -3.8 -3.6 -3.4 -4.5 -4 -3.5 -3 -2.5 -2 -1.5 -1 lnP(R) ln(R) return interval distribution segment(1) -8 -7.5 -7 -6.5 -6 -5.5 -5 -4.5 -0.5 0 0.5 1 1.5 lnP(R) ln(R) return interval distribution segment(2) Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 20 / 27
  • 61. Long memory probability process with two scaling exponents To remove discrepancy because of long memory in return intervals. We will generate return intervals such that they have no dependence on each other. first determine the constants a1 and a2 by normalizing it in the region kmin = 1 and kmax . kmax 1 Pex (r)dr = nx 1 a1r−(1−γ1) dr + kmax nx a2r−(1−γ2) dr = 1 Use continuity condition as well and solve for a1 and a2. a1 = 1 [n γ1 x γ1 − 1 γ1 + k γ2 max n γ1−γ2 x γ2 − n γ1 x γ2 ] a2 = 1 [n γ2 x γ1 − n γ2−γ1 x γ1 + k γ2 max γ2 − n γ2 x γ2 ] Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 21 / 27
  • 62. Now generate a random number ξr from a uniform distribution at every r and compare it with the value of Pex (r). A random number is accepted as an extreme event if ξr < Pex (r) at any given value of r. If ξr ≥ Pex (r), then it is not an extreme event. Using this procedure we can generate a series of extreme events. -6 -4 -2 0 2 ln (R) -9 -8 -7 -6 -5 -4 -3 -2 lnP(R) segment 2 segm ent 1 break point Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 22 / 27
  • 63. -6.5 -6 -5.5 -5 -4.5 -4 -3.5 -3 -2.5 -2 -5.5 -5 -4.5 -4 -3.5 -3 -2.5 -2 -1.5 lnP(R) ln(R) return interval distribution of segment 1 -8.5 -8 -7.5 -7 -6.5 -6 -1 -0.5 0 0.5 1 1.5 2 lnP(R) ln(R) return interval distribution of segment 2 P(r) = a1r−(1−γ1)e−(a1/γ1)rγ1 for 0 < r < nx Ca2r−(1−γ2)e−(a2/γ2)rγ2 for nx < r < ∞ Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 23 / 27
  • 64. In the previous slide, for segment(1) in place of a1, we have two variables a and b. Why we have two different variables? the possible reason is that for normalization integrals we have lower limit as 0 but in reality the minimum size of return interval is 1. So even after scaling, the minimum value of lower limit is 1/ r . Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 24 / 27
  • 65. conclusion for a long memory time series with two different scaling exponent Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 25 / 27
  • 66. conclusion for a long memory time series with two different scaling exponent There is a break point in the return interval distribution graph. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 25 / 27
  • 67. conclusion for a long memory time series with two different scaling exponent There is a break point in the return interval distribution graph. For each scaling exponent there will be a different segment in return interval distribution. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 25 / 27
  • 68. conclusion for a long memory time series with two different scaling exponent There is a break point in the return interval distribution graph. For each scaling exponent there will be a different segment in return interval distribution. Each segment still follow a distribution of the form which is product of power law and stretched exponential. Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 25 / 27
  • 69. future direction Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 26 / 27
  • 70. future direction the model that we have used to generate time series with more than one scaling exponent need a fine tuning so that we can test our analytical results more accurately Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 26 / 27
  • 71. future direction the model that we have used to generate time series with more than one scaling exponent need a fine tuning so that we can test our analytical results more accurately we should also think of the effects of long memory in return intervals Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 26 / 27
  • 72. Thank you Questions???? Katiyar S K (IISER Pune) Thesis Presentation May 3, 2011 27 / 27