A comparison of three learning methods to
predict N2O fluxes and N leaching
Nathalie Villa-Vialaneix
http://www.nathalievilla.org
Institut de Mathématiques de Toulouse &
IUT de Carcassonne (Université de Perpignan)
March 18th, 2010
UC - Climate Change Unit, Ispra
1 / 25
Comparison of metamodels
Sommaire
1 Basics about machine learning for regression problems
2 Presentation of three learning methods
3 Methodology and results [Villa-Vialaneix et al., 2010]
2 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Sommaire
1 Basics about machine learning for regression problems
2 Presentation of three learning methods
3 Methodology and results [Villa-Vialaneix et al., 2010]
3 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Regression
Consider the problem where:
we want to predict Y ∈ R from X ∈ Rd
;
a learning set of N i.i.d. observations of (X, Y) is already available:
(x1, y1), . . . , (xN, yN);
new values of X are available from which we expect to evaluate the
corresponding Y: xN+1, . . . , xM.
4 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Regression
Consider the problem where:
we want to predict Y ∈ R from X ∈ Rd
;
a learning set of N i.i.d. observations of (X, Y) is already available:
(x1, y1), . . . , (xN, yN);
new values of X are available from which we expect to evaluate the
corresponding Y: xN+1, . . . , xM.
Example: Predict N2O fluxes from PH, climate, concentration of N
in rain, fertilization for a large number of HSMU . . .
4 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Two different approaches
1 Linear model: A linear relation between X and Y is supposed:
yi = βT
xi + , i = 1, . . . , N
5 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Two different approaches
1 Linear model: A linear relation between X and Y is supposed:
yi = βT
xi + , i = 1, . . . , N
β can be estimated from the learning set by ˆβN
= (XT
X)−1
XT
Y
where X = (x1 . . . xN)T
and Y = (y1 . . . yN)T
.
5 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Two different approaches
1 Linear model: A linear relation between X and Y is supposed:
yi = βT
xi + , i = 1, . . . , N
β can be estimated from the learning set by ˆβN
= (XT
X)−1
XT
Y
where X = (x1 . . . xN)T
and Y = (y1 . . . yN)T
.
2 Non parametric model: No particular relation is supposed between
X and Y:
yi = Φ(xi) + , i = 1, . . . , N
where Φ is unknown.
5 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Two different approaches
1 Linear model: A linear relation between X and Y is supposed:
yi = βT
xi + , i = 1, . . . , N
β can be estimated from the learning set by ˆβN
= (XT
X)−1
XT
Y
where X = (x1 . . . xN)T
and Y = (y1 . . . yN)T
.
2 Non parametric model: No particular relation is supposed between
X and Y:
yi = Φ(xi) + , i = 1, . . . , N
where Φ is unknown.
From (xi, yi)i, we intend to define a machine, i.e., a function
ˆΦN
: Rd
→ R that approximates Φ.
5 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Desirable properties for a machine
The adequacy of ˆΦN
is measure through an error function, L.
Example: Mean squared error
L(y, ˆy) = (y − ˆy)2
6 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Desirable properties for a machine
The adequacy of ˆΦN
is measure through an error function, L.
Example: Mean squared error
L(y, ˆy) = (y − ˆy)2
Hence, ΦN
should be:
accurate for the learning data set: N
i=1 L(yi, ˆΦN
(xi)) has to be
small;
6 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Desirable properties for a machine
The adequacy of ˆΦN
is measure through an error function, L.
Example: Mean squared error
L(y, ˆy) = (y − ˆy)2
Hence, ΦN
should be:
accurate for the learning data set: N
i=1 L(yi, ˆΦN
(xi)) has to be
small;
able to predict well new data (generalization ability):
M
i=N+1 L(yi, ˆΦN
(xi)) has to be small (but here, yi are unknown!).
6 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Desirable properties for a machine
The adequacy of ˆΦN
is measure through an error function, L.
Example: Mean squared error
L(y, ˆy) = (y − ˆy)2
Hence, ΦN
should be:
accurate for the learning data set: N
i=1 L(yi, ˆΦN
(xi)) has to be
small;
able to predict well new data (generalization ability):
M
i=N+1 L(yi, ˆΦN
(xi)) has to be small (but here, yi are unknown!).
These two objectives are somehow contradictory and a tradeoff
has to be found (see [Vapnik, 1995] for further details).
6 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Purpose of the work
We focus on methods that are universally consistent. These
methods lead to the definition of machines ˆΦN
such that:
EL(ˆΦN
(X), Y)
N→+∞
−−−−−−→ L∗
= inf
Φ:Rd →R
L(Φ(X), Y)
for any random pair (X, Y).
7 / 25
Comparison of metamodels
Basics about machine learning for regression problems
Purpose of the work
We focus on methods that are universally consistent. These
methods lead to the definition of machines ˆΦN
such that:
EL(ˆΦN
(X), Y)
N→+∞
−−−−−−→ L∗
= inf
Φ:Rd →R
L(Φ(X), Y)
for any random pair (X, Y).
1 multi-layer perceptrons (neural networks): [Bishop, 1995]
2 Support Vector Machines (SVM): [Boser et al., 1992]
3 random forests: [Breiman, 2001] (universal consistency is not
proven in this case)
7 / 25
Comparison of metamodels
Presentation of three learning methods
Sommaire
1 Basics about machine learning for regression problems
2 Presentation of three learning methods
3 Methodology and results [Villa-Vialaneix et al., 2010]
8 / 25
Comparison of metamodels
Presentation of three learning methods
Multilayer perceptrons (MLP)
A “one-hidden-layer perceptron” is a function of the form:
Φw : x ∈ Rd
→
Q
i=1
w
(2)
i
G xT
w
(1)
i
+ w
(0)
i
+ w
(2)
0
where:
the w are the weights of the MLP that have to be learned from the
learning set;
G is a given activation function: typically, G(z) = 1−e−z
1+e−z ;
Q is the number of neurons on the hidden layer. It controls the
flexibility of the machine. Q is a hyper-parameter that is usually
tuned during the learning process.
9 / 25
Comparison of metamodels
Presentation of three learning methods
Symbolic representation of MLP
INPUTS
x1
x2
. . .
xd
w
(1)
11
w
(1)
pQ
Neuron 1
Neuron Q
φw(x)
w
(2)
1
w
(2)
Q
+w
(0)
Q 10 / 25
Comparison of metamodels
Presentation of three learning methods
Learning MLP
Learning the weights: w are learned by a mean squared error
minimization scheme :
w∗
= arg min
w
N
i=1
L(yi, Φw(xi)).
11 / 25
Comparison of metamodels
Presentation of three learning methods
Learning MLP
Learning the weights: w are learned by a mean squared error
minimization scheme penalized by a weight decay to avoid
overfitting (ensure a better generalization ability):
w∗
= arg min
w
N
i=1
L(yi, Φw(xi))+C w 2
.
11 / 25
Comparison of metamodels
Presentation of three learning methods
Learning MLP
Learning the weights: w are learned by a mean squared error
minimization scheme penalized by a weight decay to avoid
overfitting (ensure a better generalization ability):
w∗
= arg min
w
N
i=1
L(yi, Φw(xi))+C w 2
.
Problem: MSE is not quadratic in w and thus the optimization can
not be made perfectly right. Some solutions can be local minima.
11 / 25
Comparison of metamodels
Presentation of three learning methods
Learning MLP
Learning the weights: w are learned by a mean squared error
minimization scheme penalized by a weight decay to avoid
overfitting (ensure a better generalization ability):
w∗
= arg min
w
N
i=1
L(yi, Φw(xi))+C w 2
.
Problem: MSE is not quadratic in w and thus the optimization can
not be made perfectly right. Some solutions can be local minima.
Tuning of the hyper-parameters, C and Q: simple validation has
been used to tune first C and Q.
11 / 25
Comparison of metamodels
Presentation of three learning methods
SVM
SVM is also a error loss with penalization minimization:
1 Basic linear SVM for regression: Φ(w,b) is of the form
x → wT
x + b with (w, b) solution of
arg min
N
i=1
L (yi, Φ(w,b)(xi)) + λ w 2
where
λ is a regularization (hyper) parameter (to be tuned during the
learning process);
L (y, ˆy) = max{|y − ˆy| − , 0} is an -insensitive loss function
See -insensitive loss function
12 / 25
Comparison of metamodels
Presentation of three learning methods
SVM
SVM is also a error loss with penalization minimization:
1 Basic linear SVM for regression
2 Non linear SVM for regression are the same except that a non
linear (fixed) transformation of the inputs is previously made:
ϕ(x) ∈ H is used instead of x.
12 / 25
Comparison of metamodels
Presentation of three learning methods
SVM
SVM is also a error loss with penalization minimization:
1 Basic linear SVM for regression
2 Non linear SVM for regression are the same except that a non
linear (fixed) transformation of the inputs is previously made:
ϕ(x) ∈ H is used instead of x.
Kernel trick: in fact, ϕ is never explicit but used through a kernel,
K : Rd
× Rd
→ R. This kernel is used for K(xi, xj) = ϕ(xi)T
ϕ(xj).
12 / 25
Comparison of metamodels
Presentation of three learning methods
SVM
SVM is also a error loss with penalization minimization:
1 Basic linear SVM for regression
2 Non linear SVM for regression are the same except that a non
linear (fixed) transformation of the inputs is previously made:
ϕ(x) ∈ H is used instead of x.
Kernel trick: in fact, ϕ is never explicit but used through a kernel,
K : Rd
× Rd
→ R. This kernel is used for K(xi, xj) = ϕ(xi)T
ϕ(xj).
Common kernel: Gaussian kernel
Kγ(u, v) = e−γ u−v 2
is known to have good theoretical properties both for accuracy and
generalization.
12 / 25
Comparison of metamodels
Presentation of three learning methods
Learning SVM
Learning (w, b): w = N
i=1 αiK(xi, .) and b are calculated by an
exact optimization scheme (quadratic programming). The only step
that can be time consumming is the calculation of the kernel matrix:
K(xi, xj) for i, j = 1, . . . , N.
13 / 25
Comparison of metamodels
Presentation of three learning methods
Learning SVM
Learning (w, b): w = N
i=1 αiK(xi, .) and b are calculated by an
exact optimization scheme (quadratic programming). The only step
that can be time consumming is the calculation of the kernel matrix:
K(xi, xj) for i, j = 1, . . . , N.
The resulting ˆΦN
is known to be of the form:
ˆΦN
(x) =
N
i=1
αiK(xi, x) + b
where only a few αi are non zero. The corresponding xi are called
support vectors.
13 / 25
Comparison of metamodels
Presentation of three learning methods
Learning SVM
Learning (w, b): w = N
i=1 αiK(xi, .) and b are calculated by an
exact optimization scheme (quadratic programming). The only step
that can be time consumming is the calculation of the kernel matrix:
K(xi, xj) for i, j = 1, . . . , N.
The resulting ˆΦN
is known to be of the form:
ˆΦN
(x) =
N
i=1
αiK(xi, x) + b
where only a few αi are non zero. The corresponding xi are called
support vectors.
Tuning of the hyper-parameters, C = 1/λ, and γ: simple
validation has been used. To limit waste of time, has not been
tuned in our experiments but set to the default value (1) which
ensured 0.5N support vectors at most.
13 / 25
Comparison of metamodels
Presentation of three learning methods
From regression tree to random forest
Example of a regression tree
|
SOCt < 0.095
PH < 7.815
SOCt < 0.025
FR < 130.45 clay < 0.185
SOCt < 0.025
SOCt < 0.145
FR < 108.45
PH < 6.5
4.366 7.100
15.010 8.975
2.685 5.257
26.260
28.070 35.900 59.330
Each split is made such that
the two induced subsets have
the greatest homogeneity pos-
sible.
The prediction of a final node
is the mean of the Y value of
the observations belonging to
this node.
14 / 25
Comparison of metamodels
Presentation of three learning methods
Random forest
Basic principle: combination of a large number of under-efficient
regression trees (the prediction is the mean prediction of all trees).
15 / 25
Comparison of metamodels
Presentation of three learning methods
Random forest
Basic principle: combination of a large number of under-efficient
regression trees (the prediction is the mean prediction of all trees).
For each tree, two simplifications of the original method are
performed:
1 A given number of observations are randomly chosen among the
training set: this subset of the training data set is called in-bag
sample whereas the other observations are called out-of-bag and
are used to control the error of the tree;
2 For each node of the tree, a given number of variables are
randomly chosen among all possible explanatory variables.
The best split is then calculated on the basis of these variables and
of the chosen observations. The chosen observations are the
same for a given tree whereas the variables taken into account
change for each split.
15 / 25
Comparison of metamodels
Presentation of three learning methods
Learning a random forest
Random forest are not very sensitive to hyper-parameters
(number of observations for each tree, number of variables for
each split): the default values have been used.
16 / 25
Comparison of metamodels
Presentation of three learning methods
Learning a random forest
Random forest are not very sensitive to hyper-parameters
(number of observations for each tree, number of variables for
each split): the default values have been used.
The number of trees should be large enough for the mean squared
error based on out-of-sample observations to stabilize:
0 100 200 300 400 500
0246810
Error
Out−of−bag (training)
Test
16 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Sommaire
1 Basics about machine learning for regression problems
2 Presentation of three learning methods
3 Methodology and results [Villa-Vialaneix et al., 2010]
17 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Data
5 data sets corresponding to different scenarios coming from the
geobiological simulator DNDC-EUROPE.
S1: Baseline scenario (conventional corn cropping system): 18 794
HSMU
S2: like S1 without tillage: 18 830 HSMU
S3: like S1 with a limit in N input through manure spreading at 170
kg/ha y: 18 800 HSMU
S4: rotation between corn (2y) and catch crop (3y): 40 536 HSMU
S5: like S1 with an additional application of fertilizer in winter:
18 658 HSMU
18 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Data
11 input variables
N FR (N input through fertilization; kg/ha y);
N MR (N input through manure spreading; kg/ha y);
Nfix (N input from biological fixation; kg/ha y);
Nres (N input from root residue; kg/ha y);
BD (Bulk Density; g/cm3
);
SOC (Soil organic carbon in topsoil; mass fraction);
PH (Soil PH);
Clay (Ratio of soil clay content);
Rain (Annual precipitation; mm/y);
Tmean (Annual mean temperature; C);
Nr (Concentration of N in rain; ppm).
18 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Data
2 outputs to estimate from the inputs:
N2O fluxes;
N leaching.
18 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Methodology
For every data set, every output and every method,
1 The data set has been split into a training set and a test set (on a
80%/20% basis);
19 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Methodology
For every data set, every output and every method,
1 The data set has been split into a training set and a test set (on a
80%/20% basis);
2 The machine has been learned from the training set (with a full
validation process for the hyperparameter tuning);
19 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Methodology
For every data set, every output and every method,
1 The data set has been split into a training set and a test set (on a
80%/20% basis);
2 The machine has been learned from the training set (with a full
validation process for the hyperparameter tuning);
3 The performances have been calculated on the basis of the test
set: for the test set, predictions have been made from the inputs
and compared to the true outputs.
19 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Numerical results
MLP RF SVM LM
N2O Scenario 1 90.7% 92.3% 91.0% 77.1%
Scenario 2 80.3% 85.0% 82.3% 62.2%
Scenario 3 85.1% 88.0% 86.7% 74.6%
Scenario 4 88.6% 90.5% 86.3% 78.3%
Scenario 5 80.6% 84.9% 82.7% 42.5%
N leaching Scenario 1 89.7% 93.5% 96.6% 70.3%
Scenario 2 91.4% 93.1% 97.0% 67.7%
Scenario 3 90.6% 90.4% 96.0% 68.6%
Scenario 4 80.5% 86.9% 90.6% 42.5%
Scenario 5 86.5% 92.9% 94.5% 71.1%
20 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Graphical results
Random forest (S1, N2O)
q
qqqqqqqqq qqq
q
q
q
q
qq
q
q
q
q
qq
q
qq
q
q
qqq
qqq
q
qq
qq
qq
qq
q
q
q qqq
q
qqq
q
q
q
qqq
q
q
qq
q
q
qqq
q
qq
q
q
qqq
q
q
q
qq
q
q
q
qq
q
q
qq
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
qqq
qq
qqq
qq
q
q
q
q qq
qqqq
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qqqq
q
q
qq q
q
q
qq
q
q
q
qq
q
q
q
q
q q
q
q
q
qq
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
qq
qq
q
qq
q
qq
q
qq
qqq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qqqqq
q
q
q
qqqqqq
q
q
q
q
qq
q
q
q q
q
q
q
qq
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
qqqq
q
q
q
q
q
q
q
q q
qqq
q
q
q
q
q
qq
qq
q
qq q
qq
q
q
q
q
q
q
q
qq
q
q
qqq
q
q
q
qqq
q
q
q
q
q
q
q
q
qq
qqq
qq
qq
q
qq
q
q
qq
qq
q
q
q
q
q
q
qq
q
q
q
qqq
qq
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqq
qq
q
qqq
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
qq
q
q
qq
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqq q
q
q
q
q
q
q
q qq
q
q
q
q
qqq
qq
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q q
q
q q
q
q
q
q
q
q
qqq
qq
qq
q
q
q
q
q
qq
q
qqq
q
q
q
qqq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
qq
q
q
q
q
qq
q
qq
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q qq
q
q q
qq
q
q
q
q
q
q
q
q
qq
qq
qq
qq
q
q
q
q
q
qqq
q
qq
q
q
q
qq
q
qq
q
q
qqq
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qqq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
qq
q
qq
q qq
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q q
qq
q
qq
q
q
q q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q q
q
qqq
q
qq
q q
q
q
q
q
q
q
q q
q q
q
q
q
q
q
q
q
q
q
q
q
q
qqqq
q
q
q
q
q
qq
q
qq
q
q
q
qqq
q
qq q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
qq
qqq
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
q
q
q
qqq
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
qq
qq
q
qq
q
q
qq
qqqq
q
qq
qq
q
qq
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q q
q
q
q
q
q
q
q
qq
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qqqqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
qq
q
q
q q
q
q
q
q
q
qqqq
q
qq
q
q
q
q
qq
q
qq
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qqq
q
q
qqq
q
qqq
q
qqqqqqq
qq
q
q
q
q
q
q
qqqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qqqq
q
q qqq
q
q
q
qq
q
q
q
qq
q
q
q
qq
q
q qqqq
qq
qq
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
qq
q
q
q
q
q
q
qqqq
q
qq
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q q
qq
qq
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
q
qq
qq
qq
q
q
q
q
qq q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
qq
q q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
qq
q
q
q
qq
qq
q
q
q qq
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
qq
q
q
q
qq
q
qq
q
qq
q
q qq
q
q
q q
q
q
q
q
q
q
q
q
qq
q
qq
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q q
qqq
q
qq
q
qqqq
qqq
q
q
qq
q
q
q
qq
q
qq q
q
q
qq
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q q
q
q
q
q
q q
q
q
qq
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
qq
q
q
qq
qq q
qq
qqqq
qqq
qqqqqqq
q
q qq
q
qq
q q
q
q
q
qqqqq
q
q
q
q qq
qq q
q
q
q
qqqqqq
q
q
q
qqq
qqq
q
q
qqqq
q
q
q
qq
qqq
q qq
q
q
qq
q
q
q
q
q qq
qqq
q
qqq
q
q
q
qqqqq
q q
qq
qqq q
q
q
q
qq
q
q
q
q
q
q
q
q
qq q
qqqq
q
q
qqqqqq
qqqq
qq
qq
q
q
qqqqqqqqq
qq
qq
q
q
q
qqqqq
q
q
qqq
q
q
qqq
q
q qqq
q
qq
q
qqq
qqqq
q
q
q
qqqqq
q qq
qqqqqqqqq
qq
q q
qqqqqq
qqqq
qq
q
qq
qq
q
q
q
q
qq
q
qq
q
qqq
qqq
qqq
q
qqqqqq
q
q
q
qqqq
q
qq
q
qq
qq
q
q
qq
q
qqqqq
qq
q
q
q
q
qqqqq
q
q
qq
q
q
q
q
q
qqqq
q
q
qqqq
q
qq
qqqq
qq
qq
q
q qq
q
q
q
qq
q
qq
q
qq
q
qq
q
q
q
q
qq
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qqqqqqqq
q q
qq
q
q
q
qq
qqqq
q
q
q
q q
q
q
q
qq
q
q
qq q
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
qq
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
qq qq
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
qq
qq
q
qq
q
q
q
q
qqqq
q
q
q
q
qq
q
q
qqqqq
q
q
qq
q
q
q
qq
q
qq
qqq
q
q
qq
q
q
q
q
qqq
q
qqq
q
q
q
q
q
qqqq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
qq
q
qq
q
q
q
qq
qq
q
q
qqq
q
q
qq
qq
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
qqqq
q
q
q
q
q
q
qq
q q
q
q
q q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
qq
q
q
q
q
qq
q
q
q
qq
q
qq
q
q
q
qqq
q
qqqq
q
q
q
qq
qq
qq
q
q
q
qqq
q
q
qq
qq
q
q
q
q
qq
q
q
q
qq
qqq
q
qqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqq
qqq
qq
q
q
q
qq
q
qqqqqqqqqq
q
qqq
q
q
q
q
q
qq
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
qq
q
q
q
q
q
q
q
q
qqq
qq
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
q
qqq
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q q
q
q
q
q
q
qq
q
q
qq
q
q
q
qq
q
q
qq
q
q
q
q
qqqq
q
q
q
qq
q
q
q
q
q
qq
q
q
qqqqqqq
qq
q
q
qq
qq
q
qq
qq
q
qq
q
q
q
q
qqq
q
q
q
qq
qq
qq
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
qq
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
qqq
q q
q
qq
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
qq
qqqqqqq
q
qq
q
qqqq
q
q
qq
q
q
q
qq
q
q
q
qq
q
q
q q
q
q
q
qq
q
q
q
q
q
q
q
q
qqq
q
q
qq
q
q
qq
q
qq
qq
q
q
q
q
qq
qq
q
q
qq
q
q
q
qqq
qq
qq
q q
qqq
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
qqq
q
qqq q
q
q
q
q
q
qq
q
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
qqqqqq
q
qq
q
q
q
q
q
qq
q
qqq
q
q
q
q
q
q
q
q
qqqq
q
q
q
qq
q
q
qq
q
q q
qq
q
q
q
qq
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q qq
q
qq
q
q
q
q
q
q
q
q
q
qqqq
qq
q q
q
q
q
q
q
qq
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q qq
q
qq q
qq
q
q
q
q
q
qq
q
q
q
qq
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q q
q
qq
q
q
q
q
qq
q
q
qq
q qq
q
q
q
q q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqq
qq
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
qq
q
qqq
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q q
q
q
q
q
q q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
qq
qq qqq
q
q
qqq
q
q
qq qq
q
q qq
q
q
qq q
q
q
qq
q
q
q
qq
q
q
q
q
q
qq
qq
qqqqqq
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
qq
qq
q
q
q
qq
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q q
q
q
q
qqq
q
q
qq
q
q
qq
qq
qq
q
q
q q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q q
q
q
q
q q
q
q
q
qq
q
qq
qq
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
qqqq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qqqqq
qq
q
q qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
qq
q
q
qq
q
q
q
q
q q
qq
q
q
q
q
q
qq
q
qq
q
q
qq
q
q
q
q
q
q
q
qq
qq
q
qq
qq
qq
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
qqq
q
q
q
q q
qq
q
q
q
qq
q
qq
q
q
qq
q
qqq
q
qq
q
q
qqq
q
qq
qq q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qqqq
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
qq
qq
q
qqq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
qqq
q
qq
qq
q
q
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
q
q q
qq
q
q
q
q
q
qqq qq
q
q
q
qq
q
q
q
q
qq
qqqqqq
q
qq
q
q
q
q
q
q
qqq
qq
qqq
qqqqqq
q
q
q
q
q
q
q
qq
q
qq
q
qq
q
qq
q
q
qqq
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qqqq
q
q
qq
q
q
qqqq
q
qq
q
q
qq
qqq
q
qqqq
qq
q
qq
q
qqq
q
qq
qq
qqq
qqq
qq
q
qqq
q
qq
qq
qq
q
q qq
qq
q
qq
qq
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqqqq
qq
q
q
qq
q
qqq
qq
q
q
q
qqqqq
qq
qqq
qqqq
q qqq
q
q
q
qqq
qqq
q
q
q
qq
qq
qqq
q
qq
qq
q
q
q
q
q
q q
q
qqq
q
qq
q
q
qq
qq
q
q
q
q
q
q
q
qqq
qq
qqqqq qq
qqqqqqq
qqq
q
q
q
q
q
qqq
qqq
qq
q
qqq
qqq
q q
q
q
qqqq
qq
q
qq
q
qq
q
q
q q
qqqqq
qqq
q
qqq
q
qq
q
qqqqq
qqq
q
q
qq
qq
qq
q
qq
q
q
q qq
q
qqq
q
q
q
q
q
q
q
qq
q
qqqq
q
qq
q
q
qq q q
q
q
qq
qq
qq
q
qqq
qq
q
q
q
q
q
qq
q
q
qqq
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
qq
qq
q
q
q
qq
qq
qqq
q
q
q
q
q
q
qq
q
qq
qq
qqqqq
qqq
q
qqqqq
qq
qq
q
qqq
qqq
qqqq
q
q
q
qq
q
qq
q
q
q
q
q
q
q
qq q
q
q
q
q
q
qq
q
qqqq
q
qq
qqq
q
q
q
qq
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
q
qq
q
q
q
q
q
qq
qq
qq
qqqq
q
q
q
qq
q
q
q
qq
qq
qq
q
q
q
q
q
q
q
q
qqq
q
q
qq
q
qq
q
qqqq
q
q
qqqq
qqqqq
qq
qq
q
q
q
q
qqq
qq
qq
q
q
q
q q
q
qqq
q
qqq
q
q
qq
qqqqqqqq
q
qqq
qq
q
q
q
q
qqqq
q
q
qq
q
qqq
q
qqqqq
qqqq
q
q
q
qq
qqq
q
qqqqqqq
qqqqq
q
qqq
q
qq
q
qq
q
q
q
qqq
q
qqq
qqq
qqq
q
q
qqq
q
q
q
qq
q
q
q
qq
q q
q
qqqqqqq
q
qq
qq q
qq
q
qq
q q
q
q
q
q
q
q
q qqq
q
q
qqqq
qqqqqqqqqqqqqqqqqqqqq
qqqq
q
q
q
qq
qqqqq
q
q
qq
qqq
qqq
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
qq
qq
q
qq
q
qq
qqq
q
qqqq
qq
q
q
q
qqqq
q
q
q
qq
q
qqqq
q
q
q
q
qq
q
qq
qqqqqq
q
qqqqqq
qqqqq
q
q qq
qq
q
q
qqqqq
q qq
q
qq
qq
qqqq
qq
q
qq
qq
qqq
q qqq
q
qqqq
q
qq
q
q
qqqq
qq
q q
qq
q
q
q
q
q
qqqqqqqqqqqqq
qq
q
qq q
qq
q
qqqq
q
qq
q
q
q
qqq qq
q
qqqqq
q
q
qqqqq
qq
qqq
qq
qq
q
qq
q
q
q
q
q
qq
qq
qqq
qq
q
qqq
q
q
q qq
q
q
qqqq
qqqqqq
q
qqq
qq
qqqq
qqq
qq
q
qqqq
qq
q
q
q
qqq
q q
q
qqqq
q q
q
q
qqqq
qqqqq
q
q
q
qq
q
qqqq
q
q
q
q
q
q
qqqqq
q
q
q
q
q
q
q
q
qqq
q
qq
q
qq
q
q
qq
q
q
q
qqq
qqqq
qqq
q
qq
qqq
qqqqqq
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
qqq
q
q
q
qq
qqqqqqqqqqq
qq
q
qq
q
qq
qqqqq
q
qqqq
q
q
q
q
q
q
qqq
q
q
q
qq qqqq
q
q
q
q
qqqqqqq
q
q
q
q
q
qqqqq
q
q
qq
qqq
q
qqq
q
qq
q
q
q
q
qq
qqqqqqqq
q
q
q
qq
q
qq
q
qq
qq
qq
q
q
q
q
q
q
q
qq
qqqqq
q
qq
qqqq
q
q
qq
qqq
q
qq
q
qqqqqq
q
q
q
q
q
q
qqq
q
q
q
q
q
qq
qq
q
qq
q
qqqqqqqqqqqqq
qq
qqq
qq
qq
q
q
q
qqqq
qq
qq
q
qq
qqq
q
q
q
q
q
qqq
qq
qqq
qq
q
qqq
q
q
qqqqqq
q
q
q qqq
q
q
q
q
q
qqqqq
qqq
qq
q
q
q
q
qq
qqqqq
q
q
q
qqq
q
q
q
q q
q
qqqq
qqq
q
q
qqqqq
q
qqqqqqqqq
q
q
q
qq
q
q
q
qqqqq
qq
q
qq
qqqq
qq
q
qqqqq
q
qq
q
qq
q
qq
qq
qq
q
qq
qq
q
qq
q q
qqqq
q
qqqq
qq
q
qq qq
qqq
q
qqqq
q
qq
qqq
qq
q
q
q
q
qqq
qqq
q
q
qqqqq
q
q
q
q
q q
q
qq
q
q
q
qq
q
q
q
q
q
qq
q q
q
qqqq
q
qqq
q
q
q
q
qqq
qqq
q
q
qqqqq
q
qqqqqqqqqqq
qqq
qq
q q
q
q
qq
qqq
q
q q
q
q
qqqq
q
q
qqqq q
qqqq
q
qq
qq
q
qq
q
q
q
qqqqqqqq
qq
qq
q
q qqqq
q
q
q
q
q
q
q
qqqq
qq
qqqqq
q
q
q
q
q
q
q
qq
q
qqqqqqq
q
qq
q
q
qq
qq
qq
qq
qq
q
qqq
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
q
q
q
q q
q
qq
q
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
qqqqq
qqq
qq
q
qqq
q
qqqq
q
q
q
q
qqq
q
qq
qqq q
q
qqqqqqqq
qqqqqq
q
q
qqq qqq
qq
q
q
qq
q
qqqq qq
qqq
qqq
q
qq
qq
qq
q
qq
q
q
qq
q
q
q
q
qq
qqq
q
q
qq
q
qqq
q
qqq qq
q
qqq
qq
qqqq
qqq
q
q
q
qq
qq
qqq
q
qq
q
q qqqqqqqqqqqq
qq
q
q q
qqqqqqq
qq
q
qqqq
qqq
q
q
q
qq
q
qq
q
qqqqq
qqqqq
qqqqqq
q
qqqqqqq
q
q
q
qqqqqqq
q
qqq
q
q
qq
qq
qq
qq
qq
qq
qqqqq
qqqq
q
q
qq
q
qqqqq
qq
q
q
qq
qq
q
qq
qq
q
q
qqq
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
qq
qq
qqqq
qqq
q q
q
qqq
qq
q
q
qq
q
qq
qq
q
qq
q
q
qqq
q
qq
q
qq
qq
qq
q
q
q
qqqq
q
q
qqq
qqq
qqq
q
qq
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqq
qqqqq
q
q
q
qq
q
q
q
qq
q
qq
qq
qq
q
qq
q
q
q
qq
qqqqq
q
q
qq
q
qq
qq
q
qqqq
q
q
q
q
qq
q
q
qq
q
qqq
qq
qq
qq
q
q
qq
qq
q
q
q
q
q
q
q
q
qqq
qq
qqqqq
qqq
qq
qq
q q
qqq
qqqqq
q
q
q
q
q
qqqqqqqqqqqq
qqq
q
qq
qq
qq
q
q
q
q
qq
q
qq
q
q
q
q
q
qq
q
q
qqq
qq
qq
qq
q
q
qq
q
q
q
qq
q
q
qq
q
q
q
qq
q
q
q
q
qq
qq
qqqq
q
q
q
q
q
q
qqqqqqq
qqq
q
q
q
q
q
qqqq
q
q
q
q
q
qq
qqq
q
q
q
q
q
q
qqq
q
q
q
q
q
q
qq
q
q
qq
qqq
q
q
qq
q
qqq
qq
qqq
q
q
qqqqq
qqqqqq
qqq qq
qqqqqq
q
qqqqqqq
q
qqqqqq
qq
qqq
qqqqqqqqq
q
q
qq
q
q
q
qqqq
qqq
q
qq
qqqq
q
q
q
qqq
q
q
qq
qq
qqq
qqq
qq
qqqq
q
qq
qqqq
qq
q
qq
q
q
q
q
qq
qqq
q
qq
qq
q
q
q
q
qq
q
qqqqqqqq
q
q
qqq qq
q
q
qqq
qqqqq
q
q
qqq
q
q
q
qq
q qqq
qqq
q
qqq
q
q
q
q
q
q
q
qqq
q
q
qqqqq
q
q
q
q
q
q
q
qqq
q
q
q
qqq
q
qqq
q
q
q
q
q
q
qq
qqq
q
q
q
q qqq
q
q
q
qq
q
q
qq
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
qqq
qqqq
qqq
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
qqqq
q
q
qq
qq
q
qq
q
qq
q
q
q
qqq
q
qqqq
q
qqq
q
qq
q q
qqqqq
q
q
q
q
qqq
qqqqqqqq
q
qqq
qq
q
q
q
q
q
qq
q
qq
q
q
qq
q
q
q
qqq
q
q
q
qq
qq
q
q
q
q
q
q
qq
q
q
q
qq
qq
q
q
q
qqq
qqq
q
q
qqqq
q
q
qq
q
q
q
q
q
q
q
qq
qq
q
q
q
qqq
qqq
q
q
qqq qq
qq
qq
q
q
q
q
q
q
q
qq
q
q
qqq
q qq
q
qqq
q
q
q
qq
q
q
q
q
q
q
q
q q
qq
q
qq
q q
q
q
q
qq
q
q
q
qqqqq
q
qqq
qqq
qq
qqq
q
q
q
q
q
q
qqqq
q
q
q
q
qqqqqqqq
qqq
qqq
qqqqqqq
qq
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
q
qq
q
qqq
q
q
qq
q
q
q
qqqq
q
qqq
q
qq
qq
q
qqqqqq
q
qq
qqqqqqqq
q
q
qqqq
q
q
q
q
q
qqqq
q
qq
q
qqq
qq
q
qqqq
q
q
q
q q
qq
qq
qqqq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qqq
q
qq
q
q
qq
qqqqq
q
q
q
qq
q
q
q
qqq
qqq
qqqqqq
q
q
q
q
q
q
q
qqqqq
q
q
qqq
qqq
q
qqq
qq
qqq
qq q
q
qqq
qq
q
q
qq
qq
q
qqq
qqq
q
qq
q
q
q
q
qqqqq
qq q
qqqqqq
q
q
q
q
qqq
qqq
q
q
qqq
q
q
qq
q
q
qqq
q
q
qq
qq
q q
q
q
q
q
qqq
q
q
q
q
qq
q
qqq
qq
q
q
q
q
q
qq
q
qq qq
qqq
qqq q
q
qq
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q q
qq
q
q
q
q
qq
q
q
qq
qqqqqq
qqq
q
qqqqqq
q
q
qq
qqqqq
qq
q
q
q
q
q
qqq
q
q
q
qq
q
qq
q
q
qq
q
qq
q
q
q
q
qq
q
qqqq
q
q
qqq
qq
q
q
q
q
q
q
q q
qqq
q
q
q
q
q
qq
qq q
qqq
q
qqq
q q
qq
q q
qqqqq
q
q
qq
qq
q
qqqqqq
q
q
q
qq
qq
qqqqq
qq
q
q
q
qq
q
q
q
qq
q
qq
qqq
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
qqqq
q
q
q
q
qqq
q
qq
qq
qq
q
qqq
q
qq
q
q
q
q
q
q
qqq
q
qq
q
q
q
q
q
qq
qqq qq qq
qqqq
qqqqqqq
qq
q
qqq
qqq
q
q
qq
q
q
q
q
q
qq
qq
q
q
q
qq
q
qqqq
qq
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
qqqqq
q
q
q
q
q
q
qqq
qq
q
qq
q
q
q
q q
q
q
qqq
q
qq
q
q
q
q
q
q
q
q
qqq
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
qq
q
qqq
q
q
qqqq
q
q
q
q
q qq
q
q
q
qq
q q
q
q
q
qq
q
q
qqq
q
qq
q
qqq
q
q
q
qq
qqq
q
qq
q
qqq
q
q
q
q
q
q
q
q
qq qq
qqq
q
q
q
qq
q
q qqq
q
q
qq
q
qq
qqqqqqq
q
qqq
qqq
q
q
q
q
q
q
q
qq
q
q
q
q
q
qqqq
q
qq
q
qq
qq
qq
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
qq
qq
q
q
qqqqq
q
q
qqqqq
q
q
qq
qq
qqqqqqq
q
q
q
qqqq
q
q
qq
q
q
q
q
q
q
q
q
q
q
qqqqq
qqqq
q
q
qq
qqqq
qq
qq
q
qq
q
qq
qq
q
q
q
qqq
q
q
q
q
q
q
q
q q
q
qq
q
q
q
q
q
q
qq
q
q
qq
q
q
qqqqqqq
q
qq
q
q
q
qq
q q
qqq
q
qqqq
q
q
q
q
qq
qq
qqqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qqqqq
q
q
qqq
q
q
q
q
qqq
q
qq
qqqqq
q
qq
q
qq
q
q
q
qqqqqqq
q
qq
q
qq
q
q
q
qqqq
q qq
q
q
q
qqq
q
qqq
q
qqq
q
q
q
qq
q
q
q
q
q
q
qqqqqqq
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
qq
q
qq
q
q
q
qq
qq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
qqqq
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
qq
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
qq
q qq
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
qq qq
qqqq
q
q
qq
qq
q
q
q
q
q
qq
q
q
qq
q
q
q
qq
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
qqq
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
qq
q
q
qq
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
qq
qqq
qqqqq
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
qq
qq
q
q
qqq
qq
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
qqq
qq
q
qqq
qqqq
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qqq
q
qq
qq
q
q
q
q
qqq
qq
qqq
qqqqqqqqq
q
q
q
qq
qqq
q
q
q
q
qqq q
qq
q
q
qqq
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
qq
qq
qq
q
qqq
qqq
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
qqq
q
qq
q
q
qq
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qqq
q
qqqq
q
qqqqq
q
qqq
q
q
q
q q
q
q q
q
qq
q
q
q
q
q q
q
q
qq
qq
qq
q
q
q
q
qqq
q
qqqqqqq
q
q
q
q
q
q
qq q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
qqq
q
q
qqq
q
qq
q
qq
q q
q
q
qqq
q
q
q qq
q
qq q
qq
q
q
qq
q
qq
q
q
qq
q
q
q
q
q
qq
q
q
qq
q
qq
q
q
qq
q
q
q
q
qqqq
q
q
qqqqqq
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
qq
q
q
qqq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
qqq
q
q
q
q
q
q
q
q
qqq
q
qqq
q
q
q
qq
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qqqqq
q
qq
q
q
q
q
q
qq
q q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
q
qq
qq q
qq
q
q
qqqq
q
q
qq
q
qq
qqq
qq
q
q
q
q
q
qqq
q
qqq
q
q
q
q
qqqqq
q
qq
q
q
q
q
q
q
q
q
q
qqqqq
qq
q
q
q
q
q
qq
q
q
q
q
qq
q
q
qqq
q
qqqqqqqqqq
q
q
q
q
q
q
qq
qq
q
qqqqqq
q
q
q
qq
qqqqq
q
qqqq
q
qq
q
q
q
qqq
qqqqqqqq
qq
q
qqqqqqqqqqq
q
qqq
q
q
qq
q
q
q
q
q
qqqqqq
q
q
q
qq
q
q
q
q
q
q
q
qqq
q
qq
q
q
q
qqq
q
qq
q
qqq
q
qqq
q
qqq
q
qqq
q
q
q
q
q
q
qq
q
qqqq
q
q q
q
qq
q
qqqqqqqqqqqqqq
qq
q
q
qqq
q
q
q
qqq
qq
qqq
qq
q
q q
q
q
qq
q
q
q
q
qq
q
qq
qq
q
q
q
qq
q
q
q
q
qq
qq
q
q
q
q
q
qqq q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
qqq
q
qqq
qqqqqq
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
qq
q
q
q
qqq
q
q
q
q q
q
q
qqq
q
q
q
qqq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
qqqqq
q
q
qq
q
qq
qq qqq
q
qq
q
q
q
q q
q
qq
qqq
q
q
qqqqq
qq
q
q qq
qqqq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
qq
q
q
qqqq
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq q
q
qqq
q
q
q
q
qqq
q
q
qq
q
q
q
qq
qq
qq
q
q
qq
q
q
q
q
qq
q q
q
q
q
q
q
q
qq
q
qqq
q
q
q
q
q
qqqq
q
qq
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
qq
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q q qq
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
qq
q
q
q
qq
q q
q
q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qqq
q
q
qq
qq
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q q
q q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
qqqq
qqqq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q q
qq
q
q
q
q
q
q
qqq
q
qq
q
q
q
q
q
qq
q
q
q
qqqq
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
qq
q
q
q
q
q
q
qq q
q
qqqqq
q
q
qqq
q
q
q
q
q
qq
qq
q
q
q
q
q
qq
q
q
q
q
q
q
qqq
qq
qq
q
q
qq
q
q
q
qqqqq
q
q
qq
qq
qq
q
qqqqq
q
q
qqq
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
qq
q
qq
q
q
qq
q
q
q
q
q
q
qq
qq
q
q
q
q
q
qqqqq
q
qq
q
qqqq
qq
q
qq
q
q
q
q
q
q
qq
q
q
qq
qq
q
q
q
q
q
q
q
q
qqq
q
q
qq
qq
q
qq
q
q
q
q
q
qq
q
q q
qq
q
q
q
qq
q
q
q
q
q
qq
qq
q
q
q
q
q
qqqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
qq
q
qqq
qq
qqqqqq
qq
qq
q
q
qq
q
q
qq
q
q
qq
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
qqqq
q
qqq
qqq
qq
q
q
q
q
q
q
q
q
q
q
q
q
qqqqqqq
q
q
q
q
q
q
q
q
q
qqqqqq
qqq
q
qqq
q
q
q
q
qq
q
q
q
q
qqq
qq
q
q
q
qq
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
qq
q
q
q
qq
q
qq
q
q
q
q
qqq
q
q
qq
qqq
q
qqqqq
q
q
q
qq
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqq
qq
q q
qq
q
q
qq
q
qq
q
q
q
q
q
qq
q
q
q
qqqqq
q
qq
q
q
q
qq
q
q
q
q
q
q q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q q
q
q
qq
q
qq q
q
q q
q
q
qq
q
q
q
qq
q
qq
q
q
q
q
q
q q
q
q
q
q
qq
qqq
q
q
q
q
qq
q
qqqq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
qqq
q
q
q
q
q
qq
q
q
q q
qq
qqqq
q
q
q
q
q
qq
qq
q
q
q
q
q
q
qq
qqq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qqqqq
q
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqqqqq
q
q q
q
qq
q
qqq
qqq
qqq
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
qqq
qqq
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qqq
qq
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q q
q
q
q
qqq
q
q
q
qqq
q
q
q
q
qq
q
q
q
q
q
q
q
qqq
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
qq
q
q
q
q
qq
qq
qqqq
qq
q
q
q
q
qqq
q
q
q
qqq
qq
q
q
q
q
qq
q
qqqqq
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
q
q
qq
q
qq
q q
q
q
q
q
qq
q
q
q
q
q
qq
q
qq
q
q
q
q
qq
q
q
q
qq
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q qq
q
q
q
q
qqq
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
qq
q
q
q
q
q
q
qq
q
qq
q
q
q
q
qq
qq
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
qqq
q
q
q
qqqq
q
q
q
q
qqq
q
q
q
qq
q
q
q
qq
q
qq
q
q
q
q
q
q
qqqq
q
q
q
q
qqqqq
qq
q
q
qqqqq
qq
q
q
q
q
q
q
q
q
qq
q
qqq
q
q
q
q
q
q
q
qq
q
qqqqq
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
qq qq
q
q
qq
q qq
q
q
q
qq
qq
qqq
q
qq
q
qq
q
qqq
q
qq
q
q
q
q
q
q
q
qq
qqqqq
qq
q
qq
q
q
q
q
q
qqq
q
qq
q
q
q
q
q
qq
q
q
qqq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qqq
qqq
q
q
q
q
q
qq
qq
q
q
q
q
q
q
qq
qqq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
qqqq
q
q
q
q
q
q q
q
q
qqq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qqq
qq
q
q
qq
q
q
q
q
q
qq
qqq
q
q
q
q
q
q
qq
qq
q
qq
q
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
qqq
q
q
q
q
qq
qqq
q
q
q
q
q q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
qq
qqq
q
q q
q
qq
qqq
qq
q
q
q
q
q
q
qq
q
qqq
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
qq
q q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qqq
q
q
q
q
q
qq
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
qq
q
q
qqq
q
q
q
q
qqq
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q qq
qqq
qq q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
qq q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q q
q
qq
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
qq
q
q
qqqqqqqq
q
qq
q
q
q
q
qqq
qq
q
qqqqq
q
q
qqq
q
q
qq
q
q
q q
q
qq
qq q
q
q q
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
qq
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qqqq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
qqqqqq
qq
q
qq
q
qqq
q
qq
qq
q
q
q
q
q
qqqq
q
qq
q
qqq
q
q
q
qq
q
q
q
q
q
q
qqq
qq
q
qq
q
q
q
q
q
q
q
q
qq
qqq
qq
q
q
qq
qq
q
q
q
q
qq
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
qqqq
qq
q
q
q
qq
q
q
q
q
q
qqq
q
q
qqq
q
q
q
q
qq
q
qq
q
qq
qq
q
q
q
q
q
qqqq
q
q
qq
qqq
qq q
q
q
q
qq
q
q
q
q
q
q
q
qqq
q
q
q
q
qq
q
q
qq q
qqqqqqqq
qq
q
q
q
qq
q
q
q
q
q
q
q
qq q
qq
q
q
q
q
q
q
q
q
q
qq
qq
q
q
qqq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qqq
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq q q
qq
q
q
q
qq
q
q
q
qq
q
qqqqq
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
qq
q
qq
q
q
q
q
q
q
qq
qqqqq
qq
q qq
qq
q
qq
q
qq
qq
q
q
q
q qq
q
q
q
q
q qqq q
qq
q
q
q
qqq
q
q
qq
q
q
q qq
qq
qqqq
q
qqq
q
q
qq
q
q
qq
q
q
q
q
q
q
qq
qq
q q
qq q
q
q
qq
q
q
q qq
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
qqq
q
q
qq
q
qqqq
q
q
qq
q
q
qqq q
q
q
q
q
qq
qqqq
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
qqqq
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q qq
q
qqqqqqq
q
q
q
qqq
qqq q
q
q
q
qq
q
q
q
q
qqqq
q
qq
q
q
q
qqq
qqqq
qq
qq
qq
q
qq
q
qq
q
q
q
qq
q
qq
qq
q
q
q
qqq
q
qqq
q
qq
q
q
q
q
q
q qq
qq
qq q
q q
q
q
qq
q
q
q
q
q
q
q
qqq
q
qqqqq
q
q
q
qq
q
qqqq
qq
q
qq
q q
q q
qq
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q q
q
q
q
q
qqqq
qqq
q
qq
q
q
q
q
q q
qqq
q
qqqqq
q
q
q
q
q
q
q
q
q
qq
q
qqq
q
qq
qq
q
q
q
qq
qqqqq
qq
q
q
q
q
q
q
qq
qqqq
q
q
qq
qqqq
q
qq
q
qqqq
q
qq
q
q
q
q
qqqqqqqqqqqq
qq
qq
qqqq qq
qqqq
qq
qqqq
q
q
qqqq
qq
q
q
q
q
q
qqq
qqqqq
qq
q
q q
q
q
q q
q
q
q
q
qqqqq
qqq
q
q
qqqqqqq
qqqq q
qqqq
q
q
qq q
qqq
q
q
q
q
q
q
q
qqq qqq
qqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qqq
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
qqqqq
q
q
q
q
q
q
qq
q
qqq q
q
q
q
q
q
q
q
q
q
q
q
qqq
qqqqq
q
q
q
q
q
q
q
q
q
q
qq
q
qqq
q
qq
q
q
q q
qq
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
qqq
q
q
q
q
q
q q
q
q
q
q qq
q
q
qqqq
qqqqq
q
q
qq
q
qqq
q
q
q
q
q
qqq
qq
q
q
q
q
qqqqqqqq
qq
q
qq
q
q
q
q q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q q q
q
q
qq
q
q
qqq
qq
q
q
qqqqq
q
q
q
qq
q
q
q
q
q
q
qq
q
qqq
q
q
q
q
q
q
q
q
qqq
qq
qq
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
qqq
q
q
q
q q
q
q
q
q
q
q
q
q
qqq
qq
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qqqqqqq
qq
qqq
q
q
q
q
q
q
q
qq
q
q
qqq
qq
q
q
qq
qq qqq
q
q
q
q q
q
q
qq
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
qq
qq
q
qqqq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq q
qq
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
qq
qqq
q
q
q
qq
q
q
q
qqq
qq
qq
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q q
q
qq
q
q
qq
q q
q
qqq
q
q
q
q
q
q
q
q
q
q
qq
q qq
q
q
q
q
q
qq qq
qqq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
qq
qqq
q
q q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
qq
q
qq
q
qq
q
q
q
qqq
q
q
q
q
q
q
q
q
q q
q
q
q
q
q qq
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
qq
qq
q
q
q
q
q
qqqq
q
q
qq
q
q
q
q
q
qqq
q
qqqqq
q
qqqq
q
q
qqq
q
qqqqq
q
q
qq
q
q
q
q
qq
q
q
q
q
qqq
q
q
q
qq
q
q q
qq
q
q
q
qqq
q
q
qq
q
q
q
qq
qqqqq
q qq
q
q
q
q qqq
q
q
q
q
q
q
q
q
q
q
qq
q
q q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
qq
q q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
qqq
q
q
qq
q
q
qqqqqqq
q
q
qq
q
qq
qq
q
qq qqqqqqqqqqq
q
qqqq
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
qq
q
qq
qqqqqq
q
qq
q
q
q
q
qq
q
qqqq
q
q
q
q
qq
qqq
q
q
q
q
q
q
q q
q
qq
q
qq
q
q
q
q
q
qq qq qq
q
q
q
q
q
qqqq
q
q qq
q
qq q
qq
q
q
q
qq
q
qq
q
qq
q
q q
q
q
qqq
qqq
qq
qq
qq
q
qqqqqqq
q
q
qq
q q
qq
q
q
qqq
qq
qq
q
q
q
q q
q
q
q
q
q
0 20 40 60 80
010305070
True value (train)
Predictedvalue(train)
qq q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qqq
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
qqqq
q
q
qq
q
q
q
q
q
qq
q
q
qq
q
q
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qqq
qqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qqq
q q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
qq
qqq
q
q
q
q
q q
q
q
q
q
q
q
q
q
qqq
q
q q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
qq
q
q
q
qq
qqq
q
q
q
q
q
q
q
qqq
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
qq qq
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q q
q
q
q
qq q
q
q
qqq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q q
q
q
qq
q
q
q
q
qq
q
q
q
q q q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qqq
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
qqq
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q q
q
qq
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
qq
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
qqq
q
q
q
qq
q
qqqq
q
q
q
q
q
q
q
q q
qq
q
q
q
q
q
qq
q
qq
q
q
q
qq
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q q
q
q
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
q
q
q
qq
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
qq
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qqq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qqq
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q qqqq
q q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
qqqq
q q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qqqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q q
q
q q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
qq
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
qq
q
q
q
q
q
q q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
qq
q
q
q
q
q
q qq
qqqq
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
0 10 20 30 40 50 60 70
0103050
True value (test)
Predictedvalue(test)Linear model (S1, N2O)
0 20 40 60 80
01020304050
True value (train)
Predictedvalue(train)
0 10 20 30 40 50 60 70
01020304050
True value (test)
Predictedvalue(test)
21 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Further comparisons
Time for training Time for using once the model is trained
RF + (∼ 15 minutes) = (∼ 5 seconds)
SVM = - (∼ 20 seconds)
sensitive to N
MLP - + (∼ 1 second)
sensitive to d
Times correspond to the prediction for about 20 000 HSMU on a
Dell Latitude D830, Intel Core 2DUO 2.2GHz, 2GO RAM, OS
Ubuntu Linux 9.10 (and, for RF, to the training of about 15 000
HSMU) with R.
Time for DNDC: around 200 hours by using a desktop computer
and around 2 days by using cluster 7!
22 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Understanding which inputs are impor-
tant
Importance: A measure to estimate the importance of the input
variables can be defined by:
for a given input variable randomly permute the input values and
calculate the prediction from this new randomly permutated inputs;
compare the accuracy of these predictions to accuracy of the
predictions obtained with the true inputs: the increase of mean
squared error is called the importance.
23 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Understanding which inputs are impor-
tant
Importance: A measure to estimate the importance of the input
variables can be defined by:
for a given input variable randomly permute the input values and
calculate the prediction from this new randomly permutated inputs;
compare the accuracy of these predictions to accuracy of the
predictions obtained with the true inputs: the increase of mean
squared error is called the importance.
This comparison is made on the basis of data that are not used to
define the machine, either the validation set or the out-of-bag
observations.
23 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Understanding which inputs are impor-
tant
Example (S1, N2O, RF):
q
q
q
q
q
q
q
q
q
q
q
1 2 3 4 5 6 7 8 9 10 11 12
51015202530
Rank
Importance(meandecreasemse) SOC
PH
Nr N_MR
Nfix
N_FR
clay Nres TmeanBD rain
The variables SOC and PH are the most important for accurate
predictions. 23 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
Understanding the relation between a
given input and the output
Evolution of N2O in function of the value of SOC
0.00 0.05 0.10 0.15 0.20 0.25
510152025
SOC
N2Opredicted
24 / 25
Comparison of metamodels
Bishop, C. (1995).
Neural Networks for Pattern Recognition.
Oxford University Press, New York.
Boser, B., Guyon, I., and Vapnik, V. (1992).
A training algorithm for optimal margin classifiers.
In 5th annual ACM Workshop on COLT, pages 144–152. D. Haussler Editor, ACM Press.
Breiman, L. (2001).
Random forests.
Machine Learning, 45(1):5–32.
Vapnik, V. (1995).
The Nature of Statistical Learning Theory.
Springer Verlag, New York.
Villa-Vialaneix, N., Follador, M., and Leip, A. (2010).
A comparison of three learning methods to predict N2O fluxes and N leaching.
Submitted to MASHS 2010 http://mashs2010.free.fr.
25 / 25
Comparison of metamodels
Methodology and results [Villa-Vialaneix et al., 2010]
-insensitive loss function
Go back
25 / 25
Comparison of metamodels

A comparison of three learning methods to predict N20 fluxes and N leaching

  • 1.
    A comparison ofthree learning methods to predict N2O fluxes and N leaching Nathalie Villa-Vialaneix http://www.nathalievilla.org Institut de Mathématiques de Toulouse & IUT de Carcassonne (Université de Perpignan) March 18th, 2010 UC - Climate Change Unit, Ispra 1 / 25 Comparison of metamodels
  • 2.
    Sommaire 1 Basics aboutmachine learning for regression problems 2 Presentation of three learning methods 3 Methodology and results [Villa-Vialaneix et al., 2010] 2 / 25 Comparison of metamodels
  • 3.
    Basics about machinelearning for regression problems Sommaire 1 Basics about machine learning for regression problems 2 Presentation of three learning methods 3 Methodology and results [Villa-Vialaneix et al., 2010] 3 / 25 Comparison of metamodels
  • 4.
    Basics about machinelearning for regression problems Regression Consider the problem where: we want to predict Y ∈ R from X ∈ Rd ; a learning set of N i.i.d. observations of (X, Y) is already available: (x1, y1), . . . , (xN, yN); new values of X are available from which we expect to evaluate the corresponding Y: xN+1, . . . , xM. 4 / 25 Comparison of metamodels
  • 5.
    Basics about machinelearning for regression problems Regression Consider the problem where: we want to predict Y ∈ R from X ∈ Rd ; a learning set of N i.i.d. observations of (X, Y) is already available: (x1, y1), . . . , (xN, yN); new values of X are available from which we expect to evaluate the corresponding Y: xN+1, . . . , xM. Example: Predict N2O fluxes from PH, climate, concentration of N in rain, fertilization for a large number of HSMU . . . 4 / 25 Comparison of metamodels
  • 6.
    Basics about machinelearning for regression problems Two different approaches 1 Linear model: A linear relation between X and Y is supposed: yi = βT xi + , i = 1, . . . , N 5 / 25 Comparison of metamodels
  • 7.
    Basics about machinelearning for regression problems Two different approaches 1 Linear model: A linear relation between X and Y is supposed: yi = βT xi + , i = 1, . . . , N β can be estimated from the learning set by ˆβN = (XT X)−1 XT Y where X = (x1 . . . xN)T and Y = (y1 . . . yN)T . 5 / 25 Comparison of metamodels
  • 8.
    Basics about machinelearning for regression problems Two different approaches 1 Linear model: A linear relation between X and Y is supposed: yi = βT xi + , i = 1, . . . , N β can be estimated from the learning set by ˆβN = (XT X)−1 XT Y where X = (x1 . . . xN)T and Y = (y1 . . . yN)T . 2 Non parametric model: No particular relation is supposed between X and Y: yi = Φ(xi) + , i = 1, . . . , N where Φ is unknown. 5 / 25 Comparison of metamodels
  • 9.
    Basics about machinelearning for regression problems Two different approaches 1 Linear model: A linear relation between X and Y is supposed: yi = βT xi + , i = 1, . . . , N β can be estimated from the learning set by ˆβN = (XT X)−1 XT Y where X = (x1 . . . xN)T and Y = (y1 . . . yN)T . 2 Non parametric model: No particular relation is supposed between X and Y: yi = Φ(xi) + , i = 1, . . . , N where Φ is unknown. From (xi, yi)i, we intend to define a machine, i.e., a function ˆΦN : Rd → R that approximates Φ. 5 / 25 Comparison of metamodels
  • 10.
    Basics about machinelearning for regression problems Desirable properties for a machine The adequacy of ˆΦN is measure through an error function, L. Example: Mean squared error L(y, ˆy) = (y − ˆy)2 6 / 25 Comparison of metamodels
  • 11.
    Basics about machinelearning for regression problems Desirable properties for a machine The adequacy of ˆΦN is measure through an error function, L. Example: Mean squared error L(y, ˆy) = (y − ˆy)2 Hence, ΦN should be: accurate for the learning data set: N i=1 L(yi, ˆΦN (xi)) has to be small; 6 / 25 Comparison of metamodels
  • 12.
    Basics about machinelearning for regression problems Desirable properties for a machine The adequacy of ˆΦN is measure through an error function, L. Example: Mean squared error L(y, ˆy) = (y − ˆy)2 Hence, ΦN should be: accurate for the learning data set: N i=1 L(yi, ˆΦN (xi)) has to be small; able to predict well new data (generalization ability): M i=N+1 L(yi, ˆΦN (xi)) has to be small (but here, yi are unknown!). 6 / 25 Comparison of metamodels
  • 13.
    Basics about machinelearning for regression problems Desirable properties for a machine The adequacy of ˆΦN is measure through an error function, L. Example: Mean squared error L(y, ˆy) = (y − ˆy)2 Hence, ΦN should be: accurate for the learning data set: N i=1 L(yi, ˆΦN (xi)) has to be small; able to predict well new data (generalization ability): M i=N+1 L(yi, ˆΦN (xi)) has to be small (but here, yi are unknown!). These two objectives are somehow contradictory and a tradeoff has to be found (see [Vapnik, 1995] for further details). 6 / 25 Comparison of metamodels
  • 14.
    Basics about machinelearning for regression problems Purpose of the work We focus on methods that are universally consistent. These methods lead to the definition of machines ˆΦN such that: EL(ˆΦN (X), Y) N→+∞ −−−−−−→ L∗ = inf Φ:Rd →R L(Φ(X), Y) for any random pair (X, Y). 7 / 25 Comparison of metamodels
  • 15.
    Basics about machinelearning for regression problems Purpose of the work We focus on methods that are universally consistent. These methods lead to the definition of machines ˆΦN such that: EL(ˆΦN (X), Y) N→+∞ −−−−−−→ L∗ = inf Φ:Rd →R L(Φ(X), Y) for any random pair (X, Y). 1 multi-layer perceptrons (neural networks): [Bishop, 1995] 2 Support Vector Machines (SVM): [Boser et al., 1992] 3 random forests: [Breiman, 2001] (universal consistency is not proven in this case) 7 / 25 Comparison of metamodels
  • 16.
    Presentation of threelearning methods Sommaire 1 Basics about machine learning for regression problems 2 Presentation of three learning methods 3 Methodology and results [Villa-Vialaneix et al., 2010] 8 / 25 Comparison of metamodels
  • 17.
    Presentation of threelearning methods Multilayer perceptrons (MLP) A “one-hidden-layer perceptron” is a function of the form: Φw : x ∈ Rd → Q i=1 w (2) i G xT w (1) i + w (0) i + w (2) 0 where: the w are the weights of the MLP that have to be learned from the learning set; G is a given activation function: typically, G(z) = 1−e−z 1+e−z ; Q is the number of neurons on the hidden layer. It controls the flexibility of the machine. Q is a hyper-parameter that is usually tuned during the learning process. 9 / 25 Comparison of metamodels
  • 18.
    Presentation of threelearning methods Symbolic representation of MLP INPUTS x1 x2 . . . xd w (1) 11 w (1) pQ Neuron 1 Neuron Q φw(x) w (2) 1 w (2) Q +w (0) Q 10 / 25 Comparison of metamodels
  • 19.
    Presentation of threelearning methods Learning MLP Learning the weights: w are learned by a mean squared error minimization scheme : w∗ = arg min w N i=1 L(yi, Φw(xi)). 11 / 25 Comparison of metamodels
  • 20.
    Presentation of threelearning methods Learning MLP Learning the weights: w are learned by a mean squared error minimization scheme penalized by a weight decay to avoid overfitting (ensure a better generalization ability): w∗ = arg min w N i=1 L(yi, Φw(xi))+C w 2 . 11 / 25 Comparison of metamodels
  • 21.
    Presentation of threelearning methods Learning MLP Learning the weights: w are learned by a mean squared error minimization scheme penalized by a weight decay to avoid overfitting (ensure a better generalization ability): w∗ = arg min w N i=1 L(yi, Φw(xi))+C w 2 . Problem: MSE is not quadratic in w and thus the optimization can not be made perfectly right. Some solutions can be local minima. 11 / 25 Comparison of metamodels
  • 22.
    Presentation of threelearning methods Learning MLP Learning the weights: w are learned by a mean squared error minimization scheme penalized by a weight decay to avoid overfitting (ensure a better generalization ability): w∗ = arg min w N i=1 L(yi, Φw(xi))+C w 2 . Problem: MSE is not quadratic in w and thus the optimization can not be made perfectly right. Some solutions can be local minima. Tuning of the hyper-parameters, C and Q: simple validation has been used to tune first C and Q. 11 / 25 Comparison of metamodels
  • 23.
    Presentation of threelearning methods SVM SVM is also a error loss with penalization minimization: 1 Basic linear SVM for regression: Φ(w,b) is of the form x → wT x + b with (w, b) solution of arg min N i=1 L (yi, Φ(w,b)(xi)) + λ w 2 where λ is a regularization (hyper) parameter (to be tuned during the learning process); L (y, ˆy) = max{|y − ˆy| − , 0} is an -insensitive loss function See -insensitive loss function 12 / 25 Comparison of metamodels
  • 24.
    Presentation of threelearning methods SVM SVM is also a error loss with penalization minimization: 1 Basic linear SVM for regression 2 Non linear SVM for regression are the same except that a non linear (fixed) transformation of the inputs is previously made: ϕ(x) ∈ H is used instead of x. 12 / 25 Comparison of metamodels
  • 25.
    Presentation of threelearning methods SVM SVM is also a error loss with penalization minimization: 1 Basic linear SVM for regression 2 Non linear SVM for regression are the same except that a non linear (fixed) transformation of the inputs is previously made: ϕ(x) ∈ H is used instead of x. Kernel trick: in fact, ϕ is never explicit but used through a kernel, K : Rd × Rd → R. This kernel is used for K(xi, xj) = ϕ(xi)T ϕ(xj). 12 / 25 Comparison of metamodels
  • 26.
    Presentation of threelearning methods SVM SVM is also a error loss with penalization minimization: 1 Basic linear SVM for regression 2 Non linear SVM for regression are the same except that a non linear (fixed) transformation of the inputs is previously made: ϕ(x) ∈ H is used instead of x. Kernel trick: in fact, ϕ is never explicit but used through a kernel, K : Rd × Rd → R. This kernel is used for K(xi, xj) = ϕ(xi)T ϕ(xj). Common kernel: Gaussian kernel Kγ(u, v) = e−γ u−v 2 is known to have good theoretical properties both for accuracy and generalization. 12 / 25 Comparison of metamodels
  • 27.
    Presentation of threelearning methods Learning SVM Learning (w, b): w = N i=1 αiK(xi, .) and b are calculated by an exact optimization scheme (quadratic programming). The only step that can be time consumming is the calculation of the kernel matrix: K(xi, xj) for i, j = 1, . . . , N. 13 / 25 Comparison of metamodels
  • 28.
    Presentation of threelearning methods Learning SVM Learning (w, b): w = N i=1 αiK(xi, .) and b are calculated by an exact optimization scheme (quadratic programming). The only step that can be time consumming is the calculation of the kernel matrix: K(xi, xj) for i, j = 1, . . . , N. The resulting ˆΦN is known to be of the form: ˆΦN (x) = N i=1 αiK(xi, x) + b where only a few αi are non zero. The corresponding xi are called support vectors. 13 / 25 Comparison of metamodels
  • 29.
    Presentation of threelearning methods Learning SVM Learning (w, b): w = N i=1 αiK(xi, .) and b are calculated by an exact optimization scheme (quadratic programming). The only step that can be time consumming is the calculation of the kernel matrix: K(xi, xj) for i, j = 1, . . . , N. The resulting ˆΦN is known to be of the form: ˆΦN (x) = N i=1 αiK(xi, x) + b where only a few αi are non zero. The corresponding xi are called support vectors. Tuning of the hyper-parameters, C = 1/λ, and γ: simple validation has been used. To limit waste of time, has not been tuned in our experiments but set to the default value (1) which ensured 0.5N support vectors at most. 13 / 25 Comparison of metamodels
  • 30.
    Presentation of threelearning methods From regression tree to random forest Example of a regression tree | SOCt < 0.095 PH < 7.815 SOCt < 0.025 FR < 130.45 clay < 0.185 SOCt < 0.025 SOCt < 0.145 FR < 108.45 PH < 6.5 4.366 7.100 15.010 8.975 2.685 5.257 26.260 28.070 35.900 59.330 Each split is made such that the two induced subsets have the greatest homogeneity pos- sible. The prediction of a final node is the mean of the Y value of the observations belonging to this node. 14 / 25 Comparison of metamodels
  • 31.
    Presentation of threelearning methods Random forest Basic principle: combination of a large number of under-efficient regression trees (the prediction is the mean prediction of all trees). 15 / 25 Comparison of metamodels
  • 32.
    Presentation of threelearning methods Random forest Basic principle: combination of a large number of under-efficient regression trees (the prediction is the mean prediction of all trees). For each tree, two simplifications of the original method are performed: 1 A given number of observations are randomly chosen among the training set: this subset of the training data set is called in-bag sample whereas the other observations are called out-of-bag and are used to control the error of the tree; 2 For each node of the tree, a given number of variables are randomly chosen among all possible explanatory variables. The best split is then calculated on the basis of these variables and of the chosen observations. The chosen observations are the same for a given tree whereas the variables taken into account change for each split. 15 / 25 Comparison of metamodels
  • 33.
    Presentation of threelearning methods Learning a random forest Random forest are not very sensitive to hyper-parameters (number of observations for each tree, number of variables for each split): the default values have been used. 16 / 25 Comparison of metamodels
  • 34.
    Presentation of threelearning methods Learning a random forest Random forest are not very sensitive to hyper-parameters (number of observations for each tree, number of variables for each split): the default values have been used. The number of trees should be large enough for the mean squared error based on out-of-sample observations to stabilize: 0 100 200 300 400 500 0246810 Error Out−of−bag (training) Test 16 / 25 Comparison of metamodels
  • 35.
    Methodology and results[Villa-Vialaneix et al., 2010] Sommaire 1 Basics about machine learning for regression problems 2 Presentation of three learning methods 3 Methodology and results [Villa-Vialaneix et al., 2010] 17 / 25 Comparison of metamodels
  • 36.
    Methodology and results[Villa-Vialaneix et al., 2010] Data 5 data sets corresponding to different scenarios coming from the geobiological simulator DNDC-EUROPE. S1: Baseline scenario (conventional corn cropping system): 18 794 HSMU S2: like S1 without tillage: 18 830 HSMU S3: like S1 with a limit in N input through manure spreading at 170 kg/ha y: 18 800 HSMU S4: rotation between corn (2y) and catch crop (3y): 40 536 HSMU S5: like S1 with an additional application of fertilizer in winter: 18 658 HSMU 18 / 25 Comparison of metamodels
  • 37.
    Methodology and results[Villa-Vialaneix et al., 2010] Data 11 input variables N FR (N input through fertilization; kg/ha y); N MR (N input through manure spreading; kg/ha y); Nfix (N input from biological fixation; kg/ha y); Nres (N input from root residue; kg/ha y); BD (Bulk Density; g/cm3 ); SOC (Soil organic carbon in topsoil; mass fraction); PH (Soil PH); Clay (Ratio of soil clay content); Rain (Annual precipitation; mm/y); Tmean (Annual mean temperature; C); Nr (Concentration of N in rain; ppm). 18 / 25 Comparison of metamodels
  • 38.
    Methodology and results[Villa-Vialaneix et al., 2010] Data 2 outputs to estimate from the inputs: N2O fluxes; N leaching. 18 / 25 Comparison of metamodels
  • 39.
    Methodology and results[Villa-Vialaneix et al., 2010] Methodology For every data set, every output and every method, 1 The data set has been split into a training set and a test set (on a 80%/20% basis); 19 / 25 Comparison of metamodels
  • 40.
    Methodology and results[Villa-Vialaneix et al., 2010] Methodology For every data set, every output and every method, 1 The data set has been split into a training set and a test set (on a 80%/20% basis); 2 The machine has been learned from the training set (with a full validation process for the hyperparameter tuning); 19 / 25 Comparison of metamodels
  • 41.
    Methodology and results[Villa-Vialaneix et al., 2010] Methodology For every data set, every output and every method, 1 The data set has been split into a training set and a test set (on a 80%/20% basis); 2 The machine has been learned from the training set (with a full validation process for the hyperparameter tuning); 3 The performances have been calculated on the basis of the test set: for the test set, predictions have been made from the inputs and compared to the true outputs. 19 / 25 Comparison of metamodels
  • 42.
    Methodology and results[Villa-Vialaneix et al., 2010] Numerical results MLP RF SVM LM N2O Scenario 1 90.7% 92.3% 91.0% 77.1% Scenario 2 80.3% 85.0% 82.3% 62.2% Scenario 3 85.1% 88.0% 86.7% 74.6% Scenario 4 88.6% 90.5% 86.3% 78.3% Scenario 5 80.6% 84.9% 82.7% 42.5% N leaching Scenario 1 89.7% 93.5% 96.6% 70.3% Scenario 2 91.4% 93.1% 97.0% 67.7% Scenario 3 90.6% 90.4% 96.0% 68.6% Scenario 4 80.5% 86.9% 90.6% 42.5% Scenario 5 86.5% 92.9% 94.5% 71.1% 20 / 25 Comparison of metamodels
  • 43.
    Methodology and results[Villa-Vialaneix et al., 2010] Graphical results Random forest (S1, N2O) q qqqqqqqqq qqq q q q q qq q q q q qq q qq q q qqq qqq q qq qq qq qq q q q qqq q qqq q q q qqq q q qq q q qqq q qq q q qqq q q q qq q q q qq q q qq q q q qq qq q q q qq q q q q q q q q q q q q q q q q qqq qq qqq qq q q q q qq qqqq q q q qq q q q q q q q qq q q q qq q q q q q q q q q q q q q q qq q q q q q q q q qqqq q q qq q q q qq q q q qq q q q q q q q q q qq qq q q q q q q qq q q q q q q qq q q q qq q q q q qq qq q qq q qq q qq qqq q q q q q q qq q q q q q q q q q qqqqq q q q qqqqqq q q q q qq q q q q q q q qq q q q q q q q qqq q q q q q q qqqq q q q q q q q q q qqq q q q q q qq qq q qq q qq q q q q q q q qq q q qqq q q q qqq q q q q q q q q qq qqq qq qq q qq q q qq qq q q q q q q qq q q q qqq qq q q q qq qq q q q qq q q q q qqq q q q q q q q q q q q q q q q q qqqq qq q qqq q q q q qq qq q q qq q q q q q q q q q q q q qq qq q qq q q q q q q q q q q q q q q q q q q qqq qq q q qq q q q q q q q q q q q q q q q qqqq q q q q q q q q qq q q q q qqq qq q q q q q q qqq q q q q q q q q q q q q q q q q q q q qqq qq qq q q q q q qq q qqq q q q qqq q q q q q q qq q q q q q q q qq q q q q q qq q qq q q q q qq q qq q q q q qq q q q qq q q q q q qq q q q qq q q q q q q q q qq qq qq qq q q q q q qqq q qq q q q qq q qq q q qqq q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q qqq q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q qq q qq qq q qq q qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q qq q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q qq q q q q q q q q q qqq q qq q q q q q q q q q q q q q q q q q q q q q q q q qqqq q q q q q qq q qq q q q qqq q qq q q q q q qq q q q q q q q qq q qq qqq q q q qq q q q qq q q q q q q q q q q qq q q q q q q q q q q q qq q q q qq qq q q q qq q q q q q q q q qqq q q qq q q q q q q q q qq q q qq q q q q q q q q q q q q q qq qq q q qq q q qq qq q qq q q qq qqqq q qq qq q qq q q qq q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q qqqqq q q q qq q q q q q q q q q q q q q q q q q q qqq q q q q q qq q q q qq q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q qq q qq q q q q qq q q q q q q q q q qqqq q qq q q q q qq q qq q qq q qq q q q q q q q q q q q q q q q q q qq q q q q q q qqq q q qqq q qqq q qqqqqqq qq q q q q q q qqqq q q q q q q q q q q q q q q q q q q q q qq q qq q q q qqqq q q qqq q q q qq q q q qq q q q qq q q qqqq qq qq q q q q q q q q q q q q q q q q q q q qq q q qq q q qq q q q q q q qqqq q qq q q q q q qq q q q qq q q q q q q qq qq q q q q q q q qq q qq q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q qq q q q qq qq qq q q q q qq q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq qq q q q qq q q q q qq q q q q q q q q q q qq q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q qq q q q qq qq q q q qq q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q qq q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q qq q q q qq q qq q qq q q qq q q q q q q q q q q q q qq q qq qq q q qq q q q q q q q q q q q q q qqq q qq q qqqq qqq q q qq q q q qq q qq q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q qq qq q q q q q q qq q q q q q q q q q q q q q qq qq q qq q q qq qq q qq qqqq qqq qqqqqqq q q qq q qq q q q q q qqqqq q q q q qq qq q q q q qqqqqq q q q qqq qqq q q qqqq q q q qq qqq q qq q q qq q q q q q qq qqq q qqq q q q qqqqq q q qq qqq q q q q qq q q q q q q q q qq q qqqq q q qqqqqq qqqq qq qq q q qqqqqqqqq qq qq q q q qqqqq q q qqq q q qqq q q qqq q qq q qqq qqqq q q q qqqqq q qq qqqqqqqqq qq q q qqqqqq qqqq qq q qq qq q q q q qq q qq q qqq qqq qqq q qqqqqq q q q qqqq q qq q qq qq q q qq q qqqqq qq q q q q qqqqq q q qq q q q q q qqqq q q qqqq q qq qqqq qq qq q q qq q q q qq q qq q qq q qq q q q q qq q qq qq q q q q q q q q q q q q q q qq qqqqqqqq q q qq q q q qq qqqq q q q q q q q q qq q q qq q q q q q q q qq qq q q qq q q qq qq q q q q q qq q q q q q q q qq q qqq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q qq q q q q qq q q q q qq qq q qq q q q q qqqq q q q q qq q q qqqqq q q qq q q q qq q qq qqq q q qq q q q q qqq q qqq q q q q q qqqq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q qq q qq q q q qq qq q q qqq q q qq qq q q q q q q q q q qq q qq q q q q q q qq q qq q q q q q q q q qq q q q q qqqq q q q q q q qq q q q q q q q q q q q qq q qq q q q q q q q qq q q q q q qqq q q q qqq q q q q q q q q q q q q q q qq q q qq qq q q q q qq q q q qq q qq q q q qqq q qqqq q q q qq qq qq q q q qqq q q qq qq q q q q qq q q q qq qqq q qqq q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q qqqq qqq qq q q q qq q qqqqqqqqqq q qqq q q q q q qq q qqq q q q q q q q q q q q q q q q q q qq q qq q qq q q q q q q q q qqq qq qq q q q q q q q q q q q qq q qq q q q qq q q q q qqq q qq q q q q q q q q qq q q q q q q q q q q qq q q qq q q q qq q q qq q q q q qqqq q q q qq q q q q q qq q q qqqqqqq qq q q qq qq q qq qq q qq q q q q qqq q q q qq qq qq q q q q q q q qq q q q q qq q qq q q q q q q qq q q q qq q q q qqq q q q qq q qq q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q qq q q q q qq q q q q q q q qq q q q qq qqqqqqq q qq q qqqq q q qq q q q qq q q q qq q q q q q q q qq q q q q q q q q qqq q q qq q q qq q qq qq q q q q qq qq q q qq q q q qqq qq qq q q qqq q q q q q q q q q q q q q q q q q qqq q qqq q q q q q q qq q q q q qq q qq q q q qq q q q q q q q qq q q qqqqqq q qq q q q q q qq q qqq q q q q q q q q qqqq q q q qq q q qq q q q qq q q q qq q q q q q q qqq q q q q q q q q q q q qq q qq q q q q q q q q q qqqq qq q q q q q q q qq q q q qq q qq q q q q q q q q q q qq q qq q qq q q q q q qq q q q qq q q q qqq q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q qqq q q q q q qq q q q q q q qq q q q q q q q qq q q q q q q q q qq q q q q qq q q qq q qq q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q qqqq qq q q q q q q q q q q q qqq q q q qq q qqq q q q q q qq q qq q q q q q q q qqq q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q qqq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq qq qqq q q qqq q q qq qq q q qq q q qq q q q qq q q q qq q q q q q qq qq qqqqqq qqq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q qq q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q qqq q q qq qq q q q qq q q qq qq q q q qq q q q q q qq q q q q q q q q q qqq q q qq q q qq qq qq q q q q q qq q q q qq q q q q q q q q q q q q q q q q q q qq q qq qq q q q qq q q q qq q q q q q q q qq q q q qqqq q qq q q q q q q q q q q q q qqq q q q q q q q q q q q q qq qq q q qq q q q q q q qq q q q q q q q q q q q qqqqq qq q q qq q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q qq qq q q q qq q q qq q q q q q q qq q q q q q qq q qq q q qq q q q q q q q qq qq q qq qq qq q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q qqq q q q q q qq q q q qq q qq q q qq q qqq q qq q q qqq q qq qq q q q qq q q q q q q q q q q q qqqq q q q qq q q q qq q q q q q q q qq qq q qqq q q q q q qq q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q qq q qq q q qqq q qq qq q q q q q qq qq q q q qq q q q q q q q q qq q q q q q qqq qq q q q qq q q q q qq qqqqqq q qq q q q q q q qqq qq qqq qqqqqq q q q q q q q qq q qq q qq q qq q q qqq q qqq q q q q q q q q q q q q q qq qqqq q q qq q q qqqq q qq q q qq qqq q qqqq qq q qq q qqq q qq qq qqq qqq qq q qqq q qq qq qq q q qq qq q qq qq q q qqq q q q q q q q q q q q q q q q q qqqqqq qq q q qq q qqq qq q q q qqqqq qq qqq qqqq q qqq q q q qqq qqq q q q qq qq qqq q qq qq q q q q q q q q qqq q qq q q qq qq q q q q q q q qqq qq qqqqq qq qqqqqqq qqq q q q q q qqq qqq qq q qqq qqq q q q q qqqq qq q qq q qq q q q q qqqqq qqq q qqq q qq q qqqqq qqq q q qq qq qq q qq q q q qq q qqq q q q q q q q qq q qqqq q qq q q qq q q q q qq qq qq q qqq qq q q q q q qq q q qqq q qq q q q qq q q q q q q q q qq q qq qq q q q qq qq qqq q q q q q q qq q qq qq qqqqq qqq q qqqqq qq qq q qqq qqq qqqq q q q qq q qq q q q q q q q qq q q q q q q qq q qqqq q qq qqq q q q qq q q q q q q q qq q qq q q q qq q q qq q q q q q qq qq qq qqqq q q q qq q q q qq qq qq q q q q q q q q qqq q q qq q qq q qqqq q q qqqq qqqqq qq qq q q q q qqq qq qq q q q q q q qqq q qqq q q qq qqqqqqqq q qqq qq q q q q qqqq q q qq q qqq q qqqqq qqqq q q q qq qqq q qqqqqqq qqqqq q qqq q qq q qq q q q qqq q qqq qqq qqq q q qqq q q q qq q q q qq q q q qqqqqqq q qq qq q qq q qq q q q q q q q q q qqq q q qqqq qqqqqqqqqqqqqqqqqqqqq qqqq q q q qq qqqqq q q qq qqq qqq q q q q q qq q q q q qq q q q q q qqq q q qq qq q qq q qq qqq q qqqq qq q q q qqqq q q q qq q qqqq q q q q qq q qq qqqqqq q qqqqqq qqqqq q q qq qq q q qqqqq q qq q qq qq qqqq qq q qq qq qqq q qqq q qqqq q qq q q qqqq qq q q qq q q q q q qqqqqqqqqqqqq qq q qq q qq q qqqq q qq q q q qqq qq q qqqqq q q qqqqq qq qqq qq qq q qq q q q q q qq qq qqq qq q qqq q q q qq q q qqqq qqqqqq q qqq qq qqqq qqq qq q qqqq qq q q q qqq q q q qqqq q q q q qqqq qqqqq q q q qq q qqqq q q q q q q qqqqq q q q q q q q q qqq q qq q qq q q qq q q q qqq qqqq qqq q qq qqq qqqqqq q q q q q q q qq q q q q q qq q q qqq q q q qq qqqqqqqqqqq qq q qq q qq qqqqq q qqqq q q q q q q qqq q q q qq qqqq q q q q qqqqqqq q q q q q qqqqq q q qq qqq q qqq q qq q q q q qq qqqqqqqq q q q qq q qq q qq qq qq q q q q q q q qq qqqqq q qq qqqq q q qq qqq q qq q qqqqqq q q q q q q qqq q q q q q qq qq q qq q qqqqqqqqqqqqq qq qqq qq qq q q q qqqq qq qq q qq qqq q q q q q qqq qq qqq qq q qqq q q qqqqqq q q q qqq q q q q q qqqqq qqq qq q q q q qq qqqqq q q q qqq q q q q q q qqqq qqq q q qqqqq q qqqqqqqqq q q q qq q q q qqqqq qq q qq qqqq qq q qqqqq q qq q qq q qq qq qq q qq qq q qq q q qqqq q qqqq qq q qq qq qqq q qqqq q qq qqq qq q q q q qqq qqq q q qqqqq q q q q q q q qq q q q qq q q q q q qq q q q qqqq q qqq q q q q qqq qqq q q qqqqq q qqqqqqqqqqq qqq qq q q q q qq qqq q q q q q qqqq q q qqqq q qqqq q qq qq q qq q q q qqqqqqqq qq qq q q qqqq q q q q q q q qqqq qq qqqqq q q q q q q q qq q qqqqqqq q qq q q qq qq qq qq qq q qqq q q q qq q qq q q q qq q q q q q q q q q qq q q q q q q q qq qq q q qq q q q q q q q q q q qqqqq qqq qq q qqq q qqqq q q q q qqq q qq qqq q q qqqqqqqq qqqqqq q q qqq qqq qq q q qq q qqqq qq qqq qqq q qq qq qq q qq q q qq q q q q qq qqq q q qq q qqq q qqq qq q qqq qq qqqq qqq q q q qq qq qqq q qq q q qqqqqqqqqqqq qq q q q qqqqqqq qq q qqqq qqq q q q qq q qq q qqqqq qqqqq qqqqqq q qqqqqqq q q q qqqqqqq q qqq q q qq qq qq qq qq qq qqqqq qqqq q q qq q qqqqq qq q q qq qq q qq qq q q qqq q q q q q q qqq q q q q q q q qq qq qqqq qqq q q q qqq qq q q qq q qq qq q qq q q qqq q qq q qq qq qq q q q qqqq q q qqq qqq qqq q qq q q qq q qq q q q q q q q q q q q q q qqqq qqqqq q q q qq q q q qq q qq qq qq q qq q q q qq qqqqq q q qq q qq qq q qqqq q q q q qq q q qq q qqq qq qq qq q q qq qq q q q q q q q q qqq qq qqqqq qqq qq qq q q qqq qqqqq q q q q q qqqqqqqqqqqq qqq q qq qq qq q q q q qq q qq q q q q q qq q q qqq qq qq qq q q qq q q q qq q q qq q q q qq q q q q qq qq qqqq q q q q q q qqqqqqq qqq q q q q q qqqq q q q q q qq qqq q q q q q q qqq q q q q q q qq q q qq qqq q q qq q qqq qq qqq q q qqqqq qqqqqq qqq qq qqqqqq q qqqqqqq q qqqqqq qq qqq qqqqqqqqq q q qq q q q qqqq qqq q qq qqqq q q q qqq q q qq qq qqq qqq qq qqqq q qq qqqq qq q qq q q q q qq qqq q qq qq q q q q qq q qqqqqqqq q q qqq qq q q qqq qqqqq q q qqq q q q qq q qqq qqq q qqq q q q q q q q qqq q q qqqqq q q q q q q q qqq q q q qqq q qqq q q q q q q qq qqq q q q q qqq q q q qq q q qq q q q q q qq qq q q q q q q q q q q q q q qq q qq q q q qq q qqq qqqq qqq q q q q qq qq q q q q q q q q q q q qqqq q q qq qq q qq q qq q q q qqq q qqqq q qqq q qq q q qqqqq q q q q qqq qqqqqqqq q qqq qq q q q q q qq q qq q q qq q q q qqq q q q qq qq q q q q q q qq q q q qq qq q q q qqq qqq q q qqqq q q qq q q q q q q q qq qq q q q qqq qqq q q qqq qq qq qq q q q q q q q qq q q qqq q qq q qqq q q q qq q q q q q q q q q qq q qq q q q q q qq q q q qqqqq q qqq qqq qq qqq q q q q q q qqqq q q q q qqqqqqqq qqq qqq qqqqqqq qq q q q q qq q q q q qq q q q q q qqq q q q q qq q qqq q q qq q q q qqqq q qqq q qq qq q qqqqqq q qq qqqqqqqq q q qqqq q q q q q qqqq q qq q qqq qq q qqqq q q q q q qq qq qqqq q q q q q q q q q q qq q q q qqq q qq q q qq qqqqq q q q qq q q q qqq qqq qqqqqq q q q q q q q qqqqq q q qqq qqq q qqq qq qqq qq q q qqq qq q q qq qq q qqq qqq q qq q q q q qqqqq qq q qqqqqq q q q q qqq qqq q q qqq q q qq q q qqq q q qq qq q q q q q q qqq q q q q qq q qqq qq q q q q q qq q qq qq qqq qqq q q qq q q q q q q q q qq q qq q q q q qq q q q q q q q q qq q q q q qq q q qq qqqqqq qqq q qqqqqq q q qq qqqqq qq q q q q q qqq q q q qq q qq q q qq q qq q q q q qq q qqqq q q qqq qq q q q q q q q q qqq q q q q q qq qq q qqq q qqq q q qq q q qqqqq q q qq qq q qqqqqq q q q qq qq qqqqq qq q q q qq q q q qq q qq qqq q q q qq q q q q q q q q qq q q qqqq q q q q qqq q qq qq qq q qqq q qq q q q q q q qqq q qq q q q q q qq qqq qq qq qqqq qqqqqqq qq q qqq qqq q q qq q q q q q qq qq q q q qq q qqqq qq q q q q q q q qq q q qq q q q q qqqqq q q q q q q qqq qq q qq q q q q q q q qqq q qq q q q q q q q q qqq q q q q qq qq q q q q q q q q q qq q qqq q q qqqq q q q q q qq q q q qq q q q q q qq q q qqq q qq q qqq q q q qq qqq q qq q qqq q q q q q q q q qq qq qqq q q q qq q q qqq q q qq q qq qqqqqqq q qqq qqq q q q q q q q qq q q q q q qqqq q qq q qq qq qq q q q q qq q q qq q q q q q q qq qq q q qqqqq q q qqqqq q q qq qq qqqqqqq q q q qqqq q q qq q q q q q q q q q q qqqqq qqqq q q qq qqqq qq qq q qq q qq qq q q q qqq q q q q q q q q q q qq q q q q q q qq q q qq q q qqqqqqq q qq q q q qq q q qqq q qqqq q q q q qq qq qqqq q q q q q q q q q q q q q q q qq q q q qqqqq q q qqq q q q q qqq q qq qqqqq q qq q qq q q q qqqqqqq q qq q qq q q q qqqq q qq q q q qqq q qqq q qqq q q q qq q q q q q q qqqqqqq q q q q q q q q qqq q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q qq q q qq q q qq q qq q q q qq qq q qq q q qq q q q q q q q q qq q q q q qq q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q qqq q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq qq q q qq q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q qq qq q q q q q qqqq q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q qqq q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q qq q q qq q q q qq q q q q q q q q qq q qq q qq q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q qq q qq qq qqqq q q qq qq q q q q q qq q q qq q q q qq qq q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q qqq q qq q q q q q q q q q q q qq q q q q qq q qq q q qq q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q qq q q qq qqq qqqqq q q q q q q q q q q q q q qqq q qqq q q q q q q q q q q q q q q q q q q q q q qq qq q q qqq qq q qq q q q q q q q q qq q q q q q q qq qqq qq q qqq qqqq q qq qq q q q q q q q q q q q q q qq q qqq q qq qq q q q q qqq qq qqq qqqqqqqqq q q q qq qqq q q q q qqq q qq q q qqq q qq q q q q q q q q q q q qq qq qq qq q qqq qqq q q q q qqq q q q q q q q q q q q q q q q q qqq q qq q q qq q q q qq qq q q q q q q q q q q q q q q q q q q q qq qqq q qqqq q qqqqq q qqq q q q q q q q q q qq q q q q q q q q qq qq qq q q q q qqq q qqqqqqq q q q q q q qq q q q qq q q q q qq q q q q q q q q q q q q q qq q q q q qq q qq q q q q qq q q q q q q q qqq q q qqq q qq q qq q q q q qqq q q q qq q qq q qq q q qq q qq q q qq q q q q q qq q q qq q qq q q qq q q q q qqqq q q qqqqqq q q q q q q q q q q q qqq q qq q q qq q q qqq q q q q q q q qq q q q q q q q q qq q q q qq q q q q qqq q q q q q q q q qqq q qqq q q q qq q qq q q q q q qq q q q q q q q q q q qqqqq q qq q q q q q qq q q q q q q q qq q qq q q q q q qq q q q qq q q q q q q q q q q q q q qqq q qq q q q qq qq q qq q q qqqq q q qq q qq qqq qq q q q q q qqq q qqq q q q q qqqqq q qq q q q q q q q q q qqqqq qq q q q q q qq q q q q qq q q qqq q qqqqqqqqqq q q q q q q qq qq q qqqqqq q q q qq qqqqq q qqqq q qq q q q qqq qqqqqqqq qq q qqqqqqqqqqq q qqq q q qq q q q q q qqqqqq q q q qq q q q q q q q qqq q qq q q q qqq q qq q qqq q qqq q qqq q qqq q q q q q q qq q qqqq q q q q qq q qqqqqqqqqqqqqq qq q q qqq q q q qqq qq qqq qq q q q q q qq q q q q qq q qq qq q q q qq q q q q qq qq q q q q q qqq q q q q q qq q q q q q q qq q q q q q q q q q qqq q q q q q q q q qqq q qqq qqqqqq q q qq q q q qq q q q q q q q q q q q q q q q q qq q q q qqq q q q q q q q qqq q q q qqq q q q q q q q q qq q q q q q q qq qqqqq q q qq q qq qq qqq q qq q q q q q q qq qqq q q qqqqq qq q q qq qqqq q qq q q q q q q q q q q q q q qq qq qq q q qqqq q q q q q q qq qq q q qq q q q q q q q q q q qq q q q q qq q q qqq q q q q qqq q q qq q q q qq qq qq q q qq q q q q qq q q q q q q q q qq q qqq q q q q q qqqq q qq q q q q q q qq qq q q qq q q q qq qqq q q q q q q q q q q q q q qq q q qq q q q q qq q q q qq q q q q q q qq q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q qqq q q q q qq q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q qqq q q qq qq q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq qq q q qq q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q qqqq qqqq q qq q q qq q q q q q q q q qq q q q q q q qq q q q q q q qqq q qq q q q q q qq q q q qqqq q q q q q q q q q q q q q q q qq q q q q q q qq q q qqqqq q q qqq q q q q q qq qq q q q q q qq q q q q q q qqq qq qq q q qq q q q qqqqq q q qq qq qq q qqqqq q q qqq q q q q qq q q q q qq q q q q q q q q q q q q q qq q q q qq q qq q qq q q qq q q q q q q qq qq q q q q q qqqqq q qq q qqqq qq q qq q q q q q q qq q q qq qq q q q q q q q q qqq q q qq qq q qq q q q q q qq q q q qq q q q qq q q q q q qq qq q q q q q qqqq q q q qq q q q q q q q q q q q q q q q q q qq q qqq qq qqqqqq qq qq q q qq q q qq q q qq q q qq q q q q q q qq q q q q q q q q q q q q q q q q qqqq q qqq qqq qq q q q q q q q q q q q q qqqqqqq q q q q q q q q q qqqqqq qqq q qqq q q q q qq q q q q qqq qq q q q qq q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q qq q q q q q q q q q qq q qq q q q q qq q q q qq q qq q q q q qqq q q qq qqq q qqqqq q q q qq q q q q q q q q q q q q q q q q q q q q q q q qqqq qq q q qq q q qq q qq q q q q q qq q q q qqqqq q qq q q q qq q q q q q q q qq q q q q q q q q qq q q q qq q q q q q q qq q qq q q q q q q qq q q q qq q qq q q q q q q q q q q q qq qqq q q q q qq q qqqq q q q q q q q q q q q qq q q q q qq q q q qqq q q q q q qq q q q q qq qqqq q q q q q qq qq q q q q q q qq qqq q q q q q q q q q q q qq q q q q q q q q q q qqqqq q q q q qq q qq q q q qq q q q q q q q q q q q q q q q q q q q q qqqqqqq q q q q qq q qqq qqq qqq q q q q q qq q q q q q q q q q q qq q q q q q q q q qq q q qqq qqq q q q qq q q q q q q qq q q q q q q q qqq qq q q q q q q q qqq q q q q q q q q q q q q q q q qq q qq q q q q q q q qqq q q q qqq q q q q qq q q q q q q q qqq q q q q qq q q q qq q q q q q q q qq q qq q q q qq qq q qq q q q q q q q q q q q q qq qq qq q q q q qq qq qqqq qq q q q q qqq q q q qqq qq q q q q qq q qqqqq q q q qq q qq q q q qq q q q q q qq q qq q q q q q q qq q q q q q qq q qq q q q q qq q q q qq q q q qq q q q q q qq q q q q q q qq q q q q qqq q q q q q q q q qq q q q qq q qq q q q q q q qq q qq q q q q qq qq q q q q q q q qq q qq q q q qq q q q qqq q q q qqqq q q q q qqq q q q qq q q q qq q qq q q q q q q qqqq q q q q qqqqq qq q q qqqqq qq q q q q q q q q qq q qqq q q q q q q q qq q qqqqq q q q q qq q q q qq q q q q q q q q q q q q q q qq q qq q q q q q q qq qq q q qq q qq q q q qq qq qqq q qq q qq q qqq q qq q q q q q q q qq qqqqq qq q qq q q q q q qqq q qq q q q q q qq q q qqq q qq q q q q q q q q q q q q qqq qqq q q q q q qq qq q q q q q q qq qqq q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q qq q qq q q q q q q qqqq q q q q q q q q q qqq q q qq q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q qq q q q q q qqq qq q q qq q q q q q qq qqq q q q q q q qq qq q qq q q q q q qq q q q q q qqq q q q q q q q q q q q q qq q q q q q q qq q qqq q q q q qq qqq q q q q q q q q q qq q q q q q q qq q q q q q qq qqq q q q q qq qqq qq q q q q q q qq q qqq qq q q q q qq q q q q q q q q q q q qq q q qq q q q qq q q q q q q q q q qq q qq q q q q q q q q q q qq q q q q qqq q q q q q qq q q qq q q q qq q q q q q q q q q qq qq q q qqq q q q q qqq q q q q q q qq q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq qqq qq q q qq q q q q q q q qq q q q q q q qq q q q qq q q q q qqq q q q q q q q q q q q q q q q q qq q qq q q q q q q qq q q qq q q q q q q qq q q q q q q q q q q q q q qq q qq q q qqqqqqqq q qq q q q q qqq qq q qqqqq q q qqq q q qq q q q q q qq qq q q q q q q q qq q qq q q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q qqqq q qq q q qq q q q q q q q q q qqqqqq qq q qq q qqq q qq qq q q q q q qqqq q qq q qqq q q q qq q q q q q q qqq qq q qq q q q q q q q q qq qqq qq q q qq qq q q q q qq q q qq q q q q q qq q q q q q qqqq qq q q q qq q q q q q qqq q q qqq q q q q qq q qq q qq qq q q q q q qqqq q q qq qqq qq q q q q qq q q q q q q q qqq q q q q qq q q qq q qqqqqqqq qq q q q qq q q q q q q q qq q qq q q q q q q q q q qq qq q q qqq q q q q q qq q q q q q q q q q q q q q q q q q qq q qqq q qq q q q q qq q q q q q q q q q q q qq q q qq q q q qq q q q qq q qqqqq q q q q qq q q q q q q q qq q q q qq q qq q q q q q q qq qqqqq qq q qq qq q qq q qq qq q q q q qq q q q q q qqq q qq q q q qqq q q qq q q q qq qq qqqq q qqq q q qq q q qq q q q q q q qq qq q q qq q q q qq q q q qq q q q q q q q q q q q q q q q q q q qqq q q qq q qqqq q q qq q q qqq q q q q q qq qqqq qqq q q q q q q q q q q q q q q qqq q qq q q q q q q q qq q qq q q q qq q qqqq q qq q q q q q qq q q q q q q q q qq q q q q q q q q q q q qqq q q qq q qqqqqqq q q q qqq qqq q q q q qq q q q q qqqq q qq q q q qqq qqqq qq qq qq q qq q qq q q q qq q qq qq q q q qqq q qqq q qq q q q q q q qq qq qq q q q q q qq q q q q q q q qqq q qqqqq q q q qq q qqqq qq q qq q q q q qq qq q qq q q q q q q q q q q q q q qq q qq q q q qq q q q qq q q q q q q q q q qq q q q q q q q q q q qqqq qqq q qq q q q q q q qqq q qqqqq q q q q q q q q q qq q qqq q qq qq q q q qq qqqqq qq q q q q q q qq qqqq q q qq qqqq q qq q qqqq q qq q q q q qqqqqqqqqqqq qq qq qqqq qq qqqq qq qqqq q q qqqq qq q q q q q qqq qqqqq qq q q q q q q q q q q q qqqqq qqq q q qqqqqqq qqqq q qqqq q q qq q qqq q q q q q q q qqq qqq qqq q q q qq q q q q q q q q q q q q q qq q q q q qqq q q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qqqqq q q q q q q qq q qqq q q q q q q q q q q q q qqq qqqqq q q q q q q q q q q qq q qqq q qq q q q q qq q q q q q q q q q qq q qq q q qq q q q q q q q qq q q q qqq q q q q q q q q q q q qq q q qqqq qqqqq q q qq q qqq q q q q q qqq qq q q q q qqqqqqqq qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q qq q q qqq qq q q qqqqq q q q qq q q q q q q qq q qqq q q q q q q q q qqq qq qq q q q q q q q qq q qq q q q q q q q qqq q q q q q q q q q q q q q qqq qq q qqq q q q q q q q q q q q q q q qq q q q q q q q q q q q qqqqqqq qq qqq q q q q q q q qq q q qqq qq q q qq qq qqq q q q q q q q qq q q q q q q q qqq q q q q q q qq q q q qq q q q q q q qq qq q qqqq q qq q q qq q q q q q q q q q q q q q q q q q q q q q q qqq q qq q q q q q q q q qq q q qq q q qq qqq q q q qq q q q qqq qq qq q q q q q qqq q q q q q q q q q q qq q q qq q q q qqq q q q q q q q q q q qq q qq q q q q q qq qq qqq q q q q q q q q qq q q q q q q q q q q q q q q qq q qq q q q q q q qq qqq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q qq q q q qq q qq q qq q q q qqq q q q q q q q q q q q q q q q qq q q q q q q q qq q q q qq q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q qq qq q q q q q qqqq q q qq q q q q q qqq q qqqqq q qqqq q q qqq q qqqqq q q qq q q q q qq q q q q qqq q q q qq q q q qq q q q qqq q q qq q q q qq qqqqq q qq q q q q qqq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q qq q q qq q q qq q q q q q q q q q q q q qq q qq q q q q q q q q q q q qq q qqq q q qq q q qqqqqqq q q qq q qq qq q qq qqqqqqqqqqq q qqqq q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q qq q qqq q q q q q q q q q q q q q q q q q qq qq q qq qqqqqq q qq q q q q qq q qqqq q q q q qq qqq q q q q q q q q q qq q qq q q q q q qq qq qq q q q q q qqqq q q qq q qq q qq q q q qq q qq q qq q q q q q qqq qqq qq qq qq q qqqqqqq q q qq q q qq q q qqq qq qq q q q q q q q q q q 0 20 40 60 80 010305070 True value (train) Predictedvalue(train) qq q q q q qq q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q qq qq q qq q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q qq q q qq q q q q q q q q qqqq q q qq q q q q q qq q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qqq qqq q q q qq q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q qq q q q q qqq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq qqq q q q q q q q q q q q q q q qqq q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q qq q q qq q qq q q q qq qqq q q q q q q q qqq q q q q qq q q q q q q q q qq q q q q qq qqq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q qq q q q q qq qq q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q qq qq q q qq qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q qq q q q q q q q q q q qq q q q qqq q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q qq q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q qqq q q q q qqq q q q q q q q q q qq q q q qq q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q qq q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q qqq q qq q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q qqq q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q qq q q q q q q q q q q q q q q q q qqq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q qq q q q q q q q q q q qqq q q q qq q qq q q q q q q qq q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q qqq q q q q q q q q qq q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q qqq q q q qq q qqqq q q q q q q q q q qq q q q q q qq q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q qq q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q qqq q q qq q q q q q q q q q q q q q q q q qq qqq q q q q q qq q q q q q q qq q q q q q q q q qqqq q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q qqqq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qqq q q q qq q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qqqq q q q q q q q q q q q q q q qq q q q q q q q q q q qq qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q qq q qq q q q qq q qq qqq q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q qq qq q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q qq qq q q q q q q qq qqqq q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q qq q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q 0 10 20 30 40 50 60 70 0103050 True value (test) Predictedvalue(test)Linear model (S1, N2O) 0 20 40 60 80 01020304050 True value (train) Predictedvalue(train) 0 10 20 30 40 50 60 70 01020304050 True value (test) Predictedvalue(test) 21 / 25 Comparison of metamodels
  • 44.
    Methodology and results[Villa-Vialaneix et al., 2010] Further comparisons Time for training Time for using once the model is trained RF + (∼ 15 minutes) = (∼ 5 seconds) SVM = - (∼ 20 seconds) sensitive to N MLP - + (∼ 1 second) sensitive to d Times correspond to the prediction for about 20 000 HSMU on a Dell Latitude D830, Intel Core 2DUO 2.2GHz, 2GO RAM, OS Ubuntu Linux 9.10 (and, for RF, to the training of about 15 000 HSMU) with R. Time for DNDC: around 200 hours by using a desktop computer and around 2 days by using cluster 7! 22 / 25 Comparison of metamodels
  • 45.
    Methodology and results[Villa-Vialaneix et al., 2010] Understanding which inputs are impor- tant Importance: A measure to estimate the importance of the input variables can be defined by: for a given input variable randomly permute the input values and calculate the prediction from this new randomly permutated inputs; compare the accuracy of these predictions to accuracy of the predictions obtained with the true inputs: the increase of mean squared error is called the importance. 23 / 25 Comparison of metamodels
  • 46.
    Methodology and results[Villa-Vialaneix et al., 2010] Understanding which inputs are impor- tant Importance: A measure to estimate the importance of the input variables can be defined by: for a given input variable randomly permute the input values and calculate the prediction from this new randomly permutated inputs; compare the accuracy of these predictions to accuracy of the predictions obtained with the true inputs: the increase of mean squared error is called the importance. This comparison is made on the basis of data that are not used to define the machine, either the validation set or the out-of-bag observations. 23 / 25 Comparison of metamodels
  • 47.
    Methodology and results[Villa-Vialaneix et al., 2010] Understanding which inputs are impor- tant Example (S1, N2O, RF): q q q q q q q q q q q 1 2 3 4 5 6 7 8 9 10 11 12 51015202530 Rank Importance(meandecreasemse) SOC PH Nr N_MR Nfix N_FR clay Nres TmeanBD rain The variables SOC and PH are the most important for accurate predictions. 23 / 25 Comparison of metamodels
  • 48.
    Methodology and results[Villa-Vialaneix et al., 2010] Understanding the relation between a given input and the output Evolution of N2O in function of the value of SOC 0.00 0.05 0.10 0.15 0.20 0.25 510152025 SOC N2Opredicted 24 / 25 Comparison of metamodels
  • 49.
    Bishop, C. (1995). NeuralNetworks for Pattern Recognition. Oxford University Press, New York. Boser, B., Guyon, I., and Vapnik, V. (1992). A training algorithm for optimal margin classifiers. In 5th annual ACM Workshop on COLT, pages 144–152. D. Haussler Editor, ACM Press. Breiman, L. (2001). Random forests. Machine Learning, 45(1):5–32. Vapnik, V. (1995). The Nature of Statistical Learning Theory. Springer Verlag, New York. Villa-Vialaneix, N., Follador, M., and Leip, A. (2010). A comparison of three learning methods to predict N2O fluxes and N leaching. Submitted to MASHS 2010 http://mashs2010.free.fr. 25 / 25 Comparison of metamodels
  • 50.
    Methodology and results[Villa-Vialaneix et al., 2010] -insensitive loss function Go back 25 / 25 Comparison of metamodels