SlideShare a Scribd company logo
Estimation of the Latent Signals for Consensus
Across Multiple Ranked Lists using Convex
Optimisation
Luca Vitale, Michael G. Schimek
Department of Economics and Statistics, Univerist`a degli studi di Salerno, Italy
and
Institute for Medical Informatics, Statistics and Documentation, Medical University
of Graz, Austria
SimStat
Salzburg, 2-6 September, 2019
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
The data-analytic problem
We have access only to the rankings of objects, not to the data
that informed the assessors’ decisions that led to those
rankings
p = 5 (# of objects) n = 4 (# of assessors)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
What do we aim at?
Let us have a set of p distinct objects
Let us assume n independent assessments assigning
rank positions to the same set of objects
These objects are ranked between 1 and p, without ties
Our aim is estimation of those signal parameters
which determine the realized rank assignments
Practical Problem: Probabilistic models, especially of
Bayesian type, require computationally highly demanding
stochastic optimisation techniques
Consequence: Only rather small sets of objects can be
handled and the majority of models does not allow to solve
p n-problems
Please note: rank aggregation is a different task under the
assumption that the observed rankings are ’correct’
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Necessary assumptions for the proposed method
overcoming current limitations
We assume
Xij = θtrue
i + Zij,
where the real-valued parameters θtrue
i are to be
estimated, and the Zij’s are arbitrary random variables (the
object-specific noise of each assessor)
The parameters θtrue
i represent the (normalised) ‘true’
consensus signals underlying the assessments
Random variables X1j, . . . , Xpj are observed by the jth
assessor
These random variables are ordered Xπ1,j > . . . > Xπp,j
and define the ranked list produced by the j-th assessor
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Proposed method for signal estimation
The goal of consensus across the assessors (rankers) is
achieved by the reduction of the global rank order
induced noise Z
The minimum noise result is used for indirect inference to
estimate the unobserved signals θi informing the
consensus ranks
The solution is obtained by convex optimisation
techniques
Two types of penalisation are applied: linear and
quadratic, where b denotes the penalty parameter
We consider two different approaches for handling the
necessary constraints: a full method and a reduced
method, the latter for higher computational efficiency
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Convex optimisation
min
x
cT x
s.t. Ax ≤ b
x ≥ 0
(1)
Where:
c is a real t-dimensional vector, where t equals the number
of variables
A is a m × t dimensional real matrix, where m is the
number of constraints
b is a m-dimensional real vector and represents the
penalisation for each constraint
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Software used for convex optimisation
Gurobi1 is powerful mathematical programming solver available
for Linear Programming (LP) and Quadratic Programming (QP)
problems
In order to solve the optimisation problem, the simplex method
is used.
1
Gurobi Optimizer Reference Manual
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - example for construction of constraints
Constraints for one item swap:
assessor 1
θ3 + z(3,1) − θ2 − z(2,1) ≥ b
θ3 + z(3,1) − θ1 − z(1,1) ≥ b
θ2 + z(2,1) − θ1 − z(1,1) ≥ b
assessor 2
θ3 + z(3,2) − θ1 − z(1,2) ≥ b
θ3 + z(3,2) − θ2 − z(2,2) ≥ b
θ1 + z(1,2) − θ2 − z(1,2) ≥ b
# of variables: n × p + p # of constraints: n × (p−1)p
2
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
The objective function
We consider two kinds of minimisation:
Linear optimisation (LP) based on the sum of the support
variables z(i,j)
p
i=1
n
j=1
z(i,j)
Quadratic optimisation (QP) based on the sum of the
squared support variables z(i,j)
p
i=1
n
j=1
z(i,j)
2
The minimisation of the objective function permits an
automatic adaptation of the individual signals θ towards
their consensus signal ˆθ that represents the observed
individual rankings in an optimal way
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full method - general formulation (linear case)
minimize
x
p
i=1
n
j=1
z(i,j)
subject to θπ(1,j) + z(π(1,j),j) − θπ(2,j) − z(π(2,j),j) ≥ b
θπ(1,j) + z(π(1,j),j) − θπ(3,j) − z(π(3,j),j) ≥ b
...
θπ(1,j) + z(π(1,j),j) − θπ(p,j) − z(π(p,j),j) ≥ b
− − − −−
θπ(2,j) + z(π(2,j),j) − θπ(3,j) − z(π(3,j),j) ≥ b
θπ(2,j) + z(π(2,j),j) − θπ(4,j) − z(π(4,j),j) ≥ b
...
θπ(2,j) + z(π(2,j),1) − θπ(p,j) − z(π(p,j),j) ≥ b
− − − − − − − − − − − − − − − −−
...
− − − − − − − − − − − − − − − −−
θi ≥ 0 i = 1, . . . , p
z(i,j) ≥ 0 i = 1, . . . , p, j = 1, . . . , n
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Reduced method - example for construction of
constraints
Constraints for one item swap:
assessor 1
θ3 + z(1,1) − θ2 ≥ b
θ2 + z(2,1) − θ1 ≥ b
assessor 2
θ3 + z(1,2) − θ1 ≥ b
θ1 + z(2,2) − θ2 ≥ b
# of variables: n × (p − 1) + p # of constraints: n × (p − 1)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Reduced method - example for construction of
constraints
Constraints for one item swap:
assessor 1
θ3 + z(1,1) − θ2 ≥ b
θ2 + z(2,1) − θ1 ≥ b
assessor 2
θ3 + z(1,2) − θ1 ≥ b
θ1 + z(2,2) − θ2 ≥ b
# of variables: n × (p − 1) + p # of constraints: n × (p − 1)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Reduced method - example for construction of
constraints
Constraints for one item swap:
assessor 1
θ3 + z(1,1) − θ2 ≥ b
θ2 + z(2,1) − θ1 ≥ b
assessor 2
θ3 + z(1,2) − θ1 ≥ b
θ1 + z(2,2) − θ2 ≥ b
# of variables: n × (p − 1) + p # of constraints: n × (p − 1)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Reduced method - example for construction of
constraints
Constraints for one item swap:
assessor 1
θ3 + z(1,1) − θ2 ≥ b
θ2 + z(2,1) − θ1 ≥ b
assessor 2
θ3 + z(1,2) − θ1 ≥ b
θ1 + z(2,2) − θ2 ≥ b
# of variables: n × (p − 1) + p # of constraints: n × (p − 1)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Reduced method - example for construction of
constraints
Constraints for one item swap:
assessor 1
θ3 + z(1,1) − θ2 ≥ b
θ2 + z(2,1) − θ1 ≥ b
assessor 2
θ3 + z(1,2) − θ1 ≥ b
θ1 + z(2,2) − θ2 ≥ b
# of variables: n × (p − 1) + p # of constraints: n × (p − 1)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Reduced method - example for construction of
constraints
Constraints for one item swap:
assessor 1
θ3 + z(1,1) − θ2 ≥ b
θ2 + z(2,1) − θ1 ≥ b
assessor 2
θ3 + z(1,2) − θ1 ≥ b
θ1 + z(2,2) − θ2 ≥ b
# of variables: n × (p − 1) + p # of constraints: n × (p − 1)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Reduced method - example for construction of
constraints
Constraints for one item swap:
assessor 1
θ3 + z(1,1) − θ2 ≥ b
θ2 + z(2,1) − θ1 ≥ b
assessor 2
θ3 + z(1,2) − θ1 ≥ b
θ1 + z(2,2) − θ2 ≥ b
# of variables: n × (p − 1) + p # of constraints: n × (p − 1)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Reduced method - example for construction of
constraints
Constraints for one item swap:
assessor 1
θ3 + z(1,1) − θ2 ≥ b
θ2 + z(2,1) − θ1 ≥ b
assessor 2
θ3 + z(1,2) − θ1 ≥ b
θ1 + z(2,2) − θ2 ≥ b
# of variables: n × (p − 1) + p # of constraints: n × (p − 1)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Reduced method - example for construction of
constraints
Constraints for one item swap:
assessor 1
θ3 + z(1,1) − θ2 ≥ b
θ2 + z(2,1) − θ1 ≥ b
assessor 2
θ3 + z(1,2) − θ1 ≥ b
θ1 + z(2,2) − θ2 ≥ b
# of variables: n × (p − 1) + p # of constraints: n × (p − 1)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Reduced method - general formulation (linear case)
minimize
x
p−1
i=1
n
j=1
z(i,j)
subject to θπ(1,j) + z(1,j) − θπ(2,j) ≥ b
θπ(2,j) + z(2,j) − θπ(3,j) ≥ b
...
θπ(p−1,j) + z(p−1,j) − θπ(p,j) ≥ b
− − − − − − − − − − − − − −
θi ≥ 0 i = 1, . . . , p
z(i,j) ≥ 0 i = 1, . . . , p − 1, j = 1, . . . , n
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Full versus reduced method - linear versus quadratic
optimisation
Number of variables full is n × p + p compared to
n × (p − 1) + p reduced
Number of constraints full is n × (p−1)p
2 compared to
n × (p − 1) reduced
As a consequence, the reduced method offers a
substantial gain in numerical efficiency, especially for a
large or huge number of objects p
Linear optimisation estimates only a discrete
approximation to the real-valued signals ⇒ an additional
bootstrap step is needed
Quadratic optimisation estimates the real-valued
signals directly ⇒ no additional computational step is
needed (unless standard errors are required)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Estimation of ˆθ and its standard error se(ˆθ) via
bootstrap
Let us select B independent bootstrap samples from the
columns of the input ranking matrix with replacement
For these bootstrap replicates we estimate the
corresponding parameters ˆθ∗(b)
Then we can estimate the standard error se(ˆθ) by the
standard deviation of the B replications
seB = {
B
b=1
[ˆθ∗
(b) − ¯θ∗
]2
/(B − 1)}1/2
,
where
¯θ∗
=
B
b=1
ˆθ∗
(b)/B
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
The algorithmic workflow
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Simulation study
simulate for p object values from θ ∼ N(0, 1);
simulate n different sigma values for each assessor from
σ ∼| N(0, 0.42) |;
For each assessor j:
simulate p noises using Zj ∼ N(0, σ2
j )
Xi,j = θi + Zi,j
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Simulation results
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Computer used
4 core windows7, 64 bit Intel Core i5-3470 CPU@3.2GHz
16 GB RAM 2133 MHz
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Evaluation of full and reduced method, linear and
quadratic, with B = 500 bootstrap replications
q
q
qqqq
qq
q
q
q
q
qqq
q
q
q
q
qq
qqq
qq
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
qqqq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
qq
q
q
q
q
qq
q
qq
q
qqqqqqq
q
q
q
qqq
q
qq
qqq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
qq
q
q
q
qq
qq
q
q
qq
qq
qqq
q
q
qqq
qq
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
qqqq
qq
q
q
q
q
q
q
q
qqqq
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
qqq
q
q
P = 20
N = 20
P = 20
N = 100
P = 100
N = 20
Pearson
Kendall
Spearm
an
Pearson
Kendall
Spearm
an
Pearson
Kendall
Spearm
an
0.4
0.6
0.8
1.0
Metrics
value
Method
FullLinear
FullQuadratic
ReducedLinear
ReducedQuadratic
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Execution time of all methods
00:00:00
01:00:00
02:00:00
03:00:00
20.20 100.20 20.100
P,N
time
Method
FullLinear
FullQuadratic
ReducedLinear
ReducedQuadratic
Mean time with 500 bootstrap (200 MC)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Evaluation of reduced method, linear and quadratic,
with B = 500 bootstrap replications, for n p data
q
qq
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
qq
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
qqq
qqq
q
q
qqq
q
q
qq
qq
qq
qq
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
qq
qq
qqqq
q
q
q
q
qq
qq
qqqqq
qqqq
q
qq
q
q
qq
q
qq
qq
qq
q
q
q
qqqq
qq
qqqqqq
qq
q
q
q
qq
q
qq
q
q
q
q
qq
q
q
q
q
q
qqq
qqq
q
qqq
q
q
q
q
q
q
q
q
q
q
qq
q
qqq
qq
q
q
qq
q
qqqq
q
q
q
qqqq
q
q
qq
q
q
q
q
q
q
qqq
q
q
q
q
q
qq
q
q
q
qqqq
q
qq
q
q
qqq
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
qq
q
qq
qq
q
qqqqq
q
q
qq
qqqqqqqqqq
q
qq
q
qq
q
q
q
qq
q
qqqq
q
qq
qq
q
q
q
q
qqq
q
q
q
qq
q
qq
q
q
q
qq
qq
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
qq
qqq
q
qqqqqqq
q
q
q
qqqqqqqqqq
q
qqq
q
qqq
q
q
q
qqq
qqqqqq
q
q
qqq
qq
q
qq
q
qq
q
q
qq
qq
qqqqqqqq
q
q
qq
q
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
qqq
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
qqq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qqq
qq
qqqqq
qqqq
q
qq
qqqq
q
qq
qq
q
qq
q
qq
q
qq
q
q
qq
q
qqq
qqq
q
q
qq
qqqq
qqqqqqqqq
qq
qqq
q
qq
qqqqqq
q
q
qq
qqqqqq
q
qq
q
q
q
qqq
q
q
q
qqqq
q
q
qq
q
qqqqqqq
q
qqq
q
qq
qqqqqqqqqqqqqqqqq
q
q
qqqqqqqqqq
q
qqq
qqqqqqqqqq
qqqqqqqqq
qq
qqqqqqqqqqqqqqq
q
q
q
q
q
qqq
q
q
qq
qq
q
q
q
q
qq
q
q
qqq
q
qq
qqq
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
qq
q
q
qq
qq
q
q
qq
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qqqqqq
q
q
qq
q
q
q
qqqqq
q
q
qqqqqqq
q
q
q
qqqqq
q
q
qqqqq
qqqq
qq
q
q
qq
q
qq
q
q
q
qqq
qqqq
qqq
q
q
qqqq qqqqqqqqqqqqq
q
q
q
q
q
qqqq
qq
qqqqq
qqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqq
P = 20
N = 20
P = 20
N = 100
P = 20
N = 500
P = 20
N = 1000
Pearson
Kendall
Spearm
an
Pearson
Kendall
Spearm
an
Pearson
Kendall
Spearm
an
Pearson
Kendall
Spearm
an
0.4
0.6
0.8
1.0
Metrics
value
Method
ReducedLinear
ReducedQuadratic
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Evaluation of reduced method, linear and quadratic,
with B = 500 bootstrap replications, for n p data
q
qq
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
qq
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
qqq
qqq
q
q
qqq
q
q
qq
qq
qq
qq
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
qq
qq
qqqq
q
q
q
q
qq
qq
qqqqq
qqqq
q
qq
q
q
qq
q
qq
qq
qq
q
q
q
qqqq
qq
qqqqqq
qq
q
q
q
qq
q
qq
q
q
q
q
qq
q
q
q
q
q
qqq
qqq
q
qqq
q
q
q
q
qqq
qqqqqqq
qq
q
qqq
qq
q
q
q
q
qqqq
q
q
q
q
qq
qqq
qq
q
q
q
q
q
q
qqq
q
q
q
qq
q
q
q
q
qq
q
q
qqqqq
q
q
q
q
q
q
q
qqq
q
qq
qq
q
q
qq
qq
q
q
qqq
q
qqqqq
q
q
q
q
q
q
q
q
q
q
q
qqqqqqqq
q
qqq
q
qqq
q
q
q
qq
qq
q
qqq
qq
qq
qq
q
qq
q
q
q
qqqq
q
q
q
q
q
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
qqqqqq
qq
qqq
q
q
qqqqq
q
qq
q
q
qq
q
q
q
q
q
q
q
qqq
q
qq
qqqq
q
q
qq
qqqqq
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
qqqqqq
q
qqqqqqq
P = 20
N = 20
P = 100
N = 20
P = 500
N = 20
P = 1000
N = 20
Pearson
Kendall
Spearm
an
Pearson
Kendall
Spearm
an
Pearson
Kendall
Spearm
an
Pearson
Kendall
Spearm
an
0.4
0.6
0.8
1.0
Metrics
value
Method
ReducedLinear
ReducedQuadratic
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Signal estimation of reduced linear and reduced
quadratic method for p=20 and n=20 data
−2
−1
0
1
2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Objects
Signalvalue
Type
theta.true
signal.estimate
Reduced Quadratic
−2
−1
0
1
2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Objects
Signalvalue
Type
theta.true
signal.estimate
Reduced Linear
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Signal estimation of full linear and full quadratic
method for p=20 and n=20 data
−2
−1
0
1
2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Objects
Signalvalue
Type
theta.true
signal.estimate
Full Quadratic
−2
−1
0
1
2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Objects
Signalvalue
Type
theta.true
signal.estimate
Full Linear
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Execution time for various combinations of n and p
00:00:00
00:10:00
00:20:00
00:30:00
00:40:00
00:50:00
20.20 100.20 500.20 1000.20 20.100 20.500 20.1000
(N,P)
time
Method
ReducedLinear
ReducedQuadratic
Mean time with 500 bootstrap (1000 MC)
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Execution time increasing p and n = 20
00:00:00
02:00:00
04:00:00
06:00:00
100 150 225 338 507 760 1140 1710 2565 3848 5772 8658 12987 19480 29220 43830 65745 98618 147927221890332835499252
p
time
method
reducedQuadratic
reducedLinear
Time execution for one estimation, n=20 and p increasing
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Summary and conclusions
A convex optimisation approach for estimation of the
consensus signals underlying ranked lists was proposed
It is computationally more efficient than any stochastic
optimisation approach, e.g. McMC
The quality of estimation is very promising, even for n p
The reduced method is substantially faster and can handle
large or even huge data
Linear
Pros
faster execution
Cons
discrete approximate
estimates
need of bootstrap step
Quadratic
Pros
real-valued precise
estimates
no need of bootstrap step
Cons
slower execution
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
Further work
Development of an R package for CRAN
Implementation of other Bootstrap concepts and
comparison with the current one
Generalisation of the method for missing ranks and tied
ranks
Detection of irregular assessments
L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation

More Related Content

Similar to Estimation of the Latent Signals for Consensus Across Multiple Ranked Lists using Convex Optimisation

matrices and determinantes
matrices and determinantes matrices and determinantes
matrices and determinantes
gandhinagar
 
Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3
Fabian Pedregosa
 
Lecture 1
Lecture 1Lecture 1
Lecture 1butest
 
Equivariance
EquivarianceEquivariance
Equivariance
mustafa sarac
 
lecture 15
lecture 15lecture 15
lecture 15sajinsc
 
TABREZ KHAN.ppt
TABREZ KHAN.pptTABREZ KHAN.ppt
TABREZ KHAN.ppt
TabrezKhan733764
 
Talk_HU_Berlin_Chiheb_benhammouda.pdf
Talk_HU_Berlin_Chiheb_benhammouda.pdfTalk_HU_Berlin_Chiheb_benhammouda.pdf
Talk_HU_Berlin_Chiheb_benhammouda.pdf
Chiheb Ben Hammouda
 
A lambda calculus for density matrices with classical and probabilistic controls
A lambda calculus for density matrices with classical and probabilistic controlsA lambda calculus for density matrices with classical and probabilistic controls
A lambda calculus for density matrices with classical and probabilistic controls
Alejandro Díaz-Caro
 
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Chiheb Ben Hammouda
 
Theta θ(g,x) and pi π(g,x) polynomials of hexagonal trapezoid system tb,a
Theta θ(g,x) and pi π(g,x) polynomials of hexagonal trapezoid system tb,aTheta θ(g,x) and pi π(g,x) polynomials of hexagonal trapezoid system tb,a
Theta θ(g,x) and pi π(g,x) polynomials of hexagonal trapezoid system tb,a
ijcsa
 
Recurrences
RecurrencesRecurrences
Recurrences
DEVTYPE
 
PCB_Lect02_Pairwise_allign (1).pdf
PCB_Lect02_Pairwise_allign (1).pdfPCB_Lect02_Pairwise_allign (1).pdf
PCB_Lect02_Pairwise_allign (1).pdf
ssusera1eccd
 
AlgoPerm2012 - 03 Olivier Hudry
AlgoPerm2012 - 03 Olivier HudryAlgoPerm2012 - 03 Olivier Hudry
AlgoPerm2012 - 03 Olivier Hudry
AlgoPerm 2012
 
Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...
journal ijrtem
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
Christian Robert
 
Skiena algorithm 2007 lecture02 asymptotic notation
Skiena algorithm 2007 lecture02 asymptotic notationSkiena algorithm 2007 lecture02 asymptotic notation
Skiena algorithm 2007 lecture02 asymptotic notationzukun
 
Asymptotic Notation
Asymptotic NotationAsymptotic Notation
Asymptotic Notation
sohelranasweet
 
Paper computer
Paper computerPaper computer
Paper computerbikram ...
 

Similar to Estimation of the Latent Signals for Consensus Across Multiple Ranked Lists using Convex Optimisation (20)

matrices and determinantes
matrices and determinantes matrices and determinantes
matrices and determinantes
 
Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3
 
Lecture 1
Lecture 1Lecture 1
Lecture 1
 
Equivariance
EquivarianceEquivariance
Equivariance
 
lecture 15
lecture 15lecture 15
lecture 15
 
TABREZ KHAN.ppt
TABREZ KHAN.pptTABREZ KHAN.ppt
TABREZ KHAN.ppt
 
Talk_HU_Berlin_Chiheb_benhammouda.pdf
Talk_HU_Berlin_Chiheb_benhammouda.pdfTalk_HU_Berlin_Chiheb_benhammouda.pdf
Talk_HU_Berlin_Chiheb_benhammouda.pdf
 
A lambda calculus for density matrices with classical and probabilistic controls
A lambda calculus for density matrices with classical and probabilistic controlsA lambda calculus for density matrices with classical and probabilistic controls
A lambda calculus for density matrices with classical and probabilistic controls
 
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
 
Theta θ(g,x) and pi π(g,x) polynomials of hexagonal trapezoid system tb,a
Theta θ(g,x) and pi π(g,x) polynomials of hexagonal trapezoid system tb,aTheta θ(g,x) and pi π(g,x) polynomials of hexagonal trapezoid system tb,a
Theta θ(g,x) and pi π(g,x) polynomials of hexagonal trapezoid system tb,a
 
Recurrences
RecurrencesRecurrences
Recurrences
 
PCB_Lect02_Pairwise_allign (1).pdf
PCB_Lect02_Pairwise_allign (1).pdfPCB_Lect02_Pairwise_allign (1).pdf
PCB_Lect02_Pairwise_allign (1).pdf
 
AlgoPerm2012 - 03 Olivier Hudry
AlgoPerm2012 - 03 Olivier HudryAlgoPerm2012 - 03 Olivier Hudry
AlgoPerm2012 - 03 Olivier Hudry
 
Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...
 
Hprec7.1
Hprec7.1Hprec7.1
Hprec7.1
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
CMU_13
CMU_13CMU_13
CMU_13
 
Skiena algorithm 2007 lecture02 asymptotic notation
Skiena algorithm 2007 lecture02 asymptotic notationSkiena algorithm 2007 lecture02 asymptotic notation
Skiena algorithm 2007 lecture02 asymptotic notation
 
Asymptotic Notation
Asymptotic NotationAsymptotic Notation
Asymptotic Notation
 
Paper computer
Paper computerPaper computer
Paper computer
 

More from Luca Vitale

A pathway and SVM based tool for tumor classification
A pathway and SVM based tool for tumor classificationA pathway and SVM based tool for tumor classification
A pathway and SVM based tool for tumor classification
Luca Vitale
 
Pathway based OMICs data classification
Pathway based OMICs data classificationPathway based OMICs data classification
Pathway based OMICs data classification
Luca Vitale
 
JSON-LD
JSON-LDJSON-LD
JSON-LD
Luca Vitale
 
Metodi per la soluzione di problemi di programmazione non lineare
Metodi per la soluzione di problemi di programmazione non lineareMetodi per la soluzione di problemi di programmazione non lineare
Metodi per la soluzione di problemi di programmazione non lineare
Luca Vitale
 
Shrinkage methods
Shrinkage methodsShrinkage methods
Shrinkage methods
Luca Vitale
 
Log structured-file-system
Log structured-file-systemLog structured-file-system
Log structured-file-system
Luca Vitale
 
Utilizzo dei Thread
Utilizzo dei ThreadUtilizzo dei Thread
Utilizzo dei Thread
Luca Vitale
 
S3
S3S3
Classificazione in efMRI: Un caso di studio sulla coniugazione dei verbi
Classificazione in efMRI: Un caso di studio sulla coniugazione dei verbiClassificazione in efMRI: Un caso di studio sulla coniugazione dei verbi
Classificazione in efMRI: Un caso di studio sulla coniugazione dei verbi
Luca Vitale
 
Linguaggi Context-Sensitive e Linear Bounded Automata
Linguaggi Context-Sensitive e Linear Bounded AutomataLinguaggi Context-Sensitive e Linear Bounded Automata
Linguaggi Context-Sensitive e Linear Bounded Automata
Luca Vitale
 
Soluzione numerica di equazioni differenziali a grandi dimensioni su GPUs
Soluzione numerica di equazioni differenziali a grandi dimensioni su GPUsSoluzione numerica di equazioni differenziali a grandi dimensioni su GPUs
Soluzione numerica di equazioni differenziali a grandi dimensioni su GPUs
Luca Vitale
 

More from Luca Vitale (11)

A pathway and SVM based tool for tumor classification
A pathway and SVM based tool for tumor classificationA pathway and SVM based tool for tumor classification
A pathway and SVM based tool for tumor classification
 
Pathway based OMICs data classification
Pathway based OMICs data classificationPathway based OMICs data classification
Pathway based OMICs data classification
 
JSON-LD
JSON-LDJSON-LD
JSON-LD
 
Metodi per la soluzione di problemi di programmazione non lineare
Metodi per la soluzione di problemi di programmazione non lineareMetodi per la soluzione di problemi di programmazione non lineare
Metodi per la soluzione di problemi di programmazione non lineare
 
Shrinkage methods
Shrinkage methodsShrinkage methods
Shrinkage methods
 
Log structured-file-system
Log structured-file-systemLog structured-file-system
Log structured-file-system
 
Utilizzo dei Thread
Utilizzo dei ThreadUtilizzo dei Thread
Utilizzo dei Thread
 
S3
S3S3
S3
 
Classificazione in efMRI: Un caso di studio sulla coniugazione dei verbi
Classificazione in efMRI: Un caso di studio sulla coniugazione dei verbiClassificazione in efMRI: Un caso di studio sulla coniugazione dei verbi
Classificazione in efMRI: Un caso di studio sulla coniugazione dei verbi
 
Linguaggi Context-Sensitive e Linear Bounded Automata
Linguaggi Context-Sensitive e Linear Bounded AutomataLinguaggi Context-Sensitive e Linear Bounded Automata
Linguaggi Context-Sensitive e Linear Bounded Automata
 
Soluzione numerica di equazioni differenziali a grandi dimensioni su GPUs
Soluzione numerica di equazioni differenziali a grandi dimensioni su GPUsSoluzione numerica di equazioni differenziali a grandi dimensioni su GPUs
Soluzione numerica di equazioni differenziali a grandi dimensioni su GPUs
 

Recently uploaded

extra-chromosomal-inheritance[1].pptx.pdfpdf
extra-chromosomal-inheritance[1].pptx.pdfpdfextra-chromosomal-inheritance[1].pptx.pdfpdf
extra-chromosomal-inheritance[1].pptx.pdfpdf
DiyaBiswas10
 
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
Health Advances
 
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptxBody fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
muralinath2
 
erythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptxerythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptx
muralinath2
 
Anemia_ different types_causes_ conditions
Anemia_ different types_causes_ conditionsAnemia_ different types_causes_ conditions
Anemia_ different types_causes_ conditions
muralinath2
 
Hemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptxHemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptx
muralinath2
 
GBSN- Microbiology (Lab 3) Gram Staining
GBSN- Microbiology (Lab 3) Gram StainingGBSN- Microbiology (Lab 3) Gram Staining
GBSN- Microbiology (Lab 3) Gram Staining
Areesha Ahmad
 
Richard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlandsRichard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlands
Richard Gill
 
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
Sérgio Sacani
 
Penicillin...........................pptx
Penicillin...........................pptxPenicillin...........................pptx
Penicillin...........................pptx
Cherry
 
Nutraceutical market, scope and growth: Herbal drug technology
Nutraceutical market, scope and growth: Herbal drug technologyNutraceutical market, scope and growth: Herbal drug technology
Nutraceutical market, scope and growth: Herbal drug technology
Lokesh Patil
 
insect morphology and physiology of insect
insect morphology and physiology of insectinsect morphology and physiology of insect
insect morphology and physiology of insect
anitaento25
 
Structural Classification Of Protein (SCOP)
Structural Classification Of Protein  (SCOP)Structural Classification Of Protein  (SCOP)
Structural Classification Of Protein (SCOP)
aishnasrivastava
 
NuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionNuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final version
pablovgd
 
Structures and textures of metamorphic rocks
Structures and textures of metamorphic rocksStructures and textures of metamorphic rocks
Structures and textures of metamorphic rocks
kumarmathi863
 
Cancer cell metabolism: special Reference to Lactate Pathway
Cancer cell metabolism: special Reference to Lactate PathwayCancer cell metabolism: special Reference to Lactate Pathway
Cancer cell metabolism: special Reference to Lactate Pathway
AADYARAJPANDEY1
 
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
muralinath2
 
Seminar of U.V. Spectroscopy by SAMIR PANDA
 Seminar of U.V. Spectroscopy by SAMIR PANDA Seminar of U.V. Spectroscopy by SAMIR PANDA
Seminar of U.V. Spectroscopy by SAMIR PANDA
SAMIR PANDA
 
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...
Sérgio Sacani
 
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATIONPRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
ChetanK57
 

Recently uploaded (20)

extra-chromosomal-inheritance[1].pptx.pdfpdf
extra-chromosomal-inheritance[1].pptx.pdfpdfextra-chromosomal-inheritance[1].pptx.pdfpdf
extra-chromosomal-inheritance[1].pptx.pdfpdf
 
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
 
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptxBody fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
 
erythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptxerythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptx
 
Anemia_ different types_causes_ conditions
Anemia_ different types_causes_ conditionsAnemia_ different types_causes_ conditions
Anemia_ different types_causes_ conditions
 
Hemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptxHemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptx
 
GBSN- Microbiology (Lab 3) Gram Staining
GBSN- Microbiology (Lab 3) Gram StainingGBSN- Microbiology (Lab 3) Gram Staining
GBSN- Microbiology (Lab 3) Gram Staining
 
Richard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlandsRichard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlands
 
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
 
Penicillin...........................pptx
Penicillin...........................pptxPenicillin...........................pptx
Penicillin...........................pptx
 
Nutraceutical market, scope and growth: Herbal drug technology
Nutraceutical market, scope and growth: Herbal drug technologyNutraceutical market, scope and growth: Herbal drug technology
Nutraceutical market, scope and growth: Herbal drug technology
 
insect morphology and physiology of insect
insect morphology and physiology of insectinsect morphology and physiology of insect
insect morphology and physiology of insect
 
Structural Classification Of Protein (SCOP)
Structural Classification Of Protein  (SCOP)Structural Classification Of Protein  (SCOP)
Structural Classification Of Protein (SCOP)
 
NuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionNuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final version
 
Structures and textures of metamorphic rocks
Structures and textures of metamorphic rocksStructures and textures of metamorphic rocks
Structures and textures of metamorphic rocks
 
Cancer cell metabolism: special Reference to Lactate Pathway
Cancer cell metabolism: special Reference to Lactate PathwayCancer cell metabolism: special Reference to Lactate Pathway
Cancer cell metabolism: special Reference to Lactate Pathway
 
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
 
Seminar of U.V. Spectroscopy by SAMIR PANDA
 Seminar of U.V. Spectroscopy by SAMIR PANDA Seminar of U.V. Spectroscopy by SAMIR PANDA
Seminar of U.V. Spectroscopy by SAMIR PANDA
 
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...
 
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATIONPRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
 

Estimation of the Latent Signals for Consensus Across Multiple Ranked Lists using Convex Optimisation

  • 1. Estimation of the Latent Signals for Consensus Across Multiple Ranked Lists using Convex Optimisation Luca Vitale, Michael G. Schimek Department of Economics and Statistics, Univerist`a degli studi di Salerno, Italy and Institute for Medical Informatics, Statistics and Documentation, Medical University of Graz, Austria SimStat Salzburg, 2-6 September, 2019 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 2. The data-analytic problem We have access only to the rankings of objects, not to the data that informed the assessors’ decisions that led to those rankings p = 5 (# of objects) n = 4 (# of assessors) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 3. What do we aim at? Let us have a set of p distinct objects Let us assume n independent assessments assigning rank positions to the same set of objects These objects are ranked between 1 and p, without ties Our aim is estimation of those signal parameters which determine the realized rank assignments Practical Problem: Probabilistic models, especially of Bayesian type, require computationally highly demanding stochastic optimisation techniques Consequence: Only rather small sets of objects can be handled and the majority of models does not allow to solve p n-problems Please note: rank aggregation is a different task under the assumption that the observed rankings are ’correct’ L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 4. Necessary assumptions for the proposed method overcoming current limitations We assume Xij = θtrue i + Zij, where the real-valued parameters θtrue i are to be estimated, and the Zij’s are arbitrary random variables (the object-specific noise of each assessor) The parameters θtrue i represent the (normalised) ‘true’ consensus signals underlying the assessments Random variables X1j, . . . , Xpj are observed by the jth assessor These random variables are ordered Xπ1,j > . . . > Xπp,j and define the ranked list produced by the j-th assessor L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 5. Proposed method for signal estimation The goal of consensus across the assessors (rankers) is achieved by the reduction of the global rank order induced noise Z The minimum noise result is used for indirect inference to estimate the unobserved signals θi informing the consensus ranks The solution is obtained by convex optimisation techniques Two types of penalisation are applied: linear and quadratic, where b denotes the penalty parameter We consider two different approaches for handling the necessary constraints: a full method and a reduced method, the latter for higher computational efficiency L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 6. Convex optimisation min x cT x s.t. Ax ≤ b x ≥ 0 (1) Where: c is a real t-dimensional vector, where t equals the number of variables A is a m × t dimensional real matrix, where m is the number of constraints b is a m-dimensional real vector and represents the penalisation for each constraint L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 7. Software used for convex optimisation Gurobi1 is powerful mathematical programming solver available for Linear Programming (LP) and Quadratic Programming (QP) problems In order to solve the optimisation problem, the simplex method is used. 1 Gurobi Optimizer Reference Manual L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 8. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 9. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 10. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 11. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 12. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 13. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 14. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 15. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 16. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 17. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 18. Full method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(3,1) − θ2 − z(2,1) ≥ b θ3 + z(3,1) − θ1 − z(1,1) ≥ b θ2 + z(2,1) − θ1 − z(1,1) ≥ b assessor 2 θ3 + z(3,2) − θ1 − z(1,2) ≥ b θ3 + z(3,2) − θ2 − z(2,2) ≥ b θ1 + z(1,2) − θ2 − z(1,2) ≥ b # of variables: n × p + p # of constraints: n × (p−1)p 2 L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 19. The objective function We consider two kinds of minimisation: Linear optimisation (LP) based on the sum of the support variables z(i,j) p i=1 n j=1 z(i,j) Quadratic optimisation (QP) based on the sum of the squared support variables z(i,j) p i=1 n j=1 z(i,j) 2 The minimisation of the objective function permits an automatic adaptation of the individual signals θ towards their consensus signal ˆθ that represents the observed individual rankings in an optimal way L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 20. Full method - general formulation (linear case) minimize x p i=1 n j=1 z(i,j) subject to θπ(1,j) + z(π(1,j),j) − θπ(2,j) − z(π(2,j),j) ≥ b θπ(1,j) + z(π(1,j),j) − θπ(3,j) − z(π(3,j),j) ≥ b ... θπ(1,j) + z(π(1,j),j) − θπ(p,j) − z(π(p,j),j) ≥ b − − − −− θπ(2,j) + z(π(2,j),j) − θπ(3,j) − z(π(3,j),j) ≥ b θπ(2,j) + z(π(2,j),j) − θπ(4,j) − z(π(4,j),j) ≥ b ... θπ(2,j) + z(π(2,j),1) − θπ(p,j) − z(π(p,j),j) ≥ b − − − − − − − − − − − − − − − −− ... − − − − − − − − − − − − − − − −− θi ≥ 0 i = 1, . . . , p z(i,j) ≥ 0 i = 1, . . . , p, j = 1, . . . , n L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 21. Reduced method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(1,1) − θ2 ≥ b θ2 + z(2,1) − θ1 ≥ b assessor 2 θ3 + z(1,2) − θ1 ≥ b θ1 + z(2,2) − θ2 ≥ b # of variables: n × (p − 1) + p # of constraints: n × (p − 1) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 22. Reduced method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(1,1) − θ2 ≥ b θ2 + z(2,1) − θ1 ≥ b assessor 2 θ3 + z(1,2) − θ1 ≥ b θ1 + z(2,2) − θ2 ≥ b # of variables: n × (p − 1) + p # of constraints: n × (p − 1) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 23. Reduced method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(1,1) − θ2 ≥ b θ2 + z(2,1) − θ1 ≥ b assessor 2 θ3 + z(1,2) − θ1 ≥ b θ1 + z(2,2) − θ2 ≥ b # of variables: n × (p − 1) + p # of constraints: n × (p − 1) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 24. Reduced method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(1,1) − θ2 ≥ b θ2 + z(2,1) − θ1 ≥ b assessor 2 θ3 + z(1,2) − θ1 ≥ b θ1 + z(2,2) − θ2 ≥ b # of variables: n × (p − 1) + p # of constraints: n × (p − 1) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 25. Reduced method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(1,1) − θ2 ≥ b θ2 + z(2,1) − θ1 ≥ b assessor 2 θ3 + z(1,2) − θ1 ≥ b θ1 + z(2,2) − θ2 ≥ b # of variables: n × (p − 1) + p # of constraints: n × (p − 1) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 26. Reduced method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(1,1) − θ2 ≥ b θ2 + z(2,1) − θ1 ≥ b assessor 2 θ3 + z(1,2) − θ1 ≥ b θ1 + z(2,2) − θ2 ≥ b # of variables: n × (p − 1) + p # of constraints: n × (p − 1) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 27. Reduced method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(1,1) − θ2 ≥ b θ2 + z(2,1) − θ1 ≥ b assessor 2 θ3 + z(1,2) − θ1 ≥ b θ1 + z(2,2) − θ2 ≥ b # of variables: n × (p − 1) + p # of constraints: n × (p − 1) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 28. Reduced method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(1,1) − θ2 ≥ b θ2 + z(2,1) − θ1 ≥ b assessor 2 θ3 + z(1,2) − θ1 ≥ b θ1 + z(2,2) − θ2 ≥ b # of variables: n × (p − 1) + p # of constraints: n × (p − 1) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 29. Reduced method - example for construction of constraints Constraints for one item swap: assessor 1 θ3 + z(1,1) − θ2 ≥ b θ2 + z(2,1) − θ1 ≥ b assessor 2 θ3 + z(1,2) − θ1 ≥ b θ1 + z(2,2) − θ2 ≥ b # of variables: n × (p − 1) + p # of constraints: n × (p − 1) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 30. Reduced method - general formulation (linear case) minimize x p−1 i=1 n j=1 z(i,j) subject to θπ(1,j) + z(1,j) − θπ(2,j) ≥ b θπ(2,j) + z(2,j) − θπ(3,j) ≥ b ... θπ(p−1,j) + z(p−1,j) − θπ(p,j) ≥ b − − − − − − − − − − − − − − θi ≥ 0 i = 1, . . . , p z(i,j) ≥ 0 i = 1, . . . , p − 1, j = 1, . . . , n L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 31. Full versus reduced method - linear versus quadratic optimisation Number of variables full is n × p + p compared to n × (p − 1) + p reduced Number of constraints full is n × (p−1)p 2 compared to n × (p − 1) reduced As a consequence, the reduced method offers a substantial gain in numerical efficiency, especially for a large or huge number of objects p Linear optimisation estimates only a discrete approximation to the real-valued signals ⇒ an additional bootstrap step is needed Quadratic optimisation estimates the real-valued signals directly ⇒ no additional computational step is needed (unless standard errors are required) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 32. Estimation of ˆθ and its standard error se(ˆθ) via bootstrap Let us select B independent bootstrap samples from the columns of the input ranking matrix with replacement For these bootstrap replicates we estimate the corresponding parameters ˆθ∗(b) Then we can estimate the standard error se(ˆθ) by the standard deviation of the B replications seB = { B b=1 [ˆθ∗ (b) − ¯θ∗ ]2 /(B − 1)}1/2 , where ¯θ∗ = B b=1 ˆθ∗ (b)/B L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 33. The algorithmic workflow L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 34. Simulation study simulate for p object values from θ ∼ N(0, 1); simulate n different sigma values for each assessor from σ ∼| N(0, 0.42) |; For each assessor j: simulate p noises using Zj ∼ N(0, σ2 j ) Xi,j = θi + Zi,j L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 35. Simulation results L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 36. Computer used 4 core windows7, 64 bit Intel Core i5-3470 CPU@3.2GHz 16 GB RAM 2133 MHz L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 37. Evaluation of full and reduced method, linear and quadratic, with B = 500 bootstrap replications q q qqqq qq q q q q qqq q q q q qq qqq qq q qq q q q q q q q q q q qq q q q q q q qq q qqqq q q qq q q q q q q q q q q q q q q q q q q q qqq q qq q qq q q q q qq q qq q qqqqqqq q q q qqq q qq qqq q q q q q q q q qq q q q q q q q qq q q q q q qq q qq q q q qq qq q q qq qq qqq q q qqq qq q q q q q qq q q q q qq q q q qq q q q q q q qqqq qq q q q q q q q qqqq q q q qq q q q q q q qq q q q qqq q q P = 20 N = 20 P = 20 N = 100 P = 100 N = 20 Pearson Kendall Spearm an Pearson Kendall Spearm an Pearson Kendall Spearm an 0.4 0.6 0.8 1.0 Metrics value Method FullLinear FullQuadratic ReducedLinear ReducedQuadratic L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 38. Execution time of all methods 00:00:00 01:00:00 02:00:00 03:00:00 20.20 100.20 20.100 P,N time Method FullLinear FullQuadratic ReducedLinear ReducedQuadratic Mean time with 500 bootstrap (200 MC) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 39. Evaluation of reduced method, linear and quadratic, with B = 500 bootstrap replications, for n p data q qq q qq qq q q q q q q q q q q q qq q q q q q q qqq q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q qq q qq q qq q q q q q q q q q qq qq q q q q q q q qq q q q q q q q qqq q q q q q q q q q qqq q qq q q qqq qqq q q qqq q q qq qq qq qq q q q qq qq q q q q q q q q q q q q qq qq qq qq qqqq q q q q qq qq qqqqq qqqq q qq q q qq q qq qq qq q q q qqqq qq qqqqqq qq q q q qq q qq q q q q qq q q q q q qqq qqq q qqq q q q q q q q q q q qq q qqq qq q q qq q qqqq q q q qqqq q q qq q q q q q q qqq q q q q q qq q q q qqqq q qq q q qqq q q q q q q qq qq q q q q q q q q qq q qq qq q qqqqq q q qq qqqqqqqqqq q qq q qq q q q qq q qqqq q qq qq q q q q qqq q q q qq q qq q q q qq qq q q qq qq q q q q q q q q q q q qq qqq q qqqqqqq q q q qqqqqqqqqq q qqq q qqq q q q qqq qqqqqq q q qqq qq q qq q qq q q qq qq qqqqqqqq q q qq q q q q qq q qq q q q qq q q q q q qqq q q q qqq q q q q q q qq q q q q q qq q q q q qqq q q q q qq q q q q q q q q q qqq qq qqqqq qqqq q qq qqqq q qq qq q qq q qq q qq q q qq q qqq qqq q q qq qqqq qqqqqqqqq qq qqq q qq qqqqqq q q qq qqqqqq q qq q q q qqq q q q qqqq q q qq q qqqqqqq q qqq q qq qqqqqqqqqqqqqqqqq q q qqqqqqqqqq q qqq qqqqqqqqqq qqqqqqqqq qq qqqqqqqqqqqqqqq q q q q q qqq q q qq qq q q q q qq q q qqq q qq qqq q q q q q q q q q q qqq q q q q q qq q q qq qq q q qq q q qqq q q q q q q q q q q q q q q q q q qq q qqqqqq q q qq q q q qqqqq q q qqqqqqq q q q qqqqq q q qqqqq qqqq qq q q qq q qq q q q qqq qqqq qqq q q qqqq qqqqqqqqqqqqq q q q q q qqqq qq qqqqq qqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqq P = 20 N = 20 P = 20 N = 100 P = 20 N = 500 P = 20 N = 1000 Pearson Kendall Spearm an Pearson Kendall Spearm an Pearson Kendall Spearm an Pearson Kendall Spearm an 0.4 0.6 0.8 1.0 Metrics value Method ReducedLinear ReducedQuadratic L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 40. Evaluation of reduced method, linear and quadratic, with B = 500 bootstrap replications, for n p data q qq q qq qq q q q q q q q q q q q qq q q q q q q qqq q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q qq q qq q qq q q q q q q q q q qq qq q q q q q q q qq q q q q q q q qqq q q q q q q q q q qqq q qq q q qqq qqq q q qqq q q qq qq qq qq q q q qq qq q q q q q q q q q q q q qq qq qq qq qqqq q q q q qq qq qqqqq qqqq q qq q q qq q qq qq qq q q q qqqq qq qqqqqq qq q q q qq q qq q q q q qq q q q q q qqq qqq q qqq q q q q qqq qqqqqqq qq q qqq qq q q q q qqqq q q q q qq qqq qq q q q q q q qqq q q q qq q q q q qq q q qqqqq q q q q q q q qqq q qq qq q q qq qq q q qqq q qqqqq q q q q q q q q q q q qqqqqqqq q qqq q qqq q q q qq qq q qqq qq qq qq q qq q q q qqqq q q q q q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qqqqqq qq qqq q q qqqqq q qq q q qq q q q q q q q qqq q qq qqqq q q qq qqqqq q q qq q q q q q q q q q qq q qq q q qqqqqq q qqqqqqq P = 20 N = 20 P = 100 N = 20 P = 500 N = 20 P = 1000 N = 20 Pearson Kendall Spearm an Pearson Kendall Spearm an Pearson Kendall Spearm an Pearson Kendall Spearm an 0.4 0.6 0.8 1.0 Metrics value Method ReducedLinear ReducedQuadratic L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 41. Signal estimation of reduced linear and reduced quadratic method for p=20 and n=20 data −2 −1 0 1 2 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Objects Signalvalue Type theta.true signal.estimate Reduced Quadratic −2 −1 0 1 2 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Objects Signalvalue Type theta.true signal.estimate Reduced Linear L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 42. Signal estimation of full linear and full quadratic method for p=20 and n=20 data −2 −1 0 1 2 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Objects Signalvalue Type theta.true signal.estimate Full Quadratic −2 −1 0 1 2 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Objects Signalvalue Type theta.true signal.estimate Full Linear L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 43. Execution time for various combinations of n and p 00:00:00 00:10:00 00:20:00 00:30:00 00:40:00 00:50:00 20.20 100.20 500.20 1000.20 20.100 20.500 20.1000 (N,P) time Method ReducedLinear ReducedQuadratic Mean time with 500 bootstrap (1000 MC) L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 44. Execution time increasing p and n = 20 00:00:00 02:00:00 04:00:00 06:00:00 100 150 225 338 507 760 1140 1710 2565 3848 5772 8658 12987 19480 29220 43830 65745 98618 147927221890332835499252 p time method reducedQuadratic reducedLinear Time execution for one estimation, n=20 and p increasing L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 45. Summary and conclusions A convex optimisation approach for estimation of the consensus signals underlying ranked lists was proposed It is computationally more efficient than any stochastic optimisation approach, e.g. McMC The quality of estimation is very promising, even for n p The reduced method is substantially faster and can handle large or even huge data Linear Pros faster execution Cons discrete approximate estimates need of bootstrap step Quadratic Pros real-valued precise estimates no need of bootstrap step Cons slower execution L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation
  • 46. Further work Development of an R package for CRAN Implementation of other Bootstrap concepts and comparison with the current one Generalisation of the method for missing ranks and tied ranks Detection of irregular assessments L. Vitale, M. G. Schimek Consensus across ranked lists using convex optimisation