SlideShare a Scribd company logo
1 of 58
Design of Multi-Criteria Decision Making
Algorithms for Cloud Computing
Submitted By: Guided By:
Munmun Saha Dr. Suvasini Panigrahi
Reg No – 1810040005 Co-Guide: Dr. Sanjaya Kumar Panda
Department of Computer Science and Engineering
Veer Surendra Sai University of Technology
Burla,Odisha, India
1
CONTENT
 Introduction to Cloud Computing
 Overview of Multi-Criteria Decision Making
 Motivation and Objectives
 Previous Work
 Problem Statement
 Proposed Methodology and Case Study
 Road Map
 Conclusions
 Reference
2
Definition of Cloud Computing
Some Definitions of Cloud Computing
• Cloud Computing is a general term which simply means, distributed computing
over the internet, or delivering computing service on the internet .
• The practice of using a network of remote servers hosted on the Internet to
store, manage, and process data, rather than a local server or a personal
computer. This is known as Cloud computing.
• National Institute of Standards and Technology (NIST), which
defines cloud computing as, a model for enabling convenient, on-demand
network access to a shared pool of configurable computing resources that can
be rapidly provisioned and released with minimal management effort or service
provider interaction.
3
Services of Cloud Computing
4
Figure 1 Services of Cloud Computing
Deployment Models of Cloud
Computing
5
Figure 2 Deployment Models of Cloud Computing
Multi-Criteria Decision Making
Multi Criteria Decision Making (MCDM) refers to making decisions in the
presence of multiple usually conflicting criteria.
6
Figure 3 Multi-Criteria Decision Making
Real Life of Multi-Criteria
Decision Making
Suppose we want to buy a Car.
We have different alternatives- BMW, Ford, Honda, Toyota etc.
To Select the Best
Car
Price MPG Style
Riding
Comfort
7
Figure4 Real life example of Multi-Criteria Decision Making
Motivation and Objectives
In spite of huge significance, scant attention has been given in the area of MCDM
in cloud computing by considering all the performance parameters including both
beneficial and non beneficial attribute. The primary objective of MCDM is to
select the best Cloud Service Provider without compromising any SLA index, and
considering both the objective and subjective criteria.
Moreover, whatever the algorithms that have been implemented in the existing
works that is also limited to many applications specially minimizing the non-
beneficial attribute and maximizing the beneficial attribute value. Following these
ideas and motivated from previous works, the objective of MCDM algorithm are
as follows:
8
Motivation and Objectives
• To select the best CSP among various homogenous alternatives.
• To evaluate different QoS parameter of the Cloud Service Provider.
• Not to Compromise the Key Performance index mentioned in the SLAs.
• To categorized the QoS parameter in B-O-C-R model (Benefits, Opportunities,
Cost, Risk) where Benefit and Opportunities are beneficial attribute and Cost and
Risk is Non-Beneficial attributes.
• To select the CSP having the maximum beneficial and minimum non-beneficial
criteria value.
• To select best criteria of each cloud provider, such that a composite service is to
be provided by collaborating among the CSPs.
9
Elements of Multi-Criteria
Decision Making
MCDM problem has five elements
• A Goal
• At least two Alternatives
• Two or more criteria
• Criteria weights
• Decision Makers
10
Type of Multi Criteria Decision
Making
They further classified into
1. Multi Object Decision Making (MODM)
2. Multi Attribute Decision Making (MADM)
MODM is applied on Continuous search space whereas MADM is applied
on finite number of alternatives or discrete search space.
MCDM
MODM MADM
11
Figure 5(a) Multi-Criteria Decision Making , Figure 5(b) Multi-Criteria Decision Making
Some existing algorithms of MCDM
MCDM
MODM MADM
Weighted Sum Method (WSM)
Weighted Product Method (WPM)
Analytic Hierarchy Process
(AHP)
Technique for Order Preference by
Similarity to Ideal Solution (TOPSIS)
Preference Organization Ranking Method
for Enrichment of Evaluation
(PROMETHEE)
Vise Kriterijumska Optimizacija I
Kompromisno Resenjie (VIKOR)
Multi Objective
Optimization using
Ratio Analysis
(MOORA)
Graph Theory Matrix
Approach (GTMA)
12
Literature Review (Comparison Table)
Sl.
No.
Author’s
Name
Year MCDM
Method
Objective Approach Service Tool Criteria
and Alter-
natives
1 Manish
Godse, Shrikant
Mulik [27]
2008 AHP To select
Appropriate
SAAS product.
Weight of the
parameter and final
product of the score
of the alternatives
are calculated by
AHP to avoid
subjective opinion.
SAAS - 5 criteria
and
16 sub
criteria and
3
alternatives
.
2 Vuong Xuan
Tran et al.
[32]
2009 AHP To rank web
Service.
QoS based ranking
Algorithm
combining AHP to
rank web service.
- - 8 QoS
property
and 5 web
service.
3 Chen-Tung
Chen, Kuan-
Hung Lin
[28]
2010 FAHP For evaluating
Cloud Service
Interval value
Fuzzy sets
combining with
AHP are used for
evaluation of
cloud service.
Overall
services -
3 criteria
9
Sub-
criteria
and 3
alternativ
13
Table 1 Comparison Table
Literature Review (Comparison
Table)
4 Saurabh
Kumar,
Garg,Steve
Versteeg,Rajkum
ar Buyyaa [20]
2013 AHP Framework
To Rank Cloud
Computing
service.
AHP for both
assigning weight to
criteria and ranking
the alternatives.
Overall
Services.
- 6 criteria
and
9 sub
criteria
and rank 3
alternatives
5 Hong-Kyu
Kwon1, Kwang
Kyu Seo [23]
2013 Fuzzy
AHP
To select suitable
IAAS provider.
Fuzzy AHP for
assigning weight to
the criteria and
AHP for ranking
the alternatives.
IAAS Exper
t
Choic
e
3 criteria
and 8 Sub
criteria
and
5 IAAS
Provider.
6 Mingzhe Wang,
Yu Liu [24]
2013 ANP Evaluation
of QoS
requirement
in Cloud Service
Architecture
(CSA).
Control Hierarchy
is constructed using
ANP, relative
superiority is
calculated from the
super-matrix.
IAAS
PAAS
SAAS
CPN+
and
Clous
- Sim
35 criteria
and 5
alternatives.
7 Gultekin Atas
,Vehbi Cagri
Gungor [21]
2014 AHP,LSP To Evaluate
the performance
of the PAAS
Provider.
AHP for
decomposing
performance
variable and LSP
for logical scoring
PAAS - 3 criteria
19 sub
criteria and
3
alternatives.
14
Literature Review (Comparison
Table)
8 Ramachandran
N. et al.[22]
2014 AHP To deploy a
appropriate
model of cloud
computing in
an academic
institution.
AHP for
comparing the
criteria and ranking
the cloud model.
Overall
service
Super
Deci-
sion
6 main
factor and
28 sub
factors, and
consider
4 cloud
deployment
model.
9 Mohamed
AbdelBasset et.
al.[25]
2016 NAHP To evaluate
Cloud Computing
service.
Neutrosophic
MCDM analysis
approach based on
AHP.
IAAS
PAAS
SAAS
- 5 criteria 3
alternatives.
10 Rajanpreet
Kaur Chahal and
Sarbjeet
Singh[26]
2016 AHP To rank
CSP.
AHP is used for
comparison of the
criteria and ranking
the alternatives.
Overall - 4 criteria
and 5
alternatives.
11 Rakesh Ranjan
Kumar et al. [29]
2017 AHP and
Fuzzy
TOPSIS
Prioritizing
the solution of
cloud ser- vice
selection.
AHP for calculation
weight of the
criteria and Fuzzy
TOP- SIS for final
rank of alternatives.
- - 10 Criteria
and 6
alternatives.
15
Literature Review (Comparison
Table)
12 Neeraj Yadava,
Major Singh
Gorayab[31]
2017 AHP Service Mapping
In the cloud
environment.
Two-way service
mapping approach
between the Service
Requesting
Customer (SRCs)
and the CSP based
on AHP ranking.
Overall - 3 criteria
and 3
alternatives.
13 Rakesh Ranjan
Kumar,
Chiranjeev
Kumar[30]
2018 AHP and
TOPSIS
To select and
rank cloud
service.
AHP for weighting
the criteria and
TOPSIS for ranking
the alternatives.
IAAS
PAAS
SAAS
- 10 criteria
and 6
alternatives.
14 Jagpreet
Sidhu, Sarbjeet
Singh
[33]
2017 AHP
TOPSIS
PROME-
THEE
To determine
trust- worthiness
of CSP
Trust is evaluated
using three MCDM
technique AHP,
TOPSIS
PROMETHEE and
the result is
compared.
- - 10 Attribute
and 18 CSP
15 Zoie ra dulescu,
Cristina,Ra
dulescu [34]
2017 E-TOPSIS To rank CSP. Extended TOPSIS
By using
Murkowski
distance is used to
rank the CSP.
- - 32 criteria
and 10 CSP.
16
Literature Review (Comparison
Table)
16 Omar
Boutkhoum et al.
[35]
2017 Fuzzy
AHP
Fuzzy
TOPSIS
To select suitable
cloud solu-tion
for big data
project.
decision making
approach consist of
FAHP for assigning
weight to criteria
and FTOPSIS for
ranking
alternatives.
- - 10 criteria
and 5
alternatives.
17 R Krishan kumar
et al. [36]
2017 IF-GDM
IF-AHP
To select Best
cloud vendor.
Intuitionistic Fuzzy
Group-Decision-
Making (IF-
GDM) approach
based on IF-AHP
for pairwise
comparison of
criteria and ranking
cloud.
- - 5 criteria
and 4
alternatives.
18 Chandrashekar
Jatoth et al.[37]
2018 AHP
and G-
TOPSIS
To select the
cloud service.
AHP for defining
priorities of criteria
and EG-TOPSIS
for selecting and
ranking the
alternatives.
- - 5 criteria
and 19
alternatives.
17
Literature Review (Comparison
Table)
19 Osama So-
Haib Mohsen
Naderpour
[38]
2017 Fuzzy
TOPSIS
Suitable
Adoption of
cloud computing
in e-commerce.
Categorized the
criteria in TOE
factors and ranked
the cloud service by
Fuzzy TOPSIS.
SAAS
PAAS
IAAS
- 12 criteria
and 3
alternatives.
20 Deepti Rai,
Pavan Kumar V
[39]
2016 TOPSIS,
VIKOR
To select
Best cloud
service.
Daily basis ranking
comparison of
cloud service of
using TOPSIS ANF
VIKOR.
IAAS Cloud
- Sim
3 criteria
and 10
alternatives.
21 Hamzeh
Mohammd
Alabool, Ahmad
Kamil Mahmood
[40]
2013 FM-
VIKOR
Trust based
Cloud
Service Selection.
Modified VIKOR is
extended by using
fuzzy to evaluate
trust of CIS, and
ranked then based
on their degree of
trust.
IAAS - 15 criteria
and 5 CSP.
22 Zohreh Ak-
barizadeh, Mahdi
Faghihi[41]
2017 SWARA,
VIKOR
To rank CSP SWARA for
assigning and
VIKOR to Ranking
the CSP.
- - 28 criteria
and 4
alternatives.
18
Literature Review (Comparison
Table)
23 Jagpreet
Sidhu1, Sarbjeet
Singh1[42]
2019 I-PROM
ETHEE,
AHP
To select
trustworthyCloud
Databse Server.
AHP for
relative importance
of the criteria
and IMPROVED-
PROMETHEE for
ranking the CDS
by calculating
Positive Outranking
Flow and
Negative
Outranking Flow.
- - 10
parameters
and 18
alternatives.
24 Hua Ma at
al.[43]
2017 N-
ELECTRE
Trustworthy
Ranking
Prediction for
Cloud Service
Improved
ELECTRE is
formed by
combining INS and
KRCC, INS is used
for measuring the
trust and KRCC is
used for ranking.
- - 8 CSP
25 Gulcin
Buyu kozkan1 et
al.[44]
2018 IVIF
MCDM
methods.
CCT selec-
tion based
on IVIF MCDM
methods.
IVIF AHP for
pairwise
comparison of
criteria, IVIF
- - 6 criteria
and
27 sub
criteria, 4
19
Literature Review (Comparison
Table)
26 Radulescu
Constan¸ta Zoie
et al.[45]
2016 DEMATAL
and AHP
To assign
weight and rank
criteria for CPS
A hybrid method
DANP is used for
calculating criteria
and cluster weight
and the global
weight and the rank
is evaluated from
the super matrix.
- - 32 criteria
and 3
cluster.
27 Chandrashekar,Ja
toth1 et al. [46]
2016 AHP,
ANP
M-DEA,
M-SDEA
To evaluate
the efficiency of
cloud service.
AHP, ANP for
determining
priority and weight
of the QoS attribute
and DEA and
SDEA for
calculating the
efficiency to rank
the cloud ser- vice.
- - 7 criteria
and 11
Alternatives
.
28 Nivethitha
Somu et al.[47]
2017 HGCM,
MDHP
To rank CSP. Helly Property and
Hyper Graph is
used to assign
weight and MDHP
is used for ranking
the alternatives.
Overall - 6 criteria
and 5
alternatives.
20
Literature Review (Comparison
Table)
29 Chinu Singla
et al.[48]
2018 FDM,
FAHP
Decision
making model for
multimedia cloud
based on
computational
Intelligence.
FDM for selection
of decision criteria
and FAHP for
determining
importance of each
criteria and rank the
alternatives.
IAAS Cloud
Sim
and
MAT-
LAB
5criteria
and
5
alternatives
30 Gireesha
Obulaporam et.
al.[52]
2019 CRITIC
and Grey
Rela-
tional
Analysis
Ranking
approach
for Cloud
Service
Selection
To overcome the
Rank reversal of
Many MCDM
method, GCRIT-
ICPA is used.
CRITICA to
determine weight
of criteria and
GRA to rank the
CSP.
- -
19 CSP
and 9
attributes.
21
Problem Statement
Consider a set of m clouds where
and a set n criteria
In which each criteria has a weight
Note that
A criteria , where (1 ≤ i ≤ n) refers to one of the
attributes of the cloud.
)
,...,
,
,
( 3
2
1 m
C
C
C
C
C 
)
,...,
,
,
( 3
2
1 n
A
A
A
A
A 
i
A i
W
1
1



n
i
i
W
i
A
22
Problem Statement
The criteria are further categorized into beneficial criteria or non-beneficial
criteria. Here, the value of beneficial criteria is to be maximized, whereas the
value of non-beneficial criteria is to be minimized. The value of each cloud
with respect to the criteria is presented in the form of a matrix called multi-
criteria decision making (MCDM) matrix, is shown in Eq. 1
(1)
An element (1 ≤ i ≤ m, 1 ≤ j ≤ n) in MCDM matrix denotes the
performance value of a cloud on a criteria
ij
M
i
C j
A
23
Problem Statement
Given a MCDM matrix, the problem is to select a best cloud out of a set of
m clouds or rank a set of clouds, such that the best cloud service provider
(CSP) holds the maximum beneficial and minimum non-beneficial criteria
value.
Moreover the problem is to select best criteria of each cloud provider, such
that a composite service is to be provided by collaborating among the
CSPs.
24
Proposed Method
Objective- A B-O-C-R Model for cloud selection using ANP and VIKOR.
The computation of the service index is done using the quality of
service (QoS) data of three cloud providers namely Amazon EC2,
Windows Azure and Rackspace
The QoS data is collected from various evaluation studies [ Garg et.al.]
The unavailable data is assigned randomly.
25
QoS requirements are Accountability, Agility, Assurance, Performance,
VM cost, Data cost, Storage cost, Adaptability, Flexibility, Serviceability,
Provider’s risk, Compliances, HR risk.
Proposed Method
Main steps to model the Cloud Selection problem
• Group the QoS requirements in B-O-C-R model ( Benefit , Opportunities, Cost,
Risk )
• Compute the relative importance of the QoS requirements in each group and
find the local priority of the alternatives in each group using ANP.
• Rank the alternatives using VIKOR.
26
Proposed Method
A brief explanation of Analytical Hierarchy Process (AHP)
AHP is a well-known MCDM algorithm which perform pair-wise comparison of
criteria and sub-criteria, resulting a local priority or an weighting factor.
Goal
Goal
C1 C2 C3
A1 A2 A3
1516 68
16
52 24 24 27
Proposed Method
C1,C2, C2 are criteria and A1, A2, A3 are alternatives
Goal
C1 C2 C3
A1 A2 A3
1516 68
16
52 24 24
The goal is to find the best alternatives.
 The criteria weight is assigned by
relative comparison matrix and local
priority is calculated by the Eigen
value of the matrix.
 By applying the global priorities to
the alternatives, we finally get a
ranking of alternatives with respect to
the criteria and sub-criteria.
28
Proposed Method
We have used Analytical Network Process (ANP) instead of AHP in our
algorithm
 The ANP is a decision finding method and generalization of the AHP
 ANP can model complex decision problem where a hierarchal model AHP is
not sufficient.
 In ANP criteria, sub criteria , alternatives are treated equally as nodes in a
network.
 Each of the node might compared to any other node , as long as there is a
relation between them.
 In ANP nodes might grouped in clusters e.g. beneficial, non-beneficial.
 Beside local priorities in the comparison of one node to a set of other node
cluster priorities can be introduced. 29
Proposed Method
We have used Analytical Network Process ANP instead of AHP in our algorithm
 The comparison of nodes to other follows the same principal and method as in
AHP
 Local priorities resulted from the Eigen vector of the comparison matrix.
Goal
C1 C2 C3
A1 A2 A3
15
30
Proposed Method with Case Study
We have used “Super Decision” Tool for computing ANP
31
Proposed Method with Case Study
Step 1 Group the QoS requirements in B-O-C-R model (Benefit,
Opportunities, Cost, Risk)
32
Proposed Method with Case Study
Step 1 Group the QoS requirements in B-O-C-R model (Benefit,
Opportunities, Cost, Risk)
33
Proposed Method with Case Study
Step2. Compute the relative importance of the QoS requirements in each
group and find the local priority of the alternatives in each group using
ANP.
34
Proposed Method with Case Study
Step2. Compute the relative importance of the QoS requirements in each
group and find the local priority of the alternatives in each group using
ANP.
35
Proposed Method with Case Study
Similarly for every nodes relative comparison is perform.
And finally a Super Decision Matrix is formed which contains the local
priorities of each alternative in each individual group
Matrix for Benefits
36
Proposed Method with Case Study
Similarly for every nodes relative comparison is perform.
And finally a Super Decision Matrix is formed which contains the local
priorities of each alternative in each individual group
Matrix for Opportunities
37
Proposed Method with Case Study
Similarly for every nodes relative comparison is perform.
And finally a Super Decision Matrix is formed which contains the local
priorities of each alternative in each individual group
Matrix for Cost
38
Proposed Method with Case Study
Similarly for every nodes relative comparison is perform.
And finally a Super Decision Matrix is formed which contains the local
priorities of each alternative in each individual group
Matrix for Risk
39
Proposed Method with Case Study
Overall value after final comparison
Sl No. Alternatives Benefits Opportunities Cost Risk
1 Amazon EC2 0.248367 0.166850 0.156623 0.255299
2 Rackspace 0.114261 0.164199 0.156356 0.124611
3 Windows Azure 0.137372 0.168951 0.187021 0.120089
40
Proposed Method with Case Study
Overall value of final comparison is represented in a Rader chart .
0
0.05
0.1
0.15
0.2
0.25
0.3
Benefits
Opportunities
Cost
Risk
Amazon EC2
Rackspace
Windows Azure
41
Proposed Method with Case Study
Step 3 Rank the alternatives using VIKOR.
Vise Kriterijumska Optimizacija I Kompromisno Resenje (VIKOR) is a
Serbian term and it is a MCDM algorithm.
It undergoes five phases,
• Normalization
• Difference
• Weighted and Normalized distance,
• Combined weight
• Selection.
42
Proposed Method with Case Study
In this phase, the maximum and minimum value of the criteria is determined
mathematically.
The Maximum value of the beneficial criteria is marked by red and the
minimum value of the non- beneficial criteria is marked by violet
Sl
No.
Alternatives Benefits Opportu
nities
Cost Risk
1 Amazon EC2 0.248367 0.166850 0.156623 0.255299
2 Rackspace 0.114261 0.164199 0.156356 0.124611
3 Windows Azure 0.137372 0.168951 0.187021 0.120089
43
Proposed Method with Case Study
Sl
No.
Alternatives Benefits Opportu
nities
Cost Risk
1 Amazon EC2 0.248367 0.166850 0.156623 0.255299
2 Rackspace 0.114261 0.164199 0.156356 0.124611
3 Windows Azure 0.137372 0.168951 0.187021 0.120089
Normalize data in range (0, 1)
x = x/xmax for beneficial attributes
x = xmin/x for non- beneficial attributes
44
Proposed Method with Case Study
Sl
No.
Alternatives Benefits Opportuniti
es
Cost Risk
1 Amazon EC2 0.248367/0.2
48367
0.166850/0.1
68951
0.156356/0.1
56623
0.120089/0.2
55299
2 Rackspace 0.114261/0.2
48367
0.164199/0.1
68951
0.156355/0.1
56356
0.120089/0.1
24611
3 Windows Azure 0.137372/0.2
48367
0.168951/0.1
68951
0.156356/0.1
87021
0.120089/0.1
20089
Normalize data in range (0, 1)
x = x/xmax for beneficial attributes
x = xmin/x for non- beneficial attributes
45
Proposed Method with Case Study
Sl
No.
Alternatives Benefits Opportuniti
es
Cost Risk
1 Amazon EC2 1 0.9875 0.9982 0.6004
2 Rackspace 0.4600 0.9718 1 0.9637
3 Windows Azure 0.5531 1 0.8360 1
Normalize data in range (0, 1)
x = x/xmax for beneficial attributes
x = xmin/x for non- beneficial attributes
46
Proposed Method with Case Study
Sl
No.
Alternatives Benefits Opportunities Cost Risk
1 Amazon EC2 1 0.9875 0.9982 0.6004
2 Rackspace 0.4600 0.9718 1 0.9637
3 Windows Azure 0.5531 1 0.8360 1
4 Max 1 1 1 1
5 Min 0.4600 0.9718 0.8360 0.6004
6 Difference (Max-
Min)
0.54 0.0282 0.164 0.3996
Find the difference MAX-MIN
47
Proposed Method with Case Study
Sl No. Alternatives Benefits Opportunities Cost Risk
weight 0.5 0.167 0.25 0.083
1 Amazon EC2 1 0.9875 0.9982 0.6004
2 Rackspace 0.4600 0.9718 1 0.9637
3 Windows Azure 0.5531 1 0.8360 1
4 Max 1 1 1 1
5 Min 0.4600 0.9718 0.8360 0.6004
6 Difference(Max-
Min)
0.54 0.0282 0.164 0.3996
Assign weights to the criteria
48
Proposed Method with Case Study
Sl
No.
Alternatives Benefits Opportunities Cost Risk
Weight 0.5 0.167 0.25 0.083
1 Amazon EC2 1 0.9875 0.9982 0.6004
2 Rackspace 0.4600 0.9718 1 0.9637
3 Windows
Azure
0.5531 1 0.8360 1
4 Max 1 1 1 1
5 Min 0.4600 0.9718 0.8360 0.6004
6 Difference
(Max-Min)
0.54 0.0282 0.164 0.3996
Find weighted and normalized distance E.
MX- Maximum value, MN-Minimum value, N – Criteria value ,WNMD-Weighted
Normalized Distance value
49
Proposed Method with Case Study
Combined weight is calculated the minimum value of the weight is ranked 1
and as the value increases the rank increases
All the simulation have been performed by using MATLAB 2016a
Sl No. Alternatives Combined
Weight is
Calculated
Rank
1 Amazon EC2 0 1
2 Rackspace 0.8642 2
3 Windows Azure 1 3
50
Conclusion
• This work is done on the basis of Multi-Criteria Decision Making
algorithm for selecting the best cloud service provider by analyzing and
comparing different beneficial and non beneficial Quality of Service
requirement.
• Different cloud service selection model like AHP, Fuzzy TOPSIS, ANP,
VIKORE, MOORA, PROMETHEE and DEA have been reviewed. It has
been analyzed that most of the framework assigned weight to the service
attribute and assigned rank after processing and comparing the attribute.
• A comparison table is drawn based on the survey and a hybrid
algorithm is proposed to select the best cloud among 3 different CSP, which
combines ANP and VIKOR to select the cloud.
51
Road Map
52
References
[1]. El-Gazzar, R. F. (2014, June). A literature review on cloud computing adoption issues in enterprises.
In International Working Conference on Transfer and Diffusion of IT (pp. 214-242). Springer, Berlin,
Heidelberg.
[2]. Mell, P., & Grance, T. (2011). The NIST definition of cloud computing.
[3]. Kumar, R. R., Mishra, S., & Kumar, C. (2017). Prioritizing the solution of cloud service selection using
integrated MCDM methods under Fuzzy environment. The Journal of Supercomputing, 73(11), 4652-
4682.
[4]. Garg, S. K., Versteeg, S., & Buyya, R. (2013). A framework for ranking of cloud computing
services. Future Generation Computer Systems, 29(4), 1012-1023.
[5]. Garrison, G., Wakefield, R. L., & Kim, S. (2015). The effects of IT capabilities and delivery model on cloud
computing success and firm performance for cloud supported processes and operations. International
Journal of Information Management, 35(4), 377-393.
[6]. Buyya, R., Vecchiola, C., & Selvi, S. T. (2013). Mastering cloud computing: foundations and
applications programming, Mc Graw Hill Education, 1st edition.
[7]. Low, C., & Chen, Y. H. (2012). Criteria for the evaluation of a cloud-based hospital information system
outsourcing provider. Journal of medical systems, 36(6), 3543-3553.
[8]. Whaiduzzaman, M., Gani, A., Anuar, N. B., Shiraz, M., Haque, M. N., & Haque, I. T. (2014). Cloud service
selection using multicriteria decision analysis. The Scientific World Journal, 2014.
[15] Ferrer, A. J., HernáNdez, F., Tordsson, J., Elmroth, E., Ali-Eldin, A., Zsigri, C., ... & Ziegler, W. (2012).
OPTIMIS: A holistic approach to cloud service provisioning. Future Generation Computer Systems, 28(1),
66-77.
[16] El-Gazzar, R., Hustad, E., & Olsen, D. H. (2016). Understanding cloud computing adoption issues: A
Delphi study approach. Journal of Systems and Software, 118, 64-84.
[17] Salleh, S. M., Teoh, S. Y., & Chan, C. (2012, July). Cloud Enterprise Systems: A Review Of Literature And
Its Adoption. In PACIS (p. 76). 53
References
[18] Buyya, R., Yeo, C. S., & Venugopal, S. (2008, September). Market-oriented cloud computing: Vision, hype,
and reality for delivering it services as computing utilities. In 2008 10th IEEE international conference on
high performance computing and communications (pp. 5-13). Ieee.
[19] Dutta, A., Peng, G. C. A., & Choudhary, A. (2013). Risks in enterprise cloud computing: the perspective of
IT experts. Journal of Computer Information Systems, 53(4), 39-48.
[20] Khajeh‐Hosseini, A., Greenwood, D., Smith, J. W., & Sommerville, I. (2012). The cloud adoption toolkit:
supporting cloud adoption decisions in the enterprise. Software: Practice and Experience, 42(4), 447-465.
[21] Motahari-Nezhad, H. R., Stephenson, B., & Singhal, S. (2009). Outsourcing business to cloud computing
services: Opportunities and challenges. IEEE Internet Computing, 10(4), 1-17.
[22] Sidhu, J., & Singh, S. (2017). Improved topsis method based trust evaluation framework for determining
trustworthiness of cloud service providers. Journal of Grid Computing, 15(1), 81-105.
[23] Jatoth, C., Gangadharan, G. R., & Fiore, U. (2017). Evaluating the efficiency of cloud services using
modified data envelopment analysis and modified super-efficiency data envelopment analysis. Soft
Computing, 21(23), 7221-7234.
[24] Kumar, R. R., Mishra, S., & Kumar, C. (2017). Prioritizing the solution of cloud service selection using
integrated MCDM methods under Fuzzy environment. The Journal of Supercomputing, 73(11), 4652-4682.
[25] Tran, V. X., Tsuji, H., & Masuda, R. (2009). A new QoS ontology and its QoS-based ranking algorithm for
Web services. Simulation Modelling Practice and Theory, 17(8), 1378-1398.
[26] Garg, S. K., Versteeg, S., & Buyya, R. (2013). A framework for ranking of cloud computing
services. Future Generation Computer Systems, 29(4), 1012-1023.
[27] Liu, S., Chan, F. T., & Ran, W. (2016). Decision making for the selection of cloud vendor: An improved
approach under group decision-making with integrated weights and objective/subjective attributes. Expert
Systems with Applications, 55, 37-47.
54
References
[29] Kumar, R. R., Mishra, S., & Kumar, C. (2017). Prioritizing the solution of cloud service selection using
integrated MCDM methods under Fuzzy environment. The Journal of Supercomputing, 73(11), 4652-4682.
[30] Kumar, R. R., & Kumar, C. (2018). A Multi Criteria Decision Making Method for Cloud Service Selection
and Ranking. International Journal of Ambient Computing and Intelligence (IJACI), 9(3), 1-14.
[31] Yadav, N., & Goraya, M. S. (2018). Two-way ranking based service mapping in cloud environment. Future
Generation Computer Systems, 81, 53-66.
[32] Tran, V. X., Tsuji, H., & Masuda, R. (2009). A new QoS ontology and its QoS-based ranking algorithm for
Web services. Simulation Modelling Practice and Theory, 17(8), 1378-1398.
[33] Sidhu, J., & Singh, S. (2017). Design and comparative analysis of MCDM-based multi-dimensional trust
evaluation schemes for determining trustworthiness of cloud service providers. Journal of Grid
Computing, 15(2), 197-218.
[34] Rădulescu, C. Z., & Rădulescu, I. C. (2017). An extended TOPSIS approach for ranking cloud service
providers. Stud. Inform. Control, 26, 183-192.
[35] Boutkhoum, O., Hanine, M., Agouti, T., & Tikniouine, A. (2017). A decision-making approach based on
fuzzy AHP-TOPSIS methodology for selecting the appropriate cloud solution to manage big data
projects. International Journal of System Assurance Engineering and Management, 8(2), 1237-1253.
[36] Krishankumar, R., Arvinda, S. R., Amrutha, A., Premaladha, J., & Ravichandran, K. S. (2017, July). A
decision making framework under intuitionistic fuzzy environment for solving cloud vendor selection
problem. In 2017 International Conference on Networks & Advances in Computational Technologies
(NetACT) (pp. 140-144). IEEE.
[37] 36. Jatoth, C., Gangadharan, G. R., Fiore, U., & Buyya, R. (2018). SELCLOUD: a hybrid multi-criteria
decision-making model for selection of cloud services. Soft Computing, 1-15.
[38] Sohaib, O., & Naderpour, M. (2017, July). Decision making on adoption of cloud computing in e-
commerce using fuzzy TOPSIS. In 2017 IEEE International Conference on Fuzzy Systems (FUZZ-
IEEE) (pp. 1-6). IEEE.
55
References
[39] Rai, D., & Kumar, P. (2016). Instance based multi criteria decision model for cloud service selection using
TOPSIS and VIKOR. Int. J. Comput Eng. Technol, 7, 78-87.
[40] Alabool, H. M., & Mahmood, A. K. (2013). Trust-based service selection in public cloud computing using
fuzzy modified VIKOR method. Australian Journal of Basic and Applied Sciences, 7(9), 211-220.
[41] Akbarizade, Z., & Faghihi, M. (2017). Ranking CloudService Providers using SWARA and VIKOR (A case
of Irancell Company). International Journal of Information, Security and Systems Management, 6(2), 679-
686.
[42] Sidhu, J., & Singh, S. (2019). Using the Improved PROMETHEE for Selection of Trustworthy Cloud
Database Servers. INTERNATIONAL ARAB JOURNAL OF INFORMATION TECHNOLOGY, 16(2), 194-
202.
[43] Ma, H., Zhu, H., Hu, Z., Li, K., & Tang, W. (2017). Time-aware trustworthiness ranking prediction for
cloud services using interval neutrosophic set and ELECTRE. Knowledge-Based Systems, 138, 27-45.
[44] Büyüközkan, G., Göçer, F., & Feyzioğlu, O. (2018). Cloud computing technology selection based on
interval-valued intuitionistic fuzzy MCDM methods. Soft Computing, 22(15), 5091-5114.
[45] Zoie, R. C., Alexandru, B., Mihaela, R. D., & Mihail, D. (2016, October). A decision making framework for
weighting and ranking criteria for Cloud provider selection. In 2016 20th International Conference on
System Theory, Control and Computing (ICSTCC) (pp. 590-595). IEEE.
[46] Jatoth, C., Gangadharan, G. R., & Fiore, U. (2017). Evaluating the efficiency of cloud services using
modified data envelopment analysis and modified super-efficiency data envelopment analysis. Soft
Computing, 21(23), 7221-7234.
[47] Somu, N., Kirthivasan, K., & VS, S. S. (2017). A computational model for ranking cloud service providers
using hypergraph based techniques. Future Generation Computer Systems, 68, 14-30.
[48] Singla, C., Kaushal, S., Verma, A., & Kumar, H. (2018). A Hybrid Computational Intelligence Decision
Making Model for Multimedia Cloud Based Applications. In Computational Intelligence for Multimedia
Big Data on the Cloud with Engineering Applications (pp. 147-157). Academic Press.
56
References
[49] Lee, S., & Seo, K. K. (2016). A hybrid multi-criteria decision-making model for a
cloud service selection problem using BSC, fuzzy Delphi method and fuzzy
AHP. Wireless Personal Communications, 86(1), 57-75.
[50] Ghafori, V., & Sarhadi, R. M. (2013). Best cloud provider selection using
integrated ANP-DEMATEL and prioritizing SMI attributes. International Journal
of Computer Applications, 71(16).
[51] Liu, S., Chan, F. T., & Ran, W. (2016). Decision making for the selection of cloud
vendor: An improved approach under group decision-making with integrated
weights and objective/subjective attributes. Expert Systems with Applications, 55,
37-47.
[52] Obulaporam, G., Somu, N., Ramani, G. R. M., Boopathy, A. K., & Sankaran, S. S.
V. (2018, December). GCRITICPA: A CRITIC and Grey relational analysis based
service ranking approach for cloud service selection. In International Conference
on Intelligent Information Technologies (pp. 3-16). Springer, Singapore.
57
58

More Related Content

What's hot

Multi-Criteria Decision-Making (MCDM) as a powerful tool for sustainable deve...
Multi-Criteria Decision-Making (MCDM) as a powerful tool for sustainable deve...Multi-Criteria Decision-Making (MCDM) as a powerful tool for sustainable deve...
Multi-Criteria Decision-Making (MCDM) as a powerful tool for sustainable deve...
nitinrane33
 
Multi criteria decision support system on mobile phone selection with ahp and...
Multi criteria decision support system on mobile phone selection with ahp and...Multi criteria decision support system on mobile phone selection with ahp and...
Multi criteria decision support system on mobile phone selection with ahp and...
Reza Ramezani
 
Simulated Binary Crossover
Simulated Binary CrossoverSimulated Binary Crossover
Simulated Binary Crossover
paskorn
 
Genetic algorithm artificial intelligence presentation
Genetic algorithm   artificial intelligence presentationGenetic algorithm   artificial intelligence presentation
Genetic algorithm artificial intelligence presentation
Tauhidul Khandaker
 
KubeFlow + GPU + Keras/TensorFlow 2.0 + TF Extended (TFX) + Kubernetes + PyTo...
KubeFlow + GPU + Keras/TensorFlow 2.0 + TF Extended (TFX) + Kubernetes + PyTo...KubeFlow + GPU + Keras/TensorFlow 2.0 + TF Extended (TFX) + Kubernetes + PyTo...
KubeFlow + GPU + Keras/TensorFlow 2.0 + TF Extended (TFX) + Kubernetes + PyTo...
Chris Fregly
 

What's hot (20)

Simulated Annealing
Simulated AnnealingSimulated Annealing
Simulated Annealing
 
Collaborative filtering
Collaborative filteringCollaborative filtering
Collaborative filtering
 
Genetic Algorithms
Genetic AlgorithmsGenetic Algorithms
Genetic Algorithms
 
Collaborative Filtering Recommendation System
Collaborative Filtering Recommendation SystemCollaborative Filtering Recommendation System
Collaborative Filtering Recommendation System
 
Multi-Criteria Decision-Making (MCDM) as a powerful tool for sustainable deve...
Multi-Criteria Decision-Making (MCDM) as a powerful tool for sustainable deve...Multi-Criteria Decision-Making (MCDM) as a powerful tool for sustainable deve...
Multi-Criteria Decision-Making (MCDM) as a powerful tool for sustainable deve...
 
Introduction to Genetic Algorithms
Introduction to Genetic AlgorithmsIntroduction to Genetic Algorithms
Introduction to Genetic Algorithms
 
Multi criteria decision support system on mobile phone selection with ahp and...
Multi criteria decision support system on mobile phone selection with ahp and...Multi criteria decision support system on mobile phone selection with ahp and...
Multi criteria decision support system on mobile phone selection with ahp and...
 
Logistic regression
Logistic regressionLogistic regression
Logistic regression
 
Sentiment Analysis of Social Issues - Negation Handling
Sentiment Analysis of Social Issues - Negation Handling Sentiment Analysis of Social Issues - Negation Handling
Sentiment Analysis of Social Issues - Negation Handling
 
Simulated Binary Crossover
Simulated Binary CrossoverSimulated Binary Crossover
Simulated Binary Crossover
 
Genetic algorithm artificial intelligence presentation
Genetic algorithm   artificial intelligence presentationGenetic algorithm   artificial intelligence presentation
Genetic algorithm artificial intelligence presentation
 
Clustering
ClusteringClustering
Clustering
 
Genetic algorithms in Data Mining
Genetic algorithms in Data MiningGenetic algorithms in Data Mining
Genetic algorithms in Data Mining
 
Pagerank and hits
Pagerank and hitsPagerank and hits
Pagerank and hits
 
Collaborative filtering
Collaborative filteringCollaborative filtering
Collaborative filtering
 
KubeFlow + GPU + Keras/TensorFlow 2.0 + TF Extended (TFX) + Kubernetes + PyTo...
KubeFlow + GPU + Keras/TensorFlow 2.0 + TF Extended (TFX) + Kubernetes + PyTo...KubeFlow + GPU + Keras/TensorFlow 2.0 + TF Extended (TFX) + Kubernetes + PyTo...
KubeFlow + GPU + Keras/TensorFlow 2.0 + TF Extended (TFX) + Kubernetes + PyTo...
 
Recommendation Systems
Recommendation SystemsRecommendation Systems
Recommendation Systems
 
Training Series - Intro to Neo4j
Training Series - Intro to Neo4jTraining Series - Intro to Neo4j
Training Series - Intro to Neo4j
 
Building an Implicit Recommendation Engine with Spark with Sophie Watson
Building an Implicit Recommendation Engine with Spark with Sophie WatsonBuilding an Implicit Recommendation Engine with Spark with Sophie Watson
Building an Implicit Recommendation Engine with Spark with Sophie Watson
 
Ensemble learning
Ensemble learningEnsemble learning
Ensemble learning
 

Similar to Design of Multi-Criteria Decision making algorithm for Cloud.pptx

Cloud service ranking with an integration of k-means algorithm and decision-m...
Cloud service ranking with an integration of k-means algorithm and decision-m...Cloud service ranking with an integration of k-means algorithm and decision-m...
Cloud service ranking with an integration of k-means algorithm and decision-m...
IJECEIAES
 
A Cloud Service Selection Model Based on User-Specified Quality of Service Level
A Cloud Service Selection Model Based on User-Specified Quality of Service LevelA Cloud Service Selection Model Based on User-Specified Quality of Service Level
A Cloud Service Selection Model Based on User-Specified Quality of Service Level
csandit
 
A cloud service selection model based
A cloud service selection model basedA cloud service selection model based
A cloud service selection model based
csandit
 
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATIONWEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
ijwscjournal
 
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATIONWEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
ijwscjournal
 
Priority Based Prediction Mechanism for Ranking Providers in Federated Cloud ...
Priority Based Prediction Mechanism for Ranking Providers in Federated Cloud ...Priority Based Prediction Mechanism for Ranking Providers in Federated Cloud ...
Priority Based Prediction Mechanism for Ranking Providers in Federated Cloud ...
IJERA Editor
 

Similar to Design of Multi-Criteria Decision making algorithm for Cloud.pptx (20)

Cloud service ranking with an integration of k-means algorithm and decision-m...
Cloud service ranking with an integration of k-means algorithm and decision-m...Cloud service ranking with an integration of k-means algorithm and decision-m...
Cloud service ranking with an integration of k-means algorithm and decision-m...
 
A CLOUD COMPUTING USING ROUGH SET THEORY FOR CLOUD SERVICE PARAMETERS THROUGH...
A CLOUD COMPUTING USING ROUGH SET THEORY FOR CLOUD SERVICE PARAMETERS THROUGH...A CLOUD COMPUTING USING ROUGH SET THEORY FOR CLOUD SERVICE PARAMETERS THROUGH...
A CLOUD COMPUTING USING ROUGH SET THEORY FOR CLOUD SERVICE PARAMETERS THROUGH...
 
A CLOUD COMPUTING USING ROUGH SET THEORY FOR CLOUD SERVICE PARAMETERS THROUGH...
A CLOUD COMPUTING USING ROUGH SET THEORY FOR CLOUD SERVICE PARAMETERS THROUGH...A CLOUD COMPUTING USING ROUGH SET THEORY FOR CLOUD SERVICE PARAMETERS THROUGH...
A CLOUD COMPUTING USING ROUGH SET THEORY FOR CLOUD SERVICE PARAMETERS THROUGH...
 
A cloud computing using rough set theory for cloud service parameters through...
A cloud computing using rough set theory for cloud service parameters through...A cloud computing using rough set theory for cloud service parameters through...
A cloud computing using rough set theory for cloud service parameters through...
 
A Cloud Service Selection Model Based on User-Specified Quality of Service Level
A Cloud Service Selection Model Based on User-Specified Quality of Service LevelA Cloud Service Selection Model Based on User-Specified Quality of Service Level
A Cloud Service Selection Model Based on User-Specified Quality of Service Level
 
A cloud service selection model based
A cloud service selection model basedA cloud service selection model based
A cloud service selection model based
 
INSTANCE BASED MULTI CRITERIA DECISION MODEL FOR CLOUD SERVICE SELECTION USI...
 INSTANCE BASED MULTI CRITERIA DECISION MODEL FOR CLOUD SERVICE SELECTION USI... INSTANCE BASED MULTI CRITERIA DECISION MODEL FOR CLOUD SERVICE SELECTION USI...
INSTANCE BASED MULTI CRITERIA DECISION MODEL FOR CLOUD SERVICE SELECTION USI...
 
A cloud broker approach with qos attendance and soa for hybrid cloud computin...
A cloud broker approach with qos attendance and soa for hybrid cloud computin...A cloud broker approach with qos attendance and soa for hybrid cloud computin...
A cloud broker approach with qos attendance and soa for hybrid cloud computin...
 
A CLOUD BROKER APPROACH WITH QOS ATTENDANCE AND SOA FOR HYBRID CLOUD COMPUTIN...
A CLOUD BROKER APPROACH WITH QOS ATTENDANCE AND SOA FOR HYBRID CLOUD COMPUTIN...A CLOUD BROKER APPROACH WITH QOS ATTENDANCE AND SOA FOR HYBRID CLOUD COMPUTIN...
A CLOUD BROKER APPROACH WITH QOS ATTENDANCE AND SOA FOR HYBRID CLOUD COMPUTIN...
 
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATIONWEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
 
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATIONWEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
WEB SERVICE SELECTION BASED ON RANKING OF QOS USING ASSOCIATIVE CLASSIFICATION
 
Differentiating Algorithms of Cloud Task Scheduling Based on various Parameters
Differentiating Algorithms of Cloud Task Scheduling Based on various ParametersDifferentiating Algorithms of Cloud Task Scheduling Based on various Parameters
Differentiating Algorithms of Cloud Task Scheduling Based on various Parameters
 
F017633538
F017633538F017633538
F017633538
 
Priority Based Prediction Mechanism for Ranking Providers in Federated Cloud ...
Priority Based Prediction Mechanism for Ranking Providers in Federated Cloud ...Priority Based Prediction Mechanism for Ranking Providers in Federated Cloud ...
Priority Based Prediction Mechanism for Ranking Providers in Federated Cloud ...
 
QoS Based Scheduling Techniques in Cloud Computing: Systematic Review
QoS Based Scheduling Techniques in Cloud Computing: Systematic ReviewQoS Based Scheduling Techniques in Cloud Computing: Systematic Review
QoS Based Scheduling Techniques in Cloud Computing: Systematic Review
 
Compatibility aware cloud service composition under fuzzy preferences of users
Compatibility aware cloud service composition under fuzzy preferences of usersCompatibility aware cloud service composition under fuzzy preferences of users
Compatibility aware cloud service composition under fuzzy preferences of users
 
Revenue Maximization with Good Quality of Service in Cloud Computing
Revenue Maximization with Good Quality of Service in Cloud ComputingRevenue Maximization with Good Quality of Service in Cloud Computing
Revenue Maximization with Good Quality of Service in Cloud Computing
 
Modeling Local Broker Policy Based on Workload Profile in Network Cloud
Modeling Local Broker Policy Based on Workload Profile in Network CloudModeling Local Broker Policy Based on Workload Profile in Network Cloud
Modeling Local Broker Policy Based on Workload Profile in Network Cloud
 
User-Centric Optimization for Constraint Web Service Composition using a Fuzz...
User-Centric Optimization for Constraint Web Service Composition using a Fuzz...User-Centric Optimization for Constraint Web Service Composition using a Fuzz...
User-Centric Optimization for Constraint Web Service Composition using a Fuzz...
 
USER-CENTRIC OPTIMIZATION FOR CONSTRAINT WEB SERVICE COMPOSITION USING A FUZZ...
USER-CENTRIC OPTIMIZATION FOR CONSTRAINT WEB SERVICE COMPOSITION USING A FUZZ...USER-CENTRIC OPTIMIZATION FOR CONSTRAINT WEB SERVICE COMPOSITION USING A FUZZ...
USER-CENTRIC OPTIMIZATION FOR CONSTRAINT WEB SERVICE COMPOSITION USING A FUZZ...
 

More from MunmunSaha7 (6)

cloud computing.ppt
cloud computing.pptcloud computing.ppt
cloud computing.ppt
 
vssutcloud computing.pptx
vssutcloud computing.pptxvssutcloud computing.pptx
vssutcloud computing.pptx
 
sla nptl.pptx
sla nptl.pptxsla nptl.pptx
sla nptl.pptx
 
nptl cc video.pptx
nptl cc video.pptxnptl cc video.pptx
nptl cc video.pptx
 
Network lab.pptx
Network lab.pptxNetwork lab.pptx
Network lab.pptx
 
cloudintro-lec01.ppt
cloudintro-lec01.pptcloudintro-lec01.ppt
cloudintro-lec01.ppt
 

Recently uploaded

Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 

Recently uploaded (20)

Asian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptxAsian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptx
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Role Of Transgenic Animal In Target Validation-1.pptx
Role Of Transgenic Animal In Target Validation-1.pptxRole Of Transgenic Animal In Target Validation-1.pptx
Role Of Transgenic Animal In Target Validation-1.pptx
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 

Design of Multi-Criteria Decision making algorithm for Cloud.pptx

  • 1. Design of Multi-Criteria Decision Making Algorithms for Cloud Computing Submitted By: Guided By: Munmun Saha Dr. Suvasini Panigrahi Reg No – 1810040005 Co-Guide: Dr. Sanjaya Kumar Panda Department of Computer Science and Engineering Veer Surendra Sai University of Technology Burla,Odisha, India 1
  • 2. CONTENT  Introduction to Cloud Computing  Overview of Multi-Criteria Decision Making  Motivation and Objectives  Previous Work  Problem Statement  Proposed Methodology and Case Study  Road Map  Conclusions  Reference 2
  • 3. Definition of Cloud Computing Some Definitions of Cloud Computing • Cloud Computing is a general term which simply means, distributed computing over the internet, or delivering computing service on the internet . • The practice of using a network of remote servers hosted on the Internet to store, manage, and process data, rather than a local server or a personal computer. This is known as Cloud computing. • National Institute of Standards and Technology (NIST), which defines cloud computing as, a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. 3
  • 4. Services of Cloud Computing 4 Figure 1 Services of Cloud Computing
  • 5. Deployment Models of Cloud Computing 5 Figure 2 Deployment Models of Cloud Computing
  • 6. Multi-Criteria Decision Making Multi Criteria Decision Making (MCDM) refers to making decisions in the presence of multiple usually conflicting criteria. 6 Figure 3 Multi-Criteria Decision Making
  • 7. Real Life of Multi-Criteria Decision Making Suppose we want to buy a Car. We have different alternatives- BMW, Ford, Honda, Toyota etc. To Select the Best Car Price MPG Style Riding Comfort 7 Figure4 Real life example of Multi-Criteria Decision Making
  • 8. Motivation and Objectives In spite of huge significance, scant attention has been given in the area of MCDM in cloud computing by considering all the performance parameters including both beneficial and non beneficial attribute. The primary objective of MCDM is to select the best Cloud Service Provider without compromising any SLA index, and considering both the objective and subjective criteria. Moreover, whatever the algorithms that have been implemented in the existing works that is also limited to many applications specially minimizing the non- beneficial attribute and maximizing the beneficial attribute value. Following these ideas and motivated from previous works, the objective of MCDM algorithm are as follows: 8
  • 9. Motivation and Objectives • To select the best CSP among various homogenous alternatives. • To evaluate different QoS parameter of the Cloud Service Provider. • Not to Compromise the Key Performance index mentioned in the SLAs. • To categorized the QoS parameter in B-O-C-R model (Benefits, Opportunities, Cost, Risk) where Benefit and Opportunities are beneficial attribute and Cost and Risk is Non-Beneficial attributes. • To select the CSP having the maximum beneficial and minimum non-beneficial criteria value. • To select best criteria of each cloud provider, such that a composite service is to be provided by collaborating among the CSPs. 9
  • 10. Elements of Multi-Criteria Decision Making MCDM problem has five elements • A Goal • At least two Alternatives • Two or more criteria • Criteria weights • Decision Makers 10
  • 11. Type of Multi Criteria Decision Making They further classified into 1. Multi Object Decision Making (MODM) 2. Multi Attribute Decision Making (MADM) MODM is applied on Continuous search space whereas MADM is applied on finite number of alternatives or discrete search space. MCDM MODM MADM 11 Figure 5(a) Multi-Criteria Decision Making , Figure 5(b) Multi-Criteria Decision Making
  • 12. Some existing algorithms of MCDM MCDM MODM MADM Weighted Sum Method (WSM) Weighted Product Method (WPM) Analytic Hierarchy Process (AHP) Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) Preference Organization Ranking Method for Enrichment of Evaluation (PROMETHEE) Vise Kriterijumska Optimizacija I Kompromisno Resenjie (VIKOR) Multi Objective Optimization using Ratio Analysis (MOORA) Graph Theory Matrix Approach (GTMA) 12
  • 13. Literature Review (Comparison Table) Sl. No. Author’s Name Year MCDM Method Objective Approach Service Tool Criteria and Alter- natives 1 Manish Godse, Shrikant Mulik [27] 2008 AHP To select Appropriate SAAS product. Weight of the parameter and final product of the score of the alternatives are calculated by AHP to avoid subjective opinion. SAAS - 5 criteria and 16 sub criteria and 3 alternatives . 2 Vuong Xuan Tran et al. [32] 2009 AHP To rank web Service. QoS based ranking Algorithm combining AHP to rank web service. - - 8 QoS property and 5 web service. 3 Chen-Tung Chen, Kuan- Hung Lin [28] 2010 FAHP For evaluating Cloud Service Interval value Fuzzy sets combining with AHP are used for evaluation of cloud service. Overall services - 3 criteria 9 Sub- criteria and 3 alternativ 13 Table 1 Comparison Table
  • 14. Literature Review (Comparison Table) 4 Saurabh Kumar, Garg,Steve Versteeg,Rajkum ar Buyyaa [20] 2013 AHP Framework To Rank Cloud Computing service. AHP for both assigning weight to criteria and ranking the alternatives. Overall Services. - 6 criteria and 9 sub criteria and rank 3 alternatives 5 Hong-Kyu Kwon1, Kwang Kyu Seo [23] 2013 Fuzzy AHP To select suitable IAAS provider. Fuzzy AHP for assigning weight to the criteria and AHP for ranking the alternatives. IAAS Exper t Choic e 3 criteria and 8 Sub criteria and 5 IAAS Provider. 6 Mingzhe Wang, Yu Liu [24] 2013 ANP Evaluation of QoS requirement in Cloud Service Architecture (CSA). Control Hierarchy is constructed using ANP, relative superiority is calculated from the super-matrix. IAAS PAAS SAAS CPN+ and Clous - Sim 35 criteria and 5 alternatives. 7 Gultekin Atas ,Vehbi Cagri Gungor [21] 2014 AHP,LSP To Evaluate the performance of the PAAS Provider. AHP for decomposing performance variable and LSP for logical scoring PAAS - 3 criteria 19 sub criteria and 3 alternatives. 14
  • 15. Literature Review (Comparison Table) 8 Ramachandran N. et al.[22] 2014 AHP To deploy a appropriate model of cloud computing in an academic institution. AHP for comparing the criteria and ranking the cloud model. Overall service Super Deci- sion 6 main factor and 28 sub factors, and consider 4 cloud deployment model. 9 Mohamed AbdelBasset et. al.[25] 2016 NAHP To evaluate Cloud Computing service. Neutrosophic MCDM analysis approach based on AHP. IAAS PAAS SAAS - 5 criteria 3 alternatives. 10 Rajanpreet Kaur Chahal and Sarbjeet Singh[26] 2016 AHP To rank CSP. AHP is used for comparison of the criteria and ranking the alternatives. Overall - 4 criteria and 5 alternatives. 11 Rakesh Ranjan Kumar et al. [29] 2017 AHP and Fuzzy TOPSIS Prioritizing the solution of cloud ser- vice selection. AHP for calculation weight of the criteria and Fuzzy TOP- SIS for final rank of alternatives. - - 10 Criteria and 6 alternatives. 15
  • 16. Literature Review (Comparison Table) 12 Neeraj Yadava, Major Singh Gorayab[31] 2017 AHP Service Mapping In the cloud environment. Two-way service mapping approach between the Service Requesting Customer (SRCs) and the CSP based on AHP ranking. Overall - 3 criteria and 3 alternatives. 13 Rakesh Ranjan Kumar, Chiranjeev Kumar[30] 2018 AHP and TOPSIS To select and rank cloud service. AHP for weighting the criteria and TOPSIS for ranking the alternatives. IAAS PAAS SAAS - 10 criteria and 6 alternatives. 14 Jagpreet Sidhu, Sarbjeet Singh [33] 2017 AHP TOPSIS PROME- THEE To determine trust- worthiness of CSP Trust is evaluated using three MCDM technique AHP, TOPSIS PROMETHEE and the result is compared. - - 10 Attribute and 18 CSP 15 Zoie ra dulescu, Cristina,Ra dulescu [34] 2017 E-TOPSIS To rank CSP. Extended TOPSIS By using Murkowski distance is used to rank the CSP. - - 32 criteria and 10 CSP. 16
  • 17. Literature Review (Comparison Table) 16 Omar Boutkhoum et al. [35] 2017 Fuzzy AHP Fuzzy TOPSIS To select suitable cloud solu-tion for big data project. decision making approach consist of FAHP for assigning weight to criteria and FTOPSIS for ranking alternatives. - - 10 criteria and 5 alternatives. 17 R Krishan kumar et al. [36] 2017 IF-GDM IF-AHP To select Best cloud vendor. Intuitionistic Fuzzy Group-Decision- Making (IF- GDM) approach based on IF-AHP for pairwise comparison of criteria and ranking cloud. - - 5 criteria and 4 alternatives. 18 Chandrashekar Jatoth et al.[37] 2018 AHP and G- TOPSIS To select the cloud service. AHP for defining priorities of criteria and EG-TOPSIS for selecting and ranking the alternatives. - - 5 criteria and 19 alternatives. 17
  • 18. Literature Review (Comparison Table) 19 Osama So- Haib Mohsen Naderpour [38] 2017 Fuzzy TOPSIS Suitable Adoption of cloud computing in e-commerce. Categorized the criteria in TOE factors and ranked the cloud service by Fuzzy TOPSIS. SAAS PAAS IAAS - 12 criteria and 3 alternatives. 20 Deepti Rai, Pavan Kumar V [39] 2016 TOPSIS, VIKOR To select Best cloud service. Daily basis ranking comparison of cloud service of using TOPSIS ANF VIKOR. IAAS Cloud - Sim 3 criteria and 10 alternatives. 21 Hamzeh Mohammd Alabool, Ahmad Kamil Mahmood [40] 2013 FM- VIKOR Trust based Cloud Service Selection. Modified VIKOR is extended by using fuzzy to evaluate trust of CIS, and ranked then based on their degree of trust. IAAS - 15 criteria and 5 CSP. 22 Zohreh Ak- barizadeh, Mahdi Faghihi[41] 2017 SWARA, VIKOR To rank CSP SWARA for assigning and VIKOR to Ranking the CSP. - - 28 criteria and 4 alternatives. 18
  • 19. Literature Review (Comparison Table) 23 Jagpreet Sidhu1, Sarbjeet Singh1[42] 2019 I-PROM ETHEE, AHP To select trustworthyCloud Databse Server. AHP for relative importance of the criteria and IMPROVED- PROMETHEE for ranking the CDS by calculating Positive Outranking Flow and Negative Outranking Flow. - - 10 parameters and 18 alternatives. 24 Hua Ma at al.[43] 2017 N- ELECTRE Trustworthy Ranking Prediction for Cloud Service Improved ELECTRE is formed by combining INS and KRCC, INS is used for measuring the trust and KRCC is used for ranking. - - 8 CSP 25 Gulcin Buyu kozkan1 et al.[44] 2018 IVIF MCDM methods. CCT selec- tion based on IVIF MCDM methods. IVIF AHP for pairwise comparison of criteria, IVIF - - 6 criteria and 27 sub criteria, 4 19
  • 20. Literature Review (Comparison Table) 26 Radulescu Constan¸ta Zoie et al.[45] 2016 DEMATAL and AHP To assign weight and rank criteria for CPS A hybrid method DANP is used for calculating criteria and cluster weight and the global weight and the rank is evaluated from the super matrix. - - 32 criteria and 3 cluster. 27 Chandrashekar,Ja toth1 et al. [46] 2016 AHP, ANP M-DEA, M-SDEA To evaluate the efficiency of cloud service. AHP, ANP for determining priority and weight of the QoS attribute and DEA and SDEA for calculating the efficiency to rank the cloud ser- vice. - - 7 criteria and 11 Alternatives . 28 Nivethitha Somu et al.[47] 2017 HGCM, MDHP To rank CSP. Helly Property and Hyper Graph is used to assign weight and MDHP is used for ranking the alternatives. Overall - 6 criteria and 5 alternatives. 20
  • 21. Literature Review (Comparison Table) 29 Chinu Singla et al.[48] 2018 FDM, FAHP Decision making model for multimedia cloud based on computational Intelligence. FDM for selection of decision criteria and FAHP for determining importance of each criteria and rank the alternatives. IAAS Cloud Sim and MAT- LAB 5criteria and 5 alternatives 30 Gireesha Obulaporam et. al.[52] 2019 CRITIC and Grey Rela- tional Analysis Ranking approach for Cloud Service Selection To overcome the Rank reversal of Many MCDM method, GCRIT- ICPA is used. CRITICA to determine weight of criteria and GRA to rank the CSP. - - 19 CSP and 9 attributes. 21
  • 22. Problem Statement Consider a set of m clouds where and a set n criteria In which each criteria has a weight Note that A criteria , where (1 ≤ i ≤ n) refers to one of the attributes of the cloud. ) ,..., , , ( 3 2 1 m C C C C C  ) ,..., , , ( 3 2 1 n A A A A A  i A i W 1 1    n i i W i A 22
  • 23. Problem Statement The criteria are further categorized into beneficial criteria or non-beneficial criteria. Here, the value of beneficial criteria is to be maximized, whereas the value of non-beneficial criteria is to be minimized. The value of each cloud with respect to the criteria is presented in the form of a matrix called multi- criteria decision making (MCDM) matrix, is shown in Eq. 1 (1) An element (1 ≤ i ≤ m, 1 ≤ j ≤ n) in MCDM matrix denotes the performance value of a cloud on a criteria ij M i C j A 23
  • 24. Problem Statement Given a MCDM matrix, the problem is to select a best cloud out of a set of m clouds or rank a set of clouds, such that the best cloud service provider (CSP) holds the maximum beneficial and minimum non-beneficial criteria value. Moreover the problem is to select best criteria of each cloud provider, such that a composite service is to be provided by collaborating among the CSPs. 24
  • 25. Proposed Method Objective- A B-O-C-R Model for cloud selection using ANP and VIKOR. The computation of the service index is done using the quality of service (QoS) data of three cloud providers namely Amazon EC2, Windows Azure and Rackspace The QoS data is collected from various evaluation studies [ Garg et.al.] The unavailable data is assigned randomly. 25 QoS requirements are Accountability, Agility, Assurance, Performance, VM cost, Data cost, Storage cost, Adaptability, Flexibility, Serviceability, Provider’s risk, Compliances, HR risk.
  • 26. Proposed Method Main steps to model the Cloud Selection problem • Group the QoS requirements in B-O-C-R model ( Benefit , Opportunities, Cost, Risk ) • Compute the relative importance of the QoS requirements in each group and find the local priority of the alternatives in each group using ANP. • Rank the alternatives using VIKOR. 26
  • 27. Proposed Method A brief explanation of Analytical Hierarchy Process (AHP) AHP is a well-known MCDM algorithm which perform pair-wise comparison of criteria and sub-criteria, resulting a local priority or an weighting factor. Goal Goal C1 C2 C3 A1 A2 A3 1516 68 16 52 24 24 27
  • 28. Proposed Method C1,C2, C2 are criteria and A1, A2, A3 are alternatives Goal C1 C2 C3 A1 A2 A3 1516 68 16 52 24 24 The goal is to find the best alternatives.  The criteria weight is assigned by relative comparison matrix and local priority is calculated by the Eigen value of the matrix.  By applying the global priorities to the alternatives, we finally get a ranking of alternatives with respect to the criteria and sub-criteria. 28
  • 29. Proposed Method We have used Analytical Network Process (ANP) instead of AHP in our algorithm  The ANP is a decision finding method and generalization of the AHP  ANP can model complex decision problem where a hierarchal model AHP is not sufficient.  In ANP criteria, sub criteria , alternatives are treated equally as nodes in a network.  Each of the node might compared to any other node , as long as there is a relation between them.  In ANP nodes might grouped in clusters e.g. beneficial, non-beneficial.  Beside local priorities in the comparison of one node to a set of other node cluster priorities can be introduced. 29
  • 30. Proposed Method We have used Analytical Network Process ANP instead of AHP in our algorithm  The comparison of nodes to other follows the same principal and method as in AHP  Local priorities resulted from the Eigen vector of the comparison matrix. Goal C1 C2 C3 A1 A2 A3 15 30
  • 31. Proposed Method with Case Study We have used “Super Decision” Tool for computing ANP 31
  • 32. Proposed Method with Case Study Step 1 Group the QoS requirements in B-O-C-R model (Benefit, Opportunities, Cost, Risk) 32
  • 33. Proposed Method with Case Study Step 1 Group the QoS requirements in B-O-C-R model (Benefit, Opportunities, Cost, Risk) 33
  • 34. Proposed Method with Case Study Step2. Compute the relative importance of the QoS requirements in each group and find the local priority of the alternatives in each group using ANP. 34
  • 35. Proposed Method with Case Study Step2. Compute the relative importance of the QoS requirements in each group and find the local priority of the alternatives in each group using ANP. 35
  • 36. Proposed Method with Case Study Similarly for every nodes relative comparison is perform. And finally a Super Decision Matrix is formed which contains the local priorities of each alternative in each individual group Matrix for Benefits 36
  • 37. Proposed Method with Case Study Similarly for every nodes relative comparison is perform. And finally a Super Decision Matrix is formed which contains the local priorities of each alternative in each individual group Matrix for Opportunities 37
  • 38. Proposed Method with Case Study Similarly for every nodes relative comparison is perform. And finally a Super Decision Matrix is formed which contains the local priorities of each alternative in each individual group Matrix for Cost 38
  • 39. Proposed Method with Case Study Similarly for every nodes relative comparison is perform. And finally a Super Decision Matrix is formed which contains the local priorities of each alternative in each individual group Matrix for Risk 39
  • 40. Proposed Method with Case Study Overall value after final comparison Sl No. Alternatives Benefits Opportunities Cost Risk 1 Amazon EC2 0.248367 0.166850 0.156623 0.255299 2 Rackspace 0.114261 0.164199 0.156356 0.124611 3 Windows Azure 0.137372 0.168951 0.187021 0.120089 40
  • 41. Proposed Method with Case Study Overall value of final comparison is represented in a Rader chart . 0 0.05 0.1 0.15 0.2 0.25 0.3 Benefits Opportunities Cost Risk Amazon EC2 Rackspace Windows Azure 41
  • 42. Proposed Method with Case Study Step 3 Rank the alternatives using VIKOR. Vise Kriterijumska Optimizacija I Kompromisno Resenje (VIKOR) is a Serbian term and it is a MCDM algorithm. It undergoes five phases, • Normalization • Difference • Weighted and Normalized distance, • Combined weight • Selection. 42
  • 43. Proposed Method with Case Study In this phase, the maximum and minimum value of the criteria is determined mathematically. The Maximum value of the beneficial criteria is marked by red and the minimum value of the non- beneficial criteria is marked by violet Sl No. Alternatives Benefits Opportu nities Cost Risk 1 Amazon EC2 0.248367 0.166850 0.156623 0.255299 2 Rackspace 0.114261 0.164199 0.156356 0.124611 3 Windows Azure 0.137372 0.168951 0.187021 0.120089 43
  • 44. Proposed Method with Case Study Sl No. Alternatives Benefits Opportu nities Cost Risk 1 Amazon EC2 0.248367 0.166850 0.156623 0.255299 2 Rackspace 0.114261 0.164199 0.156356 0.124611 3 Windows Azure 0.137372 0.168951 0.187021 0.120089 Normalize data in range (0, 1) x = x/xmax for beneficial attributes x = xmin/x for non- beneficial attributes 44
  • 45. Proposed Method with Case Study Sl No. Alternatives Benefits Opportuniti es Cost Risk 1 Amazon EC2 0.248367/0.2 48367 0.166850/0.1 68951 0.156356/0.1 56623 0.120089/0.2 55299 2 Rackspace 0.114261/0.2 48367 0.164199/0.1 68951 0.156355/0.1 56356 0.120089/0.1 24611 3 Windows Azure 0.137372/0.2 48367 0.168951/0.1 68951 0.156356/0.1 87021 0.120089/0.1 20089 Normalize data in range (0, 1) x = x/xmax for beneficial attributes x = xmin/x for non- beneficial attributes 45
  • 46. Proposed Method with Case Study Sl No. Alternatives Benefits Opportuniti es Cost Risk 1 Amazon EC2 1 0.9875 0.9982 0.6004 2 Rackspace 0.4600 0.9718 1 0.9637 3 Windows Azure 0.5531 1 0.8360 1 Normalize data in range (0, 1) x = x/xmax for beneficial attributes x = xmin/x for non- beneficial attributes 46
  • 47. Proposed Method with Case Study Sl No. Alternatives Benefits Opportunities Cost Risk 1 Amazon EC2 1 0.9875 0.9982 0.6004 2 Rackspace 0.4600 0.9718 1 0.9637 3 Windows Azure 0.5531 1 0.8360 1 4 Max 1 1 1 1 5 Min 0.4600 0.9718 0.8360 0.6004 6 Difference (Max- Min) 0.54 0.0282 0.164 0.3996 Find the difference MAX-MIN 47
  • 48. Proposed Method with Case Study Sl No. Alternatives Benefits Opportunities Cost Risk weight 0.5 0.167 0.25 0.083 1 Amazon EC2 1 0.9875 0.9982 0.6004 2 Rackspace 0.4600 0.9718 1 0.9637 3 Windows Azure 0.5531 1 0.8360 1 4 Max 1 1 1 1 5 Min 0.4600 0.9718 0.8360 0.6004 6 Difference(Max- Min) 0.54 0.0282 0.164 0.3996 Assign weights to the criteria 48
  • 49. Proposed Method with Case Study Sl No. Alternatives Benefits Opportunities Cost Risk Weight 0.5 0.167 0.25 0.083 1 Amazon EC2 1 0.9875 0.9982 0.6004 2 Rackspace 0.4600 0.9718 1 0.9637 3 Windows Azure 0.5531 1 0.8360 1 4 Max 1 1 1 1 5 Min 0.4600 0.9718 0.8360 0.6004 6 Difference (Max-Min) 0.54 0.0282 0.164 0.3996 Find weighted and normalized distance E. MX- Maximum value, MN-Minimum value, N – Criteria value ,WNMD-Weighted Normalized Distance value 49
  • 50. Proposed Method with Case Study Combined weight is calculated the minimum value of the weight is ranked 1 and as the value increases the rank increases All the simulation have been performed by using MATLAB 2016a Sl No. Alternatives Combined Weight is Calculated Rank 1 Amazon EC2 0 1 2 Rackspace 0.8642 2 3 Windows Azure 1 3 50
  • 51. Conclusion • This work is done on the basis of Multi-Criteria Decision Making algorithm for selecting the best cloud service provider by analyzing and comparing different beneficial and non beneficial Quality of Service requirement. • Different cloud service selection model like AHP, Fuzzy TOPSIS, ANP, VIKORE, MOORA, PROMETHEE and DEA have been reviewed. It has been analyzed that most of the framework assigned weight to the service attribute and assigned rank after processing and comparing the attribute. • A comparison table is drawn based on the survey and a hybrid algorithm is proposed to select the best cloud among 3 different CSP, which combines ANP and VIKOR to select the cloud. 51
  • 53. References [1]. El-Gazzar, R. F. (2014, June). A literature review on cloud computing adoption issues in enterprises. In International Working Conference on Transfer and Diffusion of IT (pp. 214-242). Springer, Berlin, Heidelberg. [2]. Mell, P., & Grance, T. (2011). The NIST definition of cloud computing. [3]. Kumar, R. R., Mishra, S., & Kumar, C. (2017). Prioritizing the solution of cloud service selection using integrated MCDM methods under Fuzzy environment. The Journal of Supercomputing, 73(11), 4652- 4682. [4]. Garg, S. K., Versteeg, S., & Buyya, R. (2013). A framework for ranking of cloud computing services. Future Generation Computer Systems, 29(4), 1012-1023. [5]. Garrison, G., Wakefield, R. L., & Kim, S. (2015). The effects of IT capabilities and delivery model on cloud computing success and firm performance for cloud supported processes and operations. International Journal of Information Management, 35(4), 377-393. [6]. Buyya, R., Vecchiola, C., & Selvi, S. T. (2013). Mastering cloud computing: foundations and applications programming, Mc Graw Hill Education, 1st edition. [7]. Low, C., & Chen, Y. H. (2012). Criteria for the evaluation of a cloud-based hospital information system outsourcing provider. Journal of medical systems, 36(6), 3543-3553. [8]. Whaiduzzaman, M., Gani, A., Anuar, N. B., Shiraz, M., Haque, M. N., & Haque, I. T. (2014). Cloud service selection using multicriteria decision analysis. The Scientific World Journal, 2014. [15] Ferrer, A. J., HernáNdez, F., Tordsson, J., Elmroth, E., Ali-Eldin, A., Zsigri, C., ... & Ziegler, W. (2012). OPTIMIS: A holistic approach to cloud service provisioning. Future Generation Computer Systems, 28(1), 66-77. [16] El-Gazzar, R., Hustad, E., & Olsen, D. H. (2016). Understanding cloud computing adoption issues: A Delphi study approach. Journal of Systems and Software, 118, 64-84. [17] Salleh, S. M., Teoh, S. Y., & Chan, C. (2012, July). Cloud Enterprise Systems: A Review Of Literature And Its Adoption. In PACIS (p. 76). 53
  • 54. References [18] Buyya, R., Yeo, C. S., & Venugopal, S. (2008, September). Market-oriented cloud computing: Vision, hype, and reality for delivering it services as computing utilities. In 2008 10th IEEE international conference on high performance computing and communications (pp. 5-13). Ieee. [19] Dutta, A., Peng, G. C. A., & Choudhary, A. (2013). Risks in enterprise cloud computing: the perspective of IT experts. Journal of Computer Information Systems, 53(4), 39-48. [20] Khajeh‐Hosseini, A., Greenwood, D., Smith, J. W., & Sommerville, I. (2012). The cloud adoption toolkit: supporting cloud adoption decisions in the enterprise. Software: Practice and Experience, 42(4), 447-465. [21] Motahari-Nezhad, H. R., Stephenson, B., & Singhal, S. (2009). Outsourcing business to cloud computing services: Opportunities and challenges. IEEE Internet Computing, 10(4), 1-17. [22] Sidhu, J., & Singh, S. (2017). Improved topsis method based trust evaluation framework for determining trustworthiness of cloud service providers. Journal of Grid Computing, 15(1), 81-105. [23] Jatoth, C., Gangadharan, G. R., & Fiore, U. (2017). Evaluating the efficiency of cloud services using modified data envelopment analysis and modified super-efficiency data envelopment analysis. Soft Computing, 21(23), 7221-7234. [24] Kumar, R. R., Mishra, S., & Kumar, C. (2017). Prioritizing the solution of cloud service selection using integrated MCDM methods under Fuzzy environment. The Journal of Supercomputing, 73(11), 4652-4682. [25] Tran, V. X., Tsuji, H., & Masuda, R. (2009). A new QoS ontology and its QoS-based ranking algorithm for Web services. Simulation Modelling Practice and Theory, 17(8), 1378-1398. [26] Garg, S. K., Versteeg, S., & Buyya, R. (2013). A framework for ranking of cloud computing services. Future Generation Computer Systems, 29(4), 1012-1023. [27] Liu, S., Chan, F. T., & Ran, W. (2016). Decision making for the selection of cloud vendor: An improved approach under group decision-making with integrated weights and objective/subjective attributes. Expert Systems with Applications, 55, 37-47. 54
  • 55. References [29] Kumar, R. R., Mishra, S., & Kumar, C. (2017). Prioritizing the solution of cloud service selection using integrated MCDM methods under Fuzzy environment. The Journal of Supercomputing, 73(11), 4652-4682. [30] Kumar, R. R., & Kumar, C. (2018). A Multi Criteria Decision Making Method for Cloud Service Selection and Ranking. International Journal of Ambient Computing and Intelligence (IJACI), 9(3), 1-14. [31] Yadav, N., & Goraya, M. S. (2018). Two-way ranking based service mapping in cloud environment. Future Generation Computer Systems, 81, 53-66. [32] Tran, V. X., Tsuji, H., & Masuda, R. (2009). A new QoS ontology and its QoS-based ranking algorithm for Web services. Simulation Modelling Practice and Theory, 17(8), 1378-1398. [33] Sidhu, J., & Singh, S. (2017). Design and comparative analysis of MCDM-based multi-dimensional trust evaluation schemes for determining trustworthiness of cloud service providers. Journal of Grid Computing, 15(2), 197-218. [34] Rădulescu, C. Z., & Rădulescu, I. C. (2017). An extended TOPSIS approach for ranking cloud service providers. Stud. Inform. Control, 26, 183-192. [35] Boutkhoum, O., Hanine, M., Agouti, T., & Tikniouine, A. (2017). A decision-making approach based on fuzzy AHP-TOPSIS methodology for selecting the appropriate cloud solution to manage big data projects. International Journal of System Assurance Engineering and Management, 8(2), 1237-1253. [36] Krishankumar, R., Arvinda, S. R., Amrutha, A., Premaladha, J., & Ravichandran, K. S. (2017, July). A decision making framework under intuitionistic fuzzy environment for solving cloud vendor selection problem. In 2017 International Conference on Networks & Advances in Computational Technologies (NetACT) (pp. 140-144). IEEE. [37] 36. Jatoth, C., Gangadharan, G. R., Fiore, U., & Buyya, R. (2018). SELCLOUD: a hybrid multi-criteria decision-making model for selection of cloud services. Soft Computing, 1-15. [38] Sohaib, O., & Naderpour, M. (2017, July). Decision making on adoption of cloud computing in e- commerce using fuzzy TOPSIS. In 2017 IEEE International Conference on Fuzzy Systems (FUZZ- IEEE) (pp. 1-6). IEEE. 55
  • 56. References [39] Rai, D., & Kumar, P. (2016). Instance based multi criteria decision model for cloud service selection using TOPSIS and VIKOR. Int. J. Comput Eng. Technol, 7, 78-87. [40] Alabool, H. M., & Mahmood, A. K. (2013). Trust-based service selection in public cloud computing using fuzzy modified VIKOR method. Australian Journal of Basic and Applied Sciences, 7(9), 211-220. [41] Akbarizade, Z., & Faghihi, M. (2017). Ranking CloudService Providers using SWARA and VIKOR (A case of Irancell Company). International Journal of Information, Security and Systems Management, 6(2), 679- 686. [42] Sidhu, J., & Singh, S. (2019). Using the Improved PROMETHEE for Selection of Trustworthy Cloud Database Servers. INTERNATIONAL ARAB JOURNAL OF INFORMATION TECHNOLOGY, 16(2), 194- 202. [43] Ma, H., Zhu, H., Hu, Z., Li, K., & Tang, W. (2017). Time-aware trustworthiness ranking prediction for cloud services using interval neutrosophic set and ELECTRE. Knowledge-Based Systems, 138, 27-45. [44] Büyüközkan, G., Göçer, F., & Feyzioğlu, O. (2018). Cloud computing technology selection based on interval-valued intuitionistic fuzzy MCDM methods. Soft Computing, 22(15), 5091-5114. [45] Zoie, R. C., Alexandru, B., Mihaela, R. D., & Mihail, D. (2016, October). A decision making framework for weighting and ranking criteria for Cloud provider selection. In 2016 20th International Conference on System Theory, Control and Computing (ICSTCC) (pp. 590-595). IEEE. [46] Jatoth, C., Gangadharan, G. R., & Fiore, U. (2017). Evaluating the efficiency of cloud services using modified data envelopment analysis and modified super-efficiency data envelopment analysis. Soft Computing, 21(23), 7221-7234. [47] Somu, N., Kirthivasan, K., & VS, S. S. (2017). A computational model for ranking cloud service providers using hypergraph based techniques. Future Generation Computer Systems, 68, 14-30. [48] Singla, C., Kaushal, S., Verma, A., & Kumar, H. (2018). A Hybrid Computational Intelligence Decision Making Model for Multimedia Cloud Based Applications. In Computational Intelligence for Multimedia Big Data on the Cloud with Engineering Applications (pp. 147-157). Academic Press. 56
  • 57. References [49] Lee, S., & Seo, K. K. (2016). A hybrid multi-criteria decision-making model for a cloud service selection problem using BSC, fuzzy Delphi method and fuzzy AHP. Wireless Personal Communications, 86(1), 57-75. [50] Ghafori, V., & Sarhadi, R. M. (2013). Best cloud provider selection using integrated ANP-DEMATEL and prioritizing SMI attributes. International Journal of Computer Applications, 71(16). [51] Liu, S., Chan, F. T., & Ran, W. (2016). Decision making for the selection of cloud vendor: An improved approach under group decision-making with integrated weights and objective/subjective attributes. Expert Systems with Applications, 55, 37-47. [52] Obulaporam, G., Somu, N., Ramani, G. R. M., Boopathy, A. K., & Sankaran, S. S. V. (2018, December). GCRITICPA: A CRITIC and Grey relational analysis based service ranking approach for cloud service selection. In International Conference on Intelligent Information Technologies (pp. 3-16). Springer, Singapore. 57
  • 58. 58