An AHP-based Frameworkfor Quality and SecurityEvaluationV. Casola, A.R. Fasolino, N. Mazzocca, P. TramontanaDipartimento d...
Rationale Context and open issues Our approach to quality and security evaluation Methodology   Policy Formalization   Eva...
Context: service cooperation, asecurity point of view  Service Oriented Architectures are capable of  intelligent interact...
How a Customer can choose the Web Service that betterfits his “quality” requirements?                                    S...
Our approach to quality/securityevaluation  Actually, these problems are faced by explicit  agreements among services:    ...
Security evaluation Methodology  We are working on different methodologies to:    Express quality/security through a semi-...
The proposed approach•    Models are needed to formally express the Quality     and security of Web Services (quality of p...
The Quality Meta-Model                                         • Quality Characteristic: any quality requirements,     Qua...
The Analytical Hierarchy Process1.      The decision model design activity:     1.    Weight Assignment step: the relative...
Building the decisional model:   Step 1: Weight AssignmentFor each Characteristic that is not directly measurable,the deci...
Building the decisional model:Step 1: Weight Assignment (cont.)                                     2. Normalize          ...
Building the decisional model:Step 2: ClusteringWe need an Utility Function to ORDER the possible values on the basis of r...
Building the decisional model: Step 3: RatingAfter clustering each possible value, we need to rate such clusters according...
The Decision Making ActivityThe Quality of different Web Services is compared by evaluating:    1. a Satisfaction Function...
The Decision Making Activity (cont.)3. the Overall Satisfaction Function:             S(request,offer) =         ∑ w S (re...
Application of the evaluation model: evaluatingmeasurable and not-measurable characteristics                        SInteg...
Application of the evaluation model:Overall evaluation Finally we evaluate the overall satisfaction function (GLOBAL SECUR...
An idea on how to automatically enforce theevaluation: A reference architecture                                 V.Casola e...
Conclusions and Future work We are working on methodologies to automatically evaluate quality and security provided by an ...
Upcoming SlideShare
Loading in …5
×

An AHP-based Framework for Quality and Security Evaluation

377 views

Published on

Nowadays, there is a large diffusion of open and dynamic cooperative architectures that are based on services (SOA). In general, a customer is not only interested in service functionalities, but also in its quality (i.e. performance, cost, reliability, security and so on). In this scenario, models, techniques and tools supporting the effective selection of the service that provides the better quality are needed. In this paper, we propose an evaluation framework that includes a flexible quality meta-model for formalizing Customer and Provider views of quality, and a decisional model defining a systematic approach for comparing offered and requested quality of services. We will also illustrate the applicability of the framework in a Web Service scenario.

Published in: Technology, Business
1 Comment
0 Likes
Statistics
Notes
  • Be the first to like this

No Downloads
Views
Total views
377
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
5
Comments
1
Likes
0
Embeds 0
No embeds

No notes for slide

An AHP-based Framework for Quality and Security Evaluation

  1. 1. An AHP-based Frameworkfor Quality and SecurityEvaluationV. Casola, A.R. Fasolino, N. Mazzocca, P. TramontanaDipartimento di Informatica e SistemisticaUniversita degli Studi di Napoli, Federico IINaples, Italy{casolav, fasolino, n.mazzocca, ptramont}@unina.it
  2. 2. Rationale Context and open issues Our approach to quality and security evaluation Methodology Policy Formalization Evaluation technique Applicability in Web Service Architectures Conclusions and Future Works
  3. 3. Context: service cooperation, asecurity point of view Service Oriented Architectures are capable of intelligent interaction and are able to discover and compose themselves into more complex services; The open issue is: how to guarantee the “quality and security” of a service built at run-time in a potential un-trusted domain?
  4. 4. How a Customer can choose the Web Service that betterfits his “quality” requirements? Service Provider1 Quality Service Offer 1 Customer ? … Quality Service Request Provider n Quality Offer n 4
  5. 5. Our approach to quality/securityevaluation Actually, these problems are faced by explicit agreements among services: Each service defines its own Service Level Agreement and Security Policy and publishes them in a public document; People from the various organization that want to cooperate, manually evaluate the different SLAs and decide to agree or not. SLA and Policies are expressed by means of a free text document; they usually contain provisions on the “quality” of services and on “security” mechanisms adopted, they are used to decide to extend trust to other services; These documents are mostly manually evaluated.
  6. 6. Security evaluation Methodology We are working on different methodologies to: Express quality/security through a semi-formal and not ambiguous model; the chosen formalization must be “easy to adopt” for technical and organizational people; Evaluate the quality/security level that a security infrastructure is able to guarantee by aggregating the security associated to all policy provisions (multidecision approach). Compare different services according to the measured quality/security level.
  7. 7. The proposed approach• Models are needed to formally express the Quality and security of Web Services (quality of protection, QoS, security and so on) requested by Customers and offered by Providers;1. We defined a quality meta-model and formally express Quality as an instance of the meta-model;2. We investigated the adoption of a decision framework based on AHP (Analytic Hierarchy Process) proposed by Saaty for Quality evaluation;
  8. 8. The Quality Meta-Model • Quality Characteristic: any quality requirements, Quality Model such as Performance, Security, Cost, Maintainability 1..n •Characteristics may be arranged in a hierarchy Characteristic +SubCharacteristic 0..n (Measurable Characteristics are the leaves) 0..1 • Measurable Characteristic: a Quality CharacteristicMeasurable Characteristic that can directly be measured 1..n Component Quality Characteristic: Efficiency oQuality Characteristic: Time Behavior 1..n Quality Characteristic: Response Time 1 •Measurable Quality Characteristic: Average Measurement Unit Response Time Measurable Quality Characteristic: Standard deviation Measurable Quality Characteristic: Maximum8 response time
  9. 9. The Analytical Hierarchy Process1. The decision model design activity: 1. Weight Assignment step: the relative importance of the characteristics is rated; A security expert 2. Clustering step: for each measurable designs the characteristic, the sets of values that will decision model be considered equivalent for the aims of the evaluation are defined; 3. Rating Step: each set is associated to a rating value; The decision maker evaluates the quality by2. The decision making activity: to compare the applying the quality of an offered service (formalised in a decision model Quality Offer Model) against requestor needs 9 (formalised in a Quality Request Model)
  10. 10. Building the decisional model: Step 1: Weight AssignmentFor each Characteristic that is not directly measurable,the decision process designer will estimate the relative Intensity of Importance of anypair of its n Sub-Characteristics, by defining a matrix of nxn Response Average Standard MaximumIntensity of Importance and its interpretation Time Response Deviation of Response Intensity of Interpretation Time Response Time Importance Time 1 Equal Importance Average 1 3 7 Response Time 3 Moderate 1. Build the Standard 1/3 1 5 Importance Comparison Deviation 5 Strong Importance matrix Response Time Maximum 1/7 1/5 1 7 Very strong Response Time Importance 9 Extreme Importance m(i, j) = 1 m( j,i) ∀i, j 2. Normalize m(i,i) = 1 ∀i The matrix 10
  11. 11. Building the decisional model:Step 1: Weight Assignment (cont.) 2. Normalize n The matrix m(i, j) = m(i, j) ∑ m(h, j) ∀i,j h=1 Average Standard Maximum Weights Response Deviation Response Time Response TimeCharacteristic Weights Timeare assigned by comparing Average Response 21/31 15/21 7/13 0.64their relative importance: Time Standard 7/31 5/21 5/13 0.28 Deviation n m (i, k ) w(i ) = ∑ Response Time ∀i Maximum 3/31 1/21 1/13 0.07 k =1 n Response Time 11
  12. 12. Building the decisional model:Step 2: ClusteringWe need an Utility Function to ORDER the possible values on the basis of relative (andnot absolute) preferences (LOCAL SECURITY LEVELS).In general, an Utility function R assigns ordered values (of utility) to members of aset: given two values x and y of the set, if x is preferred to y then R(x)> R(y).Example: Average Response Time characteristic R = Offered_value / Requested_value Possible Solutions are R < 0.5 (very fast response); clustered in three levels: 0.5≤ R <1 (sufficiently fast response); 1≤ R <2 (quite slow response). 12
  13. 13. Building the decisional model: Step 3: RatingAfter clustering each possible value, we need to rate such clusters according their goodness Intensity of Goodness and its Ratings are assigned to clusters by interpretation comparing their relative Goodness Intensity of Interpretation Goodness R<0.5 0.5≤R<1 1≤R<2 Rating 1 Equivalent R < 0.5 1 3 5 0.63 3 Moderately 0.5≤R<1 1/3 1 3 0.26 better 1≤ R<2 1/5 1/3 1 0.11 5 Strongly better 7 Very strongly better This is the relative ⎧0.63 if R < 0.5 9 Extremely better Satisfaction S (R ) = ⎪0.26 if rate/evaluation ⎨ 0.5 ≤ R < 1 Function Ssc(R) ⎪ of a cluster ⎩0.11 if 1≤ R < 2 13
  14. 14. The Decision Making ActivityThe Quality of different Web Services is compared by evaluating: 1. a Satisfaction Function for each Measurable Characteristic. 2. a Satisfaction Function for each non-Measurable Characteristic: Sc (request,offer) = ∑w sc Ssc (request,offer) sc ∈C (c ) A non measurable characteristic ( c ), All measurable sub-characteristic of ( c ) For example: Confidentiality denoted sc are weighted and summed For example: (Encryption Alghoritm, KeyLenght, KeyProtection, ) 14
  15. 15. The Decision Making Activity (cont.)3. the Overall Satisfaction Function: S(request,offer) = ∑ w S (request,offer) c c c ∈Characteristic The Web Service with the greater Satisfaction Function value is chosen 15
  16. 16. Application of the evaluation model: evaluatingmeasurable and not-measurable characteristics SIntegrity(Customer,Provider1)= 0.12*0.8+0.12*0.88+0.38*0.75+0.38*0.75=0.77 SIntegrity(Customer,Provider2)= 0.12*0.8+0.12*0.88+0.38*0.19+0.38*0.06=0.30 SResponseTime(Customer,Provider1)= 0.64*0.26+0.07*0.75+0.28*0.07=0.24 SResponseTime(Customer,Provider2)= 0.64*0.11+0.07*0.25+0.28*0.25=0.16
  17. 17. Application of the evaluation model:Overall evaluation Finally we evaluate the overall satisfaction function (GLOBAL SECURITY LEVEL) The first provider will be chosen on the basis of the provided security level
  18. 18. An idea on how to automatically enforce theevaluation: A reference architecture V.Casola et al. An Architectural Model for Trusted Domains in Web Services Journal of Information Assurance and Security 2 (2006)
  19. 19. Conclusions and Future work We are working on methodologies to automatically evaluate quality and security provided by an internet service on the basis of the published policies; The AHP methodology allows to address measurable and not-measurable quality and security aspects in a unifying way and propose an evaluation model; We are going to integrate such methodology in the TRUMAN architecture and compare with existing ones.

×