In many cases poor information quality appears mainly due to in-effectiveness of information management including information production and delivery. Where this situation poses a certain risk. A holistic information risk management model has been previously proposed. But the model has some limitations especially on risk calculation and risk priority ranking as the model does not consider existing control effectiveness. In this paper, a new risk assessment method is proposed in order to improve the model of total impact of risks and to improve the accuracy of risk priority ranking by modifying the extended risk matrix approach (RMA) where we take into account the existing control effectiveness. Using our approach by adding a new dimension on extended RMA. We are able to improve the accuracy (7.15%) and reduced the ambiguity (1.34) of assessment results on real cases illustration.
Information security risk analysis methods and research trends ahp and fuzzy ...ijcsit
Information security risk analysis becomes an increasingly essential component of organization’s
operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods.
Quantitative and qualitative analysis methods have some advantages for information risk analysis.
However, hierarchy process has been widely used in security assessment. A future research direction may
be development and application of soft computing such as rough sets, grey sets, fuzzy systems, generic
algorithm, support vector machine, and Bayesian network and hybrid model. Hybrid model are developed
by integrating two or more existing model. A Practical advice for evaluation information security risk is
discussed. This approach is combination with AHP and Fuzzy comprehensive method.
PROVIDING A METHOD FOR DETERMINING THE INDEX OF CUSTOMER CHURN IN INDUSTRYIJITCA Journal
Churn customer, one of the most important issues in customer relationship management and marketing is especially in industries such as telecommunications, the financial and insurance. In recent decades much
research has been done in this area. In this research, the index set for the reasons set reason churn customers for our customers is of particular importance. In this study we are intended to provide a formula for the index churn customers, the better to understand the reasons for customers to provide churn. Therefore, in order to evaluate the formula provided through six Classification methods (Decision tree QUEST, Decision tree C5.0, Decision tree CHAID, Decision trees CART, Bayesian network, Neural network) to evaluate the formula will be involved with individual indicators
One of the most important issues that organizations have to deal with is the timely identification and detection of risk factors aimed at preventing incidents. Managers’ and engineers’ tendency towards minimizing risk factors in a service, process or design system has obliged them to analyze the reliability of such systems in order to minimize the risks and identify the probable errors. Concerning what was just mentioned, a more accurate Failure Mode and Effects Analysis (FMEA) is adopted based on fuzzy logic and fuzzy numbers. Fuzzy TOPSIS is also used to identify, rank, and prioritize error and risk factors. This paper uses FMEA as a risk identification tool. Then, Fuzzy Risk Priority Number (FRPN) is calculated and triangular fuzzy numbers are prioritized through Fuzzy TOPSIS. In order to have a better understanding toward the mentioned concepts, a case study is presented.
A comprehensive study on disease risk predictions in machine learning IJECEIAES
Over recent years, multiple disease risk prediction models have been developed. These models use various patient characteristics to estimate the probability of outcomes over a certain period of time and hold the potential to improve decision making and individualize care. Discovering hidden patterns and interactions from medical databases with growing evaluation of the disease prediction model has become crucial. It needs many trials in traditional clinical findings that could complicate disease prediction. A Comprehensive study on different strategies used to predict disease is conferred in this paper. Applying these techniques to healthcare data, has improvement of risk prediction models to find out the patients who would get benefit from disease management programs to reduce hospital readmission and healthcare cost, but the results of these endeavors have been shifted.
Information security risk analysis methods and research trends ahp and fuzzy ...ijcsit
Information security risk analysis becomes an increasingly essential component of organization’s
operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods.
Quantitative and qualitative analysis methods have some advantages for information risk analysis.
However, hierarchy process has been widely used in security assessment. A future research direction may
be development and application of soft computing such as rough sets, grey sets, fuzzy systems, generic
algorithm, support vector machine, and Bayesian network and hybrid model. Hybrid model are developed
by integrating two or more existing model. A Practical advice for evaluation information security risk is
discussed. This approach is combination with AHP and Fuzzy comprehensive method.
PROVIDING A METHOD FOR DETERMINING THE INDEX OF CUSTOMER CHURN IN INDUSTRYIJITCA Journal
Churn customer, one of the most important issues in customer relationship management and marketing is especially in industries such as telecommunications, the financial and insurance. In recent decades much
research has been done in this area. In this research, the index set for the reasons set reason churn customers for our customers is of particular importance. In this study we are intended to provide a formula for the index churn customers, the better to understand the reasons for customers to provide churn. Therefore, in order to evaluate the formula provided through six Classification methods (Decision tree QUEST, Decision tree C5.0, Decision tree CHAID, Decision trees CART, Bayesian network, Neural network) to evaluate the formula will be involved with individual indicators
One of the most important issues that organizations have to deal with is the timely identification and detection of risk factors aimed at preventing incidents. Managers’ and engineers’ tendency towards minimizing risk factors in a service, process or design system has obliged them to analyze the reliability of such systems in order to minimize the risks and identify the probable errors. Concerning what was just mentioned, a more accurate Failure Mode and Effects Analysis (FMEA) is adopted based on fuzzy logic and fuzzy numbers. Fuzzy TOPSIS is also used to identify, rank, and prioritize error and risk factors. This paper uses FMEA as a risk identification tool. Then, Fuzzy Risk Priority Number (FRPN) is calculated and triangular fuzzy numbers are prioritized through Fuzzy TOPSIS. In order to have a better understanding toward the mentioned concepts, a case study is presented.
A comprehensive study on disease risk predictions in machine learning IJECEIAES
Over recent years, multiple disease risk prediction models have been developed. These models use various patient characteristics to estimate the probability of outcomes over a certain period of time and hold the potential to improve decision making and individualize care. Discovering hidden patterns and interactions from medical databases with growing evaluation of the disease prediction model has become crucial. It needs many trials in traditional clinical findings that could complicate disease prediction. A Comprehensive study on different strategies used to predict disease is conferred in this paper. Applying these techniques to healthcare data, has improvement of risk prediction models to find out the patients who would get benefit from disease management programs to reduce hospital readmission and healthcare cost, but the results of these endeavors have been shifted.
A R ISK - A WARE B USINESS P ROCESS M ANAGEMENT R EFERENCE M ODEL AND IT...IJCSES Journal
Due to the environmental pressures on organizations, the demand on Business Process Management
(BPM) automation suites has increased. This led to the arising need for managing process
-
related risks.
Theref
ore the management of risks in business processes has been the subject of many researches during
the past few years. However, most of these researches focused mainly on one or two stages of the BPM
life
cycle and introduced a support for it. This paper aim
s to provide a reference model for Risk
-
Aware BPM
which addresses the whole stages of the BPM life cycle, as well as some current techniques are liste
d for
the implementation of this model. Additionally, a case study for a business process in an Egyptian
university is introduced, in order to apply this model in realworld environment. The results will be
analyzed and concluded
Instance Selection and Optimization of Neural NetworksITIIIndustries
Credit scoring is an important tool in financial institutions, which can be used in credit granting decision. Credit applications are marked by credit scoring models and those with high marks will be treated as “good”, while those with low marks will be regarded as “bad”. As data mining technique develops, automatic credit scoring systems are warmly welcomed for their high efficiency and objective judgments. Many machine learning algorithms have been applied in training credit scoring models, and ANN is one of them with good performance. This paper presents a higher accuracy credit scoring model based on MLP neural networks trained with back propagation algorithm. Our work focuses on enhancing credit scoring models in three aspects: optimize data distribution in datasets using a new method called Average Random Choosing; compare effects of training-validation-test instances numbers; and find the most suitable number of hidden units. Another contribution of this paper is summarizing the tendency of scoring accuracy of models when the number of hidden units increases. The experiment results show that our methods can achieve high credit scoring accuracy with imbalanced datasets. Thus, credit granting decision can be made by data mining methods using MLP neural networks.
Adding Analytics to your Cybersecurity Toolkit with CompTIA Cybersecurity Ana...CompTIA
In this document:
- Adding Analytics to your Cybersecurity Toolkit with CompTIA Cybersecurity Analyst (CSA+)
- Measuring CompTIA CSA+ Difficulty
- Why Hybrid Testing Approaches Work Best
- Mapping the NICE Cybersecurity Workforce Framework
Corporate bankruptcy prediction using Deep learning techniquesShantanu Deshpande
Corporate Bankruptcy prediction using Recurrent neural networks – Aim is to build a recurrent neural network-based model to predict whether company will become bankrupt or not using financial ratios of Polish companies.
Methodologies & Tools: CRISP-DM, SMOTE-ENN, GA Algorithm, LSTM network (type of RNN)
Information Technology (IT) Security Framework for Kenyan Small and Medium En...CSCJournals
To address challenges faced by Small and Medium Enterprises (SMEs) especially in Kenya, this paper aims to establish an Information Technology (IT) framework that can allow SMEs implement cost effective security measures. Particularly this paper discusses IT security requirements and appropriate metrics. There is evidence from the survey to suggest that despite having some IT security measures in place, Kenyan SMEs still face some serious IT security challenges. In the light of the challenges faced by Kenyan SMEs, this work recommends a framework which is supposed among other things provide some metrics of evaluating the effectiveness of implemented security measures. The framework is likely to assist SME stakeholders measure the effectiveness of their security enhancing mechanisms.
Application Development Risk Assessment Model Based on Bayesian NetworkTELKOMNIKA JOURNAL
This paper describes a new risk assessment model for application development and its
implementation. The model is developed using a Bayesian network and Boehm’s software risk principles.
The Bayesian network is created after mapping top twenty risks in software projects with interrelationship
digraph of risk area category. The probability of risk on the network is analyzed and validated using both
numerical simulation and subjective probability from several experts in the field and a team of application
developers. After obtaining the Bayesian network model, risk exposure is calculated using Boehm's risk
principles. Finally, the implementation of the proposed model in a government institution is shown as a real
case illustration.
Abstract
Big data plays a serious role within the business for creating higher predictions over business information that is collected from the real world. Finance is that the new sector wherever the big data technologies like Hadoop, NoSQL are creating its mark in predictions from financial data by the analysts. It’s a lot of fascinating within the stock exchange choices which might predict on a lot of profits of stock exchange. For this stock exchange analysis each regular information and historical information of specific stock exchange are needed for creating predictions. There are varied techniques used for analyzing the unstructured information like stock exchange reviews (day-to-day information) and historical statistic of economic information severally. This paper involves discussion regarding the strategies that square measure used for analyzing each varieties of information.
Keywords: Big data, prediction, finance, stock market, business intelligence
Review on Algorithmic and Non Algorithmic Software Cost Estimation Techniquesijtsrd
Effective software cost estimation is the most challenging and important activities in software development. Developers want a simple and accurate method of efforts estimation. Estimation of the cost before starting of work is a prediction and prediction always not accurate. Software effort estimation is a very critical task in the software engineering and to control quality and efficiency a suitable estimation technique is crucial. This paper gives a review of various available software effort estimation methods, mainly focus on the algorithmic model and non algorithmic model. These existing methods for software cost estimation are illustrated and their aspect will be discussed. No single technique is best for all situations, and thus a careful comparison of the results of several approaches is most likely to produce realistic estimation. This paper provides a detailed overview of existing software cost estimation models and techniques. This paper presents the strength and weakness of various cost estimation methods. This paper focuses on some of the relevant reasons that cause inaccurate estimation. Pa Pa Win | War War Myint | Hlaing Phyu Phyu Mon | Seint Wint Thu "Review on Algorithmic and Non-Algorithmic Software Cost Estimation Techniques" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd26511.pdfPaper URL: https://www.ijtsrd.com/engineering/-/26511/review-on-algorithmic-and-non-algorithmic-software-cost-estimation-techniques/pa-pa-win
Automatically Estimating Software Effort and Cost using Computing Intelligenc...cscpconf
In the IT industry, precisely estimate the effort of each software project the development cost
and schedule are count for much to the software company. So precisely estimation of man
power seems to be getting more important. In the past time, the IT companies estimate the work
effort of man power by human experts, using statistics method. However, the outcomes are
always unsatisfying the management level. Recently it becomes an interesting topic if computing
intelligence techniques can do better in this field. This research uses some computing
intelligence techniques, such as Pearson product-moment correlation coefficient method and
one-way ANOVA method to select key factors, and K-Means clustering algorithm to do project
clustering, to estimate the software project effort. The experimental result show that using
computing intelligence techniques to estimate the software project effort can get more precise
and more effective estimation than using traditional human experts did
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
PORM: Predictive Optimization of Risk Management to Control Uncertainty Probl...IJECEIAES
Irrespective of different research-based approaches toward risk management, developing a precise model towards risk management is found to be a computationally challenging task owing to critical and vague definition of the origination of the problems. This research work introduces a model called as PROM i.e. Predictive Optimization of Risk Management with the perspective of software engineering. The significant contribution of PORM is to offer a reliable computation of risk analysis by considering generalized practical scenario of software development practices in Information Technology (IT) industry. The proposed PORM system is also designed and equipped with better risk factor assessment with an aid of machine learning approach without having more involvement of iteration. The study outcome shows that PORM system offers computationally cost effective analysis of risk factor as assessed with respect to different quality standards of object oriented system involved in every software projects.
A NEW MATHEMATICAL RISK MANAGEMENT MODEL FOR AGILE SOFTWARE DEVELOPMENT METHO...ijseajournal
This paper proposes a new mathematical model for estimating the cost of explicit Agile software
development risk management with its Impact Benefit s (savings/profits). This is necessitated by the fact
that despite the increase in the need for managing risks explicitly in medium-to-large scale agile software
development projects presently, there are no known ways to estimate explicit risk management
costs/benefits. With the proposed model, explicit risk management procedures alongside with risk
management estimation techniques is made known to Stakeholders who will be able to make the right
decisions on risk management costs and its impacts as well as when to utilise implicit or explicit risk
management. The proposed system proves to be feasible and dependable and is evidently capable of
enhancing the agile methods for use for all sizes of software projects while still maintaining the swiftness of
the agile process.
A R ISK - A WARE B USINESS P ROCESS M ANAGEMENT R EFERENCE M ODEL AND IT...IJCSES Journal
Due to the environmental pressures on organizations, the demand on Business Process Management
(BPM) automation suites has increased. This led to the arising need for managing process
-
related risks.
Theref
ore the management of risks in business processes has been the subject of many researches during
the past few years. However, most of these researches focused mainly on one or two stages of the BPM
life
cycle and introduced a support for it. This paper aim
s to provide a reference model for Risk
-
Aware BPM
which addresses the whole stages of the BPM life cycle, as well as some current techniques are liste
d for
the implementation of this model. Additionally, a case study for a business process in an Egyptian
university is introduced, in order to apply this model in realworld environment. The results will be
analyzed and concluded
Instance Selection and Optimization of Neural NetworksITIIIndustries
Credit scoring is an important tool in financial institutions, which can be used in credit granting decision. Credit applications are marked by credit scoring models and those with high marks will be treated as “good”, while those with low marks will be regarded as “bad”. As data mining technique develops, automatic credit scoring systems are warmly welcomed for their high efficiency and objective judgments. Many machine learning algorithms have been applied in training credit scoring models, and ANN is one of them with good performance. This paper presents a higher accuracy credit scoring model based on MLP neural networks trained with back propagation algorithm. Our work focuses on enhancing credit scoring models in three aspects: optimize data distribution in datasets using a new method called Average Random Choosing; compare effects of training-validation-test instances numbers; and find the most suitable number of hidden units. Another contribution of this paper is summarizing the tendency of scoring accuracy of models when the number of hidden units increases. The experiment results show that our methods can achieve high credit scoring accuracy with imbalanced datasets. Thus, credit granting decision can be made by data mining methods using MLP neural networks.
Adding Analytics to your Cybersecurity Toolkit with CompTIA Cybersecurity Ana...CompTIA
In this document:
- Adding Analytics to your Cybersecurity Toolkit with CompTIA Cybersecurity Analyst (CSA+)
- Measuring CompTIA CSA+ Difficulty
- Why Hybrid Testing Approaches Work Best
- Mapping the NICE Cybersecurity Workforce Framework
Corporate bankruptcy prediction using Deep learning techniquesShantanu Deshpande
Corporate Bankruptcy prediction using Recurrent neural networks – Aim is to build a recurrent neural network-based model to predict whether company will become bankrupt or not using financial ratios of Polish companies.
Methodologies & Tools: CRISP-DM, SMOTE-ENN, GA Algorithm, LSTM network (type of RNN)
Information Technology (IT) Security Framework for Kenyan Small and Medium En...CSCJournals
To address challenges faced by Small and Medium Enterprises (SMEs) especially in Kenya, this paper aims to establish an Information Technology (IT) framework that can allow SMEs implement cost effective security measures. Particularly this paper discusses IT security requirements and appropriate metrics. There is evidence from the survey to suggest that despite having some IT security measures in place, Kenyan SMEs still face some serious IT security challenges. In the light of the challenges faced by Kenyan SMEs, this work recommends a framework which is supposed among other things provide some metrics of evaluating the effectiveness of implemented security measures. The framework is likely to assist SME stakeholders measure the effectiveness of their security enhancing mechanisms.
Application Development Risk Assessment Model Based on Bayesian NetworkTELKOMNIKA JOURNAL
This paper describes a new risk assessment model for application development and its
implementation. The model is developed using a Bayesian network and Boehm’s software risk principles.
The Bayesian network is created after mapping top twenty risks in software projects with interrelationship
digraph of risk area category. The probability of risk on the network is analyzed and validated using both
numerical simulation and subjective probability from several experts in the field and a team of application
developers. After obtaining the Bayesian network model, risk exposure is calculated using Boehm's risk
principles. Finally, the implementation of the proposed model in a government institution is shown as a real
case illustration.
Abstract
Big data plays a serious role within the business for creating higher predictions over business information that is collected from the real world. Finance is that the new sector wherever the big data technologies like Hadoop, NoSQL are creating its mark in predictions from financial data by the analysts. It’s a lot of fascinating within the stock exchange choices which might predict on a lot of profits of stock exchange. For this stock exchange analysis each regular information and historical information of specific stock exchange are needed for creating predictions. There are varied techniques used for analyzing the unstructured information like stock exchange reviews (day-to-day information) and historical statistic of economic information severally. This paper involves discussion regarding the strategies that square measure used for analyzing each varieties of information.
Keywords: Big data, prediction, finance, stock market, business intelligence
Review on Algorithmic and Non Algorithmic Software Cost Estimation Techniquesijtsrd
Effective software cost estimation is the most challenging and important activities in software development. Developers want a simple and accurate method of efforts estimation. Estimation of the cost before starting of work is a prediction and prediction always not accurate. Software effort estimation is a very critical task in the software engineering and to control quality and efficiency a suitable estimation technique is crucial. This paper gives a review of various available software effort estimation methods, mainly focus on the algorithmic model and non algorithmic model. These existing methods for software cost estimation are illustrated and their aspect will be discussed. No single technique is best for all situations, and thus a careful comparison of the results of several approaches is most likely to produce realistic estimation. This paper provides a detailed overview of existing software cost estimation models and techniques. This paper presents the strength and weakness of various cost estimation methods. This paper focuses on some of the relevant reasons that cause inaccurate estimation. Pa Pa Win | War War Myint | Hlaing Phyu Phyu Mon | Seint Wint Thu "Review on Algorithmic and Non-Algorithmic Software Cost Estimation Techniques" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd26511.pdfPaper URL: https://www.ijtsrd.com/engineering/-/26511/review-on-algorithmic-and-non-algorithmic-software-cost-estimation-techniques/pa-pa-win
Automatically Estimating Software Effort and Cost using Computing Intelligenc...cscpconf
In the IT industry, precisely estimate the effort of each software project the development cost
and schedule are count for much to the software company. So precisely estimation of man
power seems to be getting more important. In the past time, the IT companies estimate the work
effort of man power by human experts, using statistics method. However, the outcomes are
always unsatisfying the management level. Recently it becomes an interesting topic if computing
intelligence techniques can do better in this field. This research uses some computing
intelligence techniques, such as Pearson product-moment correlation coefficient method and
one-way ANOVA method to select key factors, and K-Means clustering algorithm to do project
clustering, to estimate the software project effort. The experimental result show that using
computing intelligence techniques to estimate the software project effort can get more precise
and more effective estimation than using traditional human experts did
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
PORM: Predictive Optimization of Risk Management to Control Uncertainty Probl...IJECEIAES
Irrespective of different research-based approaches toward risk management, developing a precise model towards risk management is found to be a computationally challenging task owing to critical and vague definition of the origination of the problems. This research work introduces a model called as PROM i.e. Predictive Optimization of Risk Management with the perspective of software engineering. The significant contribution of PORM is to offer a reliable computation of risk analysis by considering generalized practical scenario of software development practices in Information Technology (IT) industry. The proposed PORM system is also designed and equipped with better risk factor assessment with an aid of machine learning approach without having more involvement of iteration. The study outcome shows that PORM system offers computationally cost effective analysis of risk factor as assessed with respect to different quality standards of object oriented system involved in every software projects.
A NEW MATHEMATICAL RISK MANAGEMENT MODEL FOR AGILE SOFTWARE DEVELOPMENT METHO...ijseajournal
This paper proposes a new mathematical model for estimating the cost of explicit Agile software
development risk management with its Impact Benefit s (savings/profits). This is necessitated by the fact
that despite the increase in the need for managing risks explicitly in medium-to-large scale agile software
development projects presently, there are no known ways to estimate explicit risk management
costs/benefits. With the proposed model, explicit risk management procedures alongside with risk
management estimation techniques is made known to Stakeholders who will be able to make the right
decisions on risk management costs and its impacts as well as when to utilise implicit or explicit risk
management. The proposed system proves to be feasible and dependable and is evidently capable of
enhancing the agile methods for use for all sizes of software projects while still maintaining the swiftness of
the agile process.
The use of Monte Carlo simulation in quantitative risk assessment of IT projectsEswar Publications
Estimation of the likely time and cost to complete the project and in line with it, taking into account the likelihood of occurrence and severity of the risks' effect, is one of the main concerns that have busied the organizational project managers. On the other hand, the diversity and sensitivity of information technology risks have caused to proper risk management, bolder than other issues, influences these projects. Therefore, in order to describe the degree of potential consequences and probability of occurrence of incidents accurately, IT project managers benefit from quantitative assessment. One of the most effective tools for quantitative assessment and likely forecasting of risks is Monte Carlo simulation, which by generating random numbers, calculates the individual components of a project and determine the impact of each of them on project. In this study, we tried to offer the functional model of the impact of risks on performance indicators of information technology project and propose proper time and cost for completing the project under the study by doing a case study and use of software functionality of Primavera Risk Analysis in Monte Carlo simulation.
future internetArticleERMOCTAVE A Risk Management Fra.docxgilbertkpeters11344
future internet
Article
ERMOCTAVE: A Risk Management Framework for IT
Systems Which Adopt Cloud Computing
Masky Mackita 1, Soo-Young Shin 2 and Tae-Young Choe 3,*
1 ING Bank, B-1040 Brussels, Belgium; [email protected]
2 Department of IT Convergence Engineering, Kumoh National Institute of Technology, Gumi 39177, Korea;
[email protected]
3 Department of Computer Engineering, Kumoh National Institute of Technology, Gumi 39177, Korea
* Correspondence: [email protected]; Tel.: +82-54-478-7526
Received: 22 June 2019; Accepted: 3 September 2019; Published: 10 September 2019
����������
�������
Abstract: Many companies are adapting cloud computing technology because moving to the cloud
has an array of benefits. During decision-making, having processed for adopting cloud computing,
the importance of risk management is progressively recognized. However, traditional risk management
methods cannot be applied directly to cloud computing when data are transmitted and processed by
external providers. When they are directly applied, risk management processes can fail by ignoring
the distributed nature of cloud computing and leaving numerous risks unidentified. In order to fix
this backdrop, this paper introduces a new risk management method, Enterprise Risk Management
for Operationally Critical Threat, Asset, and Vulnerability Evaluation (ERMOCTAVE), which combines
Enterprise Risk Management and Operationally Critical Threat, Asset, and Vulnerability Evaluation for
mitigating risks that can arise with cloud computing. ERMOCTAVE is composed of two risk management
methods by combining each component with another processes for comprehensive perception of risks.
In order to explain ERMOCTAVE in detail, a case study scenario is presented where an Internet seller
migrates some modules to Microsoft Azure cloud. The functionality comparison with ENISA and
Microsoft cloud risk assessment shows that ERMOCTAVE has additional features, such as key objectives
and strategies, critical assets, and risk measurement criteria.
Keywords: risk management; ERM; OCTAVE; cloud computing; Microsoft Azure
1. Introduction
Cloud computing is a technology that uses virtualized resources to deliver IT services through the
Internet. It can also be defined as a model that allows network access to a pool of computing resources
such as servers, applications, storage, and services, which can be quickly offered by service providers [1].
One of properties of the cloud is its distributed nature [2]. Data in the cloud environments had become
gradually distributed, moving from a centralized model to a distributed model. That distributed nature
causes cloud computing actors to face problems like loss of data control, difficulties to demonstrate
compliance, and additional legal risks as data migration from one legal jurisdiction to another. An example
is Salesforce.com, which suffered a huge outage, locking more than 900,000 subscribers out of important
resources needed for business trans.
1
Contents
1 Introduction 2
1.1 Project Description 2
1.2 Purpose 2
1.3 Scope 3
1.4 Compliance laws and Regulations 3
2 Risk Management Organization 5
2.1 Responsibilities 5
2.1.1 Risk Manager 5
2.1.3 Department Heads 5
2.1.4 Staff 6
3 Risk Management Strategy 7
3.1 Process 7
3.2 Risk Identification 7
3.3 Risk Assessment 7
3.3.1 Qualitative risk assessment 7
3.4 Risk Response Planning 9
3.4.1 Proposed Schedule 9
References 10
1 Introduction1.1 Project Description
Health Network Inc provides patients and clinics with information technology solutions that help them in making of payments, communications, and decision making through the three products they offer. These include HNetExchange, HNetPay, and HNetConnect . The company has several branches in Minneapolis, Portland, and Arlington, USA with three production centers. These data centers host production servers and various hardware tools such as company-issued laptops and mobile devices.1.2 Purpose
Health Network Inc, gets exposed to various risks since it uses an Information Technology Infrastructure to run its business operations. This poses a threat to Health Network Inc business continuity and therefore to manage this risk effective risk management strategies is adopted by the company. The purpose of this risk management plan is to review the company’s outdated risk management plan and deal with the threats that were identified after it was reviewed. These threats include; company data and information being lost, loss of customers, insider threats, changes in regulations, and internet threats.
Updating and developing a new risk management plan will enable Health Network Inc to be ready for how unexpected events will impact the business and their consequences. It will enable the company to develop proper risk management strategies to avoid financial loss, intellectual property loss, reputation damage,damage of business operations, harm to staff, and legal penalties (Yildirim, 2017).
In addition, the risk management is important because it enable the company to develop effect risk management strategy in regards to information security. It will increase customers and shareholders confidence in the company and increase Health Network Inc’s competitiveness (Yildirim, 2017).Furthermore, the risk management plan will enable the company to find measures that will reduce costs and also increase staff knowledge level of information security and they can become more conscious and responsible (Yildirim, 2017).1.3 Scope
The risk management policy will include Health Network Inc ‘s assets, equipment, and Information Technology infrastructure. All referenced material will be preserved with the previous documents to ensure document control. The risk manager will direct all departments at Health Network Inc, patient care professionals, and services to collaborate.1.4 Compliance laws and Regulations
Cyber risks such as malware, hacking, and viruses affect business organizations and therefore Healt ...
future internetArticleERMOCTAVE A Risk Management FraDustiBuckner14
future internet
Article
ERMOCTAVE: A Risk Management Framework for IT
Systems Which Adopt Cloud Computing
Masky Mackita 1, Soo-Young Shin 2 and Tae-Young Choe 3,*
1 ING Bank, B-1040 Brussels, Belgium; [email protected]
2 Department of IT Convergence Engineering, Kumoh National Institute of Technology, Gumi 39177, Korea;
[email protected]
3 Department of Computer Engineering, Kumoh National Institute of Technology, Gumi 39177, Korea
* Correspondence: [email protected]; Tel.: +82-54-478-7526
Received: 22 June 2019; Accepted: 3 September 2019; Published: 10 September 2019
����������
�������
Abstract: Many companies are adapting cloud computing technology because moving to the cloud
has an array of benefits. During decision-making, having processed for adopting cloud computing,
the importance of risk management is progressively recognized. However, traditional risk management
methods cannot be applied directly to cloud computing when data are transmitted and processed by
external providers. When they are directly applied, risk management processes can fail by ignoring
the distributed nature of cloud computing and leaving numerous risks unidentified. In order to fix
this backdrop, this paper introduces a new risk management method, Enterprise Risk Management
for Operationally Critical Threat, Asset, and Vulnerability Evaluation (ERMOCTAVE), which combines
Enterprise Risk Management and Operationally Critical Threat, Asset, and Vulnerability Evaluation for
mitigating risks that can arise with cloud computing. ERMOCTAVE is composed of two risk management
methods by combining each component with another processes for comprehensive perception of risks.
In order to explain ERMOCTAVE in detail, a case study scenario is presented where an Internet seller
migrates some modules to Microsoft Azure cloud. The functionality comparison with ENISA and
Microsoft cloud risk assessment shows that ERMOCTAVE has additional features, such as key objectives
and strategies, critical assets, and risk measurement criteria.
Keywords: risk management; ERM; OCTAVE; cloud computing; Microsoft Azure
1. Introduction
Cloud computing is a technology that uses virtualized resources to deliver IT services through the
Internet. It can also be defined as a model that allows network access to a pool of computing resources
such as servers, applications, storage, and services, which can be quickly offered by service providers [1].
One of properties of the cloud is its distributed nature [2]. Data in the cloud environments had become
gradually distributed, moving from a centralized model to a distributed model. That distributed nature
causes cloud computing actors to face problems like loss of data control, difficulties to demonstrate
compliance, and additional legal risks as data migration from one legal jurisdiction to another. An example
is Salesforce.com, which suffered a huge outage, locking more than 900,000 subscribers out of important
resources needed for business trans ...
Efficacy of OCTAVE Risk Assessment Methodology in Information Systems Organiz...Editor IJCATR
With the increasing use of computers in business information security has also become a key issue in organizations. Risk assessment in organizations is vital in order to identify threats and take appropriate measures. There are various risk assessment methodologies exist which organizations use for risk assessment depending the type and need of organizations. In this research OCTAVE methodology has been used following a comparative study of various methodologies due to its flexibility and simplicity. The methodology was implemented in a financial institution and results of its efficacy have been discussed.
Attack graph based risk assessment and optimisation approachIJNSA Journal
Attack graphs are models that offer significant cap
abilities to analyse security in network systems. A
n
attack graph allows the representation of vulnerabi
lities, exploits and conditions for each attack in
a single
unifying model. This paper proposes a methodology
to explore the graph using a genetic algorithm (GA)
.
Each attack path is considered as an independent at
tack scenario from the source of attack to the targ
et.
Many such paths form the individuals in the evoluti
onary GA solution. The population-based strategy of
a
GA provides a natural way of exploring a large numb
er of possible attack paths to find the paths that
are
most important. Thus unlike many other optimisation
solutions a range of solutions can be presented to
a
user of the methodology.
This presentation talks about how risks in a project are analyzed and quantified. The presentation also discusses benefits of quantification of risks and the various tools at our disposal to manage risks effectively through quantification.
Accelerating the Development of Medical Devices: The Value of Proactive Risk ...Cognizant
Identifying risks and working to mitigate them during the early stages of product development is critical for medical-device manufacturers worldwide. By focusing on four strategies - risk limitation, risk transfer, risk avoidance and risk acceptance, companies can evaluate risk effectively, take appropriate actions, and reduce the time and costs associated with New Product Development (NPD).
Similar to Risk assessment of information production using extended risk matrix approach (20)
Amazon products reviews classification based on machine learning, deep learni...TELKOMNIKA JOURNAL
In recent times, the trend of online shopping through e-commerce stores and websites has grown to a huge extent. Whenever a product is purchased on an e-commerce platform, people leave their reviews about the product. These reviews are very helpful for the store owners and the product’s manufacturers for the betterment of their work process as well as product quality. An automated system is proposed in this work that operates on two datasets D1 and D2 obtained from Amazon. After certain preprocessing steps, N-gram and word embedding-based features are extracted using term frequency-inverse document frequency (TF-IDF), bag of words (BoW) and global vectors (GloVe), and Word2vec, respectively. Four machine learning (ML) models support vector machines (SVM), logistic regression (RF), logistic regression (LR), multinomial Naïve Bayes (MNB), two deep learning (DL) models convolutional neural network (CNN), long-short term memory (LSTM), and standalone bidirectional encoder representations (BERT) are used to classify reviews as either positive or negative. The results obtained by the standard ML, DL models and BERT are evaluated using certain performance evaluation measures. BERT turns out to be the best-performing model in the case of D1 with an accuracy of 90% on features derived by word embedding models while the CNN provides the best accuracy of 97% upon word embedding features in the case of D2. The proposed model shows better overall performance on D2 as compared to D1.
Design, simulation, and analysis of microstrip patch antenna for wireless app...TELKOMNIKA JOURNAL
In this study, a microstrip patch antenna that works at 3.6 GHz was built and tested to see how well it works. In this work, Rogers RT/Duroid 5880 has been used as the substrate material, with a dielectric permittivity of 2.2 and a thickness of 0.3451 mm; it serves as the base for the examined antenna. The computer simulation technology (CST) studio suite is utilized to show the recommended antenna design. The goal of this study was to get a more extensive transmission capacity, a lower voltage standing wave ratio (VSWR), and a lower return loss, but the main goal was to get a higher gain, directivity, and efficiency. After simulation, the return loss, gain, directivity, bandwidth, and efficiency of the supplied antenna are found to be -17.626 dB, 9.671 dBi, 9.924 dBi, 0.2 GHz, and 97.45%, respectively. Besides, the recreation uncovered that the transfer speed side-lobe level at phi was much better than those of the earlier works, at -28.8 dB, respectively. Thus, it makes a solid contender for remote innovation and more robust communication.
Design and simulation an optimal enhanced PI controller for congestion avoida...TELKOMNIKA JOURNAL
In this paper, snake optimization algorithm (SOA) is used to find the optimal gains of an enhanced controller for controlling congestion problem in computer networks. M-file and Simulink platform is adopted to evaluate the response of the active queue management (AQM) system, a comparison with two classical controllers is done, all tuned gains of controllers are obtained using SOA method and the fitness function chose to monitor the system performance is the integral time absolute error (ITAE). Transient analysis and robust analysis is used to show the proposed controller performance, two robustness tests are applied to the AQM system, one is done by varying the size of queue value in different period and the other test is done by changing the number of transmission control protocol (TCP) sessions with a value of ± 20% from its original value. The simulation results reflect a stable and robust behavior and best performance is appeared clearly to achieve the desired queue size without any noise or any transmission problems.
Improving the detection of intrusion in vehicular ad-hoc networks with modifi...TELKOMNIKA JOURNAL
Vehicular ad-hoc networks (VANETs) are wireless-equipped vehicles that form networks along the road. The security of this network has been a major challenge. The identity-based cryptosystem (IBC) previously used to secure the networks suffers from membership authentication security features. This paper focuses on improving the detection of intruders in VANETs with a modified identity-based cryptosystem (MIBC). The MIBC is developed using a non-singular elliptic curve with Lagrange interpolation. The public key of vehicles and roadside units on the network are derived from number plates and location identification numbers, respectively. Pseudo-identities are used to mask the real identity of users to preserve their privacy. The membership authentication mechanism ensures that only valid and authenticated members of the network are allowed to join the network. The performance of the MIBC is evaluated using intrusion detection ratio (IDR) and computation time (CT) and then validated with the existing IBC. The result obtained shows that the MIBC recorded an IDR of 99.3% against 94.3% obtained for the existing identity-based cryptosystem (EIBC) for 140 unregistered vehicles attempting to intrude on the network. The MIBC shows lower CT values of 1.17 ms against 1.70 ms for EIBC. The MIBC can be used to improve the security of VANETs.
Conceptual model of internet banking adoption with perceived risk and trust f...TELKOMNIKA JOURNAL
Understanding the primary factors of internet banking (IB) acceptance is critical for both banks and users; nevertheless, our knowledge of the role of users’ perceived risk and trust in IB adoption is limited. As a result, we develop a conceptual model by incorporating perceived risk and trust into the technology acceptance model (TAM) theory toward the IB. The proper research emphasized that the most essential component in explaining IB adoption behavior is behavioral intention to use IB adoption. TAM is helpful for figuring out how elements that affect IB adoption are connected to one another. According to previous literature on IB and the use of such technology in Iraq, one has to choose a theoretical foundation that may justify the acceptance of IB from the customer’s perspective. The conceptual model was therefore constructed using the TAM as a foundation. Furthermore, perceived risk and trust were added to the TAM dimensions as external factors. The key objective of this work was to extend the TAM to construct a conceptual model for IB adoption and to get sufficient theoretical support from the existing literature for the essential elements and their relationships in order to unearth new insights about factors responsible for IB adoption.
Efficient combined fuzzy logic and LMS algorithm for smart antennaTELKOMNIKA JOURNAL
The smart antennas are broadly used in wireless communication. The least mean square (LMS) algorithm is a procedure that is concerned in controlling the smart antenna pattern to accommodate specified requirements such as steering the beam toward the desired signal, in addition to placing the deep nulls in the direction of unwanted signals. The conventional LMS (C-LMS) has some drawbacks like slow convergence speed besides high steady state fluctuation error. To overcome these shortcomings, the present paper adopts an adaptive fuzzy control step size least mean square (FC-LMS) algorithm to adjust its step size. Computer simulation outcomes illustrate that the given model has fast convergence rate as well as low mean square error steady state.
Design and implementation of a LoRa-based system for warning of forest fireTELKOMNIKA JOURNAL
This paper presents the design and implementation of a forest fire monitoring and warning system based on long range (LoRa) technology, a novel ultra-low power consumption and long-range wireless communication technology for remote sensing applications. The proposed system includes a wireless sensor network that records environmental parameters such as temperature, humidity, wind speed, and carbon dioxide (CO2) concentration in the air, as well as taking infrared photos.The data collected at each sensor node will be transmitted to the gateway via LoRa wireless transmission. Data will be collected, processed, and uploaded to a cloud database at the gateway. An Android smartphone application that allows anyone to easily view the recorded data has been developed. When a fire is detected, the system will sound a siren and send a warning message to the responsible personnel, instructing them to take appropriate action. Experiments in Tram Chim Park, Vietnam, have been conducted to verify and evaluate the operation of the system.
Wavelet-based sensing technique in cognitive radio networkTELKOMNIKA JOURNAL
Cognitive radio is a smart radio that can change its transmitter parameter based on interaction with the environment in which it operates. The demand for frequency spectrum is growing due to a big data issue as many Internet of Things (IoT) devices are in the network. Based on previous research, most frequency spectrum was used, but some spectrums were not used, called spectrum hole. Energy detection is one of the spectrum sensing methods that has been frequently used since it is easy to use and does not require license users to have any prior signal understanding. But this technique is incapable of detecting at low signal-to-noise ratio (SNR) levels. Therefore, the wavelet-based sensing is proposed to overcome this issue and detect spectrum holes. The main objective of this work is to evaluate the performance of wavelet-based sensing and compare it with the energy detection technique. The findings show that the percentage of detection in wavelet-based sensing is 83% higher than energy detection performance. This result indicates that the wavelet-based sensing has higher precision in detection and the interference towards primary user can be decreased.
A novel compact dual-band bandstop filter with enhanced rejection bandsTELKOMNIKA JOURNAL
In this paper, we present the design of a new wide dual-band bandstop filter (DBBSF) using nonuniform transmission lines. The method used to design this filter is to replace conventional uniform transmission lines with nonuniform lines governed by a truncated Fourier series. Based on how impedances are profiled in the proposed DBBSF structure, the fractional bandwidths of the two 10 dB-down rejection bands are widened to 39.72% and 52.63%, respectively, and the physical size has been reduced compared to that of the filter with the uniform transmission lines. The results of the electromagnetic (EM) simulation support the obtained analytical response and show an improved frequency behavior.
Deep learning approach to DDoS attack with imbalanced data at the application...TELKOMNIKA JOURNAL
A distributed denial of service (DDoS) attack is where one or more computers attack or target a server computer, by flooding internet traffic to the server. As a result, the server cannot be accessed by legitimate users. A result of this attack causes enormous losses for a company because it can reduce the level of user trust, and reduce the company’s reputation to lose customers due to downtime. One of the services at the application layer that can be accessed by users is a web-based lightweight directory access protocol (LDAP) service that can provide safe and easy services to access directory applications. We used a deep learning approach to detect DDoS attacks on the CICDDoS 2019 dataset on a complex computer network at the application layer to get fast and accurate results for dealing with unbalanced data. Based on the results obtained, it is observed that DDoS attack detection using a deep learning approach on imbalanced data performs better when implemented using synthetic minority oversampling technique (SMOTE) method for binary classes. On the other hand, the proposed deep learning approach performs better for detecting DDoS attacks in multiclass when implemented using the adaptive synthetic (ADASYN) method.
The appearance of uncertainties and disturbances often effects the characteristics of either linear or nonlinear systems. Plus, the stabilization process may be deteriorated thus incurring a catastrophic effect to the system performance. As such, this manuscript addresses the concept of matching condition for the systems that are suffering from miss-match uncertainties and exogeneous disturbances. The perturbation towards the system at hand is assumed to be known and unbounded. To reach this outcome, uncertainties and their classifications are reviewed thoroughly. The structural matching condition is proposed and tabulated in the proposition 1. Two types of mathematical expressions are presented to distinguish the system with matched uncertainty and the system with miss-matched uncertainty. Lastly, two-dimensional numerical expressions are provided to practice the proposed proposition. The outcome shows that matching condition has the ability to change the system to a design-friendly model for asymptotic stabilization.
Implementation of FinFET technology based low power 4×4 Wallace tree multipli...TELKOMNIKA JOURNAL
Many systems, including digital signal processors, finite impulse response (FIR) filters, application-specific integrated circuits, and microprocessors, use multipliers. The demand for low power multipliers is gradually rising day by day in the current technological trend. In this study, we describe a 4×4 Wallace multiplier based on a carry select adder (CSA) that uses less power and has a better power delay product than existing multipliers. HSPICE tool at 16 nm technology is used to simulate the results. In comparison to the traditional CSA-based multiplier, which has a power consumption of 1.7 µW and power delay product (PDP) of 57.3 fJ, the results demonstrate that the Wallace multiplier design employing CSA with first zero finding logic (FZF) logic has the lowest power consumption of 1.4 µW and PDP of 27.5 fJ.
Evaluation of the weighted-overlap add model with massive MIMO in a 5G systemTELKOMNIKA JOURNAL
The flaw in 5G orthogonal frequency division multiplexing (OFDM) becomes apparent in high-speed situations. Because the doppler effect causes frequency shifts, the orthogonality of OFDM subcarriers is broken, lowering both their bit error rate (BER) and throughput output. As part of this research, we use a novel design that combines massive multiple input multiple output (MIMO) and weighted overlap and add (WOLA) to improve the performance of 5G systems. To determine which design is superior, throughput and BER are calculated for both the proposed design and OFDM. The results of the improved system show a massive improvement in performance ver the conventional system and significant improvements with massive MIMO, including the best throughput and BER. When compared to conventional systems, the improved system has a throughput that is around 22% higher and the best performance in terms of BER, but it still has around 25% less error than OFDM.
Reflector antenna design in different frequencies using frequency selective s...TELKOMNIKA JOURNAL
In this study, it is aimed to obtain two different asymmetric radiation patterns obtained from antennas in the shape of the cross-section of a parabolic reflector (fan blade type antennas) and antennas with cosecant-square radiation characteristics at two different frequencies from a single antenna. For this purpose, firstly, a fan blade type antenna design will be made, and then the reflective surface of this antenna will be completed to the shape of the reflective surface of the antenna with the cosecant-square radiation characteristic with the frequency selective surface designed to provide the characteristics suitable for the purpose. The frequency selective surface designed and it provides the perfect transmission as possible at 4 GHz operating frequency, while it will act as a band-quenching filter for electromagnetic waves at 5 GHz operating frequency and will be a reflective surface. Thanks to this frequency selective surface to be used as a reflective surface in the antenna, a fan blade type radiation characteristic at 4 GHz operating frequency will be obtained, while a cosecant-square radiation characteristic at 5 GHz operating frequency will be obtained.
Reagentless iron detection in water based on unclad fiber optical sensorTELKOMNIKA JOURNAL
A simple and low-cost fiber based optical sensor for iron detection is demonstrated in this paper. The sensor head consist of an unclad optical fiber with the unclad length of 1 cm and it has a straight structure. Results obtained shows a linear relationship between the output light intensity and iron concentration, illustrating the functionality of this iron optical sensor. Based on the experimental results, the sensitivity and linearity are achieved at 0.0328/ppm and 0.9824 respectively at the wavelength of 690 nm. With the same wavelength, other performance parameters are also studied. Resolution and limit of detection (LOD) are found to be 0.3049 ppm and 0.0755 ppm correspondingly. This iron sensor is advantageous in that it does not require any reagent for detection, enabling it to be simpler and cost-effective in the implementation of the iron sensing.
Impact of CuS counter electrode calcination temperature on quantum dot sensit...TELKOMNIKA JOURNAL
In place of the commercial Pt electrode used in quantum sensitized solar cells, the low-cost CuS cathode is created using electrophoresis. High resolution scanning electron microscopy and X-ray diffraction were used to analyze the structure and morphology of structural cubic samples with diameters ranging from 40 nm to 200 nm. The conversion efficiency of solar cells is significantly impacted by the calcination temperatures of cathodes at 100 °C, 120 °C, 150 °C, and 180 °C under vacuum. The fluorine doped tin oxide (FTO)/CuS cathode electrode reached a maximum efficiency of 3.89% when it was calcined at 120 °C. Compared to other temperature combinations, CuS nanoparticles crystallize at 120 °C, which lowers resistance while increasing electron lifetime.
In place of the commercial Pt electrode used in quantum sensitized solar cells, the low-cost CuS cathode is created using electrophoresis. High resolution scanning electron microscopy and X-ray diffraction were used to analyze the structure and morphology of structural cubic samples with diameters ranging from 40 nm to 200 nm. The conversion efficiency of solar cells is significantly impacted by the calcination temperatures of cathodes at 100 °C, 120 °C, 150 °C, and 180 °C under vacuum. The fluorine doped tin oxide (FTO)/CuS cathode electrode reached a maximum efficiency of 3.89% when it was calcined at 120 °C. Compared to other temperature combinations, CuS nanoparticles crystallize at 120 °C, which lowers resistance while increasing electron lifetime.
A progressive learning for structural tolerance online sequential extreme lea...TELKOMNIKA JOURNAL
This article discusses the progressive learning for structural tolerance online sequential extreme learning machine (PSTOS-ELM). PSTOS-ELM can save robust accuracy while updating the new data and the new class data on the online training situation. The robustness accuracy arises from using the householder block exact QR decomposition recursive least squares (HBQRD-RLS) of the PSTOS-ELM. This method is suitable for applications that have data streaming and often have new class data. Our experiment compares the PSTOS-ELM accuracy and accuracy robustness while data is updating with the batch-extreme learning machine (ELM) and structural tolerance online sequential extreme learning machine (STOS-ELM) that both must retrain the data in a new class data case. The experimental results show that PSTOS-ELM has accuracy and robustness comparable to ELM and STOS-ELM while also can update new class data immediately.
Electroencephalography-based brain-computer interface using neural networksTELKOMNIKA JOURNAL
This study aimed to develop a brain-computer interface that can control an electric wheelchair using electroencephalography (EEG) signals. First, we used the Mind Wave Mobile 2 device to capture raw EEG signals from the surface of the scalp. The signals were transformed into the frequency domain using fast Fourier transform (FFT) and filtered to monitor changes in attention and relaxation. Next, we performed time and frequency domain analyses to identify features for five eye gestures: opened, closed, blink per second, double blink, and lookup. The base state was the opened-eyes gesture, and we compared the features of the remaining four action gestures to the base state to identify potential gestures. We then built a multilayer neural network to classify these features into five signals that control the wheelchair’s movement. Finally, we designed an experimental wheelchair system to test the effectiveness of the proposed approach. The results demonstrate that the EEG classification was highly accurate and computationally efficient. Moreover, the average performance of the brain-controlled wheelchair system was over 75% across different individuals, which suggests the feasibility of this approach.
Adaptive segmentation algorithm based on level set model in medical imagingTELKOMNIKA JOURNAL
For image segmentation, level set models are frequently employed. It offer best solution to overcome the main limitations of deformable parametric models. However, the challenge when applying those models in medical images stills deal with removing blurs in image edges which directly affects the edge indicator function, leads to not adaptively segmenting images and causes a wrong analysis of pathologies wich prevents to conclude a correct diagnosis. To overcome such issues, an effective process is suggested by simultaneously modelling and solving systems’ two-dimensional partial differential equations (PDE). The first PDE equation allows restoration using Euler’s equation similar to an anisotropic smoothing based on a regularized Perona and Malik filter that eliminates noise while preserving edge information in accordance with detected contours in the second equation that segments the image based on the first equation solutions. This approach allows developing a new algorithm which overcome the studied model drawbacks. Results of the proposed method give clear segments that can be applied to any application. Experiments on many medical images in particular blurry images with high information losses, demonstrate that the developed approach produces superior segmentation results in terms of quantity and quality compared to other models already presented in previeous works.
Automatic channel selection using shuffled frog leaping algorithm for EEG bas...TELKOMNIKA JOURNAL
Drug addiction is a complex neurobiological disorder that necessitates comprehensive treatment of both the body and mind. It is categorized as a brain disorder due to its impact on the brain. Various methods such as electroencephalography (EEG), functional magnetic resonance imaging (FMRI), and magnetoencephalography (MEG) can capture brain activities and structures. EEG signals provide valuable insights into neurological disorders, including drug addiction. Accurate classification of drug addiction from EEG signals relies on appropriate features and channel selection. Choosing the right EEG channels is essential to reduce computational costs and mitigate the risk of overfitting associated with using all available channels. To address the challenge of optimal channel selection in addiction detection from EEG signals, this work employs the shuffled frog leaping algorithm (SFLA). SFLA facilitates the selection of appropriate channels, leading to improved accuracy. Wavelet features extracted from the selected input channel signals are then analyzed using various machine learning classifiers to detect addiction. Experimental results indicate that after selecting features from the appropriate channels, classification accuracy significantly increased across all classifiers. Particularly, the multi-layer perceptron (MLP) classifier combined with SFLA demonstrated a remarkable accuracy improvement of 15.78% while reducing time complexity.
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...Amil Baba Dawood bangali
Contact with Dawood Bhai Just call on +92322-6382012 and we'll help you. We'll solve all your problems within 12 to 24 hours and with 101% guarantee and with astrology systematic. If you want to take any personal or professional advice then also you can call us on +92322-6382012 , ONLINE LOVE PROBLEM & Other all types of Daily Life Problem's.Then CALL or WHATSAPP us on +92322-6382012 and Get all these problems solutions here by Amil Baba DAWOOD BANGALI
#vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore#blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #blackmagicforlove #blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #Amilbabainuk #amilbabainspain #amilbabaindubai #Amilbabainnorway #amilbabainkrachi #amilbabainlahore #amilbabaingujranwalan #amilbabainislamabad
We have compiled the most important slides from each speaker's presentation. This year’s compilation, available for free, captures the key insights and contributions shared during the DfMAy 2024 conference.
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
Welcome to WIPAC Monthly the magazine brought to you by the LinkedIn Group Water Industry Process Automation & Control.
In this month's edition, along with this month's industry news to celebrate the 13 years since the group was created we have articles including
A case study of the used of Advanced Process Control at the Wastewater Treatment works at Lleida in Spain
A look back on an article on smart wastewater networks in order to see how the industry has measured up in the interim around the adoption of Digital Transformation in the Water Industry.
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Dr.Costas Sachpazis
Terzaghi's soil bearing capacity theory, developed by Karl Terzaghi, is a fundamental principle in geotechnical engineering used to determine the bearing capacity of shallow foundations. This theory provides a method to calculate the ultimate bearing capacity of soil, which is the maximum load per unit area that the soil can support without undergoing shear failure. The Calculation HTML Code included.
2. TELKOMNIKA ISSN: 1693-6930
Risk assessment of information production using extended risk matrix... (Jaka Sembiring)
1325
In this paper, we propose an information production risk assessment procedure by
deriving a new calculation method for total impact in financial objectives. We employ the threat
dependency scenario model as a building block to obtain total impact while at the same time
consider the control effectiveness that has been implemented. In addition we propose a new
method to increase accuracy of risks priority ranking when we take into account the
effectiveness of existing control. We adopt the ISO 27005:2008 standard framework where risk
assessment processes include asset, threat, control and vulnerability as risk factors.
Considering existing control, we propose recoverability as an addition dimension to risk matrix
approach derived in [10] to improve its function. Finally we provide a real case implementation
of our proposed method in a government institution. In this paper, section 2 describes related
works, section 3 describes our proposed method, and a real case illustration will be elaborated
in section 4, and finally we conclude our study in section 5.
2. Related Works
As mentioned before, TIRM provides a holistic and systematic information risk
management. This model is based on ISO 31000 standard and provides mathematical model to
calculate total impact through frequency of a task. A probability of required information,
frequency of information quality problem, and probability of direct and indirect impact [7]. On the
model, examination of existing risk control is a very important step to identify whether existing
controls already been implemented to prevent IQ problem and/or their consequences [7].
Without understanding what kind of control which is applied as respond to a risk, error will arise
in risk analysis and evaluation [3]. To assess existing risk control, one has to understand how
effective is the control that has been applied in the organization. In this context, there seems to
be some limitation on the TIRM model. It does not include existing control effectiveness
explicitly to calculate total impact. The model only identifies what existing control is, and it
estimate likelihood and impact without modelling the probability of control effectiveness. As a
consequence there is no guarantee that the result will produce exactly how effective existing
control is, what the likelihood is, and how big the residual impact is. These facts may cause
errors in risk analysis result.
In other development, Risk Matrix Approach (RMA) is a technique used in risk
assessment, especially for risk priority ranking proposed by Electronic System Centre.
This technique has been widely used in industry [7, 11]. This approach has been integrated into
the TIRM model at risk evaluation and ranking stage [7]. In RMA, risk matrices consists of two
key matrix which is impact and likelihood of risk. This technique is useful for qualitatively
identifying which risks are the most critical and has enabled industry to determine the priority for
corrective action [10, 11]. In addition, the risk matrix is also an effective tool and widely used to
improve risk management decisions [11]. Nevertheless, it appears that some disadvantages of
this technique exist. The index classification is less accurate, assessment mechanisms are
based on subjective calculation and the matrix has not always been able to meet the needs and
complexity of risk assessment diversities [10]. Moreover this technique is not as simple as it is
claimed to be. It takes a lot of considerations and requirements to create an ideal matrix to
improve risk management decisions, since it is based only on aggregation or merging of two
attributes, namely the likelihood and impact [11].
To overcome those weakness, there has been already several attempts to improve
applicability of risk matrix approach, such as clustering algorithm to improve risk matrix
classification index [12]. Borda method is also developed to improve risk matrix precision,
although this effort cannot eliminate risk ties completely [13]. There are also some proposed risk
calculation algorithm to improve objectivity of assessment process [14]. All of these
developments are relied on the original published risk matrix where matrix dimension is limited
to two dimension and for many cases this limitation create inflexibility in risk assessment
problem and requirements [10]. There are some proposed frameworks to extend the risk matrix
approach. Extension framework of RMA techniques is proposed by [10] to address complexity
and to meet the risk assessment requirements. The purpose of this extension is to widen its
applicability where input variables can be selected from variety of options with different
combinations according to requirement on actual situation [15]. Recoverability has been
proposed as an additional dimension to address the complexity of risk assessment in supply
3. ISSN: 1693-6930
TELKOMNIKA Vol. 17, No. 3, June 2019: 1324-1337
1326
chain area, where recoverability is defined as the system's ability to achieve acceptable limits or
levels of operation after a risk event occurs [10].
In the previously mentioned risk assessment mathematical model and risk matrix
approach, the role of existing control effectiveness has not been treated or considered.
To improve the overall accuracy of the model, in this paper we will create a formulation to
determine residue likelihood and impact involving existing control effectiveness. We will show
that the risk assessment result is better to represent the actual conditions. We develop further a
new dimension namely recoverability in the context of information quality to improver the
accuracy of the extend risk matrix approach. In final section we will show the implementation of
our proposed method in a real government institution.
3. Proposed Method
In this paper, our proposed information quality risk assessment method is limited to
information production domain which consists of four steps. We utilize the dimensions in this
domain to assess information quality as an input for risk assessment process. Then, we define
information system components as assets to support information quality. We develop a
conceptual model to map the risk factors and their relation. Based on this relation we develop
probability model in a step-by-step procedure to calculate the risk assessment.
3.1. Conceptual Model
Information production as our object of research is defined as an information creation
phase, which is supported by information system components where this information system is
considered as assets where each asset has threats, vulnerabilities and controls. The existing
control identification in this proposed model will adopt the types of control described in [16] i.e.
preventive, dissuasive, protective, palliative and recuperative. Our construction of conceptual
model is based on the following principles.
The controls in use have different types, where depend on these control types one can
determine the effectiveness of control to reduce likelihood and impact;
Control is applied to a threat that exploit vulnerabilities or several vulnerabilities in
relevant assets;
Threats can exploit more than one vulnerabilities;
A threat that successfully exploits vulnerabilities could affect more than one technical impact;
Reductions of information quality characteristics affect the financial impact.
Based on the above principles, we create a conceptual model illustrated in Figure 1,
where in general we adopt the model proposed in [16]. Conceptual model as in Figure 1
described components of the model and relationships between them. The components consists
of controls, threats, vulnerabilities and both of technical and financial impact. The list of assets
used in this study is described in Table 1 where we adopted [17]. Each asset can be seen in our
unified conceptual model.
Figure 1. Proposed conceptual model
V1V1
VnVn
V2V2 VnVn
T1T1
TnTn
RT1RT1
C1C1
C2C2
C3C3
CnCn
DISSDISS
PREVPREV
PROTPROT
PALLPALL
LR (T1)LR (T1)
RECURECU
IR (T1)IR (T1)
Vulnerability – Threat 1 –
Asset n
Control - Threat 1 – Asset n
Threat - Asset n
RTnRTn
C4C4
C1C1
C2C2
C3C3
CnCn
DISSDISS
PREVPREV
PROTPROT
PALLPALL
LR (Tn)LR (Tn)
RECURECU
IR (Tn)IR (Tn)
Control – Threat n – Asset n
C4C4
Aset n
Technical Impact
IQD1IQD1
IQD2IQD2
IQD1IQD1
IQDnIQDn
Financial ImpactFinancial Impact
Vulnerability – Threat n –
Asset n
4. TELKOMNIKA ISSN: 1693-6930
Risk assessment of information production using extended risk matrix... (Jaka Sembiring)
1327
Table 1. List of Assets
Code Asset
A1 Data and information
A2 Software
A3 Hardware
A4 Network
A5 Auxiliary equipment
A6 Physical infrastructure
A7 Personnel
Notes
Asset n : Asset – n
IQD n : Information quality dimension – n
Vn : Vulnerability – n
Tn : Threat – n
Cn : Control – n
DISS : Control combination effectiveness for Dissuasive
PREV : Control combination effectiveness for Preventive
PROT : Control combination effectiveness for Protective
PALL : Control combination effectiveness for Palliative
RECU : Control combination effectiveness for Recuperative
LR(Tn) : Control combination effectiveness for likelihood reduction
IR(Tn) : Control combination effectiveness for impact reduction
3.2. Probability Model
The detail of construction of the above conceptual model can be described step by step
as follows:
Step 1: Determining scope of assessment.
In this step, we define assessment scope of business process. This scope can be
defined as primary or supporting business process or based on business process criticality.
Then we determine the business objective such as financial, operational efficiency, strategy,
customer satisfaction etc. In this paper, we focus only on financial aspect of business objective.
Step 2: Performing information quality assessment.
Information quality assessment is conducted to get ideal (target) and existing quality.
In this step we refer to the information process flow described in [18. 19] as illustrated in
Figure 2. Information process consists of two phases called information production phase
(source, transfer and process) and information delivery phase (access and use). Each process
(source, transfer and so on) has different dimension as its information quality parameters.
Figure 2. Information process flow [18, 19]
To determine IQ dimension standard, we create IQ dimension–attribute catalogue.
This catalogue describes what attributes will be used in IQ assessment. In this paper we will
focus only on information production phase risk assessment. Therefore, we will use IQ
5. ISSN: 1693-6930
TELKOMNIKA Vol. 17, No. 3, June 2019: 1324-1337
1328
dimensions related to source, transfer and process (information production phase). There are
eight dimensions for source and process, and one dimension for transfer [8]. Although,
transferability dimension is rarely mentioned in literatures, we still assume that this characteristic
is important. Transferability is distribution value from one process to another process and a part
of information network (communication infrastructure and access to data and information) [8].
To develop IQ dimension-attribute catalogue, we refer to [20] for the summary of dimension and
attribute comparison between many types of published literatures. Based on analysis on the
summary, our IQ dimension–attribute catalogue can be seen in Table 2. Each dimension has its
own attributes which explains qualitative parameter for information quality.
Table 2. IQ Dimension-Attribute Catalogue
Dimension Attribute Code Description
Accuracy
Correct AC1
The degree of correctness form of information
presentation to the user
Free of error AC2 Free of error during information production process
Objectivity Unbiased OB1
The data was obtained not because of assumptions or
conjectures
Reliability From good source RE1 The data was obtained from correct source
Transferability
Free of network failure TR1 Free of network failure during transfer process
Sending media are good TR2 Used media is effective and efficient
Content
Data are clear without
ambiguity
CONT1 The data presented is clear and does not cause ambiguity
Consistency
Compatible with previous
data
CONS1
In accordance with the data obtained. processed or stored
before
Presented in the same
format
CONS2 Data elements are presented in the same format
Completeness
Complete COM1 No missing data elements
Include all necessary value COM2 Includes all important information elements
Timeliness
Up to date TI1 The degree of response to update
Delivered on timely TI2 The degree of all elements can be delivered on time
Security
Restricted appropriately SE1 Information restrictions appropriately
Secure SE2
Security is maintained throughout the information
production process
Input for this step is the scope or detail description of business process from the
previous step. The detail of business process should be mapped to IP-MAP model [21].
From this model we get information and description on how the process of information
production is performed. Through description from this model, we can understand what activity
that is mapped to information production process (source, transfer, process) and what is the
quality dimension requirement of each activity. In practice, we create questionnaire instrument
based on the attribute of each dimension from the catalogue and relation to each activity.
This instrument is used as a reference for IQ assessment process.
Step 3: Identify and estimate risk factor and risk profile.
In this step, risk factor identification is based on the catalogue of ISO/IEC 27005:2008.
ISO 27001:2005 and a brief description in [16]. A risk is the probability of losses caused by
threats, vulnerabilities and impacts [22]. Therefore, a risk is accumulation of probabilities
associated with the risk itself. In this study, probabilities are calculated using subjective
probabilities based on the knowledge and experience of the personnel involved in a process or
system or experts. In Bayesian conditional probability, a prior opportunity represents a trust
distribution reflecting the amount of initial trust of agents contributing to the hypothesis of an
event [23]. In general, to calculate this probability value we refer to GB/T 20984-2007 [14, 24].
The detail of calculation of probability of risk factor and risk profile can be derived in the
following steps.
1) Risk factor identification and estimation
a) Asset
Asset is defined as everything that has value to the organization and needs protection.
In our conceptual model asset is expressed as variable Asset and Asset valuation can be
divided in two variables: criticality and asset cost [25]. In our case, asset criticality is expressed
on a qualitative scale following the standard given in [24]. The identification result of this asset
which produce asset criticality level is expressed in the scale of 1 to 5.
6. TELKOMNIKA ISSN: 1693-6930
Risk assessment of information production using extended risk matrix... (Jaka Sembiring)
1329
b) Threat
On the conceptual model, a threat is denoted as variable Tn. A threat has the potential
to harm assets such as information, processes, systems and even organizations. Threats can
take the form of a natural or human origin and could be due to a deliberate or unintentional.
According to [26], a vulnerability does not cause a risk if there are no threat to be exploited.
Therefore, in our paper, we assume that a threat has dependent relationships with
vulnerabilities. If the probability of vulnerability is increasing than it will be easier for threats to
exploit. The probability of vulnerability indicates the degree of influence of vulnerability to the
possible threat. Therefore, threat is a conditional probability of vulnerability. A threat occurs
given vulnerabilities occurs. We use subjective Bayesian probability and expert perception as a
prior probability. All of the above phenomena are described mathematically in the following:
𝑃(𝑇𝑛𝑡|𝑉𝑛𝑡) =
𝑃(𝑉𝑛𝑡|𝑇𝑛𝑡)𝑃(𝑇𝑛𝑡)
∑ 𝑃(𝑉𝑛𝑡|𝑇𝑛𝑡)𝑃(𝑇𝑛𝑡)𝑁𝑇
𝑛𝑡=1
(1)
𝑃(𝑅𝑇𝐿 𝑛𝑡) = 𝑃(𝑇𝑛𝑡|𝑉𝑛𝑡) × (1 − 𝑃(𝐿𝑅 𝑛𝑡)) (2)
𝑃(𝑅𝑇𝐼 𝑛𝑡) = 𝑃(𝑇𝑛𝑡|𝑉𝑛𝑡) × (1 − 𝑃(𝐼𝑅 𝑛𝑡)) (3)
where:
𝑃(𝑇𝑛𝑡|𝑉𝑛𝑡) : Probability of threats based on information from the probability of
vulnerability (posterior).
𝑃(𝑇𝑛𝑡) : Probability of threat based on subject expert judgment (prior).
𝑃(𝑉𝑛𝑡|𝑇𝑛𝑡) : Probability that states the degree of vulnerability influence to threats.
𝑃(𝑅𝑇𝐿 𝑛𝑡) : Probability of residue threat likelihood.
𝑃(𝑅𝑇𝐼 𝑛𝑡) : Probability of residue threat impact.
after we obtain the threat probability, we could map the result to threat classification, where the
qualitative classification consist of five level as in [24]. This quantitative classification is used to
make it easier for classification process of threat probability.
c) Existing Control
Control as a way to lessen risk is divided into two types: (i) control that serves to reduce
likelihood and (ii) control that serves to reduce impact. According to [16], likelihood reducers are
dissuasive and preventive type of controls and the impact reducers are protective, palliative and
recuperative types of control. Parameter α and β are the weight of each type of control. Weight
ratio using α1:α2=1:2 and β1:β2:β3=1:2:2 [16]. The parameter 𝑃(𝐿𝑅 𝑛𝑡) is the effectiveness
probability of likelihood reducer and 𝑃(𝐼𝑅 𝑛𝑡) is the effectiveness of impact reducer control.
𝑃(𝐿𝑅 𝑛𝑡) =
𝛼1×𝑃(𝐷𝐼𝑆𝑆 𝑛𝑡)+𝛼2×𝑃(𝑃𝑅𝐸𝑉𝑛𝑡)
𝛼1+𝛼2
(4)
𝑃(𝐼𝑅 𝑛𝑡) =
𝛽1×𝑃(𝑃𝑅𝑂𝑇𝑛𝑡)+𝛽2×𝑃(𝑃𝐴𝐿𝐿 𝑛𝑡)+𝛽3×𝑃(𝑅𝐸𝐶𝑈 𝑛𝑡)
𝛽1+𝛽2+𝛽3
(5)
where:
𝑃(𝐿𝑅 𝑛𝑡) : Probability of likelihood reducer control effectiveness.
𝑃(𝐼𝑅 𝑛𝑡) : Probability of impact reducer control effectiveness.
d) Vulnerability
According to [26], incorrect or malfunctioning controls could become a vulnerability.
Therefore, in this paper, the probability of vulnerability is calculated based on the value of the
in-effective control of the associated vulnerability. One particular threat can exploit more than
one vulnerability, while control is explicitly dedicated to overcoming such threat. So that the
probability of vulnerability is calculated based on in-effective control of the threats that exploit
relevant vulnerability. Therefore, eventhough parameter 𝑉𝑛𝑡 consists of 𝑉1 𝑛𝑡
. … . 𝑉𝑛𝑣 𝑛𝑡
, it has only
one probability value to represent the value of vulnerabilities in one relevant threat. We assume
that in-effective probability of likelihood reduction does not affect each other (independent), and
they are also independent to the in-effective probability of impact reduction, so that it is possible
to use multiplication operation in (6).
7. ISSN: 1693-6930
TELKOMNIKA Vol. 17, No. 3, June 2019: 1324-1337
1330
𝑃(𝑉𝑛𝑡) = (1 − 𝑃(𝐿𝑅 𝑛𝑡)) × (1 − 𝑃(𝐼𝑅 𝑛𝑡)) (6)
This vulnerability probability value is then mapped to vulnerability classification.
We adopt the qualitative classification level in [24], which consist of five level. This quantitative
classification is used to make classification process of vulnerability probability easier.
2) Probability of threat risk calculation
To calculate each likelihood and impact of each threat, we refer to [14]. Likelihood of
threat risk (𝑓1) is a function of threat and vulnerability, while impact of threat risk (𝑓2) is a
function of asset and vulnerability [14]. To calculate (𝑓1), we are using formula and weight
(notation 𝛼 and 𝛽) from [14] as in (7). Using (7), the result of threat risk likelihood calculation
using two matrix dimensions can be seen in Table 3. The probability of likelihood is the result of
𝑓1 divided by the maximum value of 𝑓1 (25) where for threat level 5 the parameter 𝛼 is 3, and for
vulnerability level 5 then the parameter 𝛽 is 2.
𝑓1 = 𝛼𝑡 + 𝛽𝑣 (7)
Table 3. Calculation of Threat Risk Likelihood Based on Two Matrix Dimension [14]
𝑓1
V
1 2 3 4 5
T
1 3 4 5 10 12
2 5 6 7 12 14
3 7 8 9 14 16
4 13 14 15 20 22
5 16 17 18 23 25
for function (𝑓2), we perform some modification, since according to research result
in [26]. vulnerability exploitation by threat could pose a risk. If there is no threat related to a
certain vulnerability, then there would be no risk appear. Therefore, an impact is a function of
threat, vulnerability and asset criticality. We assumed that threat and vulnerability have a
mutually exclusive relationship as well as independent relationship between asset-threat and
vulnerability, so that
𝑓2 = (𝛼𝑡 + 𝜑𝑣) × 𝛾𝑎 (8)
The parameter α, φ and 𝛾 are the weight of each variable (𝑡 for threat level, 𝑣 for
vulnerability level and 𝑎 for asset criticality level). In this study, we adopt the weight value
from [14]. Based on our modification, we develop a new matrix of three dimensions for impact
as shown in Table 4 which shows the matrix calculation result using formula (8). For illustration,
using the weight as used in [14], the estimation result will be (i) when the threat level is 4 then
parameter 𝛼 is 3, (ii) when the vulnerability level is 2 then parameter 𝜑 is 2, and (iii) when the
asset level is 5 then parameter 𝛾 is 2.5. Based on the estimate above and (8), 𝑓2 is 200.
The probability of impact will be the result of 𝑓2 divided by the maximum value of 𝑓2 (375) where
the threat level is 5 for parameter 𝛼=3, the vulnerability level is 5 for 𝜑=3, and the asset level
is 5 for 𝛾=2.5.
Following the description in [10], recoverability is defined as the system's ability to
achieve acceptable limits or levels of operation after a risk event occurred. Therefore, to
calculate recoverability, we assumed that recoverability is defined as the percentages
(probability) of reduced threat (both of reduced likelihood threat and reduced impact threat) after
controls are implemented. This probability value represents the organization’s ability to reduce a
certain threat. The parameter 𝑃(𝑅𝑒𝑐𝐿 𝑛𝑡) is defined as recoverability related to the threat
likelihood as in (9) and 𝑃(𝑅𝑒𝑐𝐼 𝑛𝑡) defined as recoverability related to the threat impact
as in (10).
8. TELKOMNIKA ISSN: 1693-6930
Risk assessment of information production using extended risk matrix... (Jaka Sembiring)
1331
Table 4. Calculation of Threat Risk Impact Based on Three Matrix Dimension
𝑓2
A
T V
1 2 3 4 5
1
1 4 8 30 40 50
2 6 12 45 60 75
3 11 22 82.5 110 137.5
4 14 28 105 140 175
5 17 34 127.5 170 212.5
2
1 6 12 45 60 75
2 8 16 60 80 100
3 13 26 97.5 130 162.5
4 16 32 120 160 200
5 19 38 142.5 190 237.5
3
1 8 16 60 80 100
2 10 20 75 100 125
3 15 30 112.5 150 187.5
4 18 36 135 180 225
5 21 42 157.5 210 262.5
4
1 14 28 105 140 175
2 16 32 120 160 200
3 21 42 157.5 210 262.5
4 24 48 180 240 300
5 27 54 202.5 270 337.5
5
1 17 34 127.5 170 212.5
2 19 38 142.5 190 237.5
3 24 48 180 240 300
4 27 54 202.5 270 337.5
5 30 60 225 300 375
𝑃(𝑅𝑒𝑐𝐿 𝑛𝑡) =
𝑃(𝑇𝑛𝑡|𝑉𝑛𝑡)−𝑃(𝑅𝑇𝐿 𝑛𝑡)
𝑃(𝑇𝑛𝑡|𝑉𝑛𝑡)
(9)
𝑃(𝑅𝑒𝑐𝐼 𝑛𝑡) =
𝑃(𝑇𝑛𝑡|𝑉𝑛𝑡)−𝑃(𝑅𝑇𝐼 𝑛𝑡)
𝑃(𝑇𝑛𝑡|𝑉𝑛𝑡)
(10)
3) Probability of asset risk calculation
After probability value of each risk dimension (likelihood, impact and recoverability) for
each threat is obtained, we can calculate the probability of each risk dimension related to certain
asset. The likelihood and impact probability of an asset are joint probabilities of the threat risk
relevant to the asset. For probability value of recoverability, we use the mean value probability
of recoverability of each asset. We assumed that the threats on the same asset is mutually
exclusive and independent. It means, threats in an asset may occur at the same time but the
probability of those threat is independent each other.
𝑃(⋃ 𝐴 𝑘
𝑁
𝑘=1 ) = ∑ 𝑃(𝐴 𝑘) − ∑ 𝑃(𝐴 𝑘 ∩ 𝐴𝑗)+. . +(−1) 𝑁+1𝑁
𝑗=𝑘 𝑃(𝐴1 ∩. .∩ 𝐴 𝑁)𝑁
𝑘=1 (11)
notes
𝑃(𝐴 𝑘) : Probability of likelihood/impact threat risk.
𝑃 (𝐴 𝑘 ∩ 𝐴𝑗) : Joint probability of likelihood/ impact/recoverability one threat risk to each
other. Since threats in an asset are independent, then joint probability is
calculated as 𝑃(𝐴 𝑘) × 𝑃(𝐴𝑗).
Step 4: Calculating risk priority rank and calculating total impact.
1) Risk ranking estimation
Each dimension probability will be mapped to dimension classification of risk to
represent risk matrix. The classification of each dimension is shown in Table 5, where we follow
the classification described in [14]. The impact classification is created using the data in Table 4
and using k-means method [12] for classification. We can see that our proposed classification
method can eliminate the subjectivity of classification by human decision maker. Meanwhile,
9. ISSN: 1693-6930
TELKOMNIKA Vol. 17, No. 3, June 2019: 1324-1337
1332
the recoverability classification is calculated based on organization perception since it will be
different from one organization to the other depends on their capability.
Table 5. Classification of Risk Matrix Dimension
Numeric
Likelihood Impact
Range Range
1 0.01% 20.00% 0.01% 16.00%
2 20.01% 44.00% 16.01% 34.67%
3 44.01% 64.00% 34.68% 50.67%
4 64.01% 84.00% 50.68% 72.00%
5 84.01% 100.00% 72.01% 100.00%
In this paper, risk priority ranking is developed using extended risk matrix approach and
Borda rank method. This method is more accurate than simple method of multiplying each level
of risk matrix dimension as in [15]. Since the matrix dimension is extended by adding new
dimension (recoverability). Borda rank method calculate four dimensions (likelihood, impact,
likelihood recoverability and impact recoverability). In this paper, the objective of risks priority
ranking is to inform the decision makers on the priority of risk and the list of assets who has the
highest risk until lowest risk by considering the level of likelihood, impact, and recoverability.
The highest rank means the asset has a highest level of likelihood and impact, but the lowest
level of recoverability.
𝑏𝑖 = ∑ (𝑁 − 𝑅𝑖𝑘)𝑚
𝑘=1 (12)
where
N : Number of risk (asset).
k : Criteria of evaluation (L. I. RecL. RecI).
m : Number of k (m = 4).
Rik : Number risk which has greater than risk i on k criteria evaluation.
bi : Index of Borda for risk
2) Calculating total impact
In this paper, the calculation of total impact is limited to financial impact only.
This impact is based on a technical impact for each dimension of information quality.
We calculate the financial impact of each asset through gap percentages between ideal and
actual information quality after risk occurs, then multiply the number by asset cost.
𝑇𝑒𝑐ℎ 𝑛𝑑 𝑛𝑎
= ∑ (𝑃(𝐼 𝑛𝑖) × 𝐼𝑄𝐼 𝑛𝑑)𝑁𝐼
𝑛𝑖=1 ×
1
𝑁𝐼
(13)
𝐹𝑛𝑑 𝑛𝑎
=
(𝐼𝑄𝐼 𝑛𝑑−(𝐼𝑄𝐷 𝑛𝑑−𝑇𝑒𝑐ℎ 𝑛𝑑 𝑛𝑎))
𝐼𝑄𝐼 𝑛𝑑
× 𝐴𝐶 (14)
where
𝑇𝑒𝑐ℎ 𝑛𝑑 𝑛𝑎
: Technical impact of dimension.
𝑃(𝐼 𝑛𝑖) : Probability of risk impact.
𝐼𝑄𝐼 𝑛𝑑 : Ideal value of IQ.
NI : Number of dimension that receive the impact.
𝐹𝑛𝑑 𝑛𝑎
: Financial impact of dimension.
𝐼𝑄𝐴 𝑛𝑑 : Actual value of IQ.
AC : Asset cost.
4. Real Case Illustration and Discussion
In this chapter, we will present a real case illustration of our proposed method in a
government institution in Indonesia. Due to the nature of organization, we could not expose the
10. TELKOMNIKA ISSN: 1693-6930
Risk assessment of information production using extended risk matrix... (Jaka Sembiring)
1333
name of the organization. We emphasize on their administration business process for the scope
of implementation. We define and detail this scope into IP-MAP and identify each activity into
information production phase (source, transfer and process). The summary of research results
in [19] states that the dimensions which are important in government context are accuracy,
transferability, completeness and security. Using IQ dimension–attribute catalogue in Table 2,
we create several questionnaire instruments to find the existing and target of information quality.
The result of this IQ assessment is shown in Table 6, where we can see that the lowest existing
quality is in completeness dimension.
Table 6. Summary of Information Quality
Dimension AC TR COM TI SE
Target 5.083 5.611 5.75 5.33 5.67
Existing 4.167 4.67 4.33 4.9167 4.89
To simplify the elaboration of the process, we will concentrate only on personnel
asset (A7), the other assets will follow the same procedure. Based on identification step, there
are 3 threats in asset A7, and we described this situation as in Figure 3 and Table 7 where
implemented controls were related to maintain asset A7. To estimate the probability of risk
factors (threats, control and vulnerabilities), we employ s (1), (2), (3), (4), (5), and (6), and to
estimate the probability of risk profile of A7 (likelihood, impact, recoverability of likelihood and
recoverability of impact) we use s (7), (8), (9) and (10), where the results can be seen in
Table 8. To calculate probability of asset risk, we use (11) as joint probability of threats in an
asset and the result is also shown in in Table 9 as a risk profile for asset A7. This procedure can
be repeated for other assets, until finally we can find the complete risk profile of our case
illustration institution as in Table 10. Our calculation result shows that first priority is in asset A2
which means that asset A2 has high likelihood, high impact but low recoverability.
Figure 3. Implementation model
Table 7. Risk Factors Detail of A7
Asset criticality 5
Asset cost IDR 162000 (this asset cost is only for simulation)
Threats Controls
Code Description Code Description
N6 Storm C6 Personal Liability Insurance
HA4 Organization deficiency
C4 Determining the responsibility of termination and bonding managers
C1 Determining role and responsibility of information quality
C3 Organization structure and job description
HA19 Lack of staff
C2 Personnel recruitment
C5 Personnel rotation
VN6VN6
VHA4VHA4
N6N6
HA4HA4
RN6RN6
C6C6
LR (N6)LR (N6)
RECURECU IR (N6)IR (N6)
RHA4RHA4C1C1
C4C4
PREVPREV
PROTPROT
LR (HA4)LR (HA4)
IR (HA4)IR (HA4)
C3C3
Asset – A7
Technical Impact
ACAC
TRTR
COMCOM
TITI
Financial
Impact
Financial
Impact
SESE
VHA4VHA4
HA4HA4
RHA4RHA4C2C2
C5C5
PREVPREV
PROTPROT
LR (HA4)LR (HA4)
IR (HA4)IR (HA4)
Asset – A1
.
.
.
11. ISSN: 1693-6930
TELKOMNIKA Vol. 17, No. 3, June 2019: 1324-1337
1334
Table 8. Threats-controls Estimation of A7
Threats
Code
Controls
Vulnerability
Threat
Code Type
Effective
ness
Likeliho
od
reducer
Impa
ct
reduc
er
Unreduced
Threat
Reduced
likelihood of
threat
Reduced
impact of
threat
𝑃(𝐿𝑅 𝑛𝑡) 𝑃(𝐼𝑅 𝑛𝑡) 𝑃(𝑉𝑛𝑡) L 𝑃(𝑇𝑛𝑡|𝑉𝑛𝑡)L 𝑃(𝑅𝑇𝐿 𝑛𝑡) L 𝑃(𝑅𝑇𝐼 𝑛𝑡) L
N6 C6
REC
U
62.30%
0.00%
24.92
%
75.08
%
4
20.08
%
2 20.08% 2 15.08% 2
HA4
C4
PRO
T
62.30%
41.53%
12.46
%
51.18
%
3
52.65
%
4 30.78% 3 46.09% 3C1
PRE
V
62.30%
C3
PRE
V
HA19
C2
PRE
V
62.30%
41.53%
9.30
%
53.03
%
3
27.27
%
3 15.95% 2 24.74% 2
C5
PRO
T
46.50%
Table 9. Summary of A7 Risk Profile
Code 𝑓1/25 𝑓2/375
Recoverability
𝑃(𝑅𝑒𝑐𝐿 𝑛𝑡) 𝑃(𝑅𝑒𝑐𝐼 𝑛𝑡)
N6 32.00% 40.00% 0.00% 24.92%
HA4 36.00% 40.00% 41.53% 12.46%
HA19 28.00% 33.33% 41.53% 9.30%
Table 10. Summary of Overall Assets Risk Profile
Asset Code
Likelihood Impact
Recoverability
B LLikelihood Impact
P L P L P L P L
A1 87.58% 5 93.53% 5 34.22% 4 20.20% 4 24 3
A2 85.73% 5 90.95% 5 42.87% 4 15.64% 5 27 1
A3 77.05% 4 82.10% 5 48.41% 4 22.23% 4 21 4
A4 78.43% 4 83.08% 5 53.40% 3 17.77% 5 19 6
A5 54.52% 3 51.96% 4 32.86% 4 25.04% 4 13 7
A6 86.32% 5 87.95% 5 16.30% 5 20.08% 4 25 2
A7 68.67% 4 76.00% 4 27.69% 4 15.56% 5 19 5
We calculate each technical and financial impact of the threat using s (13) and (14) as
shown in Table 11. From the table, we see that the difference between the existing condition
after risk occurred and the target quality is 46%. It means the organization will be able to
recover additional budget about IDR 73.762 (1:1000). The result provides us a way to analyze
what dimensions are affected by a certain threat. For example, occurrence of threat N6 could
affect staff’s absence. In this case other staffs should cover his/her tasks and responsibilities,
and it may cause accumulation of task. As a result, the tasks could not be delivered in timely
manner. In our case illustration, all of threat in personnel asset unintentionally only affect
timeliness dimension. For other assets the case might be different.
To evaluate the accuracy of risk priority ranking, we use mean absolute error through
comparing actual rank with prediction rank for both cases without and with recoverability as
illustrated in Table 12. As shown on the table, total error of risk rank without recoverability is
higher than the one with recoverability. It means that the prediction with recoverability produces
less error and increase the accuracy of risk rank. For the case where we add a new dimension,
where we consider not only likelihood and impact but also recoverability, we implement Borda
method. With this addition to the dimension, we can show that the Borda value is more diverse
and decreasing the ambiguity of the priority risk ranking.
12. TELKOMNIKA ISSN: 1693-6930
Risk assessment of information production using extended risk matrix... (Jaka Sembiring)
1335
Table 11. Total Impact of A7
Code 𝑓2/375 Ideal quality
𝑇𝑒𝑐ℎ 𝑛𝑑 𝑛𝑎
(Timeliness)
N6 40.00%
5.33
2.13
HA4 40.00% 2.13
HA19 33.33% 1.78
Average 2.01
Actual after risk (4.9167–2.01) 2.90
Gap after risk (5.33–2.90) 46%
Financial impact of timeliness IDR 73.762
Table 12. Summary of Comparing Risk Rank
Asset
Code
Actual
Without
recoverability
With recoverability
R L I B R L I RecL RecI B R
A1 2 5 5 14 2 5 5 4 4 24 3
A2 1 5 5 14 1 5 5 4 5 27 1
A3 4 5 5 14 5 4 5 4 4 21 4
A4 6 5 5 14 4 4 5 3 5 19 6
A5 7 4 4 3 7 3 4 4 4 13 7
A6 3 5 5 14 3 5 5 5 4 25 2
A7 5 4 5 9 6 4 4 4 5 19 5
Total error 4 2
From simulation results, we can show that there are at least two benefits from our
proposed method. First, the organization will be equipped with asset priority from the risk
ranking where this ranking is created by considering not only the magnitude and level of risk
likelihood and impact, but also the likelihood recoverability and impact recoverability as shown
in Table 12. Second, the organization is provided with information on how much it would cost
when the risk occurs. In our case illustration, we provide real data of organization, and calculate
the financial impact of timeliness as presented in Table 11 amounted IDR 73.762. Using this
value and the result of asset priority ranking with Borda method in Table 10, we can obtain
two-dimensional quadrant relation of the seven assets as illustrated in Figure 4. Assets belong
to the quadrant I should receive close attention and mitigation since this asset have a high cost
and a high-risk rank. Meanwhile assets risk in quadrant IV could be accepted or in the last
priority to mitigate.
Figure 4. Two-dimensional quadrant relation of 7 assets
To compare the result of risk assessment, we evaluate it using mean absolute error and
risk ties density shown in (15) and (16) below. We use mean absolute error in (15) to evaluate
risk rank accuracy, whereas risk ties density in (16) is used to compare ambiguity of risk
13. ISSN: 1693-6930
TELKOMNIKA Vol. 17, No. 3, June 2019: 1324-1337
1336
assessment result. In several cases, assessment result provide more than of one certain
element (in our paper is IT assets) could occur in the same rank. It might make decision maker
confused. With extended risk matrix approach instead of using only the likelihood and impact,
by adding recoverability dimension the assessment result will be more complete and provide
more consideration than only likelihood and impact. As the result of evaluation using mean
absolute error and risk ties density, measured risk rank accuracy both of with and without
recoverability dimension was 92.86% and 85.71%. Meanwhile measured risk ties density both
of with and without recoverability dimension was 1.67 and 0.33.
𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦 = 1 − (
1
𝑛
× ∑ |𝑓𝑖 − 𝑦𝑖|𝑛
𝑖=1 ) × 100% (15)
𝑑 𝑡 =
∑ 𝑇 𝑗
𝐿
𝑗=1
𝐿
(16)
Where,
n : total of data
𝑓𝑖 : rank prediction of the system
𝑦𝑖 : rank prediction of the previous system
𝑑𝑡 : risk ties density
𝑇𝑗 : number of risk ties in level j
L : total risk level
5. Conclusion
This paper proposed a new method to calculate the information quality risk assessment
using extended risk matrix approach based on threat scenario dependency model. We include
existing control effectiveness and IQ assessment result to calculate total impact. Through real
implementation in a government institution, we have shown that the total impact reflects real
condition more natural than the previously announced method. By considering existing control
effectiveness in the calculation, we propose a new dimension of risk matrix called recoverability.
In our real case illustration, by comparing actual ranking and prediction ranking using we can
conclude that with our proposed method the accuracy is increasing and the risk ties on risk
ranking is decreasing. It means that we can provide the organization with more accurate asset
risk priority ranking where this ranking is created by considering not only the magnitude and
level of risk likelihood and impact, but also the likelihood recoverability and impact
recoverability. Moreover, we have shown that using out proposed method we provide the
organization with information on how much it would cost when the risk occurs using simple but
informative quadrant systems.
References
[1] Wang RY, Ziad M, Lee YW. Data Quality. New York. Boston. Dordrecht. London. Moscow: Kluwer
Academic Publishers 2002.
[2] C Higson, D Waltho. Valuing Information as an Asset. Sas Power To Know 2010: 1–17.
[3] Borek A, Woodall P, Towards a Process For Total Information Risk Management. In: Proceedings of
the 16th International Conference on Information Quality (ICIQ-2011). 2011: 477–491.
[4] Albarda, Supangkat SHAL. et al. Characteristics in Classification of Information Use (IU). Int J
Informatics Commun Technol. 2014; 3(2): 180–185.
[5] O’Brien JA, Marakas GM. Management Information System. 2012. Epub ahead of print 2012.
[6] Laudon KC, Laudon JP. Management Information Systems Managing the Digital Firm. 1968. Epub
ahead of print 1968.
[7] Borek A, Parlikad AK. Webb J. et al. Total Information Risk Management. 2014.
[8] Albarda. Information Characterization of Information Resource Services. Disertation. Bandung:
Postgraduate ITB; 2014.
[9] Yan L, You Z, Jian L. Marketing Outsourcing Risk Assessment in the Real Estate Based on Risk
Matrix Model. In: International Conference on Information System for Crisi Response and
Management. China. 2011: 134–138.
14. TELKOMNIKA ISSN: 1693-6930
Risk assessment of information production using extended risk matrix... (Jaka Sembiring)
1337
[10] Li ZP, Yee QMG, Tan PS, et al. An Extended Risk Matrix Approach for Supply Chain Risk
Assessment. 2013; 1699–1704.
[11] Elmonstri M. Review of the strengths and weaknesses of risk matrices. 2014; 4: 49–57.
[12] He L, Chen Y, Lin LY. A Risk Matrix Approach Based on Clustering Algorithm. J od Appl Sci 2013;
13(20): 4188–4194.
[13] Zhu, Qichao, Kuang, et al. Risk Matrix Method and Its Application in the Field of Technical Project
Risk Management. Eingineering Sci 2003; 5: 89–94.
[14] Xiangmo Z, Ming D, Shuai R, et al. Risk Assessment Model of Information Security for Transportation
Industry System Based on Risk Matrix. 2014; 8(3): 1301–1306.
[15] Ni H, Chen A, Chen N. Some extensions on risk matrix approach. Saf Sc. 2010; 48(10):
1269–1278.
[16] Rahmad B, Supangkat SH, Sembiring J, et al. Threat Scenario Dependency-Based Model of
Information Security Risk Analysis. 2010; 10(8): 93-102.
[17] ISO/IEC 27005:2008. Information technology-Security-techniques-Information security risk
management. 2008.
[18] Albarda, Supangkat SH. Kuspriyanto. et al. Information Interchange Layer based on Classification of
Information Use ( IU ). TELKOMNIKA Telecommunication Computing Electronics and Control. 2014;
12(2): 485–492.
[19] Nasution WS, Albarda. Improvement of Business Process in order to Manage the Quality of
Information. In: International Conference on ICT for Smart Society (ICISS). 2013.
[20] Knight S, Burn J. Developing a Framework for Assessing Information Quality on the World Wide Web
Introduction–The Big Picture What Is Information Quality ? 8.
[21] Shankaranarayanan G, Wang RY, Ziad M. IP-MAP : Representing the Manufacture of an Information
Product. Proceedings of the 2000 Conference on Information Quality. 2000; 1–16.
[22] Alberts CJ. Common Elements of Risk.
[23] Joyce JM. The Development of Subjective Bayesian. Sciencedirect. 2009; 10: 1–62.
[24] GB/T 20984-2007. Information security technology-Risk assessment specification for information
security. 2007.
[25] Sahinoglu M. Security Meter : A Practical. 2005; 18–24.
[26] ISO/IEC 27001:2005. INTERNATIONAL STANDARD ISO/IEC techniques—Information
security. 2005.