This document discusses an integrated model for sensitivity analysis and scenario analysis using breakeven analysis for operational and investment risk analysis. It was developed by Prof. Sreedhara Ramesh Chandra and Dr. Krishna Banana. The model aims to address limitations in existing sensitivity, scenario, and breakeven analysis models by integrating the three approaches. It introduces proportions and percentages to more precisely determine variable values. It also establishes relationships between scenario values and measures sensitivity through changes from a predetermined relational constant value (sales revenue). The model allows consideration of all cash flow determinants and provides a direct link between operational and investment risk measurements to improve investment decisions.
This document discusses various forecasting methods and principles. It covers:
- Qualitative methods like expert surveys, intentions surveys, and simulated interaction.
- Quantitative methods like extrapolation, rule-based forecasting, and simple regression which use numerical data.
- Checklists can improve forecasting by ensuring the latest evidence is included. The document provides a checklist for developing knowledge models.
- Forecasting principles like being conservative and choosing simple explanations are discussed.
- Estimating forecast uncertainty is important. Methods discussed include using empirical prediction intervals and decomposing errors by source.
Detailed insight into Analytical Steps required for generating reliable insights from analysis - Univariate, Bivariate, Multivariate, OLS & Logistic Models, etc
Was put together to train friends and mentees. Based on personal learnings/research and no proprietary info, etc. and no claims on 100% accuracy. Also every institution/organization/team uses it own steps/methodologies, so please use the one relevant for you and this only for training purposes.
Predictive analytics uses past data to forecast future outcomes. The document discusses various predictive analytics techniques including simple forecasting methods, decision trees, and regression. Simple forecasting techniques like moving averages are easiest to implement but lack explanatory power, while decision trees and regression provide more accurate predictions at an individual level but require more complex deployment. The key is selecting the right technique based on the problem, data, and ability to implement predictive models in real-world applications.
This document describes a genetic learning algorithm called GLOWER that is designed for financial prediction problems. It discusses how financial prediction is difficult due to high dimensionality, weak nonlinear relationships between variables, and important variable interactions. Standard algorithms like decision trees are limited by their greedy search approach, which can miss complex patterns. Genetic algorithms can perform a more thorough search without biases. The paper evaluates GLOWER on several datasets and finds it uncovers more effective patterns than other algorithms for difficult problems with weak structure.
Hypothesis Testing: Central Tendency – Non-Normal (Compare 2+ Factors)Matt Hansen
An extension on hypothesis testing, this lesson reviews the Mood’s Median & Kruskal-Wallis tests as central tendency measurements for non-normal distributions.
This document discusses various forecasting methods and principles. It covers:
- Qualitative methods like expert surveys, intentions surveys, and simulated interaction.
- Quantitative methods like extrapolation, rule-based forecasting, and simple regression which use numerical data.
- Checklists can improve forecasting by ensuring the latest evidence is included. The document provides a checklist for developing knowledge models.
- Forecasting principles like being conservative and choosing simple explanations are discussed.
- Estimating forecast uncertainty is important. Methods discussed include using empirical prediction intervals and decomposing errors by source.
Detailed insight into Analytical Steps required for generating reliable insights from analysis - Univariate, Bivariate, Multivariate, OLS & Logistic Models, etc
Was put together to train friends and mentees. Based on personal learnings/research and no proprietary info, etc. and no claims on 100% accuracy. Also every institution/organization/team uses it own steps/methodologies, so please use the one relevant for you and this only for training purposes.
Predictive analytics uses past data to forecast future outcomes. The document discusses various predictive analytics techniques including simple forecasting methods, decision trees, and regression. Simple forecasting techniques like moving averages are easiest to implement but lack explanatory power, while decision trees and regression provide more accurate predictions at an individual level but require more complex deployment. The key is selecting the right technique based on the problem, data, and ability to implement predictive models in real-world applications.
This document describes a genetic learning algorithm called GLOWER that is designed for financial prediction problems. It discusses how financial prediction is difficult due to high dimensionality, weak nonlinear relationships between variables, and important variable interactions. Standard algorithms like decision trees are limited by their greedy search approach, which can miss complex patterns. Genetic algorithms can perform a more thorough search without biases. The paper evaluates GLOWER on several datasets and finds it uncovers more effective patterns than other algorithms for difficult problems with weak structure.
Hypothesis Testing: Central Tendency – Non-Normal (Compare 2+ Factors)Matt Hansen
An extension on hypothesis testing, this lesson reviews the Mood’s Median & Kruskal-Wallis tests as central tendency measurements for non-normal distributions.
Advantages of Regression Models Over Expert Judgement for Characterizing Cybe...Thomas Lee
Expert Judgment is the foundation of many risk assessment methodologies. But research is robust on the inaccuracy of Expert Judgment with regards to rare events—and large data breach events are rare. Regression models, which are a statistical characterization of cross-company historical events are substantially more accurate than expert judgment or even models with expert judgment as a foundation.
This document provides instruction on using the 1 variance test for hypothesis testing. It begins with an overview of why hypothesis testing is needed to build a transfer function model. It then reviews the 4-step process for hypothesis testing and provides a decision tree to help select the appropriate statistical test based on data type and characteristics. The document demonstrates how to perform a 1 variance test using Minitab through examples comparing standard deviation to a target value. It concludes by prompting the reader to apply the 1 variance test to factors identified in a previous lesson and consider how the results could influence organizational decisions and goals.
Hypothesis Testing: Central Tendency – Non-Normal (Compare 1:Standard)Matt Hansen
An extension on hypothesis testing, this lesson reviews the 1 Sample Sign & Wilcoxon tests as central tendency measurements for non-normal distributions.
Making Analytics Actionable for Financial Institutions (Part II of III)Cognizant
To identify meaningful use cases for analytics-driven banking and financial services solutions, organizations need a thorough understanding of how customer interactions align with context and anticipate needs, while simplifying the decision-making process.
Reduction in customer complaints - Mortgage IndustryPranov Mishra
The project aims at analysis of Customer Complaints/Inquiries received by a US based mortgage (loan) servicing company..
The goal of the project is building a predictive model using the identified significant
contributors and coming up with recommendations for changes which will lead to
1. Reducing Re-work
2. Reducing Operational Cost
3. Improve Customer Satisfaction
4. Improve company preparedness to respond to customer.
Three models were built - Logistic Regression, Random Forest and Gradient Boosting. It was seen that the accuracy, auc (Area under the curve), sensitivity and specificity improved drastically as the model complexity increased from simple to complex.
Logistic regression was not generalizing well to a non-linear data. So the model was suffering from both bias and variance. Random Forest is an ensemble technique in itself and helps with reducing variance to a great extent. Gradient Boosting, with its sequential learning ability, helps reduce the bias. The results from both random forest and gradient boosting did not differ by much. This is confirming the bias-variance trade-off concept which states that complex models will do well on non-linear data as the inflexible simple models will have high bias and can have high variance.
Additionally, a lift chart was built which gives a Cumulative lift of 133% in the first four deciles
Hypothesis Testing: Central Tendency – Non-Normal (Compare 1:1)Matt Hansen
This document provides instruction on using the Mann-Whitney test to compare the medians of two independent samples. It discusses when to use the Mann-Whitney test, how to run it in Minitab, and provides an example comparing the medians of two columns of sample data labeled MetricC1 and MetricC2. The results of running the Mann-Whitney test on this example are interpreted to determine if the medians are statistically different between the two samples. The document encourages applying the test to factors identified in a previous lesson and discussing how the results could impact an organization.
This document discusses the topic of probability and its applications in business. It defines probability theory and describes three types of probability: classical, empirical, and subjective. Probability distributions are also introduced. The document then discusses how probability is used in business for calculating long-term gains and losses, risk evaluation, sales forecasting, manufacturing decisions, and scenario analysis.
1. The document discusses the nature of uncertainties facing firms in strategic decision making, including Knightian uncertainty about unknown probabilities versus risk with known probabilities.
2. It describes frameworks for conceptualizing uncertainty, such as Courtney's four levels ranging from a clear future to true ambiguity. Traditional strategic approaches are less effective under higher uncertainty levels.
3. Firms can cope with uncertainty by developing a strategic intent, using an opportunities approach, employing a portfolio of actions like big bets and options, creating simple rules, or following a semicoherent strategic direction with continuous change.
1) This document outlines an agenda for a workshop on programmatic risk management that covers topics such as risk management principles, basic statistics, Monte Carlo simulation theory, using Microsoft Project and Risk+ software, risk ranking, and building a credible schedule.
2) It discusses five key principles of managing programmatic risk: having a strategy rather than relying on hope, understanding that single point estimates are inaccurate without variance data, integrating cost, time and technical performance, using a risk management process and model rather than "driving in the dark," and ensuring effective risk communication.
3) The mechanics section describes how to set up a Risk+ simulation integrated with
Final Stress Test paper for Federal ReserveJoe Barber
This document analyzes stress tests and capital adequacy measurements for banks in the Dallas FDIC region. It discusses previous research on stress tests and their ability to predict financial institution stability. The study aims to use historical data from a sample of banks to predict future financial performance and default risk. It focuses on smaller banks under the Dallas FDIC office to determine what factors influence the accuracy of capital adequacy measurements, finding that institutions with initially higher capital levels tended to experience greater decreases during adverse conditions.
International journal of engineering and mathematical modelling vol1 no1_2015_2IJEMM
This document discusses using the CIR++ model to estimate default risk through simulation. It begins by describing structural and reduced-form approaches to default estimation. It then introduces the CIR and CIR++ processes, which can model the evolution of short-term interest rates and default intensities. The document outlines how the CIR++ model can be calibrated to market data on yield curves and credit default swap prices to estimate default probabilities. It concludes by stating the calibrated CIR++ model will be tested against Deutsche Bank estimates to evaluate its ability to model default risk.
Forecasting is the process of making predictions about events that have not yet occurred based on past data and other information. There are many different forecasting methods that can be qualitative or quantitative, including time series analysis, causal modeling, judgmental approaches, and more recently artificial intelligence techniques. Accuracy is important in forecasting and is typically measured using values like mean absolute error or mean squared error. Forecasting has wide applications in domains like business, economics, weather, earthquakes, and more. Limitations to forecasting accuracy exist, such as the chaotic nature of systems like the weather beyond two weeks.
IRJET- Overview of Forecasting TechniquesIRJET Journal
This document provides an overview of different forecasting techniques, including qualitative and quantitative methods. It discusses several qualitative techniques like the Delphi method, consumer market surveys, and jury of executive opinion. It also examines various quantitative techniques such as the moving average method, weighted moving average method, exponential smoothing, and least squares. The document serves to introduce students to common forecasting approaches and provide examples of each type of technique.
This document discusses the relevance and implications of forecasting retail deposits. Forecasting retail deposits involves analyzing macroeconomic data to build models that can accurately predict future deposit levels given economic conditions. Accurately forecasting deposits is important for banks to inform strategic planning and decisions around operations, technology, and infrastructure needs. The implications of deposit forecasting are discussed from social and philosophical perspectives, including how forecasting stems from humans' innate desire to understand and prepare for an uncertain future.
Advantages of Regression Models Over Expert Judgement for Characterizing Cybe...Thomas Lee
Expert Judgment is the foundation of many risk assessment methodologies. But research is robust on the inaccuracy of Expert Judgment with regards to rare events—and large data breach events are rare. Regression models, which are a statistical characterization of cross-company historical events are substantially more accurate than expert judgment or even models with expert judgment as a foundation.
This document provides instruction on using the 1 variance test for hypothesis testing. It begins with an overview of why hypothesis testing is needed to build a transfer function model. It then reviews the 4-step process for hypothesis testing and provides a decision tree to help select the appropriate statistical test based on data type and characteristics. The document demonstrates how to perform a 1 variance test using Minitab through examples comparing standard deviation to a target value. It concludes by prompting the reader to apply the 1 variance test to factors identified in a previous lesson and consider how the results could influence organizational decisions and goals.
Hypothesis Testing: Central Tendency – Non-Normal (Compare 1:Standard)Matt Hansen
An extension on hypothesis testing, this lesson reviews the 1 Sample Sign & Wilcoxon tests as central tendency measurements for non-normal distributions.
Making Analytics Actionable for Financial Institutions (Part II of III)Cognizant
To identify meaningful use cases for analytics-driven banking and financial services solutions, organizations need a thorough understanding of how customer interactions align with context and anticipate needs, while simplifying the decision-making process.
Reduction in customer complaints - Mortgage IndustryPranov Mishra
The project aims at analysis of Customer Complaints/Inquiries received by a US based mortgage (loan) servicing company..
The goal of the project is building a predictive model using the identified significant
contributors and coming up with recommendations for changes which will lead to
1. Reducing Re-work
2. Reducing Operational Cost
3. Improve Customer Satisfaction
4. Improve company preparedness to respond to customer.
Three models were built - Logistic Regression, Random Forest and Gradient Boosting. It was seen that the accuracy, auc (Area under the curve), sensitivity and specificity improved drastically as the model complexity increased from simple to complex.
Logistic regression was not generalizing well to a non-linear data. So the model was suffering from both bias and variance. Random Forest is an ensemble technique in itself and helps with reducing variance to a great extent. Gradient Boosting, with its sequential learning ability, helps reduce the bias. The results from both random forest and gradient boosting did not differ by much. This is confirming the bias-variance trade-off concept which states that complex models will do well on non-linear data as the inflexible simple models will have high bias and can have high variance.
Additionally, a lift chart was built which gives a Cumulative lift of 133% in the first four deciles
Hypothesis Testing: Central Tendency – Non-Normal (Compare 1:1)Matt Hansen
This document provides instruction on using the Mann-Whitney test to compare the medians of two independent samples. It discusses when to use the Mann-Whitney test, how to run it in Minitab, and provides an example comparing the medians of two columns of sample data labeled MetricC1 and MetricC2. The results of running the Mann-Whitney test on this example are interpreted to determine if the medians are statistically different between the two samples. The document encourages applying the test to factors identified in a previous lesson and discussing how the results could impact an organization.
This document discusses the topic of probability and its applications in business. It defines probability theory and describes three types of probability: classical, empirical, and subjective. Probability distributions are also introduced. The document then discusses how probability is used in business for calculating long-term gains and losses, risk evaluation, sales forecasting, manufacturing decisions, and scenario analysis.
1. The document discusses the nature of uncertainties facing firms in strategic decision making, including Knightian uncertainty about unknown probabilities versus risk with known probabilities.
2. It describes frameworks for conceptualizing uncertainty, such as Courtney's four levels ranging from a clear future to true ambiguity. Traditional strategic approaches are less effective under higher uncertainty levels.
3. Firms can cope with uncertainty by developing a strategic intent, using an opportunities approach, employing a portfolio of actions like big bets and options, creating simple rules, or following a semicoherent strategic direction with continuous change.
1) This document outlines an agenda for a workshop on programmatic risk management that covers topics such as risk management principles, basic statistics, Monte Carlo simulation theory, using Microsoft Project and Risk+ software, risk ranking, and building a credible schedule.
2) It discusses five key principles of managing programmatic risk: having a strategy rather than relying on hope, understanding that single point estimates are inaccurate without variance data, integrating cost, time and technical performance, using a risk management process and model rather than "driving in the dark," and ensuring effective risk communication.
3) The mechanics section describes how to set up a Risk+ simulation integrated with
Final Stress Test paper for Federal ReserveJoe Barber
This document analyzes stress tests and capital adequacy measurements for banks in the Dallas FDIC region. It discusses previous research on stress tests and their ability to predict financial institution stability. The study aims to use historical data from a sample of banks to predict future financial performance and default risk. It focuses on smaller banks under the Dallas FDIC office to determine what factors influence the accuracy of capital adequacy measurements, finding that institutions with initially higher capital levels tended to experience greater decreases during adverse conditions.
International journal of engineering and mathematical modelling vol1 no1_2015_2IJEMM
This document discusses using the CIR++ model to estimate default risk through simulation. It begins by describing structural and reduced-form approaches to default estimation. It then introduces the CIR and CIR++ processes, which can model the evolution of short-term interest rates and default intensities. The document outlines how the CIR++ model can be calibrated to market data on yield curves and credit default swap prices to estimate default probabilities. It concludes by stating the calibrated CIR++ model will be tested against Deutsche Bank estimates to evaluate its ability to model default risk.
Forecasting is the process of making predictions about events that have not yet occurred based on past data and other information. There are many different forecasting methods that can be qualitative or quantitative, including time series analysis, causal modeling, judgmental approaches, and more recently artificial intelligence techniques. Accuracy is important in forecasting and is typically measured using values like mean absolute error or mean squared error. Forecasting has wide applications in domains like business, economics, weather, earthquakes, and more. Limitations to forecasting accuracy exist, such as the chaotic nature of systems like the weather beyond two weeks.
IRJET- Overview of Forecasting TechniquesIRJET Journal
This document provides an overview of different forecasting techniques, including qualitative and quantitative methods. It discusses several qualitative techniques like the Delphi method, consumer market surveys, and jury of executive opinion. It also examines various quantitative techniques such as the moving average method, weighted moving average method, exponential smoothing, and least squares. The document serves to introduce students to common forecasting approaches and provide examples of each type of technique.
This document discusses the relevance and implications of forecasting retail deposits. Forecasting retail deposits involves analyzing macroeconomic data to build models that can accurately predict future deposit levels given economic conditions. Accurately forecasting deposits is important for banks to inform strategic planning and decisions around operations, technology, and infrastructure needs. The implications of deposit forecasting are discussed from social and philosophical perspectives, including how forecasting stems from humans' innate desire to understand and prepare for an uncertain future.
This document discusses validating risk models using intraday value-at-risk (VaR) and expected shortfall (ES) approaches with the Multiplicative Component GARCH (MC-GARCH) model. The study assesses different distributional assumptions for innovations in the MC-GARCH model and evaluates their effects on modeling and forecasting performance. Backtesting procedures are used to validate the models' predictive power for VaR and ES. Results show non-normal distributions best fit the intraday data and forecast ES, while an asymmetric distribution best forecasts VaR.
Aon FI Risk Advisory - CCAR Variable SelectionEvan Sekeris
Operational risk projections for CCAR pose challenges as the relationships between losses and economic factors are unclear. While established models exist for market and credit risk, the best approach for operational risk is less certain. Banks must use economic scenarios but many operational risk drivers are independent of economic changes. The document discusses whether variable selection should be objective via algorithms or subjective through expert judgment based on economic theory. It acknowledges limitations of both approaches. Regulators are looking for banks to consider both objective and subjective aspects in their models. Hybrid approaches that include multiple techniques like regression analysis, scenario analysis and historical averages may be most appropriate.
This document summarizes a research article that develops models to explain how firm conduct and competitive interactions jointly influence risk-return relationships at the industry level. The models show that two main mechanisms impact risk-return relations: 1) firm conduct, including heterogeneity in firms' costs and imperfect control over operations, which leads to a negative risk-return effect, and 2) a "reflection effect" whereby a firm's actions impact its competitors' profits, leading to a positive risk-return effect that is dampened as the number of firms increases. By integrating considerations of both firm conduct and industry competition, the models offer novel predictions about when risk-return relations will be negative, positive, or U-shaped, providing a more nuanced understanding
- The document discusses various techniques for analyzing risk in capital budgeting decisions such as payback period, certainty equivalent, risk-adjusted discount rate, sensitivity analysis, scenario analysis, and simulation analysis.
- It also covers using decision trees for sequential investment decisions and incorporating utility theory to explicitly include a decision-maker's risk preferences in the capital budgeting analysis.
This document describes a major project aimed at predicting health insurance costs using regression models. The objectives are to implement efficient algorithms that provide accurate predictions and to compare different regression algorithms. The project will use multiple linear regression, decision tree regression, and gradient boosting regression on health insurance data to predict costs. Literature on using machine learning and deep learning models for health insurance cost prediction is reviewed. The hardware, software, methods, and key concepts of multiple linear regression, decision tree regression, and gradient boosting regression are described.
Forecasting operational risk losses for CCAR poses a number of challenges, not least of which is uncertainty about the best methodology for modeling the relationship between a bank’s future losses and the macroeconomic environment.
The document outlines various techniques for stand-alone risk analysis, including sensitivity analysis, scenario analysis, break-even analysis, simulation analysis, and decision tree analysis. It provides examples and procedures for conducting each type of analysis. Sensitivty analysis and scenario analysis are discussed in detail through examples. Simulation analysis covers defining probability distributions, dealing with correlations, and issues in application. Decision tree analysis is introduced as a tool for sequential decision making under risk.
investment decisions, risk and uncertainity, types of risk, techniques of measuring risk, cost of capital, importance, factors affecting cost of capital, computation of cost of capital, capital structure, capital structure theories, dividend theories, walter model, gordon model, mm model, working capital management, types of working capital, factors influencing working capital, preparation of cash budget, problems on working capital, corporate valuation,methods
Multi-dimensional time series based approach for Banking Regulatory Stress Te...Genpact Ltd
Under regulatory paradigm of banking risk management, banks are required to perform stress testing of internally computed risk parameters to ensure holding of adequate amount of capital to offset the effects of downturn events. For this purpose, most of the contemporary stress-testing practices are limited to one dimensionality of the calculation, where endogenous risk parameters are predicted by modeling and scenario based values of exogenous parameters (macroeconomic variables).
The document discusses various methods for demand forecasting, including qualitative and quantitative approaches. Qualitative methods involve expert judgment through individual specialists, group consensus, or the Delphi method. Quantitative methods include causal models using regression analysis and time series analysis. Simple time series models discussed are the projection method, simple moving average, weighted moving average, and basic exponential smoothing. Accuracy of forecasts is also addressed through measures like average error, bias, and mean absolute deviation. The document provides an example application of these time series models to a sample demand data set.
COMPARISON OF BANKRUPTCY PREDICTION MODELS WITH PUBLIC RECORDS AND FIRMOGRAPHICScscpconf
Many business operations and strategies rely on bankruptcy prediction. In this paper, we aim to
study the impacts of public records and firmographics and predict the bankruptcy in a 12-
month-ahead period with using different classification models and adding values to traditionally
used financial ratios. Univariate analysis shows the statistical association and significance of
public records and firmographics indicators with the bankruptcy. Further, seven statistical
models and machine learning methods were developed, including Logistic Regression, Decision
Tree, Random Forest, Gradient Boosting, Support Vector Machine, Bayesian Network, and
Neural Network. The performance of models were evaluated and compared based on
classification accuracy, Type I error, Type II error, and ROC curves on the hold-out dataset.
Moreover, an experiment was set up to show the importance of oversampling for rare event
prediction. The result also shows that Bayesian Network is comparatively more robust than
other models without oversampling.
Risk Assessment Model and its Integration into an Established Test Processijtsrd
In industry, testing has to be performed under severe pressure due to limited resources. Risk based testing which uses risks to guide the test process is applied to allocate resources and to reduce product risks. Risk assessment, i.e., risk identi cation, analysis and evaluation, determines the signi cance of the risk values assigned to tests and therefore the quality of the overall risk based test process. In this paper we provide a risk assessment model and its integration into an established test process. This framework is derived on the basis of best practices extracted from published risk based testing approaches and applied to an industrial test process. Ashwani Kumar | Prince Sood "Risk Assessment Model and its Integration into an Established Test Process" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd26757.pdfPaper URL: https://www.ijtsrd.com/engineering/computer-engineering/26757/risk-assessment-model-and-its-integration-into-an-established-test-process/ashwani-kumar
REPORTby Assignment 1 Asssignment 1Submission dat e 30.docxsodhi3
REPORT
by Assignment 1 Asssignment 1
Submission dat e : 30- Jan- 2018 06:37 AM (UT C- 0800)
Submission ID: 9087 314 35
File name : Linear_Regressio n_SLP_2_by_T eresa_Co wad.do cx (50.98K)
Word count : 1296
Charact e r count : 7 133
48%
SIMILARIT Y INDEX
27%
INT ERNET SOURCES
14%
PUBLICAT IONS
42%
ST UDENT PAPERS
1 14%
2 12%
3 4%
4 3%
5 3%
6 3%
7 2%
REPORT
ORIGINALITY REPORT
PRIMARY SOURCES
brainmass.com
Int ernet Source
Submitted to Trident University International
St udent Paper
www.statisticshowto.com
Int ernet Source
www.internationalresearchjournalof f inanceandeconomics.com
Int ernet Source
Submitted to American Public University
System
St udent Paper
www.ref erence.com
Int ernet Source
Luca Gugliermetti, Gianf ranco Caruso, Luca
Saraceno. "Prediction of subcooled f low boiling
pressure drops in small circular tubes",
International Journal of Heat and Mass
Transf er, 2017
Publicat ion
8 2%
9 1%
10 1%
11 1%
12 1%
13 1%
Exclude quo tes Of f
Exclude biblio graphy Of f
Exclude matches Of f
Submitted to American Intercontinental
University Online
St udent Paper
Submitted to University of Witwatersrand
St udent Paper
Li, Xuejiao, and Takashi Hibiki. "Frictional
pressure drop correlation f or two-phase f lows
in mini and micro single-channels",
International Journal of Multiphase Flow, 2017.
Publicat ion
hec.gov.pk
Int ernet Source
www.jolst.net
Int ernet Source
Hamed Fazlollahtabar, Mohammad Ali Ehsani.
"Integration between Regression Model and
Fuzzy Logic Approach f or Analyzing Various
Electronic Commerce Ef f ects on Economic
Growth in Organizations", Journal of Electronic
Commerce in Organizations, 2010
Publicat ion
FINAL GRADE
/0
REPORT
GRADEMARK REPORT
GENERAL COMMENTS
Instructor
PAGE 1
PAGE 2
PAGE 3
PAGE 4
PAGE 5
PAGE 6
PAGE 7
REPORTby Assignment 1 Asssignment 1REPORTORIGINALITY REPORTPRIMARY SOURCESREPORTGRADEMARK REPORTFINAL GRADEGENERAL COMMENTSInstructor
REPORT 2
by Ass 2 Ass 2
Submission date: 30-Jan-2018 06:38AM (UTC-0800)
Submission ID: 908731785
File name: Linear_Ref ression_Case_2_by_Teresa_Coward.docx (52.24K)
Word count: 1360
Character count: 7692
56%
SIMILARITY INDEX
32%
INTERNET SOURCES
21%
PUBLICATIONS
53%
STUDENT PAPERS
1 11%
2 10%
3 6%
4 4%
5 3%
6 2%
7 2%
8
REPORT 2
ORIGINALITY REPORT
PRIMARY SOURCES
Submitted to Trident University International
Student Paper
Submitted to Colorado Technical University
Online
Student Paper
Submitted to American Public University
System
Student Paper
www.statisticshowto.com
Internet Source
Submitted to Southern New Hampshire
University - Continuing Education
Student Paper
swdllc.paresspacewarpresearch.org
Internet Source
Submitted to American Intercontinental
University Online
Student Paper
Submitted to Laureate Higher Education Group
2%
9 2%
10 2%
11 2%
12 1%
13 1%
14 1%
15 1%
16 1%
17 1%
18 1%
Student Paper
brainmass.com
Internet Source
cc.msnscache.com ...
This document provides an introduction to regression analysis. It discusses that regression analysis investigates the relationship between dependent and independent variables to model and analyze data. The document outlines different types of regressions including linear, polynomial, stepwise, ridge, lasso, and elastic net regressions. It explains that regression analysis is used for predictive modeling, forecasting, and determining the impact of variables. The benefits of regression analysis are that it indicates significant relationships and the strength of impact between variables.
THE ENGINEERING ECONOMIST, VOL. , NO. , –http.docxtodd701
THE ENGINEERING ECONOMIST
, VOL. , NO. , –
http://dx.doi.org/./X..
Using mean-Gini and stochastic dominance to choose project
portfolios with parameter uncertainty
Guilherme Augusto Barucke Marcondesa,b, Rafael Coradi Lemeb, Marcela da
Silveira Lemeb, and Carlos Eduardo Sanches da Silvab
aNational Institute of Telecommunications, Santa Rita do Sapucaí, Brazil; bInstitute of Industrial Engineering and
Management, Federal University of Itajubá, Itajubá, Brazil
ABSTRACT
Although a variety of models have been studied for project portfo-
lio selection, many organizations still struggle to choose a potentially
diverse range of projects while ensuring the most beneficial results. The
use of the mean-Gini framework and stochastic dominance to select
portfolios of research and development (R&D) projects has been gaining
attention in the literature despite the fact that such approaches do not
consider uncertainty regarding the projects’parameters. This article dis-
cusses, with relation to project portfolio selection through a mean-Gini
approach and stochastic dominance, the impact of uncertainty on
project parameters. In the process, Monte Carlo simulation is consid-
eredinevaluatingtheimpactofparametricuncertaintyonprojectselec-
tion. The results show that the influence of uncertainty is significant
enough to mislead managers. A more robust selection policy using the
mean-Gini approach and Monte Carlo simulation is proposed.
Introduction
Managers select projects by prioritizing some options over others and by excluding options
that are not aligned with their company strategy or that may lead to a loss; the choices about
which the manager makes decisions are usually treated as a portfolio. Portfolio theory seeks
to manage risk in a group of assets to determine a combination that offers the lowest risk
and the highest expected return. Such a group is called an optimal portfolio. As with a finan-
cial portfolio, portfolio management focuses primarily on select projects to ensure that risks,
complexity, potential returns, and resource allocation are aligned to the organization’s strat-
egy to provide optimal benefits (Petit 2012). Thus, if a project’s expected return and its associ-
ated risk can be estimated, portfolio theory can be used to select the most attractive options.
The concept of portfolio selection, which was introduced by the seminal work of Markowitz
(1952), established the optimal strategy for maximizing return and minimizing the associated
variance. When this strategy is followed, the efficient frontier is reached, where for a given
variance level, there exists no other portfolio with a greater expected return. Similarly, for a
given expected return level, there exists no other portfolio with smaller variance.
CONTACT Guilherme Augusto Barucke Marcondes [email protected] National Institute of Telecommunications,
Department of Computing Engineering, Av. João de Camargo, , Santa Rita do Sapucaí .
This document discusses techniques for evaluating projects under uncertainty, including sensitivity analysis, switching values, and probability distributions of outcomes. Sensitivity analysis systematically tests how changes to estimates impact a project's worth. Probability distributions collapse uncertain outcomes into point estimates but lack variance data. Direct or simulated calculations of probability distributions provide a full picture of outcome likelihoods when events are independent.
Similar to ymca821-.pdf-published paper-5.pdf++ (20)
1. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
102 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
Dr. KB and Prof SRC’s Integrated Model of Sensitivity to
Scenario Analysis with Breakeven Analysis for Operational
and Investment Risk Analysis
Prof. Sreedhara Ramesh Chandra Dr. Krishna Banana
Associate Professor & HOD, ML Eng. College, Assistant Professor,
Singarayakonda, Andhra Pradesh, India Department of Commerce &
Research Scholar, Acharya Nagarjuna University Guntur, Business Administration, Ongole
Campus, Ongole, Acharya Nagarjuna University Guntur,
Andhra Pradesh, India. Ongole Campus, Ongole. Andhra Pradesh, India.
ABSTRACT
It is evident that among the decisions on financial aspects of any business, the most important and critical is the long
term investment decision. It involves lock up of large amount of funds for long period future coupled with the risk of
uncertainty of returns affected by economic and business scenarios of the environment. The decisions with in-depth
assessment of the economic & business scenarios of the environment help to take the impotent decisions like long term
investments. The difficulty in forecasting number of scenarios and unable to establish relation between the values of
variables in between the scenarios is a concern for the need of further improvements of the methods in use of the
investment analysis. The price, costs and volume are the determinants of the investment decisions. The economic
environment is of a dynamic instinct. The economic scenarios are ever changing. The determinants of the decision are
subject to the changes affected by the changes in scenario. Therefore the analysis which analyses in tune with the
scenarios gives more accurate result. The existing methods and techniques though considering the scenario effects on
determinants, there are issues needed to be addressed for perfection in analysis and more accuracy in the decision. In
the light of the two method viz. the sensitivity and the scenario analysis the crystal ball and simulation software model
able to give greater consideration to the scenario impacts on determinants in analysis. It is felt that still there are some
issues needed to be addressed for perfection in application of the crystal ball and software simulations. This is one of
such attempt in eliminating the limitations of the crystal ball. This is the model designed to perfection with the
integration of both scenario and sensitivity analysis through Breakeven analysis able to measure the impacts of all at
once physically without the software. This model enables to consider multiple scenarios with multivariable sensitivity
with a deterministic relation between scenario values for analysis is the limitation of the crystal ball. This is made
possible with the effect of sensitivity to scenarios through extent of change from one to another at once with simple
concepts and calculations of proportions. Further this enable to provide a direct link between the operational risks and
investment risks. This model is equally useful for both operating risk measurement and control apart from the investment
risk analysis.
1. INTRODUCTION:
There are several analytical models are in use for analyzing and measuring the risk in investment decisions
under uncertainty conditions of environment. Among them the important are sensitivity, scenario, break-even
analysis, hillier Model of probability application with non correlated and correlated cash flows approaches,
simulation, risk adjusted discount rate, certainty equivalents, Decision tree analysis etc. the data used in
almost all these methods are starts from the predication/forecast of annual cash flows. It is evident that the
function in Cash Flows depends on three important factor determinants viz. the price, costs and volume.
Commonly the impact of scenario effects on cash flows is measured through determinants in scenario
analysis. A function in the determinants depends on internal and external environmental factors that show the
influence on determinants. The predicted functions in Cash Flows more particularly the cash inflows need to
2. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
103 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
be determined through measuring the forecasted scenario impact on the determinants i.e. price, costs and
volume.
Commonly the impact effects of changes in scenario on cash flows are measured through predicted changes in
determinants either by experience or expertise is called scenario analysis. The extent of responsiveness of cash
flows to a determinant is considered as the sensitivity analysis. In reality the concept of study of impact
means the extent of responsiveness from the existing or standard or desired or ideal figures and may not be the
resultant with distinct value of variables that are in consideration under the present scenario analysis. They
represent to a discrete scenario and it is hard to establish relation with distinct scenarios by discrete value
determination of variables is a major limitation of the scenario. Further the scenario concept is the core for
crystal ball and simulation analysis wherein there is no greater change beyond the consideration of many
number of scenario values to the determinants to derive the different scenario cash flows through a software.
In the context of modern trends in fast changing technologies and integration of world economies the
economic environments become more dynamic. As result the risk of the uncertainty increased further,
evidently it enhanced the need for effective assessment of risk in investment decisions. In light of the situation
several modern methods/models were developed. Amongst all the latest technique the crystal ball and
simulation apart from the sensitivity analysis, scenario analysis methods consider the scenario effects in
analysis with the chief determinants of cash flows. It is needed to overcoming the limitation of considering
only few determinants and ensuring all the considerable variables in analysis and increased accuracy in
measuring the risk factor in investment. This is an innovative model distinctly designed using the simple basic
relationships among the variables and their relationships in proportions ensuring an integrative effect of the
variables that gives totality to the process in deriving the most realistic factors to the fore in risk analysis.
The proposed model developed with the integration of sensitivity; scenario and breakeven concepts in such a
way that it can provide a better and more accurate values in terms of proportions with inter relation between
the different scenario values and with a perfect sensitivity effect i.e. clear measurement of effect or the
sensitivity through measuring the extent of effect from the predetermined. It further solves the major problem
of relational and non related cash flow error effects in analysis with the aspect of measurement of the effects
with reference to a unified constant variable (sales revenue).
The problem of measuring the values of determinants under different scenarios is made simple with the
application of corresponding multiplication constants for each of the chief and the sub determinants of cash
flows to any and every predicted scenario either under general or excel work sheet calculation. With the help
of an excel work sheet we can visibly measure the effects without much technical knowhow. It mostly
resembles a semi crystal ball simulation effects with physical visible calculations using more deterministic
values of variables through proportions.
Among the models available the sensitivity, scenario and breakeven analysis originated at the level of
determinants apart from the crystal ball simulation. These models, gave due consideration to measure the
extent of impact of the environment through the changes in one of the determents at each instance in
sensitivity and values of all the variable independently in each instance under scenario on the cash flows.
These three analytical models have similarities in consideration of data and each one has their way of
projection, they suffered from certain limitations. With the integration of the three ensure the overcoming of
the limitations and bringing the totality to the analysis.
The proposed model covers both micro level information, macro level information in an indexed manner (pre
determined on current price, costs and the volume figures on one hand and the exact measurement of effects
on results through sensitivity to changes observed/predated time to time and made possible to measure the
impact on results w.r.t. to changed scenario of any kind.
2. Key words:
Variable cost Ratio, BEP ratio, P/V Ratio, sensitivity constants, relational parametric constant sales revenue.
Percentage of profit on parametric constant sales revenue, Percentage of cash flows on the parametric sales, %
3. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
104 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
of PVCIF on parametric sales, % of NPV on parametric sales. Change in sales as a Percentage of the new or
scenario sales.
3. Research gap:
1. The limitation of no scope for consideration of all the determinants of cash flows is need to be avoid and
able to give all the determinants together in analysis in measuring the risk in investment proposals.
2. Unfulfilled need for the direct integration between operational risk and investment risk measurements
together to understand and disclose the relation between them for effective investment decisions.
3. Laxity of sensitivity constants for simplification in measurement efforts and results.
4. Giving scope for all the possible determinants of cash flows in risk analysis.
5. The limitation of crystal ball simulation i.e. it suffered with the limitations from the way of consideration
of the values of determinant variables viz. price, volume, costs etc in independent monetary or numerical
figures. Multiple complex calculations. No Possibility to conduct the same physically/manually with
minimum values such as proportions of variables. Presently there is no scope for ensuring the process of
analysis as a guide for operational risk control of the project after the commencement.
4. Objectives:
1. Ensuring greater accuracy in determining the values of determinants with more precession and much
simplicity for risk measurement through introduction of proportions or percentages.
2. Ensuring more effective way in Implication of risk aspect of the economic environment on all the
determinants of cash flows through the application of extent of response deviations and measurement of
effect with a relational constant value.
3. ensuring calculation of sensitivity of results /profits with respect to all the factor determinants of profits
to parametric relational constant value (constant sales revenue) Designing a perfect model for both
operational risk and investment risk assessment in consideration of all the determinants of any kind of
financial of risk.
4. Unique and Simplified way of calculation of NPV, PI and IRR directly with proportions of cash flows
instead of monitory figures of cash Flows.
5. Ensuring simplicity in measuring the measurement of risk under with integrative model of sensitivity and
scenario analysis under breakeven ensuring the effects of crystal ball and simulation physically in a
visible manner through micro soft excel work sheet without redundancy and With a multiplication
constant factor of sensitivity or elasticity factor for each of the determinant variables viz. costs, price,
volume.
6. Ensuring the sensuousness of the cash flows/profits in proportion to a common constant referral value, the
revenue. Minimizes/eliminates error in interpretation of results.
5. Research methodology:
Innovative research on conceptual enlargement for methodical expansions in advanced applications of break
even analysis for operational and investment risk analysis is used.
6. Limitations of the study:
1. Here in only the hypothetical examples used to explain and no live examples used in experimental
analysis of situations.
2. Practicability of the hypothetical conditions of the scenarios.
3. Not having enough research publications and empirical research literature in this area is also limitation.
7. Literature review:
Working in isolation the existing scenario analysis, sensitivity analysis, breakeven analysis has very limited
scope for measuring the risk through the predicted changes in the entire variable under multiple scenarios.
4. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
105 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
As the concept of the paper is to formulate the calculations innovatively, the common existing formulae were
collected from the review of literature.
1. The concepts used in BEA:
2. Sales(S): sales or selling price.
3. Variable costs (V): unit cost / proportional variable total cost.
4. Fixed costs (F): total fixed cost irrespective of level of output.
5. Contribution margin: it is the amount calculated with the following: C = S-V
6. P/V Ratio (Profit Volume ratio) it is the ratio of between the contribution and sales.
7. P/V Ratio: C/S*100
8. All formulas of BEA ring aground the following equation: S-V = C =F+P 9. S-V = C = F+P S S S 5.1.
Existing Other Formulas in Breakeven Analysis for Profit Planning Calculation of:
Breakeven Point (BEP) I units: F/Cpu In sales value/revenue: F/ p/v Ratio.
Determination of sales required to get a profit of Rs. P
Required sales {in units}: F+ desired P Cpu
Required sales {in revenue}: F+ desired P/ PV Ratio A
mount of profit (P) when sales are S units:
P= (S x Cpu) -F Amount of profit (P) when target sales are ‘S’ rupees:
P = (S x P/V Ratio) –F Calculation of safety margin sales SM/MS/SMS:
SMS= TS-BEP (in units or value) SMS in units: P / Cpu SMS in value: P / (P/V Ratio)
The newly invented formulae with the existing concepts are % of P = P/V Ratio (1-BEP Ratio) or P/V Ratio-
(P/V Ratio*BEP Ratio) Generally accessible from any text book of cost and management accounting and
drawn from the references
The scenario analysis:
1. It depends on the assumption that there are well delineated scenarios which may not be true in many
cases. Further the economy does not necessarily lie in three discrete states viz. recession, stability, and boom.
It can in fact be somewhere on the continuum between the extremes. When a continuum is converted in to
three discrete states some information is lost.
2. The independent scenario analysis needs to determine the values of variables in consideration of
concerned state of the scenario independently. As result it become inflexible and does not provide for
correlative implication with other scenarios under consideration.
__________________________________________________________________
Prasanna chandra projects planning, analysis,selection,
financing, implementation, and review, CFM-TMH professional
series in finance 7th
editionpage No.11.7 & 8
3. In case of sensitivity analysis the major limitation is that it can be able to measure the changes with
respect to changes in only one variable at once is not useful for correct inference.
________________________________________________________________
Prasanna chandra projects planning, analysis,selection,
financing,implementation, and review, CFM-TMH professional
series in finance 7th
editionpage No.11.5to11.7
4. The financial break-even i.e. the present value breakeven is also a measure of risk in investment
decisions.
5. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
106 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
_________________________________________________________
Prasanna chandra projects planning, analysis,selection,
financing,implementation, and review, CFM-TMH professional
series in finance 7th
editionpage No.11.5to11.11
5. The conversion of a deterministic micro soft excel spread sheet model into dynamic simulation model
involves three steps.
a. Identification of the inputs to the model i.e. selling price quantity etc. that are subject to the
uncertainty and define the distribution for each of them. In crystal ball these uncertain inputs are called
assumptions.
b. Select a value of each of each of the assumptions in the model, calculate the value of each of the
output variables called crystal ball and store the result.
c. Repeat this process for number iterations. The resultant values may be summarized using descriptive
statistics.
_________________________________________________
Prasanna chandra projects planning, analysis,selection,
financing,implementation, and review, CFM-TMH professional
series in finance 7th
edition page No.11.5to11.19
Additional formulas used apart from the common BEA Formulas
1. Calculation of component wise multiplication constant factor of each component of Variable cost
sensitivity as @ re. 1 = component wise proportion of each variable factor to variable cost ratio. VC
ratio*proportion of VC component value (% of change in VC on VC.
2. Determination of sensitivity constant for measuring the effect of change in FC through FC cost ratio. FC
constant = Proportion of each component of Fixed cost in total fixed cost* BEP Ratio* PV Ratio. at Re.1.
3. Determination of sensitivity constant for measuring the effect of change in volume: BEP proportion * PV
Ratio proportion at Re.1.
4. The sales revenue is the only variable which has direct link between the costs, price, volume on one end
and the profits on the other. Further the sales revenue is the most important and a common standard of
measurement of performance and efficiency. It is a common phenomenon of target performance. In view
of this the sales revenue is identified as the common constant value to which the worth of every
determinant variable and their extent of effect on performance are measured. The pre determined value of
sales as constant factor enabled to derive the multiplication constants for each variable and the sub
variables.
5. With the help of the determination of multiplication constant/elasticity of the determinant factor it is made
possible to bring totality for the analysis.
8. Innovation Hypothesis:
a. Every business follows the principle of determination of target volume and value of sales at a given price
for the period of analysis under pre determined context/scenario as constant. Further the changes in
volume are possible to measure as a percentage on the volume/revenue after the. And price changes as a
percentage on prefixed price to arrive the results after the effects of the changes
b. It possible to determine the variable (either in total or at per unit) and fixed costs in total for the
corresponding target volume in any and every business on any prefixed context i.e. ideal/existing
facts/standers/budgets. Further it is possible to determine the extent of likely changes in the costs to other
context situations as percentage on costs.
c. The measurement of extent of effect of change in price/cost/ volume is made possible to bring the
prefixed / predicted results to the actual through measuring and adjusting with the extent of effect i.e.
6. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
107 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
sensitivity, affected by change situations /scenario without altering the prefixed parameters and without
redundancy indicate the uniqueness of the model.
d. This is the analytical model assess the effectiveness of investment proposal with a clear picturesque of the
way how results are effected by scenarios and the moments of results from scenario to scenario.
e. It is possible that the breakeven analysis need not keep the assumptions as prerequisites for conducting
analysis. More importantly it solves the problem of FC as constant and does not change.
f. It is possible that the measurement of actual results through a smooth passage from initial prefixed vale to
the actual value by measuring the extent of response (sensitivity).
9. ANALYSIS:
The sensitivity to scenario analysis is conceptualized and developed with the following disclosed common
facts of relation between the price, costs and revenues. Apart from the concepts, formulas commonly used in
BEA and the innovative formulas developed and published in May /June in the international journals the
IJBM & IRJBM.
1. The amount of change in VC is measured and considered as a percentage of VC (∆VC/VC*100) and
evidently the rate or % change in VC is the same as the rate / % change in VC ratio. Therefore the amount
of change in VC as rate of change in VC is synonymous to rate of change in VC Ratio. In view of this the
sensitivity constants are determined as:
2. The constants of sensitivity = VC Ratio/100 or VC/SP or variable cost proportion it is either to VC or in
its components.
3. Further in similar way the constants or sensitivity of FC is calculated as the product of PV Ratio and BEP
proportion (BEP/Total sales), because the BEP ratio moves in proportion to FC on one hand and show a
similar response to Volume on the other.
4. Therefore the Volume sensitivity constant is calculated as product of contribution proportion(c/s) and BEP
proportion (BEP/Sales). The scenario value of change in respect of volume is calculated on the volume
after the change and not the predetermined volume.
5. When there exists a change in both volume and fixed cost in-tandem the sub effect part sensitivity is
calculated as (BEP proportion * contribution proportion) *(the product of % change in FC and % change
in Volume).
6. The sensitivity constant of a change in SP is always being 1 and the percentage change in price intact is
the extent of response. Because change *1 is the same as the % change.
With the use of the above simple concepts of multiplication constants as the sensitivity constants / elasticity,
the scenario effect on results are calculated by multiplication with the scenario impact on the determinant
arrived as a % on the predetermined value of the respective determinant called sensitivity to the variable for
each of the identified determinants independently. The net effect is calculated by adding or deducting of
sensitivity of all. The net results after the scenario effects are arrived by adding/deducting the net effect of
sensitivity to the predetermined Rate of profit. Simultaneously as may number of different results is possible
to determine for the each of the number of scenarios that one predicts. Further it is possible to view visibly the
state inter relation among them. This is because the extent of effect of sensitivity or scenario effects is
measured in proportion/percentage to constant reference value of sales revenue. The way how the integrated
innovative model is developed can be elaborated with the following hypothetical example.
This is made possible to apply as many determinants/sub determinants that one wish to can be used and
effects are visibly verified. Though there is no limit for visualizing the number of scenarios, here for the
purpose three limits are used apart from the ideal/slandered/targeted.
The following figures are arrived keeping in view the existing/prevailing/ideal/slandered conditions
environment as:
Project cost Rs. 75000/- scrap value Rs. 5000 & life 5 years tax rate 40%
The following values are arrived through standard costing/budgeting under ideal conditions
7. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
108 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
Table: 9. 1
Change predicted under different scenarios
Particulars
standard/budgeted/target
under Ideal
Scenario effects in % /change in %
normal pessimistic optimistic
level or % of capacity 100% 80% 60% 100%
Output and sales 12500 10000 7500 12500
cost PU total cost 2 3 4
cost of direct materials consumed 2 25000 3 10 -5
cost of dire labour 1 12500 2 7 -3
direct other exp 0.2 2500 1 2 0
prime cost 3.2 40000
add variable factory OH 0.8 10000 6 19 -8
add fixed factory OH 0.6 7500 10 20 5
factory/manufacturing cost 4.6 57500
add office and admn variable OH 0.5 6250 2 5 -2
add office and admn fixed OH 0.6 7500 10 20 0
cost of production 5.7 71250
add selling & dist variable OH 0.75 9375 0 -2 -5
add selling & dist fixed OH 0.32 4000 10 20 5
project depreciation 1.2 15000 0 5 0
total cost 7.97 99625
profit 4.03 50375
sales 12 150000 0 -5 3
Table: 9. 2
Determination of profits under the breakeven analysis and determination of CV sensitivity constants
ideal conditions
level of output and sales in units(12500) 150000 12
element wise
VC sensitivity
constants (v/sp)total cost VCPU
cost of direct materials consumed 25000 2 .167
cost of dire labour 12500 1 .0833
direct other exp 2500 0.2 .167
prime cost 40000 3.2
variable element of factory OH 10000 0.8 .0667
variable element of office & admn OH 6250 0.5 .0417
variable element of selling & dist OH 9375 0.75 .0625
total VC/Vcpu/ & VC ratio 65625 5.25 (43.75%) .4375
contribution /Cpu/ & pv Ratio 84375 6.75 (56.52%) .5625
sales 150000 12 1.00
proportion of each
element of FC in total
FC
fixed element of factory OH 7500 0.221
element office & admn fixed OH 7500 0.221
8. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
109 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
element selling & dist fixed OH 4000 0.118
project depreciation 15000 0.441
total Fixed cost 34000 34000 1.0
total cost (F+V) (65625+34000) 99625
profit 50375
sales 150000
% profit on sales 33.58 33.58
BEP in units F/c 5037
BEP in revenue F/PV ratio 60444.44
% of bep in target sales bep/TS*100 40.2962963 40.296
Table: 9. 3
Determination of effects or sensuousness of profits to VC sensitivity affected by scenarios
variable cost
particulars
determination of sensitivity constants
sensitivity analysis together with scenarios under the Break
even Analysis
sensitivity
constants
changes predicted
under scenarios in
i.e.∆VC as % in VC
sensitivity of results to the
scenario
extent of
sensitivity/responsiveness of
profits to scenarios VC
ideal/prese
nt
nor
mal
pessi
misti
c
optimi
stic normal
pessi
mistic
optimisti
c normal
pessimi
stic
optimist
ic
1=(
VCPU/SP) 2 3 4 5=1*2
6=(1*
3) 7=(1*4)
7= sum
of 4
8= sum
of 5
9= sum
of 6
cost of direct
materials consumed 0.167 3 10 -5 0.500 1.667 -0.833
cost of dire labour 0.083 2 7 -3 0.167 0.583 -0.250
direct other exp 0.017 1 2 0 0.017 0.033 0.000
prime cost 0.267 0.683 2.283 -1.083 0.6833 2.2833 -1.0833
add variable factory
OH 0.067 2 5 -2 0.133 0.333 -0.133
add office and admn
variable OH 0.042 0 -2 -5 0.000 -0.083 -0.208
add selling & dist
variable OH 0.063 3 8 0 0.188 0.500 0.000
total Variable OH 0.4375 0.3208 0.750 -0.3417 0.3208 0.7500 -0.342
responsiveness of total
variable cost 1.0042 3.0333 -1.4250
aggregate weighted average change in VC to scenarios = (SP*
responsiveness to scenario /100)
0.1205 0.364 -0.171
9. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
110 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
Table: 9. 4
Calculation of Sensitivity constants for the determinants i.e. fixed cost and volume
particulars
(FOH/TFOH)
(BEP
proportion
*FOH) PV ratio/100
sensitivit
y constant
1 2 3 4=(2*3)
Volume of sales 0.40296 0.5625 0.227
Factory fixed OH other
than depreciation. 7500 0.22 0.0889 0.5625 0.050
Admn Fixed OH 7500 0.22 0.0889 0.5625 0.050
Selling Fixed OH 4000 0.12 0.0474 0.5625 0.027
Project Depreciation 15000 0.44 0.1778 0.5625 0.100
total fixed cost 34000 1.00 0.40296 0.5625 0.227
Volume and fixed cost sub
effect 0.40296 0.5625 0.227
Selling price (the extent of % change in price) or it is always
Table: 9. 5
Determination of effects or sensuousness of profits to FC, volume and price sensitivity affected by scenarios
particulars sensitivity
constant
predicted scenario changes in
FC as % on ideal values of FC
and ( for volume change % on
scenario volume)
extent of
sensitivity/responsiveness of
profits to scenarios FC, sales,
price
4 5 6 7 8=4*5 9=4*6 10=4*7
Volume of sales 0.227 -25.00 -66.67 0.00 -5.67 -15.111 0.000
Factory fixed OH other than
depreciation.
0.050
10.00 20.00 5.00 0.50 1.00 0.25
Admn. Fixed OH 0.050 10.00 20.00 0.00 0.50 1.00 0.00
Selling Fixed OH 0.027 10.00 20.00 5.00 0.27 0.53 0.13
Project Depreciation 0.100 0.00 5.00 0.00 0.00 0.50 0.00
total fixed cost 0.227 1.27 3.03 0.38
volume and fixed cost sub
effect(when both changed in-
tandem :
(aggregate % change in Fixed
cost:(the total of 8/4, 9/4.10/4)
4= total
sensitivity*%vol*%FC
8=(.227*5.59*-25)---
0.227 5.59 13.38 1.69 -0.32 -2.02 0.00
Selling price sensitivity % change in SP 0 -5 3
Table:9. 6
10. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
111 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
Effects of the aggregate scenario sensitivity or sensuousness of profits and cash flows
determinant
factor
extent of sensitivity to scenarios
normal=80% pessimistic=60% optimistic=100%
extent
effect
effect
on profit
extent
effect
effect
on profit
extent
effect
effect
on profit
Initial profit as % on planned sales 33.58 33.58 33.58
less: total sensuousness of profits
to sensitivity of VC to scenario
variable cost 1.0042 1.0042 3.0333 3.0333 -1.4250 -1.4250
Profits after scenario VC effect 32.579167 30.55 35.008333
Less: total sensuousness of profits
to sensitivity of FC to scenario
Fixed cost 1.58 1.58 5.06 5.06 0.38 0.38
Profits after scenario FC effect 30.996 25.494 34.625
add scenario volume effect -5.67 -5.67 -15.11 -15.11 0.00 0.00
after volume effect 25.329 10.383 34.625
add scenario price effect 0 0 -5 -5 3 3
% of profit after the scenario
sensitivity effects 25.329 5.383 37.625
% of profit on the ideal revenues: (
P after the scenario*Nvol/old vol) PBIT 20.26 3.23 37.63
after tax rate 40% PAIT 12.158 1.938 22.575
add depreciation (=14/15*100) 9.33 9.33 9.33
(14000/150000*100)
cash inflows from the project 21.49 11.27 31.91
Table: 9. 7
Calculation of NPV of the project under predicted different scenario situations
expected cost of capital is 8% normal=80% pessimistic=60% optimistic=100%
PV annuity for 5 years @ 8% = 3.993 3.993 3.993
PVIF 85.809 45.003 127.401
PV of scrap (5000/150000*100*.6806) 2.27 2.27 2.27
PV of cash flows as a % on sales 88.08 47.27 129.67
initial investment as % on sales :
75000/150000*100 50 50 50
NPV as a percentage of predetermined sales (%of
PVIF-%PVOF) 38.08 -2.73 79.67
IRR 33.25 5.95 57.65
PI 88.08/50=1.72 47.27/50=0.90 129.67/50=2.55
10. Evaluation:
With the help of the determination of the sensitivity constants detailed in the table 9.2 for the VC and Table
9.4 for the FC and Volume and it made very much simple in determining the sensuousness of the
11. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
112 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
profits/results to each and every determinant as given table 9.3 and table 9.5. And the aggregate effect as
detailed in the Table 9.6. With the help of measuring the results after the sensitivity impacts and converting
the resultant rate as % on the predetermined sales revenue by multiplying the result with the ratio of actual
scenario volume and prefixed volume the PBIT is determined. The next process is as usual in determining the
NPV/IRR/PI. The difference is only that instead of physical value of cash inflows it is determined with the
CIF as a % on predetermined sales which is the key constant for measuring all the effects including the NPV
is the real concept of establishing and ensuring perfect integration of the total process of the model. It
becomes the error free relativity among the scenarios determinants and the process. Therefore it is perfect
rational analytical tool in studying the extent of difference in between the scenario values. If necessary,
calculating the standard deviation would be the sufficient to measure the deviation/variance coefficient to
arrive at the investment decision. Further it serves as a integrated controlling tool of operation after the
commencement of the project.
This model ensures the entire process of analysis in proportions with perfect inter relation among the
determinants and between the scenarios from the beginning to the end. The method of proportional value
analysis enabled the perfect integration of scenario effects from the beginning with assessment of changes as a
% in costs, price and volume through the process with the impact of sensitivity till the end with perfect
integration on breakeven concept based measurements of effects of determinants on results. Therefore the
integration sensitivity scenario enabled through breakeven analysis enable to overcome the limitations and
multiplicity of value considerations. Further it ensures all the variables what so ever one consider as important
made involved in the analysis for investment analysis.
This is the integrated model further perfectly fitted to the assessment of operational risk analysis enabling the
study of day to day change in costs, prices and projected volume on day to day basis for strategy formulations,
strategic operational decisions including pricing.
11. Conclusion:
This is the tool represents a perfect blend of integrated system of absorption technique in data collection and
representation of costs, direct costing and breakeven techniques in relating and processing and deriving the
results. Whereas in adopting the standers for comparison and interpretation of measured results, the use of
budgeting /sundered costs techniques. Therefore this is a perfect model of operational risk measurement apart
from the investment risk analysis.
12. References:
i. S.P Jain & K.L Narang, Cost accounting principles and practice, 22nd Revised Edition (2011), Kalyani
Publishers.
ii. M.N Arora, A Textbook of Cost and Management Accounting, 10/e , Vikas Publishing.
iii.Prasanna chandra projects planning, analysis,selection, financing,implementation, and review, CFM-TMH
professional series in finance 7th
edition
iv. Management and cost accounting sixth edition, cengage learning india edition, by colin drury.
v. Chandra Sreedhara Ramesh & Banana Krishna (June, 2016) Innovative Formulations and Enhanced Scope
of Break Even Analysis. IRJBM ,Volume – IX (Issue – 6).
vi. Chandra Sreedhara Ramesh & Banana Krishna (May, 2016) Innovative Formulations and Enhanced
Additional Applications of Break Even Analysis. IRJBM, Volume – IX (Issue – 5).
vii. SRCSPS Karivena Effects in Application of Breakeven Analysis for Strategic Pricing Advanced
Applications in Breakeven Applications, IJMB June 2016, Vol 4, Issue 6.
viii. Prof. SRC & Dr. KB Innovations in Measuring the Impact and Action Recourse for Changes in Costs
Prices Product Mix and Volume on Profits, Developed as an Effective Mathematical Tool for Reporting
and Decision Making by Interlinking the Economic Analysis and Financial Analysis through Breakeven
Analysis
12. International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com November 2016, Volume 4, Issue 11, ISSN 2349-4476
113 Prof. Sreedhara Ramesh Chandra , Dr. Krishna Banana
iX. Risk Management System Implementation: Improving the Role of Internal Control Unit (SPI)
September, 2016. theijbm, 2-MB1609-009’
Declaration:
I solemnly declare that the above is a sheer intuitive thought of my own and nothing is copied in the parts of
the above innovative formulations and analysis, except the generalizations cited and the references cited. If
you found the same anywhere in the past, it is purely due to non accessibility to such work, kindly give the
details of that work(s), if possible/necessary and they duly regarded. Your cooperation in this regard is
earnestly solicited.
Your comments and suggestions are earnestly solicited.
Thanking you sir,
With regards
S. Ramesh Chandra