Economics is the study of how individuals and societies choose to use the scarce resources that nature and the previous generation have provided. The world‟s resources are limited and scarce. The resources which are not scarce are called free goods. Resources which are scarce are called economic goods.
Prisoner's Dilemma is a paradox in decision analysis in which two individuals acting in their own best interest pursue a course of action that does not result in the ideal outcome. The typical prisoner's dilemma is set up in such a way that both parties choose to protect themselves at the expense of the other participant. As a result of following a purely logical thought process to help oneself, both participants find themselves in a worse state than if they had cooperated with each other in the decision-making process.
In this session, we will be looking at The Prisoner's Dilemma and how it affects our decision making, group and team dynamics, business decisions. We'll look at real world case studies and nature with a goal of understanding this dilemma better.
Operation research and its application
• Operations • The activities carried out in an organization.
• Research • The process of observation and testing characterized by the scientific method. Situation, problem statement, model construction, validation, experimentation, candidate solutions.
Multiple Linear Regression II and ANOVA IJames Neill
Explains advanced use of multiple linear regression, including residuals, interactions and analysis of change, then introduces the principles of ANOVA starting with explanation of t-tests.
Regression Analysis and model comparison on the Boston Housing DataShivaram Prakash
Creation of regression models to predict the median housing price using the Boston Housing dataset. Models used: Generalized linear model, generalized additive model, artificial neural networks, regression tree
Malmquist Total Factor Productivity Index for Modeling Dairy SectorARNAB ROY
Productivity growth in agriculture sector is considered important to the development process, allowing countries to produce more food at lower cost, improve nutrition and welfare, and release resources to other sectors. The Total Factor Productivity (TFP) growth, traditionally calculated as the ratio of total output to the weighted sum of inputs, is quite often interpreted as a shift of the production function.TFP represents the increase in total production which is in excess of the increase that results from increase in inputs. It results from intangible factors such as technological change, education, research and development, synergies, etc. the Malmquist index, a data envelopment analysis-type nonparametric technique, to decompose productivity growth into technical efficiency and technological change for Indian dairy sector. The results indicate that the increase in dairy productivity is mainly attributed to the increase in technical change, and the efficiency gain found is largely the result of improvements in scale efficiency.
Economics is the study of how individuals and societies choose to use the scarce resources that nature and the previous generation have provided. The world‟s resources are limited and scarce. The resources which are not scarce are called free goods. Resources which are scarce are called economic goods.
Prisoner's Dilemma is a paradox in decision analysis in which two individuals acting in their own best interest pursue a course of action that does not result in the ideal outcome. The typical prisoner's dilemma is set up in such a way that both parties choose to protect themselves at the expense of the other participant. As a result of following a purely logical thought process to help oneself, both participants find themselves in a worse state than if they had cooperated with each other in the decision-making process.
In this session, we will be looking at The Prisoner's Dilemma and how it affects our decision making, group and team dynamics, business decisions. We'll look at real world case studies and nature with a goal of understanding this dilemma better.
Operation research and its application
• Operations • The activities carried out in an organization.
• Research • The process of observation and testing characterized by the scientific method. Situation, problem statement, model construction, validation, experimentation, candidate solutions.
Multiple Linear Regression II and ANOVA IJames Neill
Explains advanced use of multiple linear regression, including residuals, interactions and analysis of change, then introduces the principles of ANOVA starting with explanation of t-tests.
Regression Analysis and model comparison on the Boston Housing DataShivaram Prakash
Creation of regression models to predict the median housing price using the Boston Housing dataset. Models used: Generalized linear model, generalized additive model, artificial neural networks, regression tree
Malmquist Total Factor Productivity Index for Modeling Dairy SectorARNAB ROY
Productivity growth in agriculture sector is considered important to the development process, allowing countries to produce more food at lower cost, improve nutrition and welfare, and release resources to other sectors. The Total Factor Productivity (TFP) growth, traditionally calculated as the ratio of total output to the weighted sum of inputs, is quite often interpreted as a shift of the production function.TFP represents the increase in total production which is in excess of the increase that results from increase in inputs. It results from intangible factors such as technological change, education, research and development, synergies, etc. the Malmquist index, a data envelopment analysis-type nonparametric technique, to decompose productivity growth into technical efficiency and technological change for Indian dairy sector. The results indicate that the increase in dairy productivity is mainly attributed to the increase in technical change, and the efficiency gain found is largely the result of improvements in scale efficiency.
Analysing the efficiency of energy use on farms using Data Envelopment Analys...Daniel Sandars
We use Data Envelopment Analysis (DEA) to investigate the efficiency of fossil energy usage of farms in England and Wales. We find that DEA identifies the outliers in the sample for cross checking the data quality as well as finding peer groups from which farmers can benchmark their own improvements. I gave the presentation to the department of land economy at the University of Cambridge.
It describes different lending schemes of IMF along with eligibility criteria and access limit under concessional and non-concessional conditionalities.
It describes sovereign risk and theories of indebtedness. Discussion is based on the following three papers:
Eaton, Gersovitz, Stiglitz (1986), “The Pure Theory Of Country Risk”, EER
(ii) Krugman (1985), “International Debt Strategies In An Uncertain World” in Smith & Cuddington (Ed.) “International Debt and the Developing Countries”
(iii) Basu (1991), ‘The International Debt Problem, Credit Rationing and Loan Pushing: Theory and Experience’, Princeton Studies In International Finance No- 70
Analyzing the solutions of DEA through information visualization and data min...Gurdal Ertek
Data envelopment analysis (DEA) has proven to be a useful tool for assessing efficiency or productivity of organizations, which is of vital practical importance in managerial decision making. DEA provides a significant amount of information from which analysts and managers derive insights and guidelines to promote their existing performances. Regarding to this fact, effective and methodologic analysis and interpretation of DEA solutions are very critical. The main objective of this study is then to develop a general decision support system (DSS) framework to analyze the solutions of basic DEA models. The paper formally shows how the solutions of DEA models should be structured so that these solutions can be examined and interpreted by analysts through information visualization and data mining techniques effectively. An innovative and convenient DEA solver, Smart DEA, is designed and developed in accordance with the proposed analysis framework. The developed software provides a DEA solution which is consistent with the framework and is ready-to-analyze with data mining tools, through a table-based structure. The developed framework is tested and applied in a real world project for bench marking the vendors of a leading Turkish automotive company. The results show the effectiveness and the efficacy of the proposed framework.
http://research.sabanciuniv.edu.
Financial Benchmarking Of Transportation Companies In The New York Stock Exc...ertekg
Download Link > https://ertekprojects.com/gurdal-ertek-publications/blog/financial-benchmarking-of-transportation-companies-in-the-new-york-stock-exchange-nyse-through-data-envelopment-analysis-dea-and-visualization/
In this paper, we present a benchmarking study of industrial transportation companies traded in the New York Stock Exchange (NYSE). There are two distinguishing aspects of our study: First, instead of using operational data for the input and the output items of the developed Data Envelopment Analysis (DEA) model, we use financial data of the companies that are readily available on the Internet. Secondly, we visualize the efficiency scores of the companies in relation to the subsectors and the number of employees. These visualizations enable us to discover interesting insights about the companies within each subsector, and about subsectors in comparison to each other. The visualization approach that we employ can be used in any DEA study that contains subgroups within a group. Thus, our paper also contains a methodological contribution.
Marketing Research Approaches to Demand EstimationConsumer Surveysdata from survey questionsObservational Researchdata from observed behaviorConsumer Clinicsdata from laboratory experimentsMarket Experimentsdata from real market tests
Regression Analysis
Scatter Diagram
Regression AnalysisRegression Line: Line of Best Fit
Regression Line: Minimizes the sum of the squared vertical deviations (et) of each point from the regression line.
Ordinary Least Squares (OLS) Method
Ordinary Least Squares (OLS)
Model:
Ordinary Least Squares (OLS)
Objective: Determine the slope and intercept that minimize the sum of the squared errors.
Ordinary Least Squares (OLS)
Estimation Procedure
Ordinary Least Squares (OLS)
Estimation Example
Ordinary Least Squares (OLS)
Estimation Example
Tests of Significance
Standard Error of the Slope Estimate
Tests of Significance
Example Calculation
Tests of Significance
Example Calculation
Tests of Significance
Calculation of the t Statistic
Degrees of Freedom = (n-k) = (10-2) = 8
Critical Value at 5% level =2.306
Tests of Significance
Decomposition of Sum of Squares
Total Variation = Explained Variation + Unexplained Variation
Tests of Significance
Coefficient of Determination
Tests of Significance
Coefficient of Correlation
Multiple Regression Analysis
Model:
Multiple Regression Analysis
Adjusted Coefficient of Determination
Multiple Regression Analysis
Analysis of Variance and F Statistic
Problems in Regression AnalysisMulticollinearity: Two or more explanatory variables are highly correlated.Heteroskedasticity: Variance of error term is not independent of the Y variable.Autocorrelation: Consecutive error terms are correlated.
Durbin-Watson Statistic
Test for Autocorrelation
If d = 2, autocorrelation is absent.
Steps in Demand EstimationModel Specification: Identify VariablesCollect DataSpecify Functional FormEstimate FunctionTest the Results
Functional Form Specifications
Linear Function:
Power Function:
Estimation Format:
Chapter 5 Appendix
Getting StartedInstall the Analysis ToolPak add-in from the Excel installation media if it has not already been installedAttach the Analysis ToolPak add-inFrom the menu, select Tools and then Add-Ins...When the Add-Ins dialog appears, select Analysis ToolPak and then click OK.
Entering DataData on each variable must be entered in a separate columnLabel the top of each column with a symbol or brief description to identify the variableMultiple regression analysis requires that all data on independent variables be in adjacent columns
Example Data
Running the RegressionSelect the Regression tool from the Analysis ToolPak dialogFrom the menu, select Tools and then Data Analysis...On the Data Anal.
MINING DISCIPLINARY RECORDS OF STUDENT WELFARE AND FORMATION OFFICE: AN EXPLO...IJITCA Journal
Data mining is the process of analyzing large datasets, understanding their patterns and discovering useful
information from a large amount of data. Decision tree as one of the common algorithm of data mining is a tree structure entailing of internal and terminal nodes which process the data to eventually produce a classification. Classification is the process of dividing a dataset together in a high-class set such that the members of each set are nearby as expected to one another, and different groups are as far as expected from one another, where distance is measured with respect to the specific variable(s) you are trying to predict. Data Envelopment Analysis is a technique wherein the productivity of a unit is evaluated by equating the volume/amount of output(s) in relation to the volume/amount of input(s) used. The performance of a unit is calculated by equating its efficiency with the best-perceived performance in the data set. In this study, a model for measuring the efficiency of Decision Making Units will be presented, along with related methods of implementation and interpretation. DEA assesses and evaluates the efficiency of a unit dubbed as Decision-Making Units or DMU. There are many classification techniques and algorithms but the research study used decision tree using CHAID algorithms. Classification decision tree algorithm using CHAID as data mining technique identifies the relationship between the demographic profile of the students and the category of offenses. Cross tabulation is a tool used to analyze categorical data. It is a type of table in a matrix format that shows the multivariate occurrence dissemination of the variables and delivers a basic picture of the interrelation between two variables. Both CHAID algorithm and cross tabulation obtained the same results implying that higher percentage of students commit minor offenses regardless of college, gender, year level, month and course. The CHAID algorithm used in a software application Student Offenses Remediation System (STORES) serves as remediation plan for the university. Further studies should be conducted to identify the effectiveness of the remediation plan by conducting an empirical investigation on the rule set and/or implement another algorithm to determine the program efficiency.
MINING DISCIPLINARY RECORDS OF STUDENT WELFARE AND FORMATION OFFICE: AN EXPLO...IJITCA Journal
Data mining is the process of analyzing large datasets, understanding their patterns and discovering useful
information from a large amount of data. Decision tree as one of the common algorithm of data mining is a
tree structure entailing of internal and terminal nodes which process the data to eventually produce a
classification. Classification is the process of dividing a dataset together in a high-class set such that the
members of each set are nearby as expected to one another, and different groups are as far as expected
from one another, where distance is measured with respect to the specific variable(s) you are trying to
predict. Data Envelopment Analysis is a technique wherein the productivity of a unit is evaluated by
equating the volume/amount of output(s) in relation to the volume/amount of input(s) used. The
performance of a unit is calculated by equating its efficiency with the best-perceived performance in the
data set. In this study, a model for measuring the efficiency of Decision Making Units will be presented,
along with related methods of implementation and interpretation. DEA assesses and evaluates the
efficiency of a unit dubbed as Decision-Making Units or DMU. There are many classification techniques
and algorithms but the research study used decision tree using CHAID algorithms. Classification decision
tree algorithm using CHAID as data mining technique identifies the relationship between the demographic
profile of the students and the category of offenses. Cross tabulation is a tool used to analyze categorical
data. It is a type of table in a matrix format that shows the multivariate occurrence dissemination of the
variables and delivers a basic picture of the interrelation between two variables. Both CHAID algorithm
and cross tabulation obtained the same results implying that higher percentage of students commit minor
offenses regardless of college, gender, year level, month and course. The CHAID algorithm used in a
software application Student Offenses Remediation System (STORES) serves as remediation plan for the
university. Further studies should be conducted to identify the effectiveness of the remediation plan by
conducting an empirical investigation on the rule set and/or implement another algorithm to determine the
program efficiency.
Analyzing the solutions of DEA through information visualization and data min...ertekg
Download Link > https://ertekprojects.com/gurdal-ertek-publications/blog/analyzing-the-solutions-of-dea-through-information-visualization-and-data-mining-techniques-smartdea-framework/
Data envelopment analysis (DEA) has proven to be a useful tool for assessing efficiency or productivity of organizations, which is of vital practical importance in managerial decision making. DEA provides a significant amount of information from which analysts and managers derive insights and guidelines to promote their existing performances. Regarding to this fact, effective and methodologic analysis and interpretation of DEA solutions are very critical. The main objective of this study is then to develop a general decision support system (DSS) framework to analyze the solutions of basic DEA models. The paper formally shows how the solutions of DEA models should be structured so that these solutions can be examined and interpreted by analysts through information visualization and data mining techniques effectively. An innovative and convenient DEA solver, SmartDEA, is designed and developed in accordance with the pro-posed analysis framework. The developed software provides a DEA solution which is consistent with the framework and is ready-to-analyze with data mining tools, through a table-based structure. The developed framework is tested and applied in a real world project for benchmarking the vendors of a leading Turkish automotive company. The results show the effectiveness and the efficacy of the proposed framework.
Technical Efficiency of Management wise Schools in Secondary School Examinati...IOSRJM
In this paper we measuring the Board of Secondary education data by CCR Model for the Andhra Pradesh state for the academic years 2012-2013 and 2013-2014 to see the Pattern through CCR Technical Efficiency of the Management wise school results in prior to the division of state in to two separate states. The Performance of the Management wise schools are presented along with the Peer Management Schools performance of the state as a whole.
Multi criteria decision making (MCDM) techniques in today’s organizations, as a key
to performance measurement comes more to the foreground with the advancement in the high
technology. During recent years, many studies have been conducted to obtain a ranking
among many alternatives via measuring performance of each of them against many criteria.
Managerial decision making problems like supplier selection, weapon selection, project
selection, site selection etc are dealt with many multi criteria decision making methods like
TOPSIS, AHP-TOPSIS (Technique for Order Preference by Similarity to Ideal Solution),
PROMETHEE (Preference Ranking Organization Method for Enrichment Evaluation),
ELECTRE, VIKOR etc in crisp throughout the literature. In this work, we first compare
several MCDM methodologies to validate the consistency of them on a standard dataset of
plant layout problem. We proposed M-TOPSIS, A-TOPSIS procedure to select a suitable
layout for the comparative study. Results of M-TOPSIS and A-TOPSIS have been employed
to build an unsupervised artificial neural network (ANN) to obtain a new ranking of
alternatives. This study proposes an approach of deriving the rank value, in order to get
optimal configuration, from the average of more than one set of rank results obtained through
the deployment of MCDM methodologies
PREDICTING BANKRUPTCY USING MACHINE LEARNING ALGORITHMSIJCI JOURNAL
This paper is written for predicting Bankruptcy using different Machine Learning Algorithms. Whether the company will go bankrupt or not is one of the most challenging and toughest question to answer in the 21st Century. Bankruptcy is defined as the final stage of failure for a firm. A company declares that it has gone bankrupt when at that present moment it does not have enough funds to pay the creditors. It is a global
problem. This paper provides a unique methodology to classify companies as bankrupt or healthy by applying predictive analytics. The prediction model stated in this paper yields better accuracy with standard parameters used for bankruptcy prediction than previously applied prediction methodologies.
Automatically Estimating Software Effort and Cost using Computing Intelligenc...cscpconf
In the IT industry, precisely estimate the effort of each software project the development cost
and schedule are count for much to the software company. So precisely estimation of man
power seems to be getting more important. In the past time, the IT companies estimate the work
effort of man power by human experts, using statistics method. However, the outcomes are
always unsatisfying the management level. Recently it becomes an interesting topic if computing
intelligence techniques can do better in this field. This research uses some computing
intelligence techniques, such as Pearson product-moment correlation coefficient method and
one-way ANOVA method to select key factors, and K-Means clustering algorithm to do project
clustering, to estimate the software project effort. The experimental result show that using
computing intelligence techniques to estimate the software project effort can get more precise
and more effective estimation than using traditional human experts did
Benchmarking The Turkish Apparel Retail Industry Through Data Envelopment Ana...ertekg
Download Link > https://ertekprojects.com/gurdal-ertek-publications/blog/benchmarking-the-turkish-apparel-retail-industry-through-data-envelopment-analysis-dea-and-data-visualization/
This paper presents a benchmarking study of the Turkish apparel retailing industry. We have applied the Data Envelopment Analysis (DEA) methodology to determine the efficiencies of the companies in the industry. In the DEA model the number of stores, number of corners, total sales area and number of employees were included as inputs and annual sales revenue was included as the output. The efficiency scores obtained through DEA were visualized for gaining insights about the industry and revealing guidelines that can aid in strategic decision making.
International Journal of Mathematics and Statistics Invention (IJMSI)inventionjournals
International Journal of Mathematics and Statistics Invention (IJMSI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJMSI publishes research articles and reviews within the whole field Mathematics and Statistics, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
1. HAKEEM–UR–REHMAN
PhD (Scholar) Management Science & Engineering
Center of Logistics & Operations Management,
Antai College of Economics and Management,
Shanghai Jiao Tong University, Shanghai, China 1
Data Envelopment Analysis (DEA):
An Overview
0 1 2 3 4 5 6 7 8 9 10
0
1
2
3
4
5
6
7
Employee
Sales
DMU
DB
C
A
E
H
F
G
Production Possibility Set
Efficient Frontier
2. Outlines
2
What is Operations Research?
What is a Linear Programming?
Data Envelopment Analysis (DEA)
DEA Vs Regression
CCR Model
Models Selection
DEA Software
DEA: Review Papers & Books
3. What is Operations Research?
3
Optimal decision-making in, and modeling of, deterministic and probabilistic
systems that originate from real life. These applications, which occur in
government, business, engineering, economics, and the natural and social sciences,
are largely characterized by the need to allocate limited resources. In these
situations, considerable insight can be obtained from scientific analysis, such as
that provided by Operations Research. (Hiller–Lieberman)
OR: a new field which started in the late 1930's and has grown and expanded
tremendously in the last 30 years
TECHNIQUES
Deterministic
OR
Stochastic
OR
Heuristics
Meta-heuristics
(OR/MS)
o Linear Programming
o Integer Programming
o Network Analysis
o Dynamic Programming
o Non-linear
Programming
o …
o Queuing Theory
o Decision Theory
o MCMC
o Markov Decision Process
o Simulation
o …
o Problem Based o Simulated Annealing
o Neural Network
o Genetic Algorithms
o Ant Colony Optimization
o …
4. What is a Linear Programming?
4
A linear programming problem (LP) is a class of the mathematical programming
problem, a constrained optimization problem, for which:
– We attempt to maximize (or minimize) a linear function of the decision variables.
(Objective Function)
– The values of the decision variables must satisfy a set of constraints, each of
which must be a linear inequality or linear equality.
– A sign restriction on each variable. For each variable 𝑋𝑗 the sign restriction can
either say
– 𝑋𝑗 ≥ 0, 𝑋𝑗 ≤ 0, 𝑋𝑗 unrestricted.
𝑀𝑎𝑥𝑖𝑚𝑖𝑧𝑒 𝑜𝑟 𝑀𝑖𝑛𝑖𝑚𝑖𝑧𝑒 𝑍 =
𝑗=1
𝑛
𝐶𝑗 𝑥𝑗
𝑆𝑢𝑏𝑗𝑒𝑐𝑡 𝑡𝑜
𝑗=1
𝑛
𝑎𝑖𝑗 𝑥𝑗
≤
=
≥
𝑏𝑖 𝑓𝑜𝑟 𝑖 = 1,2, … , 𝑚
𝑥𝑗 ≥ 0 𝑓𝑜𝑟 𝑗 = 1,2, … , 𝑛
5. 5
Data Envelopment Analysis (DEA)
“A Performance measurement tool”
0 1 2 3 4 5 6 7 8 9 10
0
1
2
3
4
5
6
7
Employee
Sales
DMU
DB
C
A
E
H
F
G
Production Possibility Set
Efficient Frontier
6. 6
“A Performance measurement tool”
What is Data Envelopment Analysis (DEA)?
Data Envelopment Analysis (DEA) is a linear programming methodology to
measure the efficiency of multiple decision-making units (DMU) when the
production process presents a structure of multiple inputs and outputs.
EXAMPLES:
DMU: Banks, Nations, schools, …
Input: Labor, Capital, Fixed Assets…
Output: Revenues, Profit …
Model of Efficiency Analysis:
Objective of a Data Envelopment Analysis (DEA) assessment:
o Comparison of performance of homogeneous decision making units (DMUs)
that use multiple inputs for the production of multiple outputs.
o The efficiency measure compares the ratio output / input of the DMU assessed
with the value of this ratio observed in the other DMUs analyzed.
7. 7
Data Envelopment Analysis (DEA)
Example: “Measuring managerial ability”
Measuring managerial ability is key to many important research
questions, such as those examining managerial contributions to
firm performance and investment decisions, and cross-country
productivity differences.
However, it is difficult to conduct a precise measure of managerial
ability. This paper uses DEA to propose a measure of managerial
ability, based on managers’efficiency in generating revenues. They
find that the measure is more suitable than a number of alternative
measures of ability (historical industry-adjusted stock returns,
historical industry-adjusted return on assets, chief executive officer
(CEO) pay, and as on).
Demerjian P, Lev B, McVay S. “Quantifying managerial ability: A new measure and
validity tests.” Management Science, 2012, 58(7): 1229-1248.
8. 8
Data Envelopment Analysis (DEA)
Example: “Measuring Olympics”
The problem of constructing an Olympic ranking using just the
number of medals won by each national team is still unsolved. The
President of the International Olympic Committee Jacques Rogge
said during the 2008 Olympic Games.
Lozano S, Villa G, Guerrero F, et al. Measuring the performance of nations at the
Summer Olympics using data envelopment analysis. Journal of the Operational
Research Society, 2002: 501-511.
Li, Y., Liang, L., Chen, Y., & Morita, H. (2008). Models for measuring and
benchmarking Olympics achievements. Omega, 36(6), 933-940.
Wu J, Liang L, Yang F. Achievement and benchmarking of countries at the Summer
Olympics using cross efficiency evaluation method. European Journal of
Operational Research, 2009, 197(2): 722-730.
Lei, X., Li, Y., Xie, Q., & Liang, L. (2014). Measuring Olympics achievements
based on a parallel DEA approach. Annals of Operations Research, 1-18.
Following papers use DEA to establish fair models for measuring
and benchmarking the performance of nations at Olympic Games.
9. 9
Data Envelopment Analysis (DEA)
Example: “Allocating Resources”
Resource allocation decisions are crucial for the success of an
organization. But it is not a easy job. Resources include scientific
funding, employees and so on.
Athanassopoulos, A. D. (1998). Decision support for target-based resource
allocation of public services in multiunit and multilevel systems. Management
Science, 44(2), 173-187.
Korhonen, P., & Syrjänen, M. (2004). Resource allocation based on efficiency
analysis. Management Science, 50(8), 1134-1144.
Chen, C. M., & Zhu, J. (2011). Efficient resource allocation via efficiency
bootstraps: an application to R&D project budgeting. Operations research,59(3),
729-741.
10. 10
Data Envelopment Analysis (DEA)
Example: “Allocating fixed costs”
When a central bank invested a common electric trade system for
its branches, such the expense should be covered by its branches.
How should this kind of expense be assigned in an equitable way
to the various peer subunits?
Following papers use DEA to propose alternative approaches to
solve such the problem.
Cook W D, Kress M. Characterizing an equitable allocation of shared costs: A
DEA approach. European Journal of Operational Research, 1999, 119(3): 652-661.
Beasley J E. Allocating fixed costs and resources via data envelopment analysis.
European Journal of Operational Research, 2003, 147(1): 198-216.
Li Y, Yang F, Liang L, et al. Allocating the fixed cost as a complement of other cost
inputs: A DEA approach. European Journal of Operational Research, 2009, 197(1):
389-401.
11. 11
Why DEA?
DEA publication statistics from Science Direct
Number of papers Published
0
100
200
300
400
500
600
700
800
900
1000
1994 1996 1998 2000 2002 2004 2006 2008 2010 2012 2014
Years
Numbers
Databases including:
Science Direct (www.sciencedirect.com)
EBSCO (www.ebsco.com)
Google Scholar (http:// scholar.google.com)
JSTOR (http://uk.jstor.org/)
Pro-Quest (http://proquest.umi.com)
12. 12
Why DEA?...
Top 20 most influential journals in DEA field
Liu J S, Lu L Y Y, Lu W M, et al. Data envelopment analysis 1978–2010: A citation-
based literature survey. Omega, 2013, 41(1): 3-15.
13. 13
What is DEA?...
PRODUCER is usually referred to as a Decision Making Unit (DMU)
o All DMUs must exist in the same basic environment and convert same set
of inputs into same set of outputs
It is a deterministic & non-parametric technique (i.e. makes no
assumptions)
The name of the technique is because we try to build a frontier by
enveloping all the observed input-output vectors
o Efficiency of each firm is measured by the distance of its input-output
vectors to the frontier
It fits a piece-wise linear frontier using a linear programming technique
It’s an extreme point method and it compares each producer with only the
"best" producers
14. 14
DEA: Measuring Performance Efficiency
Shoe Shops: Example (Single Input and Single Output Case) Suppose there
are 8 branch stores which we label A to H at the head of each column in the following Table.
Each store uses employees to get sales. Sales are measured in 100,000 dollars. Their
“production processes” are similar, so they are comparable.
Employee
o Compared with the best store ‘B’, the others
are inefficient. We can measure the efficiency
of others relative to ‘B’by
o The worst F’s efficiency is 0.4 / 1 = 0.4,
which is 40% of B's efficiency.
o Arrange them in the following order
Efficiency=
𝑂𝑢𝑡𝑝𝑢𝑡
𝐼𝑛𝑝𝑢𝑡
15. 15
DEA Vs Regression
Shoe Shops: Example (Single Input and Single Output Case) …
8 Stores; Sales & # of Employees
Regression can accommodate Multiple inputs or outputs but not both
Regression requires a functional relationship between in/outputs
Regression provides only average relationships not best practice
DEA Best-
Practice Frontier
predicted
average behavior
Inefficient Efficient
piece-
wise
linear
frontier
16. 16
Why Use DEA?
Reasons:
o DEA provides an ordinal ranking of relative efficiency
compared to the Pareto-efficient frontier—the best performance
that can be practically achieved.
o DEA allows each DMU to select an optimal set of weights by
itself. (If two firms produce the same output, but do so with
different mixes of inputs, even if the dollar value of the inputs
differs, both are considered efficient.)
17. 17
DEA: General case
Suppose there are ‘𝑛’ 𝐷𝑀𝑈
o 𝐷𝑀𝑈𝑗 Where: 𝑗 = 1,2,3, … , 𝑛 (𝑖. 𝑒. 𝐷𝑀𝑈1, 𝐷𝑀𝑈2, … , 𝐷𝑀𝑈 𝑛)
Each 𝐷𝑀𝑈 Consumes ‘𝑚’ inputs and Generates ′𝑠′ outputs. Let the
o Input data for 𝐷𝑀𝑈𝑗 be (𝑥1𝑗, 𝑥2𝑗, … , 𝑥 𝑚𝑗) and
o Output data for 𝐷𝑀𝑈𝑗 be (𝑦1𝑗, 𝑦2𝑗, … , 𝑦 𝑠𝑗)
Then, for each DMU, we formed the virtual input and output by (yet
unknown) weights (𝑣𝑖) and (𝑢 𝑟):
o Virtual Input = 𝑣1 𝑥10 + ⋯ + 𝑣 𝑚 𝑥 𝑚0 Total Cost
o Virtual Output = 𝑢1 𝑦10 + ⋯ + 𝑢 𝑠 𝑦𝑠0 Total Revenue
Then we tried to determine the optimal weights, using linear
programming so as to maximize the 𝑅𝑎𝑡𝑖𝑜 =
𝑉𝑖𝑟𝑡𝑢𝑎𝑙 𝑂𝑢𝑡𝑝𝑢𝑡
𝑉𝑖𝑟𝑡𝑢𝑎𝑙 𝐼𝑛𝑝𝑢𝑡
The optimal weights may (and generally will) vary from one DMU to
another DMU.
18. 18
DEA: The CCR Model
Let the 𝐷𝑀𝑈𝑗 to be evaluated be designated as 𝐷𝑀𝑈 𝑜
We solve the following fractional programming problem to obtain values for the
input "weights" (𝑣𝑖: 𝑖 = 𝑙, … , 𝑚) and the output "weights" (𝑢𝑟: 𝑟 = 1, … , 𝑠)
These weights are decision variables.
19. 19
DEA: The CCR Model…
The optimal solution for (LPo) denoted as (v*, u*) in measuring the efficiency for
the DMU0.
CCR Efficiency:
i. 𝐷𝑀𝑈 𝑜 is CCR-efficient if 𝜃∗ = 1 and there exists at least one optimal
(𝑣∗, 𝑢∗), with 𝑣∗ > 0 and 𝑢∗ > 0.
ii. Otherwise, 𝐷𝑀𝑈 𝑜 is CCR-inefficient.
The bigger its score, the more efficient, the higher its rank.
Theorem 2.1 The
fractional program (FPo)
is equivalent to (LPo).
(Cooper-Charnes
transformation)
The fractional program (FPo) can be changed to be a following Linear program
(LPo),
20. 20
DEA: The CCR Model…
Shoe Shops: Example (Single Input and Single Output Case) Following
Table shows 8 DMUs with 1 input (Employee) and 1 output (Sale). (The former example)
v*=0.5
u*=0.5
𝑢, 𝑣 >= 0
The optimal solution, easily obtained by
simple ratio calculations, is given by (v* =
0.5, u* = 0.5, 𝜃∗
= 0.5). Thus, the
CCR-efficiency of ‘A’ is 𝜽∗
= u* = 0.5
We can evaluate the efficiency of DMU A, by solving the LP
problem below:
21. 21
DEA: The CCR Model…
Shoe Shops: Example (Single Input and Single Output Case) …
Definition: Reference set
o If DMU A is evaluated, its optimal solution can make DMU B’s efficiency
equal to 1, then B is a reference set to A.
o A’s optimal solution is (v* = 0.5, u* = 0.5). In this case, DMU B’s efficiency
is u * y/(v * x)=0.5*3 /(0.5 * 3) = 1.
o Thus, the performance of B is used to characterize A and rates it as inefficient
even with the best weights that the data admit for A.
The efficiency of B can be similarly evaluated from the
data in Table by:
The optimal solution is (v* = 0.3333, u* = 0.3333, 𝜽∗= 1) and B is CCR-efficient.
Note: The optimal solution for B is different from A’s
22. 22
DEA: The CCR Model…
Similarly, all DMUs’ efficiencies can be obtained as follows:
No. DMU Score Rank Reference set
1 A 0.5 6 B
2 B 1 1 B
3 C 0.6666667 4 B
4 D 0.75 3 B
5 E 0.8 2 B
6 F 0.4 8 B
7 G 0.5 6 B
8 H 0.625 5 B
How to improve the
performance of inefficient
DMU D?
Employee
24. 24
DEA: Models Selection
DEA Models: CCR, BCC, FG, ST
o The main difference of these four models attributes to their different
production possibility sets and corresponding efficient frontiers.
o The production possibility set is an assumption of production functions of
DMUs’ inputs and outputs, defined as P(x, y)={(x, y)| x can produce y}.
Therefore, the production possibility set seems feasible production area.
o The efficient frontier is the positive edge of production possibility set .
Different models selected may obtain quite different evaluation results, therefore, it
is better to use an appropriate model before using DEA.
Which model selected depends on return to scale assumptions of the true
production functions
0 1 2 3 4 5 6 7 8 9 10
0
1
2
3
4
5
6
7
Employee
Sales
DMU
DB
C
A
E
H
F
G
CCR efficient frontier
BCC efficient frontier
FG efficient frontier
ST efficient frontier
CCR: Constant returns-to-scale assumption
BCC: Variable returns-to-scale assumption
ST: Increasing returns-to-scale assumption
FG: Decreasing returns-to-scale assumption
25. 25
How to solve DEA models: Software
Matlab
– (www.mathworks.cn/products/matlab/)
Efficiency Measurement System (EMS): developed under the
supervision of Prof. Emmanual Thanassoulis and Ali Emrouznejad.
– (http://www.wiso.uni-dortmund.de/lsfg/or/scheel/ems/)
DEA-Solver: a software based on EXCEL
– (www.deafrontier.net/joezhu/)
26. 26
Data Envelopment Analysis (DEA)
“GOOD REVIEWS”
Seiford L M. A bibliography for data envelopment analysis (1978-1996). Annals
of Operations Research, 1997, 73: 393-438.
Cook W D, Seiford L M. Data envelopment analysis (DEA)–Thirty years on.
European Journal of Operational Research, 2009, 192(1): 1-17.
Dyson R G, Allen R, Camanho A S, et al. Pitfalls and protocols in DEA.
European Journal of Operational Research, 2001, 132(2): 245-259.
The last review is very important for DEA researchers, for it explains some key
problems unsolved in DEA field.