This document provides an overview of Optimization Direct, an IBM business partner that specializes in optimization software and consulting. It discusses Optimization Direct's experience implementing optimization technology for various industries. The document also summarizes Optimization Direct's product offerings, which focus on IBM ILOG CPLEX Optimization Studio. It then highlights several recent case studies where Optimization Direct helped customers solve scheduling, resource allocation, and pricing problems using analytics and optimization modeling approaches like MIP and heuristic algorithms. Finally, it shares an example of how Optimization Direct helped a retail client optimize markdown pricing and promotions to improve sales and margins.
EX Optimization Studio* solves large-scale optimization problems and enables better business decisions and resulting financial benefits in areas such as supply chain management, operations, healthcare, retail, transportation, logistics and asset management. It has been applied in sectors as diverse as manufacturing, processing, distribution, retailing, transport, finance and investment. CPLEX Optimization Studio is an analytical decision support toolkit for rapid development and deployment of optimization models using mathematical and constraint programming. It combines an integrated development environment (IDE) with the powerful Optimization Programming Language (OPL) and high-performance ILOG CPLEX optimizer solvers. CPLEX Optimization Studio enables clients to: Optimize business decisions with high-performance optimization engines. Develop and deploy optimization models quickly by using flexible interfaces and prebuilt deployment scenarios. Create real-world applications that can significantly improve business outcomes. Optimization Direct has partnered with and entered into a technology licensing and distribution agreement with IBM. By combining the founders' industry and software experience and IBM’s CPLEX Optimization Studio product with the arsenal of Optimization modeling and solving tools from IBM provides customers the most powerful capabilities in the industry.
Literally, Kanban is a Japanese word that means "visual card". At Toyota, Kanban is the term used for the visual & physical signaling system that ties together the whole Lean Production system. Kanban as used in Lean Production is over a half century old. It is being adopted newly to some disciplines as software.
EX Optimization Studio* solves large-scale optimization problems and enables better business decisions and resulting financial benefits in areas such as supply chain management, operations, healthcare, retail, transportation, logistics and asset management. It has been applied in sectors as diverse as manufacturing, processing, distribution, retailing, transport, finance and investment. CPLEX Optimization Studio is an analytical decision support toolkit for rapid development and deployment of optimization models using mathematical and constraint programming. It combines an integrated development environment (IDE) with the powerful Optimization Programming Language (OPL) and high-performance ILOG CPLEX optimizer solvers. CPLEX Optimization Studio enables clients to: Optimize business decisions with high-performance optimization engines. Develop and deploy optimization models quickly by using flexible interfaces and prebuilt deployment scenarios. Create real-world applications that can significantly improve business outcomes. Optimization Direct has partnered with and entered into a technology licensing and distribution agreement with IBM. By combining the founders' industry and software experience and IBM’s CPLEX Optimization Studio product with the arsenal of Optimization modeling and solving tools from IBM provides customers the most powerful capabilities in the industry.
Literally, Kanban is a Japanese word that means "visual card". At Toyota, Kanban is the term used for the visual & physical signaling system that ties together the whole Lean Production system. Kanban as used in Lean Production is over a half century old. It is being adopted newly to some disciplines as software.
If you are new to Kanban then this presentation is for you. I am talking briefly about lean principles and Elements of the Kanban Method. The difference between Kanban with capital K and kanban with small k.
Agenda
Kanban Self Assessment
Lean Principles
What is Kanban?
Motivation to use Kanban
Elements of the Kanban Method
Kanban Practices
Kanban and Scrum
iEnable - the talking QMS from Adaptive ProcessesLN Mishra CBAP
iEnable - the talking QMS from Adaptive Processes
World’s First Audio and Video Based Business Management System (BMS), iEnable, over comes the short-comings of the traditional QMSs
iEnable explains critical templates through audio and video demonstrations
Flash based interface
Integrated system for Agile, CMMI, ISO 9001, ISO 27001, BABoK
Implementing Kanban to Improve your WorkflowJennifer Davis
Tutorial from LOPSA East
System, network, and security senior engineers manage intricate relationships ensuring that everything from simple tasks to complex projects gets completed in a timely manner. In this workshop, we will talk about using agile processes to identify, visualize, and improve work.
Outline:
Overview of the kanban process. What is kanban?
Identify common problems.
Define common terminology explicitly.
Work through common problems as a group using kanban.
Identify metrics for improvement.
Review, next steps, additional resources.
At the end of this tutorial, attendees will have a solid understanding of kanban and agile processes to take back to their environments.
Improved Swing-Cut Modeling for Planning and Scheduling of Oil-Refinery Disti...Alkis Vazacopoulos
Nonlinear planning and scheduling models for crude-oil atmospheric and vacuum distillation units are essential to manage increased complexities and narrow margins present in the petroleum industry. Traditionally, conventional swing-cut modeling is based on fixed yields with fixed properties for the hypothetical cuts that swing between adjacent light and heavy distillates and can subsequently lead to inaccuracies in the predictions of both its quantity and quality. A new extension is proposed to better predict quantities and qualities for the distilled products by taking into consideration that we require corresponding light and heavy swing-cuts with appropriately varying qualities. By computing interpolated qualities relative to its light and heavy swing-cut quantities, we can show an improvement in the accuracy of the blended or pooled quality predictions. Additional nonlinear variables and constraints are necessary in the model but it is shown that these are relatively easy to deal with in the nonlinear optimization.
Presented in this short document is a description of how to model and solve multi-utility scheduling optimization (MUSO) problems in IMPL. Multi-utility systems (co/tri-generation) are typically found in petroleum refineries and petrochemical plants (multi-commodity systems) especially when fuel-gas (i.e., off-gases of methane and ethane) is a co- or by-product of the production from which multi-pressure heating-, motive- and process-steam are generated on-site. Other utilities include hydrogen, electricity, water, cooling media, air, nitrogen, chemicals, etc. where a multi-utility system is shown in Figure 1 with an intermediate or integrated utility (both produced and consumed) such as fuel-gas, steam or electricity. Itemized benefit areas just for better management of an integrated steam network can be found in Pelham (2013) where his sample multi-pressure steam utility flowsheet is found in Figure 2.
Conditional interval variables: A powerful concept for modeling and solving c...Philippe Laborie
Scheduling is not only about deciding when to schedule a predefined set of activities. Most of real-world scheduling problems also involve selecting a subset of activities (oversubscribed problems) and a particular way to execute them (resource or mode allocation, alternative recipes, preemptive activity splitting, etc.). We present the notion of conditional interval variable in the context of Constraint Programming and show how this concept can be leveraged to model and solve complex scheduling problems involving both temporal and non-temporal decisions.
This slide deck was presented at the 21st International Symposium on Mathematical Programming (ISMP 2012).
Philippe Laborie
Modeling and Solving Resource-Constrained Project Scheduling Problems with IB...Philippe Laborie
Since version 2.0, IBM ILOG CP Optimizer provides a new scheduling language supported by a robust and efficient automatic search. We show how the main features of resource-constrained project scheduling such as work-breakdown structures, optional tasks, different types of resources, multiple modes and skills, resource calendars and objective functions such as earliness/tardiness, unperformed tasks or resource costs can be modeled in CP Optimizer. The robustness of the automatic search will be illustrated on some classical resource-constrained project scheduling benchmarks.
This slide deck was presented at EURO 2009 conference (http://www.euro-2009.de/).
Philippe Laborie
This presentation introduces CP Optimizer a model & run optimization engine for solving discrete combinatorial problems with a particular focus on scheduling problems.
Modeling and Solving Scheduling Problems with CP OptimizerPhilippe Laborie
This presentation focuses on using CP Optimizer to address scheduling problems. We will initially cover modeling concepts related with scheduling in CP Optimizer. Using examples we will then provide details on tools, functionalities and tips for speeding-up the development of your scheduling models and improving their efficiency.
Solving Large Scale Optimization Problems using CPLEX Optimization Studiooptimizatiodirectdirect
Recent advancements in Linear and Mixed Programing give us the capability to solve larger Optimization Problems. In this talk using CPLEX Optimization Studio we will discuss modeling practices, case studies and demonstrate good practices for solving Hard Optimization Problems. We will also discuss recent CPLEX performance improvements and recently added features.
If you are new to Kanban then this presentation is for you. I am talking briefly about lean principles and Elements of the Kanban Method. The difference between Kanban with capital K and kanban with small k.
Agenda
Kanban Self Assessment
Lean Principles
What is Kanban?
Motivation to use Kanban
Elements of the Kanban Method
Kanban Practices
Kanban and Scrum
iEnable - the talking QMS from Adaptive ProcessesLN Mishra CBAP
iEnable - the talking QMS from Adaptive Processes
World’s First Audio and Video Based Business Management System (BMS), iEnable, over comes the short-comings of the traditional QMSs
iEnable explains critical templates through audio and video demonstrations
Flash based interface
Integrated system for Agile, CMMI, ISO 9001, ISO 27001, BABoK
Implementing Kanban to Improve your WorkflowJennifer Davis
Tutorial from LOPSA East
System, network, and security senior engineers manage intricate relationships ensuring that everything from simple tasks to complex projects gets completed in a timely manner. In this workshop, we will talk about using agile processes to identify, visualize, and improve work.
Outline:
Overview of the kanban process. What is kanban?
Identify common problems.
Define common terminology explicitly.
Work through common problems as a group using kanban.
Identify metrics for improvement.
Review, next steps, additional resources.
At the end of this tutorial, attendees will have a solid understanding of kanban and agile processes to take back to their environments.
Improved Swing-Cut Modeling for Planning and Scheduling of Oil-Refinery Disti...Alkis Vazacopoulos
Nonlinear planning and scheduling models for crude-oil atmospheric and vacuum distillation units are essential to manage increased complexities and narrow margins present in the petroleum industry. Traditionally, conventional swing-cut modeling is based on fixed yields with fixed properties for the hypothetical cuts that swing between adjacent light and heavy distillates and can subsequently lead to inaccuracies in the predictions of both its quantity and quality. A new extension is proposed to better predict quantities and qualities for the distilled products by taking into consideration that we require corresponding light and heavy swing-cuts with appropriately varying qualities. By computing interpolated qualities relative to its light and heavy swing-cut quantities, we can show an improvement in the accuracy of the blended or pooled quality predictions. Additional nonlinear variables and constraints are necessary in the model but it is shown that these are relatively easy to deal with in the nonlinear optimization.
Presented in this short document is a description of how to model and solve multi-utility scheduling optimization (MUSO) problems in IMPL. Multi-utility systems (co/tri-generation) are typically found in petroleum refineries and petrochemical plants (multi-commodity systems) especially when fuel-gas (i.e., off-gases of methane and ethane) is a co- or by-product of the production from which multi-pressure heating-, motive- and process-steam are generated on-site. Other utilities include hydrogen, electricity, water, cooling media, air, nitrogen, chemicals, etc. where a multi-utility system is shown in Figure 1 with an intermediate or integrated utility (both produced and consumed) such as fuel-gas, steam or electricity. Itemized benefit areas just for better management of an integrated steam network can be found in Pelham (2013) where his sample multi-pressure steam utility flowsheet is found in Figure 2.
Conditional interval variables: A powerful concept for modeling and solving c...Philippe Laborie
Scheduling is not only about deciding when to schedule a predefined set of activities. Most of real-world scheduling problems also involve selecting a subset of activities (oversubscribed problems) and a particular way to execute them (resource or mode allocation, alternative recipes, preemptive activity splitting, etc.). We present the notion of conditional interval variable in the context of Constraint Programming and show how this concept can be leveraged to model and solve complex scheduling problems involving both temporal and non-temporal decisions.
This slide deck was presented at the 21st International Symposium on Mathematical Programming (ISMP 2012).
Philippe Laborie
Modeling and Solving Resource-Constrained Project Scheduling Problems with IB...Philippe Laborie
Since version 2.0, IBM ILOG CP Optimizer provides a new scheduling language supported by a robust and efficient automatic search. We show how the main features of resource-constrained project scheduling such as work-breakdown structures, optional tasks, different types of resources, multiple modes and skills, resource calendars and objective functions such as earliness/tardiness, unperformed tasks or resource costs can be modeled in CP Optimizer. The robustness of the automatic search will be illustrated on some classical resource-constrained project scheduling benchmarks.
This slide deck was presented at EURO 2009 conference (http://www.euro-2009.de/).
Philippe Laborie
This presentation introduces CP Optimizer a model & run optimization engine for solving discrete combinatorial problems with a particular focus on scheduling problems.
Modeling and Solving Scheduling Problems with CP OptimizerPhilippe Laborie
This presentation focuses on using CP Optimizer to address scheduling problems. We will initially cover modeling concepts related with scheduling in CP Optimizer. Using examples we will then provide details on tools, functionalities and tips for speeding-up the development of your scheduling models and improving their efficiency.
Solving Large Scale Optimization Problems using CPLEX Optimization Studiooptimizatiodirectdirect
Recent advancements in Linear and Mixed Programing give us the capability to solve larger Optimization Problems. In this talk using CPLEX Optimization Studio we will discuss modeling practices, case studies and demonstrate good practices for solving Hard Optimization Problems. We will also discuss recent CPLEX performance improvements and recently added features.
Flowsheet Decomposition Heuristic for Scheduling: A Relax & Fix MethodAlkis Vazacopoulos
Decomposing large problems into several smaller subproblems is well-known in any problem solving endeavor and forms the basis for our flowsheet decomposition heuristic (FDH) described in this short note. It can be used as an effective strategy to decrease the time necessary to find good integer-feasible solutions when solving closed-shop scheduling problems found in the process industries. The technique is to appropriately assign each piece of equipment (i.e., process-units and storage-vessels) into groups and then to sequence these groups according to the material-flow-path of the production network following the engineering structure of the problem. As many mixed-integer linear programming (MILP) problems are solved as there are groups, solved in a pre-specified order, fixing the binary variables after each MILP and proceeding to the next. In each MILP, only the binary variables associated with the current group are explicit search variables. The others associated with the un-searched on binary variables (or the next in-line equipment) are relaxed. Three examples are detailed which establishes the effectiveness of this relax-and-fix type heuristic.
Addressing Uncertainty How to Model and Solve Energy Optimization Problemsoptimizatiodirectdirect
During the past twenty years, IBM's CPLEX Optimization Studio has been used extensively to solve hard Energy Optimization Problems. CPLEX Optimization Studio's modeling capabilities, fast solvers and easy deployment features empower users to deploy energy applications quickly and reliably. Now, with a newly update superset offering know as Decision Optimization Center, the capabilities for this class of problems is greatly enhanced. Using the Unit Commitment Problem as a paradigm, we will demonstrate the advantages of using Decision Optimization Center, and CPLEX for modeling complex problems and application developments in the Energy Field.
Most importantly we will introduce new research developed for solving problems that address Uncertainty. We will showcase a new research approach that allows the user to test and deploy Stochastic, Robust or Deterministic models all on the same platform. This new approach allows you to truly understand your options as you explore the best plan for your business and develop a robust and repeatable process.
The usual performance metric that most Paid Search professionals would do bid optimisation against is CPA or ROAS. I explore other different metrics that could be used that would still really help drive a better performance than one would usually see. It looks at ways not just to drive revenue growth but also improve cost efficiency.
Canary Analyze All The Things: How We Learned to Keep Calm and Release OftenC4Media
Video and slides synchronized, mp3 and slide download available at URL http://bit.ly/1ph8Rq1.
Roy Rapoport discusses canary analysis deployment and observability patterns he believes that are generally useful, and talks about the difference between manual and automated canary analysis. Filmed at qconnewyork.com.
Roy Rapoport manages the Insight Engineering group at Netflix, responsible for building Netflix's Operational Insight platforms, including cloud telemetry, alerting, and real-time analytics. He originally joined Netflix as part of its datacenter-based IT/Ops group, and prior to transferring over to Product Engineering, was managing Service Delivery for IT/Ops.
Rethinking an organization in an Agile manner is a challenge that affects every organizational aspects and is surrounded by risks that must be appropriately managed.
Beyond the used methodologies and frameworks, the goal is always to develop a mindset that allows the organization to " stand on their own feet" and embrace antifragility.
In this talk we will describe a concrete transformation experience in a company working on the medical sector, with the operational office in Italy, and how it has been completely revolutionized. We will talk about successful changes and the less fortunate experiments, how the company developed its Way of Working (WoW) in agile manner, even going so far as to reorganize of the internal physical spaces. We will also take a look at how the aspects of the Program were developed: from the Portfolio to the Risk Management System, up to the revision of the Quality procedures.
PECB Webinar: Achieve business excellence through the power of Six SigmaPECB
We will cover:
• Why every company needs Six Sigma implementation
• How processes are improved by using Six Sigma
• Real benefits in profits, and reductions in defects
Presenter:
This webinar will be presented by M.Youssef.K, Executive Consultant & Trainer at Six Sigma Associates - SSA.
Do you have an OEE calculator? TBM Operations consultants share their framework for demonstrating process improvements in financial terms so you can convince senior management that OEE improvement should be a top priority in 2022.
This is a story of a fantasy company situation but most likely very similar to yours
So I am sure you will recognize yourself and your company
The story is a about how IT is effected by company policies
The story is also about what we can do in IT to mitigate the company policies with new technologies and DevOps
Form your future
Presenters notes has been added as lines in the presentation
Manufacturing's Holy Grail: A Practical Science for Executives and ManagersUBMCanon
Mark Spearman, President and CEO, Factory Physics
In this session we will discuss:
-Manufacturing Myths that Muddle Management:
-Bottlenecks and non-bottlenecks—meeting demand
-One Piece Flow—what is the real cost?
-ABC Inventory Policies—how low can you go?
And many more!
Mark L. Spearman is President and CEO of Factory Physics, Inc., a firm that provides management consulting, training, and software to improve manufacturing and supply chain management. In his former life as an academic, he was Head of the Department of Industrial and Systems Engineering at Texas A&M University and also a professor at Georgia Tech and Northwestern University. He is coauthor, with Wallace J. Hopp, of the book, “Factory Physics” that was named the IIE Book of the Year. He has helped more than one hundred companies over the last twenty five years apply the principles of factory physics to improve operations by increasing productivity, reducing cycle times and inventories by developing integrated supply chain approaches that are both simple and effective.
CPLEX Optimization Studio, Modeling, Theory, Best Practices and Case Studiesoptimizatiodirectdirect
Recent advancements in Linear and Mixed Programing give us the capability to solve larger Optimization Problems. CPLEX Optimization Studio solves large-scale optimization problems and enables better business decisions and resulting financial benefits in areas such as supply chain management, operations, healthcare, retail, transportation, logistics and asset management. In this workshop using CPLEX Optimization Studio we will discuss modeling practices, case studies and demonstrate good practices for solving Hard Optimization Problems. We will also discuss recent CPLEX performance improvements and recently added features.
Similar to Optimization Direct: Introduction and recent case studies (20)
We tested ODH|CPLEX 4.24 on Miplib Open-v7 Models, a public collection of 286 models to which and optimal solution has not been proven. 257 of these are known to have a feasible solution.
ODH|CPLEX proved optimality on 6 models and found better solutions in 2 hours, to 40% of the models with 12 threads and 35% with 8 threads. ODH|CPLEX matched on 21% of the models.
Missing-Value Handling in Dynamic Model Estimation using IMPL Alkis Vazacopoulos
Presented in this short document is a description of how IMPL handles missing-values or missing-data when estimating dynamic models which inherently involve time-lagged or time-shifted input and output variables. Missing-values in a data set imply that for some reason the data is not available most likely due to a mal-functioning instrument or even lack of proper accounting. Missing-data handling is relatively well-studied especially for time-series or dynamic data given that it is not as easy as removing, ignoring or deleting bad sections of data when static or steady-state models are calibrated (Honaker and King, 2010; Smits and Baggelaar, 2010; Fisher and Waclawski, 2015). Unfortunately, all of their methods involve what is known as “imputation” i.e., replacing or substituting missing-data with some reasonably assumed value which is at the very least is a biased estimate. When regression techniques such as PLS and PCR are used (Nelson et. al., 2006) then missing-data can be handled without imputation by computing the input-output covariance matrices excluding the contribution from the missing-values given the temporal and structural redundancy in the system. However, it is shown in Dayal (1996) that using PLS and other types of regression techniques such as Canonical Correlation Regression (CCR) and Reduced Rank Regression (RRR) to fit non-parsimonious and non-parametric finite impulse/step response models (FIR/FSR), that this is not as reliable as fitting lower-ordered transfer functions especially considering the robust stability of the resulting model predictive controller if that is its intended use.
Finite Impulse Response Estimation of Gas Furnace Data in IMPL Industrial Mod...Alkis Vazacopoulos
Presented in this short document is a description of how to estimate deterministic and stochastic non-parametric finite impulse response (FIR) models in IMPL applied to industrial gas furnace data identical to that found in TSE-GFD-IMF using parametric transfer-functions. The methodology of time-series analysis or system identification involves essentially three (3) stages (Box and Jenkins, 1976): (1) model structure identification, (2) model parameter estimation and (3) model checking and diagnostics. We do not address (1) which requires stationarity and seasonality assessment/adjustment, auto-, cross- and partial-correlation, etc. to establish the parametric transfer function polynomial degrees especially when we are using non-parametric FIR estimation. Instead we focus only on the parameter estimation and diagnostics. These types of parameter estimation problems involve dynamic and nonlinear relationships shown below and we solve these using IMPL’s Sequential Equality-Constrained QP Engine (SECQPE) and Supplemental Observability, Redundancy and Variability Estimator (SORVE). Other types of non-parametric identification known as Subspace Identification (Qin, 2006) and can used to estimate state-space models.
Our Industrial Modeling Service (IMS) involves several important (but rarely implemented) methods to significantly improve and advance your existing models and data. Since it is well-known that good decision-making requires good models and data, IMS is ideally suited to support this continuous-improvement endeavour. IMS is specifically designed to either co-exist with your existing design, planning, scheduling, etc. applications or these same models and data can be used seamlessly into our Industrial Modeling and Programming Language (IMPL) to create new value-added applications. The following techniques form the basis of our IMS offering.
This short note describes a relatively simple methodology, procedure or approach to increase the performance of already installed industrial models used for optimization, control, simulation and/or monitoring purposes. The method is called Excess or X-Model Regression (XMR) where the concept of “excess modeling” or an X-model is taken from the field of thermodynamics to describe the departure or residual behaviour of real (non-ideal) gases and liquids from their ideal state (Kyle, 1999; Poling et. al., 2001; Smith et. al., 2001). It has also been applied to model the non-ideal or nonlinear behaviour of blending motor gasoline octanes with its synergistic and antagonistic interactional effects (Muller, 1992).
The fundamental idea of XMR is to calibrate, train, fit or estimate, using actual data and multiple linear regression (MLR) or ordinary least squares (OLS), the deviations of the measured responses from the existing model responses. The existing model may be a glass, grey or black-box model (known or unknown, linear or nonlinear, implicit/open or explicit/closed) depending on the use of the model. That is, for optimization and control the model structure and parameters are available given that derivative information is required although for simulation and monitoring, the model may only be observed through the dependent output variables given the necessary independent input variables.
Advanced Parameter Estimation (APE) for Motor Gasoline Blending (MGB) Indust...Alkis Vazacopoulos
Presented in this short document is a description of how to model and solve advanced parameter estimation (APE) problems in IMPL. APE is the term given to the application of estimating, fitting or calibrating parameters in models involving a network, topology, superstructure or flowsheet. When estimating parameters with multiple linear regression (MLR), ordinary least squares (OLS), ridge regression (RR), principal component regression (PCR) and partial least squares (PLS) there is no explicit model but simply an X-block and Y-block of data. Hence, these methods are referred to as “non-parametric” or “data-based” methods as opposed to the “parametric” or “model-based” method used here. To solve these types of problems we use what is commonly referred to as “error-in-variables” (EIV) regression which is conveniently implemented as nonlinear data reconciliation and regression (NDRR) using the technology found in Kelly (1998a; 1998b; 1999) and Kelly and Zyngier (2008a). The primary benefit of using EIV (NDRR) over the other regression methods is that we can easily handle the inclusion of conservation laws and constitutive relations, explicitly, a must for any industrial estimation problem (IEP).
Presented in this short document is a description of modeling and solving partial differential equations (PDE’s) in both the temporal and spatial dimensions using IMPL. The sample PDE problem is taken from Cutlip and Shacham (1999 and 2014) and models the process of unsteady-state heat transfer or conduction in a one dimensional (1D) slab with one face insulated and constant thermal conductivity as discussed by Geankoplis (1993).
Presented in this short document is a description of what is well-known as Advanced Process Control (APC) applied to a small linear three (3) manipulated variable (MV) by two (2) controlled variable (CV) problem. These problems are also known as Model Predictive Control (MPC) (Grimm et. al., 1989) and Moving Horizon Control (MHC). Figure 1 shows the 3 x 2 APC problem configured in our unit-operation-port-state superstructure (UOPSS) (Kelly, 2004, 2005; Zyngier and Kelly, 2012) as an Advanced Planning and Scheduling (APS) problem as opposed to a traditional APC problem.
Although there is a tremendous amount of stability, performance and robustness theory associated with APC which can be directly assumed to APS problems (Mastragostino et. al., 2014), our approach is to show that APC can equally be set into an APS framework except that APS has far less sensitivity technology due to its inherent discrete and nonlinear modeling complexities i.e., especially non-convexities. In order to eliminate the steady-state offset between the actual value and its target, it is well-known to apply bias-updating though other forms of “parameter-feedback” is possible. Typically, APS applications only employ “variable-feedback” i.e., opening or initial inventories, properties, etc. but this alone will not alleviate the steady-state offset as demonstrated by Kelly and Zyngier (2008).
Presented in this short document is a description of our three separate techniques to analyze the data by checking, clustering and componentizing it before it is used by other IMPL’s routines especially in on-line/real-time decision-making applications. We also have other data consistency or analysis techniques which have been described in other IMPL documents and these relate to the application of data reconciliation and regression with diagnostics but require an explicit model (model-based) whereas the techniques below do not i.e., they are data-based techniques.
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
2. Agenda
• Introduction, Recent Optimization Case Studies: Alkis
Vazacopoulos, Optimization Direct
• Recent advances and future direction(s) in IBM ILOG
CPLEX Optimization Studio, Xavier Nodet, IBM
• Analyzing uncertainty and optimizing: a case study in
retail, Robert Ashford, Optimization Direct
• Solving Planning and Scheduling with CPLEX. Filippo
Focacci, Decision Brain
3. Optimization Direct
• IBM Business Partner
• More than 30 years of experience in developingand selling
Optimization software
• Experience in implementingoptimization technology in all the
verticals
• Sold to end users – Fortune 500 companies
• Train our customers to get the maximum out of the IBM software
• Help the customersget a kick start and get the maximum from the
software right from the start
4. What software do we sell?
• IBM ILOG CPLEX Optimization Studio
• Cplex is the leader in optimization technology
• Cplex can handle large scale problems and solve them
very fast
6. Why IBM? Why Cplex?
• Fast
• Reliable
• IBM software
• Large scale
• Gives you the ability to model develop and solve your
decision problem
• Complete solution
7. What types of problems?
• Price & revenue optimization (Travel Industry, etc..,)
• Retail – optimization of campaigns
• Financial: trading, portfolio optimization
• Process industries: schedule your refinery
• Big Data: We see new innovations in human /machine
interface and how operation research Experts they
solve complicated problems in data mining
8. How can we help?
• Benchmark your problems
• Help you with next steps for developing your solution!
• Develop optimization prototypes using OPL
9. Why Optimization Direct?
• Experience
• Responsive
• Benchmark faster against competition
• Expertise
• 15 years of experience competing with CPLEX
• Understand differentiator
• Know how to sell against competitors
10. Recent Analytics & Optimization Case
Studies
• Hospital (OPL MODEL + MIP)
• DNA Screening Company (MIP + CP)
• Workforce scheduling Problem (CPLEX + ODH)
• Sports (MIP, MIP + Local Search, Regression)
• Customized Offers Company (Analytics + MIP)
• Packaging and Fulfillment (MIP, MIP+CP)
• Pharma Co (Analytics, Robust Opt, MIP)
• Energy Co (MIP, extend to Stochastic MIP)
• Financial company (Complex QCPs, MIP)
• Retail Clothing (Analytics, MIP)
11. Hospital Scheduling (non emergency
units)
• Patients
• Block is combination of Room/Day/week
• Surgeons, nurses and doctors
• PROBLEM A: COMPOSITION & ASSIGNMENTS: Create
teams of doctors + stuff to assign to patients
• PROBLEM B: BLOCK ASSIGNMENT: Assign Patients to
Block
12. Hospital Scheduling
• PROBLEM A: Starting to get attention lately in the
Health analytics area; Experiments to determine if
optimal composition will be beneficial (Health Analytics)
• PROBLEM B: Complex rules, Objective function
complexity;
• Objective function: Variation between Number of Patients
in Hospital, waiting for surgery and similar objectives
• Many experimentslarge Hospital
• Model develop with OPL and solved MIP with CPLEX
13. DNA Screening - Scheduling problems
• New Innovative DNA Screening Companies
• Goal: Make custom-built robots to turn blood and saliva
samples into purified DNA.
• Samples: These samples come from men and women
across the globe.
• DNA Sample and Robots: The robots can analyze
thousands of DNA samples at the same time, and can
work nonstop seven days a week.
14. DNA Screening Problem
• This is Flowshop scheduling problem with Many Side
Constraints
• Challenge: Increase Utilization of the robots –
decrease idle time
• Solver: Constrained programming & MIP combination
• Time Horizon: Determine easily Daily sequences and
develop a rolling horizon schedule
16. Worksforce Scheduling Example: Large
Scale Scheduling models
• Schedule entities over 64 periods
• No usable (say within 30% gap) solution to small
model after 3 days run time on fastest hardware (Intel
i7 4790K ‘Devil’s Canyon’)
Model entities rows cols integers
Small 314 389560 94200 94200
Medium 406 371964 149132 149132
Large 508 554902 426390 426390
17. Solution: ODHeuristics
• Uses CPLEX as a solver
• Solves sequence of sub-models
• Delivers usable solutions (12%-16% gap)
• Takes 4-36 hours run time
• Multiple instances can be run concurrently with
different seeds
• Can run on only one core
• Can interrupt at any point and take best solution so far
time limit / call-back /SIGINT
18. Heuristic Results on Scheduling Models
Seed Solution Time Gap
Small 1234 118 65818 15.2%
5678 118 41122 15.2%
9012 117 38243 14.5%
21098 117 27623 14.5%
Medium 1234 703.32 100000
5678 728.64 100000
9012 718.23 100000
21098 832.43 100000
Large 1234 1039.67 60000
5678 1039.47 60000
9012 1039.43 60000
21098 1044.09 60000
Best
bound
of
100
established
by
separate
CPLEX
run
Times
are
in
seconds
on
Intel
i7-‐4790K
@
4.4GHz
(1
core)
19. Small Model Heuristic Behavior
100
120
140
160
180
200
220
240
260
280
300
0 5000 10000 15000 20000 25000 30000
Solution value
Time in seconds
1234
5678
9012
21098
Seeds
20. Medium Model Heuristic Behavior
500
700
900
1100
1300
1500
1700
1900
0 20000 40000 60000 80000 100000 120000 140000 160000
Solution value
Time in seconds
1234
5678
9012
21098
Seeds
21. Large Model Heuristic Behavior
1020
1040
1060
1080
1100
1120
1140
1160
1180
0 10000 20000 30000 40000 50000 60000 70000
Solution value
Time in seconds
1234
5678
9012
21098
Seeds
22. Parallel Heuristic Approach
• Run several heuristic threads with different seeds
simultaneously
• CPLEX callable library very flexible, so
• Exchange solution information between runs
• Kill sub-model solves when done better elsewhere
• Improves sub-model selection
• 4 instances run on 4 core i7-4790K
• Each heuristic thread run with single CPLEX thread
i.e. 1 core each
• Compare with serial runs using a single CPLEX thread
23. Small Model Parallel Heuristic Behavior
100
120
140
160
180
200
220
240
260
280
300
0 5000 10000 15000 20000 25000 30000
Solution value
Time in seconds
1234
5678
9012
21098
Parallel
Seeds
24. Medium Model Parallel Heuristic Behavior
500
700
900
1100
1300
1500
1700
1900
0 20000 40000 60000 80000 100000 120000 140000 160000
Solution value
Time in seconds
1234
5678
9012
21098
Parallel
Seeds
25. Large Model Parallel Heuristic Behavior
1020
1040
1060
1080
1100
1120
1140
1160
1180
0 10000 20000 30000 40000 50000 60000 70000
Solution value
Time in seconds
1234
5678
9012
21098
Parallel
Seeds
26. Parallel Heuristic Advantages
• Better results
• Better objective value
• More consistent
• Faster
• Compare time to interesting(i.e. good) solutions
• Speedup depends on model (as with straight MIP)
• Depends on which serial run used for comparison
• A factor of 2 to 4 with 4 cores is typical
Model Speedup factor
Small 1.4 to 3.7
Medium 2.5 to 8.3
Large 1.8 to 2.8
27. Customized Offers Company:
• Products( portfolio of products)
• Customers
• More productsare added or deleted from the portfolio
• Customers are added or deleted every month; customers
have monthlyor yearly subscription s
• Objective : retain customers, increase sales
• Action: Send a package recommendation to a customer or–
customer has option to cancel or select other package
29. Solution
• Use CPLEX libraries
• MIP
• Use ODHeuristics to solve some harder problems
30. Used Case: Dynamic Pricing Using Promotions
Markdowns and Clearance strategies
31. Retail Optimization Used Case
• Vertical: Retail
• Products: Apparel & Accessories
• Objective: Maximize Revenue, Maximize margin, Reduce Inventory
• Decisions: Dynamic Pricing
• What do I have: Initial Plan
• Status: Review the week
• Decisions: Pricing
• Dynamic Pricing
• Markdowns
• Price Points
• Clearance
• Promotions
32. PLAN – Last week
Sales $ Units Sold Margin
$7,689,140 568,000 53.73%
Our sales plan for last week was:
REVENUE TARGET
33. PLAN – Last week
Sales $ Units Sold Margin
$7,689,140 568,000 53.73%
Actual – Last Week
Sales $ Units Sold Margin
$7,083,935 559,390 51%
How did we do? Plan vs. Actual
REVENUE TARGET ACTUAL Revenue
34. PLAN – Last week
Sales $ Units Sold Margin
$7,689,140 568,000 54%
Actual – Last Week
Sales $ Units Sold Margin
$7,083,935 559,390 51%
How did we do? Plan vs. Actual
We missed both in
Sales revenue,
Units sold and
Margin
35. PLAN – Last week
Sales $ Units Sold Margin
$7,689,140 568,000 53.73%
Actual – Last Week
Sales $ Units Sold Margin
$7,083,935 559,390 51%
How did we do? Plan vs. Actual
Which Season/s was the problem?
36. PLAN – Last week – SPRING SEASON
Sales $ Units Sold Margin
$5,515,500 310,000 61.73%
Actual – Last Week
Sales $ Units Sold Margin
$4,571,196 269,470 61.48%
Where we miss?
We missed on Revenue
and on units
SPRING 2016 is the problem!
37. What can do?
• Using TM1 we can analyze the data and identify Variance in the Plan vs.
Actual
• How can we affect the demand?
• Promotions
• Markdowns
• Clearance
• How do we decide which products, groups, when to act?
38. Technology
• We use Predictive analytics
• To predict the sales for next week/s
• To identify slow and fast moving products
• To identify productsthat react well in markdowns and promotions
• We use Prescriptive analytics – optimization
• To decide optimal prices that maximize our revenue
• to decide when to offer promotionsto maximize our revenue
39. What are the data we need for each
SKU?
SKU
ID
Price Cost Days
on
the
Floo
r
Total
Quantity
Ordered
Revenues Cost
of
Sold
Current
Margin
Total
Sold
Units
Total
Length
of
Selling
period
Liquidatio
n price
SKU99
9
$24.0
4
$6.85 77
days
1794 $9969 $4247 57.4% 620 24
weeks
$6.85
Total Sold * Price IS
NOT EQUAL to
REVENUES SO FAR
Average
Price
$16.07
Avg. Salesthrough
3.14%
40. What is the output of the optimization?
SKU
ID
Price Cost Days
on
the
Floo
r
Total
Quantity
Ordered
Revenues Cost
of
Sold
Current
Margin
Total
Sold
Units
Total
Length
of
Selling
period
Liquidatio
n price
SKU99
9
$24.0
4
$6.85 77
days
1794 $9969 $4247 57.4% 620 24
weeks
$6.85
PROMOTE 30%
NEXT WEEK
Average
Price
$16.07 Avg. Salesthrough
3.14%
41. 50% off
20% off
30% off
40% off
50% off
$0
$2,000
$4,000
$6,000
$8,000
$10,000
$12,000
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
Revenues
Weeks
Markdown
&
Promotion
strategy for a
“slow-‐moving”
product
42. 10% off
30% off
20% off 50% off
$0
$2,000
$4,000
$6,000
$8,000
$10,000
$12,000
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
Revenues
Weeks
Markdown
&
Promotion
strategy for a
“fast-‐moving”
product
49. Tutorial: Monday
• Optimization Direct will also be presenting a
technology tutorial at INFORMS 2016 on
• Monday April 11 at 3:40pm-4:30pm, Track 10 -
Technology Tutorials,
• Regency 6.
• Solving Large Scale Optimization Problems using
CPLEX Optimization Studio