The document describes the IMPRESS industrial modeling and presolving system. IMPRESS provides optimization and estimation capabilities for planning, scheduling, data reconciliation, and real-time control problems. It includes components for modeling, presolving, interfacing with solvers, and interactive use via programming languages. IMPRESS supports linear, nonlinear, discrete, and mixed-integer formulations.
Generators take a high-level software specification and produce an implementation. GenVoca is an approach to building generators that composes reusable component layers. It models software as realms of components with vertical and horizontal parameters. Components are implemented as C++ templates containing member classes. Composition validation ensures semantics are correct. Aspect-oriented programming (AOP) and GenVoca both aim to improve code reuse but differ in focus, concepts, and implementation mechanisms like aspect languages versus type expressions. Generators automate implementation through transformations while GenVoca provides a systematic approach through composable and customizable components.
Systems Navigator - Added value of simulation in the design of terminalsjhjsmits
The document discusses the added value of using simulation in the design of liquid bulk terminals. Simulation can help quantify design decisions, provide insight into terminal behavior under varying conditions, and allow for scenario comparisons. Case studies demonstrate how simulation helped with terminal layout selection, improving refinery throughput, and evaluating options for expanding an existing terminal. Simulation allows terminal designers to raise questions and optimize designs.
The document discusses formal verification methods like formal equivalence checking and how they can be used at different stages of the design process, including verifying that design representations are functionally equivalent from specification to layout for new designs and across derivative designs. It also explains the difference between boolean and sequential equivalence checking and provides an example comparing two implementations of a two cycle adder to illustrate the types of mismatches each approach can find.
Transnet's Competitive Supplier Development Program (CSDP) aims to localize its supply chain through developing local suppliers' capabilities. The CSDP will focus first on rolling stock and port equipment suppliers. It will take a phased approach, first engaging priority suppliers and preparing staff. Key activities include a supplier benchmarking process to assess suppliers' competitiveness and develop improvement plans. The benchmarking process involves supplier self-assessments, development of improvement plans, and implementing improvements over multiple cycles. It aims to improve supplier-buyer alignment and increase local participation in supply chains. Initial focus is facilitating participation of railway and harbor suppliers through onboarding workshops and a training program to implement benchmarking.
The document describes a demonstration of using an SPC chart to monitor a machining process that creates locator pads. It shows the process being monitored over several subgroups, with the chart indicating when the process is in or out of control. When out of control points are detected, corrective actions are taken like fixing a clamping pressure issue, and the process is monitored again until it remains in control.
The document discusses various software development life cycle models. It describes the waterfall model as the most common and classic linear model where each phase must be completed before the next begins. The V-model is a variant where each development phase is paired with a testing phase. The incremental/iterative model breaks development into smaller iterations with working software delivered in each cycle and requirements refined. Other models covered include the prototype, spiral, rapid application development, and fourth generation techniques models.
Seminar: New Stochastic Programming Features for MPL - Nov 2011Bjarni Kristjánsson
The document discusses new stochastic programming features for the MPL modeling language including:
1) New keywords and syntax for formulating stochastic models in MPL for scenario-based and independent variable models.
2) A callable library interface for connecting stochastic models in MPL to solvers like BendX for solving.
3) Examples of stochastic models expressed in MPL including a farmer model and aircraft allocation model.
DCRA Inc. Supply Chain S&OP Solutions SummaryJon Kirkegaard
DCRA Inc. has helped shape the demand and supply balancing operations in dozens of leading global companies. Our Trade Marked Total Order Fulfillment methodology has led to tripple digit internal rates of returns in weeks and months.
Additionally we have developed patented supply chain solutions including S&OP, order commitment and others that rapidly integrated to any existing supply chain solutions that allow for accelerating change and improving financial returns while reducing IT complexity that causes so many projects to fail. Contact us at www.selectclassics.com
Generators take a high-level software specification and produce an implementation. GenVoca is an approach to building generators that composes reusable component layers. It models software as realms of components with vertical and horizontal parameters. Components are implemented as C++ templates containing member classes. Composition validation ensures semantics are correct. Aspect-oriented programming (AOP) and GenVoca both aim to improve code reuse but differ in focus, concepts, and implementation mechanisms like aspect languages versus type expressions. Generators automate implementation through transformations while GenVoca provides a systematic approach through composable and customizable components.
Systems Navigator - Added value of simulation in the design of terminalsjhjsmits
The document discusses the added value of using simulation in the design of liquid bulk terminals. Simulation can help quantify design decisions, provide insight into terminal behavior under varying conditions, and allow for scenario comparisons. Case studies demonstrate how simulation helped with terminal layout selection, improving refinery throughput, and evaluating options for expanding an existing terminal. Simulation allows terminal designers to raise questions and optimize designs.
The document discusses formal verification methods like formal equivalence checking and how they can be used at different stages of the design process, including verifying that design representations are functionally equivalent from specification to layout for new designs and across derivative designs. It also explains the difference between boolean and sequential equivalence checking and provides an example comparing two implementations of a two cycle adder to illustrate the types of mismatches each approach can find.
Transnet's Competitive Supplier Development Program (CSDP) aims to localize its supply chain through developing local suppliers' capabilities. The CSDP will focus first on rolling stock and port equipment suppliers. It will take a phased approach, first engaging priority suppliers and preparing staff. Key activities include a supplier benchmarking process to assess suppliers' competitiveness and develop improvement plans. The benchmarking process involves supplier self-assessments, development of improvement plans, and implementing improvements over multiple cycles. It aims to improve supplier-buyer alignment and increase local participation in supply chains. Initial focus is facilitating participation of railway and harbor suppliers through onboarding workshops and a training program to implement benchmarking.
The document describes a demonstration of using an SPC chart to monitor a machining process that creates locator pads. It shows the process being monitored over several subgroups, with the chart indicating when the process is in or out of control. When out of control points are detected, corrective actions are taken like fixing a clamping pressure issue, and the process is monitored again until it remains in control.
The document discusses various software development life cycle models. It describes the waterfall model as the most common and classic linear model where each phase must be completed before the next begins. The V-model is a variant where each development phase is paired with a testing phase. The incremental/iterative model breaks development into smaller iterations with working software delivered in each cycle and requirements refined. Other models covered include the prototype, spiral, rapid application development, and fourth generation techniques models.
Seminar: New Stochastic Programming Features for MPL - Nov 2011Bjarni Kristjánsson
The document discusses new stochastic programming features for the MPL modeling language including:
1) New keywords and syntax for formulating stochastic models in MPL for scenario-based and independent variable models.
2) A callable library interface for connecting stochastic models in MPL to solvers like BendX for solving.
3) Examples of stochastic models expressed in MPL including a farmer model and aircraft allocation model.
DCRA Inc. Supply Chain S&OP Solutions SummaryJon Kirkegaard
DCRA Inc. has helped shape the demand and supply balancing operations in dozens of leading global companies. Our Trade Marked Total Order Fulfillment methodology has led to tripple digit internal rates of returns in weeks and months.
Additionally we have developed patented supply chain solutions including S&OP, order commitment and others that rapidly integrated to any existing supply chain solutions that allow for accelerating change and improving financial returns while reducing IT complexity that causes so many projects to fail. Contact us at www.selectclassics.com
The document contains a quiz with multiple choice questions about music, sports, history, literature and companies. The questions cover topics like the writer of "Stairway to Heaven", the number of Formula One championships won by Michael Schumacher, the founder of Ferrari and the director of the movie "The Bodyguard".
School counselors prove valuable partners in understanding student needsLearning Forward
According to the newest MetLife Survey of the American Teacher, 64% of teachers are reporting seeing an increase in the number of students and families needing health and social support services, while 28% of teachers have seen reductions or eliminations of those same services. As a result, schools may be faced with redefining their school counselors’ roles by necessity, and in the process may find better ways to leverage the counselor’s contributions to teachers and to student achievement.
Traditionally, school counselors have been asked to address students’ emotional or academic planning needs. See how one Alabama school is redefining the role of school counselor to become more active contributors to schools’ improvement plans, student achievement, and teacher professional learning.
Learn more about professional learning at all
levels of education with Learning Forward,
an international membership association of
learning educators:
www.learningforward.org
Membership in Learning Forward gives you
access to a wide range of publications, tools,
and opportunities to advance professional
learning for student success.
World War II involved massive land battles across Europe between Allied and Axis forces from 1942 to 1945. Major Allied military leaders like Montgomery, Clark, Alexander, Patton, Bradley, and Devers led multiple army groups and task forces against Axis armies in theaters across Western, Central, and Eastern Europe. Overlapping operations across many army groups advanced the Allied front lines back towards Germany over three years of costly fighting until final victory in Europe in 1945.
The document discusses the different types of verb tenses in English including present, past, and future tenses. It outlines the four types of each tense - simple, progressive, perfect, and perfect progressive - and provides examples of how to conjugate verbs for each tense. Key tenses include present simple (S + V1), past simple (S + V2), and future simple (S + will/shall + V1).
El documento describe los equipos y procesos técnicos utilizados por tres radios comunitarias en Perú. Las radios usan computadoras PC con Windows 7 y consolas de audio de 8 a 12 canales. Usan programas como Audacity y Zara Radio para edición y automatización. Realizan ajustes de ecualización en vivo y usan pocos efectos de sonido. Monitorean los niveles de audio con vúmetros y no hacen producciones complejas. Usan programas como Nero y CyberLink para quemar CDs y automatizar la programación. Se enfrentan a
Monitorul Oficial 255/17.04.2012 - Lege privind aprobarea Ordonantei de urgenta a Guvernului nr. 79/2011 pentru reglementarea unor masuri necesare intrarii in vigoare a Legii nr. 287/2009 privind Codul Civil
This document provides information on available homes for sale in the River Valley Highlands community in Lancaster, Ohio. It describes two inventory homes that are currently available, including a 3 bedroom, 2.5 bathroom home priced at $196,898 located on Zachariah Ave. It also lists available home sites, including some with walkout basements, and notes that 5 sites remain across from the new school. Financing is available through Pulte Mortgage with $3,000 in contributions.
The document summarizes the process of making a pot. It describes wedging and shaping the clay into a rectangular form, making coils to build up the pot, applying slip to join the coils, firing the pot in a kiln, glazing it after the first firing, and firing it a final time. It also briefly discusses the importance of water for survival and culture, and how Native Americans used pots to carry and transport water and other items as well as for cooking.
This document contains the schedule for the HSBC Sevens World Series tournament in Hong Kong from March 23-25. It lists the teams competing in each pool and the schedule of matches to be played each day of the tournament, including quarterfinals, semifinals and finals of the Cup, Plate, Bowl and qualifier competitions.
Software variability management involves developing software systems that can be efficiently extended, changed, customized, or configured for different contexts. Variability in software can come from bundles, command line parameters, plugins, configuration files, and microservices. Managing variability is challenging due to the large number of potential product configurations. Software product lines address this challenge by systematically combining common and variable features using techniques like feature modeling. Feature models document the features and options of a product line through a tree structure showing mandatory, optional, and mutually exclusive relationships between parent and child features. This allows efficiently developing, testing, and maintaining many derived products that share common parts.
The document contains a quiz with multiple choice questions about music, sports, history, literature and companies. The questions cover topics like the writer of "Stairway to Heaven", the number of Formula One championships won by Michael Schumacher, the founder of Ferrari and the director of the movie "The Bodyguard".
School counselors prove valuable partners in understanding student needsLearning Forward
According to the newest MetLife Survey of the American Teacher, 64% of teachers are reporting seeing an increase in the number of students and families needing health and social support services, while 28% of teachers have seen reductions or eliminations of those same services. As a result, schools may be faced with redefining their school counselors’ roles by necessity, and in the process may find better ways to leverage the counselor’s contributions to teachers and to student achievement.
Traditionally, school counselors have been asked to address students’ emotional or academic planning needs. See how one Alabama school is redefining the role of school counselor to become more active contributors to schools’ improvement plans, student achievement, and teacher professional learning.
Learn more about professional learning at all
levels of education with Learning Forward,
an international membership association of
learning educators:
www.learningforward.org
Membership in Learning Forward gives you
access to a wide range of publications, tools,
and opportunities to advance professional
learning for student success.
World War II involved massive land battles across Europe between Allied and Axis forces from 1942 to 1945. Major Allied military leaders like Montgomery, Clark, Alexander, Patton, Bradley, and Devers led multiple army groups and task forces against Axis armies in theaters across Western, Central, and Eastern Europe. Overlapping operations across many army groups advanced the Allied front lines back towards Germany over three years of costly fighting until final victory in Europe in 1945.
The document discusses the different types of verb tenses in English including present, past, and future tenses. It outlines the four types of each tense - simple, progressive, perfect, and perfect progressive - and provides examples of how to conjugate verbs for each tense. Key tenses include present simple (S + V1), past simple (S + V2), and future simple (S + will/shall + V1).
El documento describe los equipos y procesos técnicos utilizados por tres radios comunitarias en Perú. Las radios usan computadoras PC con Windows 7 y consolas de audio de 8 a 12 canales. Usan programas como Audacity y Zara Radio para edición y automatización. Realizan ajustes de ecualización en vivo y usan pocos efectos de sonido. Monitorean los niveles de audio con vúmetros y no hacen producciones complejas. Usan programas como Nero y CyberLink para quemar CDs y automatizar la programación. Se enfrentan a
Monitorul Oficial 255/17.04.2012 - Lege privind aprobarea Ordonantei de urgenta a Guvernului nr. 79/2011 pentru reglementarea unor masuri necesare intrarii in vigoare a Legii nr. 287/2009 privind Codul Civil
This document provides information on available homes for sale in the River Valley Highlands community in Lancaster, Ohio. It describes two inventory homes that are currently available, including a 3 bedroom, 2.5 bathroom home priced at $196,898 located on Zachariah Ave. It also lists available home sites, including some with walkout basements, and notes that 5 sites remain across from the new school. Financing is available through Pulte Mortgage with $3,000 in contributions.
The document summarizes the process of making a pot. It describes wedging and shaping the clay into a rectangular form, making coils to build up the pot, applying slip to join the coils, firing the pot in a kiln, glazing it after the first firing, and firing it a final time. It also briefly discusses the importance of water for survival and culture, and how Native Americans used pots to carry and transport water and other items as well as for cooking.
This document contains the schedule for the HSBC Sevens World Series tournament in Hong Kong from March 23-25. It lists the teams competing in each pool and the schedule of matches to be played each day of the tournament, including quarterfinals, semifinals and finals of the Cup, Plate, Bowl and qualifier competitions.
Software variability management involves developing software systems that can be efficiently extended, changed, customized, or configured for different contexts. Variability in software can come from bundles, command line parameters, plugins, configuration files, and microservices. Managing variability is challenging due to the large number of potential product configurations. Software product lines address this challenge by systematically combining common and variable features using techniques like feature modeling. Feature models document the features and options of a product line through a tree structure showing mandatory, optional, and mutually exclusive relationships between parent and child features. This allows efficiently developing, testing, and maintaining many derived products that share common parts.
Recover 30% of your day with IBM Development Tools (Smarter Mainframe Develop...Susan Yoskin
If you need to attract new developers, and want to keep your company’s name out of the headlines, then this session is for you. When your business depends on your mainframe apps working and performing well—all the time—you need to be alerted to issues as they occur and have the tools to help you find and fix the problems and test your solutions before disaster strikes (we’ve all been in those late night and weekend drills). You also need to continue supporting these applications for years to come, and that will require new talent.
This session will introduce you to the development environments that college grads are already comfortable with, and help your applications become more resilient at the same time. We’ll walk you through the tools to help you accomplish all of this and demo some scenarios to show you how efficiently our tools can perform the tasks that slow you down.
Building Cogeneration Planning Scheduling Systems using IBM ILOG ODME, CPLEX ...Alkis Vazacopoulos
This document discusses using IBM's ODME and iMPress software for building cogeneration planning and scheduling applications. It provides an overview of ODME, iMPress, and how they can be implemented together. Key benefits include fitting business models/processes, sophisticated optimization, quick adaptation, and scenario analysis. A proof-of-concept is proposed to select a plant type, integrate data sources, solve models, and tune models for accuracy and tractability.
IBM's Problem Determination Tools have evolved since their introduction in 2000 to become more robust and functionally superior through ongoing releases. Customers are migrating to the tools due to issues with older products, demands for more sophisticated development and testing tools, and rising maintenance fees for other solutions. The Problem Determination Tools suite features capabilities for supporting SOA/composite applications, optimizing performance, debugging applications, managing and testing data, and conducting various types of testing.
This presentation introduces the audience to BDD - the Behavior-Driven Development method and how it can be applied to development and testing of GUI applications. We will also try to debunk myths and false hopes surrounding it.
BDD centers around stories written in an "ubiquitous language" that describe the expected behavior of an application. The use of a human-readable language allows for technical as well as non-technical project stakeholders to participate in authoring of feature descriptions. Those descriptions then serve as a base for the work of both developers and testers.
Classic agile and test-driven programming takes an inside-out approach by focussing on the specification and testing of the API of individual software components. BDD, on the other hand, looks at the application as a whole and puts interaction sequences and their expected outcomes into the foreground.
An introduction to the de-facto standard BDD language Gherkin will be given. It became popular as part of the Cucumber Ruby testing framework but has found its way into various free and commercial tools that will be listed.
A sample feature file including scenarios, outlines and backgrounds descriptions will be developed live using the Squish GUI Tester. This feature file can already be "run" in dry mode. We'll see different types of usage of this input:
* A mean to communicate with the customer.
* Documentation for the acceptance test before delivery
* A sequence to walk through for manual testing
* Automated GUI testing through tools like Squish.
1) The document describes a modeling framework called IMPRESS that can represent and optimize supply chain problems.
2) It then provides an example of modeling a jet fuel supply chain involving an oil refinery, rail transportation, and an airport using IMPRESS.
3) Scenario analysis is performed on the example problem by generating scenarios to explore the impact of demand variability, tank availability, and train reliability issues while maintaining feasibility of the logistics sub-problem.
RSI GmbH provides solutions for linking vehicle ecosystems by enabling efficient data transfer and display between vehicles and user domains like OEMs, suppliers, and end users. Their portfolio includes hardware like ECU simulators and connectors as well as software like CAN/OBD libraries, servers, and plugins to support areas like production, development, testing, and enhancing the user experience. They are working on a modular vehicle system software stack including an RSI server and HTML5-based user interface to "internetify" vehicles and demonstrate this through live tests of their ECU simulator and JavaFX-based dashboard. Their next milestone is demonstrating their RSI plugin with a GUI at Cebit 2018.
Salesforce.com uses Hadoop to analyze large amounts of customer data generated from over 130,000 customers and 800 million daily transactions to track product usage and customer behavior. Some key use cases discussed include analyzing product metrics to understand feature adoption, examining user behavior to improve products, and enabling collaborative filtering recommendations. The document outlines Salesforce.com's Hadoop ecosystem and data pipelines used to collect, process, and visualize insights from petabytes of customer data.
This document provides an overview of SAP solutions. It introduces the SAP R/3 system and its functionality, which integrates all areas of a business. It describes SAP's technology environment including its client/server architecture and support for various platforms, operating systems, databases, and languages. The document also discusses why SAP R/3 has been successful due to its real-time processing, comprehensive and integrated functionality, and support for best business practices. Finally, it briefly introduces mySAP.com solutions and SAP's strategy to enable integration and collaboration across enterprises.
This document provides an overview of SAP solutions. It introduces the SAP R/3 system and its functionality, describes SAP's implementation methodology (ASAP), and discusses mySAP.com solutions for new economy businesses. The SAP R/3 is an integrated business software package that combines areas like sales, distribution, materials management, production planning, financial accounting, and more. It runs on a client/server architecture using SAP's proprietary ABAP language. SAP has over 10 million users worldwide and is the 4th largest software company.
Pipeline as code for your infrastructure as CodeKris Buytaert
This document discusses infrastructure as code (IAC) and continuous delivery pipelines. It introduces Puppet as an open-source configuration management tool for defining infrastructure as code. It emphasizes treating infrastructure configuration like code by versioning it, testing it, and promoting changes through environments like development, test, and production. The document also discusses using Jenkins for continuous integration to test application and infrastructure code changes and building automated pipelines for packaging and deploying changes.
Conduct data discovery or rapid BI prototyping without becoming a Hadoop expert by analyzing big data with standard BI tools, including Cognos. View the webinar video recording and download this deck: http://www.senturus.com/resources/running-cognos-on-hadoop/.
See a cost effective, scalable solution that does not have the barriers to entry common with big data applications. The webinar explains: 1) use cases for Hadoop, 2) pros and cons of different visualization tools and their integration with Hadoop and 3) a demonstration of BigInsights, IBM’s solution.
Senturus, a business analytics consulting firm, has a resource library with hundreds of free recorded webinars, trainings, demos and unbiased product reviews. Take a look and share them with your colleagues and friends: http://www.senturus.com/resources/.
The document discusses data migration from legacy systems to SAP using the Legacy System Migration Workbench (LSMW). It describes LSMW as a tool that supports importing data from non-SAP systems into R/3 via methods like batch input, direct input, BAPIs, and IDocs. The key steps for using LSMW are outlined as 1) selecting a project and object, 2) executing to view the process steps, and 3) proceeding through each step which may include importing, converting, and importing data into the SAP database. Common import methods and their advantages/disadvantages are also summarized.
This document discusses advanced modeling techniques for industrial optimization problems using the IMPRESS modeling system. It describes modeling objects like unit operations and port states to represent the problem structure. It also describes modeling attributes like quantities, logic, and qualities to represent phenomena like flows, sequences, and properties. The IMPRESS system allows problems to be configured interactively rather than coded, and provides capabilities like automatic derivative calculation and presolving to simplify problems.
Since its birth in late 2010, the jBPM migration tool project has been marching forward to support the transformation of your jBPM3 processes to the latest versions of jBPM. It has been a journey that covers the support of a vast array of use cases, example enterprise process projects, supports various process designers and has finally been included into the Drools / jBPM project team as an official project.
This session will outline the status of the jBPM migration tooling project. We will take a look at the background of jBPM 3 process projects and detail what is supported right now to get your processes deployed onto the current version of jBPM. We will demo the existing tooling on several real life enterprise jBPM projects and outline our strategy for the various conceptual problems we encountered in moving your process constructs to BPMN2. These examples will provide you with real life scenarios to take home as an example for your own BPM projects.
We will finish up with a demonstration of the jBPM migration tooling running in the Cloud. Each participant will depart this session fully enabled with their very own Cloud deployed jBPM Migration tooling.
Accelerating the Development of Efficient CP Optimizer ModelsPhilippe Laborie
The IBM Constraint Programming optimization system CP Optimizer was designed to provide automatic search and a simple modeling of discrete optimization problems, with a particular focus on scheduling applications. It is used in industry for solving operational planning and scheduling problems. We will give an overview of CP Optimizer and then describe in further detail a set of features such as input/output file format, warm-start or conflict refinement that help accelerate the development of efficient models.
The document discusses software architecture patterns and principles. It provides examples of how to apply Model-View-Controller (MVC), client-server, and other patterns to Android and web application development. Key strategies mentioned include refactoring existing code, separating concerns, and using patterns like observer and strategy to reduce coupling between architectural components.
Similar to IMPRESS Presentation Carnegie Mellon University (20)
This document summarizes a presentation given at the INFORMS Annual Meeting in 2008 about using Xpress-Tuner to automatically fine-tune the heuristics in Xpress-MP to improve solving mixed integer programs (MIPs). Xpress-Tuner allows tuning control parameters to find settings that reduce solve times by 2-10x on benchmark problems. Examples show tuning advertising and retail planning MIPs to meet time/gap targets. The document highlights Xpress-Tuner's new parallel tuning features and its ability to specialize heuristic threads to further optimize solutions.
This document discusses using the Xpress-Mosel modeling environment for solving data mining problems. It provides an overview of key Mosel features like integration of modeling and solving. It then discusses how Mosel can be used to model various common data mining problems like classification, regression, and clustering as optimization problems. These include formulations as linear programs, mixed integer programs, and stochastic programs.
We tested ODH|CPLEX 4.24 on Miplib Open-v7 Models, a public collection of 286 models to which and optimal solution has not been proven. 257 of these are known to have a feasible solution.
ODH|CPLEX proved optimality on 6 models and found better solutions in 2 hours, to 40% of the models with 12 threads and 35% with 8 threads. ODH|CPLEX matched on 21% of the models.
This document discusses optimizing fantasy football teams to maximize points scored over a 17-week season. It develops a nonlinear programming model using an evolutionary solver with 200 binary variables representing each player. The model aims to select players with the highest historical scoring averages who provide consistent weekly points and availability. Future improvements will add constraints to reduce total variance across positions and simulate different draft outcomes based on pick order, as well as calculating optimal bench players for bye-weeks.
This document outlines the scheduling rules for the National Football League (NFL) and presents an optimization of the NFL game schedule to maximize commercial value. The NFL earned $11.09 billion in revenue in 2014. The scheduling of games is crucial to maximize commercial value. The project uses a step-by-step optimization approach that considers attendance ratios to develop a schedule that achieves the greatest commercial value for the NFL.
2017 Business Intelligence & Analytics Corporate Event Stevens Institute of T...Alkis Vazacopoulos
The document summarizes student poster presentations from a Business Intelligence & Analytics program event at Stevens Institute of Technology. It provides background on the BI&A program, which has grown from 4 students to over 220 students. The posters presented research conducted by students on a wide range of topics under faculty guidance. Over 80 company representatives and 150 students/faculty attended the event to view the 76 posters presenting analytics projects and research.
The team participated in Google's Online Marketing Challenge to build an AdWords campaign for the nonprofit True Mentors. They designed campaigns around search and display ads to promote brand awareness, fundraising events, and donations. Their campaign included 23 ad groups and over 200 ads targeting 700 keywords. The campaign finished as a finalist in the Social Impact category and semi-finalist in the Business category, ranking in the top 10, 15, and 5 respectively. The live campaign ran for 3 weeks on the AdWords platform.
- Large optimization models are increasingly challenging to solve optimally due to super-linear growth in solving effort as model size increases. Parallel heuristic methods provide good quality solutions within practical time limits by solving smaller submodels simultaneously on multiple processor threads.
- Testing on scheduling, supply chain, and telecommunications models found parallel heuristics found high quality solutions for most models in hours, while optimal solutions were impossible within days for some larger models. However, using too many threads showed diminishing returns and even degradation in solution quality due to memory bus bandwidth limitations.
- A retailer buys seasonal stock in advance from overseas suppliers and tries to sell it all over a limited season, usually using discounts.
- An optimization model was created to analyze past sales data, model demand as a function of price, and determine optimal pricing and season length to maximize revenue.
- The model found that allowing for small price increases over the season and extending the season length could significantly increase total expected revenue compared to the retailer's current approach.
Optimization Direct: Introduction and recent case studiesAlkis Vazacopoulos
This document provides an overview of Optimization Direct, an IBM business partner that specializes in optimization software and consulting. It discusses Optimization Direct's experience implementing optimization technology for various industries. The document also summarizes Optimization Direct's product offerings, which focus on IBM ILOG CPLEX Optimization Studio. It then highlights several recent case studies where Optimization Direct helped customers solve scheduling, resource allocation, and pricing problems using analytics and optimization modeling approaches like MIP and heuristic algorithms. Finally, it shares an example of how Optimization Direct helped a retail client optimize markdown pricing and promotions to improve sales and margins.
Informs 2016 Solving Planning and Scheduling Problems with CPLEX Alkis Vazacopoulos
This document discusses optimization solutions for planning and scheduling problems using CPLEX. It begins with an introduction to DecisionBrain and examples of applications in manufacturing, supply chain, and maintenance scheduling. Case studies are presented on production planning in electronics manufacturing, container terminal optimization, and field service scheduling. Best practices are discussed around choosing the right optimization technology, emphasizing decision support over pure optimization, understanding business goals, and integrating process improvements with advanced decision support. Project risks around not achieving benefits, performance issues, and user acceptance are also addressed.
EX Optimization Studio* solves large-scale optimization problems and enables better business decisions and resulting financial benefits in areas such as supply chain management, operations, healthcare, retail, transportation, logistics and asset management. It has been applied in sectors as diverse as manufacturing, processing, distribution, retailing, transport, finance and investment. CPLEX Optimization Studio is an analytical decision support toolkit for rapid development and deployment of optimization models using mathematical and constraint programming. It combines an integrated development environment (IDE) with the powerful Optimization Programming Language (OPL) and high-performance ILOG CPLEX optimizer solvers. CPLEX Optimization Studio enables clients to: Optimize business decisions with high-performance optimization engines. Develop and deploy optimization models quickly by using flexible interfaces and prebuilt deployment scenarios. Create real-world applications that can significantly improve business outcomes. Optimization Direct has partnered with and entered into a technology licensing and distribution agreement with IBM. By combining the founders' industry and software experience and IBM’s CPLEX Optimization Studio product with the arsenal of Optimization modeling and solving tools from IBM provides customers the most powerful capabilities in the industry.
Missing-Value Handling in Dynamic Model Estimation using IMPL Alkis Vazacopoulos
Presented in this short document is a description of how IMPL handles missing-values or missing-data when estimating dynamic models which inherently involve time-lagged or time-shifted input and output variables. Missing-values in a data set imply that for some reason the data is not available most likely due to a mal-functioning instrument or even lack of proper accounting. Missing-data handling is relatively well-studied especially for time-series or dynamic data given that it is not as easy as removing, ignoring or deleting bad sections of data when static or steady-state models are calibrated (Honaker and King, 2010; Smits and Baggelaar, 2010; Fisher and Waclawski, 2015). Unfortunately, all of their methods involve what is known as “imputation” i.e., replacing or substituting missing-data with some reasonably assumed value which is at the very least is a biased estimate. When regression techniques such as PLS and PCR are used (Nelson et. al., 2006) then missing-data can be handled without imputation by computing the input-output covariance matrices excluding the contribution from the missing-values given the temporal and structural redundancy in the system. However, it is shown in Dayal (1996) that using PLS and other types of regression techniques such as Canonical Correlation Regression (CCR) and Reduced Rank Regression (RRR) to fit non-parsimonious and non-parametric finite impulse/step response models (FIR/FSR), that this is not as reliable as fitting lower-ordered transfer functions especially considering the robust stability of the resulting model predictive controller if that is its intended use.
Finite Impulse Response Estimation of Gas Furnace Data in IMPL Industrial Mod...Alkis Vazacopoulos
Presented in this short document is a description of how to estimate deterministic and stochastic non-parametric finite impulse response (FIR) models in IMPL applied to industrial gas furnace data identical to that found in TSE-GFD-IMF using parametric transfer-functions. The methodology of time-series analysis or system identification involves essentially three (3) stages (Box and Jenkins, 1976): (1) model structure identification, (2) model parameter estimation and (3) model checking and diagnostics. We do not address (1) which requires stationarity and seasonality assessment/adjustment, auto-, cross- and partial-correlation, etc. to establish the parametric transfer function polynomial degrees especially when we are using non-parametric FIR estimation. Instead we focus only on the parameter estimation and diagnostics. These types of parameter estimation problems involve dynamic and nonlinear relationships shown below and we solve these using IMPL’s Sequential Equality-Constrained QP Engine (SECQPE) and Supplemental Observability, Redundancy and Variability Estimator (SORVE). Other types of non-parametric identification known as Subspace Identification (Qin, 2006) and can used to estimate state-space models.
Our Industrial Modeling Service (IMS) involves several important (but rarely implemented) methods to significantly improve and advance your existing models and data. Since it is well-known that good decision-making requires good models and data, IMS is ideally suited to support this continuous-improvement endeavour. IMS is specifically designed to either co-exist with your existing design, planning, scheduling, etc. applications or these same models and data can be used seamlessly into our Industrial Modeling and Programming Language (IMPL) to create new value-added applications. The following techniques form the basis of our IMS offering.
Dither Signal Design Problem (DSDP) for Closed-Loop Estimation Industrial Mod...Alkis Vazacopoulos
This document describes a methodology for designing dither signals to improve closed-loop system identification of industrial models. It formulates the dither signal design as an optimization problem (DSDP) that determines signal amplitudes to maximize excitation within input/output constraints. The DSDP can be solved in the physical or transformed space, and accounts for process gain information to better excite ill-conditioned processes. The designed dither signals are intended to provide data for updating existing industrial models used for control, optimization, and monitoring.
This short note describes a relatively simple methodology, procedure or approach to increase the performance of already installed industrial models used for optimization, control, simulation and/or monitoring purposes. The method is called Excess or X-Model Regression (XMR) where the concept of “excess modeling” or an X-model is taken from the field of thermodynamics to describe the departure or residual behaviour of real (non-ideal) gases and liquids from their ideal state (Kyle, 1999; Poling et. al., 2001; Smith et. al., 2001). It has also been applied to model the non-ideal or nonlinear behaviour of blending motor gasoline octanes with its synergistic and antagonistic interactional effects (Muller, 1992).
The fundamental idea of XMR is to calibrate, train, fit or estimate, using actual data and multiple linear regression (MLR) or ordinary least squares (OLS), the deviations of the measured responses from the existing model responses. The existing model may be a glass, grey or black-box model (known or unknown, linear or nonlinear, implicit/open or explicit/closed) depending on the use of the model. That is, for optimization and control the model structure and parameters are available given that derivative information is required although for simulation and monitoring, the model may only be observed through the dependent output variables given the necessary independent input variables.
Distillation Curve Optimization Using Monotonic InterpolationAlkis Vazacopoulos
The document discusses optimization of distillation curves for blending multiple distillation streams. It presents a methodology to interconvert temperatures between ASTM D86 and true boiling point (TBP) scales, interpolate the points to generate evaporation curves using monotonic interpolation, and blend the components using an ideal blending law. The methodology allows manipulating cutpoints of one or more streams' distillation curves to shift the front and back-ends and optimize the final blended product properties and flows while satisfying demands and specifications. An example optimization problem is presented to maximize the flow of two distillation streams in a blend subject to property and flow constraints.
Presented in this short document is a description of how to model and solve multi-utility scheduling optimization (MUSO) problems in IMPL. Multi-utility systems (co/tri-generation) are typically found in petroleum refineries and petrochemical plants (multi-commodity systems) especially when fuel-gas (i.e., off-gases of methane and ethane) is a co- or by-product of the production from which multi-pressure heating-, motive- and process-steam are generated on-site. Other utilities include hydrogen, electricity, water, cooling media, air, nitrogen, chemicals, etc. where a multi-utility system is shown in Figure 1 with an intermediate or integrated utility (both produced and consumed) such as fuel-gas, steam or electricity. Itemized benefit areas just for better management of an integrated steam network can be found in Pelham (2013) where his sample multi-pressure steam utility flowsheet is found in Figure 2.
Advanced Parameter Estimation (APE) for Motor Gasoline Blending (MGB) Indust...Alkis Vazacopoulos
Presented in this short document is a description of how to model and solve advanced parameter estimation (APE) problems in IMPL. APE is the term given to the application of estimating, fitting or calibrating parameters in models involving a network, topology, superstructure or flowsheet. When estimating parameters with multiple linear regression (MLR), ordinary least squares (OLS), ridge regression (RR), principal component regression (PCR) and partial least squares (PLS) there is no explicit model but simply an X-block and Y-block of data. Hence, these methods are referred to as “non-parametric” or “data-based” methods as opposed to the “parametric” or “model-based” method used here. To solve these types of problems we use what is commonly referred to as “error-in-variables” (EIV) regression which is conveniently implemented as nonlinear data reconciliation and regression (NDRR) using the technology found in Kelly (1998a; 1998b; 1999) and Kelly and Zyngier (2008a). The primary benefit of using EIV (NDRR) over the other regression methods is that we can easily handle the inclusion of conservation laws and constitutive relations, explicitly, a must for any industrial estimation problem (IEP).
Advanced Parameter Estimation (APE) for Motor Gasoline Blending (MGB) Indust...
IMPRESS Presentation Carnegie Mellon University
1. industrIALgorithms
IMPRESS
Industrial Modeling & Presolving System
Are you ready to get IMPRESSed?
Brenno, December, 2012
12/10/2012
2. Commercial Scheduling Tools
industrIALgorithms
Simulation based Simulation based Simulation based Optimization based Optimization based
Suite of tools Suite of tools
Suite of tools (PIMS, Suite of tools (RPMS,
(SIMTO Sched, (Flower, Suite of tools
APS, MBO, PS, BLEND,
MBlend, Dock Compass, (IMPRESS)
IMOS) SAND)
Sol) Optimix)
Scheduling Blend: Scheduling Blend: Scheduling Blend: Scheduling Blend: Scheduling Blend:
MBO SIMTO M-Blend Optimix BLEND CBS, PBS-IMF
Easy interface Easy interface Interface less
Installed in the
Installed in the Installed in the interactive No User-Interface
machine
machine machine Access by intranet
Integration by Data Integration by Web Integration by File,
Integration by XML
Base Service Excel, API’s
BP
VALERO PETROCHINA
CONOCO
CHEVRON INDIAN OIL
SHELL
3. Types of Optimization & Estimation
IMPRESS is designed both for industrial decision-
making & data-mining problems deployed off-line,
in-line & on-line such as:
Planning & Scheduling Optimization (active).
Data Reconciliation & Regression Estimation (passive).
Real-Time Control & Optimization (active).
Monitoring, Tracking & Tracing (passive).
The terms active & passive imply the "degree of causality".
Active models must be "causal" (cause & effect amongst
variables) & passive models may or may not be causal (no
cause & effect required) also known as "observational".
4.
5.
6.
7. Advanced Modeling System: IMPRESS
Problems are configured either interfacing with a
flat-file language (IML = Industrial Modeling
Language) or interactively using a programming
language such as Python, Java, C#, C++, C or
Fortran (IPL = Industrial Programming Language).
Currently have bindings to several linear and
nonlinear programming solvers such as COINMP,
GLPK, LPSOLVE, SCIP, XPRESS, XPRESS-SLP,
CONOPT, IPOPT, KNITRO, NOVA & SLPQPE, etc.
8. System Architecture: SIIMPL
SIIMPL = Server, Interfacer, Interacter, Modeler,
Presolver Libraries.
There are five DLL components in IMPRESS:
Server = data modeling & presolving routines.
Interfacer = parsing for the language.
Interacter = inserting, updating & viewing routines
for the interchange.
Modeler = formulating of the variables, constraints,
derivatives & expressions including
"dependent" sets, lists & parameters.
Presolver = bindings for 3rd-party solving-systems.
10. Time Model
• Logistics (Quantity*Logic (proy’d-quality), MILP):
– “discrete-time” where each time-period has the same duration.
– Time-periods may be “small-buckets” (un-ary) or “big-buckets”
(N-ary):
• If un-ary then only one activity per time-period (scheduling) but if N-ary
Planuling = Planning + Scheduling
then multiple activities per time-period where a “time-portion” variable
for each operation is applied (planning).
• Quality (Quantity*quality (fixed-logic), NLP):
– “distributed-time” where each time-period may have a different
duration (global/common time-grid).
– Same as logistics.
• All input data is entered in “continuous-time” (begin, end-
times) and digitized i.e., discretized or distributed
accordingly.
11. Scheduling
without the blender scheduling details
planning
crude
Solution: duration of each blender grade
with time-period
Scheduling only for the blender:
Including setup, startup,
switchover, shutdown
12. Prod A Prod A
Prod B Prod B
Prod C Prod C
Prod D Prod D
Crude A
Prod E Prod E
Crude B
Prod F Prod F
Crude C Prod G Prod G
Crude D Prod H Prod H
Crude Recipe Refining Units Products Recipes Products
Operational Modes Scheduling
Crude Scheduling Products Planning
resulting targets to
Including setup, startup, switchover, shutdown
blender Scheduling/RTO
1st Planuling (big-buckets) 2nd Scheduling
(small-buckets)
13. User/Adhoc Linear Constraints
• There are 2 ways to add user, adhoc or custom
linear/logistics constraints or formulations:
– Inside IML/IPL using special frames with access to UOPSS
flow, holdup, setup, startup, etc. variables i.e.,
• UOLConstraint-&sUnit,&sOperation,&sName,&iBegin,&iEnd,@sType,@rValue,@rWeight
– Import a “foreign” LP (CPLEX) file generated by GAMS,
AMPL, MOSEL, etc.
• Variables are referenced as “Xnnn” and constraints by
“Fmmm” where if “nnn” corresponds to an IMPRESS generated
variable index number then the IMPRESS variable will be used
instead. All “mmm” constraint index numbers correspond to
new constraint instances (or cuts/valid inequalities).
14. User/Adhoc Nonlinear Constraints
• There are 3 ways to add user, adhoc or custom
nonlinear/quality constraints or formulations:
– Single-value formula using conditions on “black-box”
subtype unit-operations to calculate port flows, rates,
yields, densities, components, etc.
– Single-value functions for “coefficients” as functions of
conditions on “black-box” subtype unit-operations and
written in C, C++ or Fortran i.e., useful for physical-
properties such as enthalpy, entropy, fugacity, etc.
– Multi-value functions on “black-blox” subtype unit-
operations written in C, C++ or Fortran.
16. Formula Intrinsic/Extrinsic Functions
• Unary intrinsic functions supported are:
– ABS, SQRT, LN, LOG, EXP, SIN, COS, TAN, DEG, RAD, INT,
ROUND, FLOOR, CEILING, FACT, IF, NOT and URN, NRN
(uniform and normal random numbers).
• Binary and N-ary intrinsic functions are:
– EQ, NE, LE, LT, GE, GT, MIN, MAX and
– MNL, MXL, KIP, LIP, SIP, KIP2, LIP2, SIP2 (monotonic spline
using piecewise Hermite polynomials).
• Extrinsic unary, binary and N-ary functions are:
– XFU1, XFU2, XFU3, XFB1, XFB2, XFB3 & XFN1, XFN2, XFN3
17. Solving via SLPQPE (not SQP, SLQP, SQLP)
• What is SLPQPE?
– It is identical to SLP (Zhang et. al., 1985) except that at each major iteration of the
nonlinear program a QP is called instead of the LP if there are quadratic terms in the
objective function.
J. Zhang, et. al., “An Improved Successive Linear Programming Algorithm”,
Management Science, 1985.
• Where is it useful?
– When the objective function is a mix of linear (1-norm) and quadratic (2-norm) terms such
as found in industrial planning, scheduling, control & reconciliation problems, then SLPQPE
has been found to be extremely effective in practice.
• Where is it usually not useful?
– When the objective function has an indefinite Hessian (+/- eigenvalues) i.e., +/- diagonal
quadratic terms.
– However, industrial optimization & estimation problems do not usually have an indefinite
objective function Hessian although the Hessian of the Lagrangian most likely will be.
18. Other Techniques found in IMPRESS
Differentiation Engine:
All 1st-order partial derivatives are either supplied
analytically or computed numerically of analytical quality
using complex-step differencing & graph-colouring. *
The modeler or user is not responsible for providing
derivatives.
Diagnostic Engine (Work-in-Progress ...):
Experience says that over 95% of IOP's have infeasibilities
that occur in the linear part of the model where these can
be quickly identified in LP primal presolve. The rest
require artificial or what we call excursion variables in
both the integer and nonlinear parts of the model and are
generated automatically.