This document summarizes Maria Fox's talk on planning with theories. The talk introduces temporal planning, heuristic search techniques like relaxed plan construction, and challenges in hybrid planning involving continuous change. It discusses an application in nuclear waste processing where rods heat over time and interactions are temperature dependent. The talk presents an overall framework called "Planning Modulo Theories" that combines discrete planning with reasoning about continuous processes and constraints. It shows how processes can be modeled in PDDL+ and discusses building linear programs from developing plans to determine action timings.
deep reinforcement learning with double q learningSeungHyeok Baek
presentation for Lab seminar
Double DQN Algorithm of Deepmind
Van Hasselt, Hado, Arthur Guez, and David Silver. "Deep Reinforcement Learning with Double Q-Learning." AAAI. Vol. 2. 2016.
Deep Reinforcement Learning: Q-LearningKai-Wen Zhao
This slide reviews deep reinforcement learning, specially Q-Learning and its variants. We introduce Bellman operator and approximate it with deep neural network. Last but not least, we review the classical paper: DeepMind Atari Game beats human performance. Also, some tips of stabilizing DQN are included.
This problem represents an interesting opportunity for scientists and statisticians to collaborate since the problem is too big for either community. The science is not well established, although fairly sophisticated ice flow models exist. They are even becoming relevant to explain some of the complexity seen in observational data. At the same time, the complex phenomena we see in observations may not be particularly relevant to assessing the risks of significant increases in sea level rise over the near future. The talk will review what we have learned about this problem through the PISCEES SciDAC project. This problem is rich with challenges and opportunities, particularly for realigning how our two communities engage each other. The talk will review the computational, scientific, and mathematical "reality checks" that might stop any reasonable person from considering this topic further. I then will point out how each of these challenges could be mitigated if these different perspectives were better integrated.
We present a survey of computational and applied mathematical techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties.
Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.
This document discusses challenges and opportunities for using machine learning and data mining techniques on big climate data. It describes various types of climate and Earth observation data available from satellites and models. Research highlights are presented on using pattern mining to track ocean eddies, extreme value theory to study heatwaves and rainfall, and relationship mining to study seasonal hurricane activity. The challenges of analyzing multi-scale, heterogeneous climate data are also discussed.
Grasp approach to rcpsp with min max robustness objectivecsandit
This paper deals with the Resource-Constrained Project scheduling Problem (RCPSP) under
activity duration uncertainty. Based on scenarios, the object is to minimize the worst-case
performance among a set of initial scenarios which is referred to as the min-max robustness
objective. Due to the complexity of the tackled problem, we propose the application of the
GRASP method which is qualified as a simple and effective multi-start metaheuristic. The
proposed approach incorporates an adaptive greedy function based on priority rules to
construct new solutions, and a local search with a forward-backward heuristic in the
improvement phase. Two different benchmark data sets are investigated, the Patterson set and
the PSPLIB J30 set. Comparative results show that the proposed enhanced GRASP outperforms
the basic procedure in robustness optimization.
In the first part of the talk, we will present a sensitivity analysis of a novel sea ice model. neXtSIM is a continuous Lagrangian numerical model that uses an elastobrittle rheology to simulate the ice response to external forces. The response of the model is evaluated in terms of simulated ice drift distances from its initial position and from the mean position of the ensemble. The simulated ice drift is decomposed into advective and diffusive parts that are characterized separately both spatially and temporally and compared to what is obtained with a free-drift model, i.e. when the ice rheology does not play any role. Overall the large-scale response of neXtSIM is correlated to the ice thickness and the wind velocity fields while the free-drift model response is mostly correlated to the wind velocity pattern only. The seasonal variability of the model sensitivity shows the role of the ice compactness and rheology at both local and Arctic scales. Indeed, the ice drift simulated by neXtSIM in summer is close to the free-drift model, while the more compact and solid ice pack is showing a significantly different mechanical and drift behavior in winter. In contrast of the free-drift model, neXtSIM reproduces the sea ice Lagrangian diffusion regimes as found from observed trajectories. The forecast capability of neXtSIM is also evaluated using a large set of real buoy’s trajectories. We found that neXtSIM performs better in simulating sea ice drift, both in terms of forecast error and as a tool to assist search-and-rescue operations. Adaptive meshes, as the one used in neXtSIM, are used to model a wide variety of physical phenomena. Some of these models, in particular those of sea ice movement, use a remeshing process to remove and insert mesh points at various points in their evolution. This represents a challenge in developing compatible data assimilation schemes, as the dimension of the state space we wish to estimate can change over time when these remeshings occur.
In the second part of the talk, we highlight the challenges that such a modeling framework represents for data assimilation setup. We then describe a remeshing scheme for an adaptive mesh in one dimension. The development of advanced data assimilation methods that are appropriate for such a moving and remeshed grid is presented. Finally we discuss the extension of these techniques to two-dimensional models, like neXtSIM.
Summit Entertainment is an American film studio and subsidiary of Lions Gate Entertainment headquartered in California. It was founded in 1991 to handle film sales internationally and later expanded into film production and distribution. Some of Summit's early hits included the American Pie films. In 2006 it became an independent studio with access to over $1 billion in financing. The Twilight film series was Summit's most successful franchise, with the latter films in the series grossing over $700 million each worldwide.
deep reinforcement learning with double q learningSeungHyeok Baek
presentation for Lab seminar
Double DQN Algorithm of Deepmind
Van Hasselt, Hado, Arthur Guez, and David Silver. "Deep Reinforcement Learning with Double Q-Learning." AAAI. Vol. 2. 2016.
Deep Reinforcement Learning: Q-LearningKai-Wen Zhao
This slide reviews deep reinforcement learning, specially Q-Learning and its variants. We introduce Bellman operator and approximate it with deep neural network. Last but not least, we review the classical paper: DeepMind Atari Game beats human performance. Also, some tips of stabilizing DQN are included.
This problem represents an interesting opportunity for scientists and statisticians to collaborate since the problem is too big for either community. The science is not well established, although fairly sophisticated ice flow models exist. They are even becoming relevant to explain some of the complexity seen in observational data. At the same time, the complex phenomena we see in observations may not be particularly relevant to assessing the risks of significant increases in sea level rise over the near future. The talk will review what we have learned about this problem through the PISCEES SciDAC project. This problem is rich with challenges and opportunities, particularly for realigning how our two communities engage each other. The talk will review the computational, scientific, and mathematical "reality checks" that might stop any reasonable person from considering this topic further. I then will point out how each of these challenges could be mitigated if these different perspectives were better integrated.
We present a survey of computational and applied mathematical techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties.
Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.
This document discusses challenges and opportunities for using machine learning and data mining techniques on big climate data. It describes various types of climate and Earth observation data available from satellites and models. Research highlights are presented on using pattern mining to track ocean eddies, extreme value theory to study heatwaves and rainfall, and relationship mining to study seasonal hurricane activity. The challenges of analyzing multi-scale, heterogeneous climate data are also discussed.
Grasp approach to rcpsp with min max robustness objectivecsandit
This paper deals with the Resource-Constrained Project scheduling Problem (RCPSP) under
activity duration uncertainty. Based on scenarios, the object is to minimize the worst-case
performance among a set of initial scenarios which is referred to as the min-max robustness
objective. Due to the complexity of the tackled problem, we propose the application of the
GRASP method which is qualified as a simple and effective multi-start metaheuristic. The
proposed approach incorporates an adaptive greedy function based on priority rules to
construct new solutions, and a local search with a forward-backward heuristic in the
improvement phase. Two different benchmark data sets are investigated, the Patterson set and
the PSPLIB J30 set. Comparative results show that the proposed enhanced GRASP outperforms
the basic procedure in robustness optimization.
In the first part of the talk, we will present a sensitivity analysis of a novel sea ice model. neXtSIM is a continuous Lagrangian numerical model that uses an elastobrittle rheology to simulate the ice response to external forces. The response of the model is evaluated in terms of simulated ice drift distances from its initial position and from the mean position of the ensemble. The simulated ice drift is decomposed into advective and diffusive parts that are characterized separately both spatially and temporally and compared to what is obtained with a free-drift model, i.e. when the ice rheology does not play any role. Overall the large-scale response of neXtSIM is correlated to the ice thickness and the wind velocity fields while the free-drift model response is mostly correlated to the wind velocity pattern only. The seasonal variability of the model sensitivity shows the role of the ice compactness and rheology at both local and Arctic scales. Indeed, the ice drift simulated by neXtSIM in summer is close to the free-drift model, while the more compact and solid ice pack is showing a significantly different mechanical and drift behavior in winter. In contrast of the free-drift model, neXtSIM reproduces the sea ice Lagrangian diffusion regimes as found from observed trajectories. The forecast capability of neXtSIM is also evaluated using a large set of real buoy’s trajectories. We found that neXtSIM performs better in simulating sea ice drift, both in terms of forecast error and as a tool to assist search-and-rescue operations. Adaptive meshes, as the one used in neXtSIM, are used to model a wide variety of physical phenomena. Some of these models, in particular those of sea ice movement, use a remeshing process to remove and insert mesh points at various points in their evolution. This represents a challenge in developing compatible data assimilation schemes, as the dimension of the state space we wish to estimate can change over time when these remeshings occur.
In the second part of the talk, we highlight the challenges that such a modeling framework represents for data assimilation setup. We then describe a remeshing scheme for an adaptive mesh in one dimension. The development of advanced data assimilation methods that are appropriate for such a moving and remeshed grid is presented. Finally we discuss the extension of these techniques to two-dimensional models, like neXtSIM.
Summit Entertainment is an American film studio and subsidiary of Lions Gate Entertainment headquartered in California. It was founded in 1991 to handle film sales internationally and later expanded into film production and distribution. Some of Summit's early hits included the American Pie films. In 2006 it became an independent studio with access to over $1 billion in financing. The Twilight film series was Summit's most successful franchise, with the latter films in the series grossing over $700 million each worldwide.
Webinar presentation March 3, 2016.
The CSCC deliverable, Practical Guide to Hybrid Cloud Computing, contains prescriptive guidance for the successful deployment of hybrid cloud computing. The whitepaper outlines the key considerations that customers must take into account as they adopt hybrid cloud computing and covers the strategic and tactical activities for decision makers implementing hybrid cloud solutions as well as technical considerations for deployment.
Download the deliverable: http://www.cloud-council.org/resource-hub
Steel and concrete modular construction is better suited than wood-framed modular for commercial and high-end residential buildings as it allows for larger openings, more flexible floor plans, fire resistance, and LEED certification. Modular units are constructed in a controlled factory setting using an assembly line process, then transported and assembled on-site, providing construction speed, cost savings, and faster occupancy compared to traditional on-site construction. Steel and concrete modular structures are more durable and rigid with longer lifespan than wood structures.
Hybrid integration reference architectureKim Clark
The ownership boundary of the typical enterprise now encompasses a much broader IT landscape. It is common to see that landscape stretch out to cloud native development platforms, software as a service, dependencies on external APIs from business partners, a mobile workforce and an ever growing range of digital channels. The integration surface area is dramatically increased and the integration patterns to support it are evolving just as quickly. These are the challenges we recognise as "hybrid integration". We will explore what a reference architecture for hybrid integration might look like, and how IBM's integration portfolio is growing and changing to meet the needs of digital transformation. This deck comes from the following article http://ibm.biz/HybridIntRefArch and is also described in this video http://ibm.biz/HybridIntRefArchYouTube
Webinar presentation March 9, 2017
IT environments are now fundamentally hybrid in nature – devices, systems, and people are spread across the globe, and at the same time virtualized. Achieving integration across this ever changing environment, and doing so at the pace of modern digital initiatives, is a significant challenge.
This presentation introduces a hybrid integration reference architecture published by the Cloud Standards Customer Council. Learn best practices from leading-edge enterprises that are starting to leverage a hybrid integration platform to take advantage of best of breed cloud-based and on-premises integration approaches.
This webinar draws from the CSCC's deliverable, Cloud Customer Architecture for Hybrid Integration. Read it here: http://www.cloud-council.org/deliverables/cloud-customer-architecture-for-hybrid-integration.htm
Five Steps to Creating a Secure Hybrid Cloud ArchitectureAmazon Web Services
A hybrid Architecture is one of the easiest ways to securely address new application requirements and cloud-first development initiatives. This approach allows you to start small and expand as your requirements change while maintaining a strong security posture. In this session, you will learn the 5 key steps to building a hybrid architecture on AWS using the VM-Series next-generation firewall.
This document discusses virtualization and cloud computing, specifically hybrid cloud architectures. It defines hybrid cloud as a cloud computing environment where an organization provides and manages some resources in-house and has others provided externally from a public cloud provider. The document outlines the key considerations for hybrid cloud planning, examines hybrid cloud architecture which combines a private cloud with at least one public cloud, and discusses the advantages of cost efficiency, isolation, availability and flexibility as well as the disadvantages of data beyond the firewall and greater internal IT maintenance required.
Modular coordination is a concept where buildings and components are dimensioned and positioned based on basic modular units. This allows for dimensional compatibility and simplifies construction. The basic module is 100mm denoted as 1M. Multiples and fractions of the basic module can also be used. A modular reference system establishes grids to coordinate the placement and sizing of building elements and components. Structural elements like walls, floors and columns are dimensioned to fit within the modular grids, as are non-structural components and finishes. This standardization aims to reduce waste and improve construction efficiency.
Modular construction involves prefabricating building components off-site and transporting them to the construction location for assembly. This document discusses the benefits of modular construction compared to standard construction methods. Some key benefits include reduced construction costs through industrialized manufacturing, faster installation times, standardized components that improve productivity, and reusability of modular units that provide flexibility. The document also explains modular coordination, which involves dimensioning buildings and components using a standard module unit of 100mm to facilitate industrialized production and assembly of standardized building parts.
Differentiating between web APIs, SOA, & integration…and why it mattersKim Clark
At a high level, both SOA and web APIs seem to solve the same problem – expose business function in real-time and in a reusable way. This tutorial looks at how these initiatives are different and how they align into an evolving integration architecture. It discusses how API Management differs from the integration architectures that came before it, such as SOA and EAI.
This document discusses industrialized building systems (IBS) and modular coordination. It defines IBS as building systems where prefabricated structural components are manufactured off-site and assembled with minimal work. Modular coordination standardizes dimensions to facilitate industrial production and assembly of building components. The document outlines various IBS classifications including frame, panel and box systems. It notes advantages like reduced labor, waste and faster completion compared to traditional construction methods.
AWS re:Invent 2016: Hybrid Architecture Design: Connecting Your On-Premises W...Amazon Web Services
You’re trying to minimize your time to deploy applications, reduce capital expenditure, and take advantage of the economies of scale made possible by using Amazon Web Services; however, you have existing on-premises applications that are not quite ready for complete migration. Hybrid architecture design can help! In this session, we discuss the fundamentals that any architect needs to consider when building a hybrid design from the ground up. Attendees get exposure to Amazon VPC, VPNs, Amazon Direct Connect, on-premises routing and connectivity, application discovery and definition, and how to tie all of these components together into a successful hybrid architecture.
Learn how Aerospike's Hybrid Memory Architecture brings transactions and analytics together to power real-time Systems of Engagement ( SOEs) for companies across AdTech, financial services, telecommunications, and eCommerce. We take a deep dive into the architecture including use cases, topology, Smart Clients, XDR and more. Aerospike delivers predictable performance, high uptime and availability at the lowest total cost of ownership (TCO).
The document summarizes a presentation on building automation systems (BAS) and their role in managing energy usage and demand in green, intelligent buildings. It discusses how BAS can integrate with the smart grid to support distributed energy and demand response. It also outlines the agenda, benefits of intelligent buildings, services that building systems can provide, and the vision of the Continental Automated Buildings Association (CABA) to advance integrated technology in buildings.
Big data is transforming terrestrial ecosystem science by enabling new approaches to model evaluation, development and prediction. Several examples are provided where large datasets on atmospheric measurements, remote sensing, and flux towers are integrated with models. This allows processes to be better understood from data analysis and provides opportunities to improve models. However, tools are still needed to easily facilitate comparison and assimilation of diverse data with models. The eMAST initiative aims to develop infrastructure for predictive ecosystem models that are fully informed by all relevant data.
The document discusses equations of motion used in weather forecasting and climate change studies. It begins with an introduction to geophysical fluid dynamics and the distinguishing effects of rotation and stratification. It then outlines the basic equations of motion, including conservation of momentum, mass, energy, and state. It describes how these equations are solved on grids using numerical models. It discusses the challenges of modeling processes at different spatial scales from synoptic to urban. It also addresses challenges in tropical weather prediction and how dynamical prediction of weather over South Asia has improved.
Physical processes in the earth system are modeled with mathematical representations called parameterizations. This talk will describe some of the conceptual approaches and mathematics used do describe physical parameterizations focusing on cloud parameterizations. This includes tracing physical laws to discrete representations in coarse scale models. Clouds illustrate several of the complexities and techniques common to many physical parameterizations. This includes the problem of different scales, sub-grid scale variability. Discussions of mathematical methods for dealing with the sub-grid scale will be discussed. In-exactness or indeterminate problems for both weather and climate will be discussed, including the problems of indeterminate parameterizations, and inexact initial conditions. Different mathematical methods, including the use of stochastic methods, will be described and discussed, with examples from contemporary earth system models.
The Climateprediction.net programme, big data climate modellingDavid Wallom
The Climateprediction.net program harnesses over 600,000 volunteers and their computers to conduct large ensemble climate simulations through the BOINC distributed computing platform. It has run over 130 million model years across more than 25 subprojects exploring uncertainties in climate predictions and extreme weather attribution. Current work includes super-ensembles examining stratosphere-troposphere coupling and mid-latitude dynamics, as well as the HAPPI project simulating 1.5°C and 2°C warming scenarios consistent with the goals of the Paris Agreement.
Towards Exascale Simulations for Regional-Scale Earthquake Hazard and Riskinside-BigData.com
The document discusses the goals and progress of the Department of Energy's Exascale Computing Project (ECP) to develop exascale simulations for regional-scale earthquake hazard and risk assessments. The ECP aims to (1) develop computational frameworks coupling geophysics and infrastructure modeling codes, (2) increase frequency resolution and reduce runtimes through advances in hardware, software, and algorithms, and (3) establish performance benchmarks to track progress towards exascale capabilities. Initial regional demonstrations in 2017 showed promising realism in simulated ground motions and infrastructure response. Further work includes waveform inversions, GPU optimizations, and assessing how far simulations can augment probabilistic hazard assessments.
Climate Modeling and Future Climate Change ProjectionsJesbin Baidya
Climate models are mathematical representations of the physical processes that control the climate system. The most sophisticated climate models are called General Circulation Models (GCMs) which attempt to simulate all relevant atmospheric and oceanic processes. GCMs are based on fundamental laws of physics and solve complex equations using computers. They allow scientists to project potential future climate changes from increasing greenhouse gases by assessing how the climate system may respond to restore equilibrium. While climate models have uncertainties, they provide valuable insights when evaluated against historical climate data.
fpbm- pg subject in Construction Managamentdeepika977036
This document discusses heuristic procedures for reactive project scheduling. It outlines four reactive scheduling procedures - priority lists scheduling, fixed resource allocation, sampling approach, and heuristic weighted earliness-tardiness procedure. It also describes the properties of heuristic algorithms and experimental setup testing the reactive procedures on 600 project instances from the PSPLIB 120 data set. The conclusions suggest that further research could apply sampling to minimize expected makespan or use different priority lists.
Webinar presentation March 3, 2016.
The CSCC deliverable, Practical Guide to Hybrid Cloud Computing, contains prescriptive guidance for the successful deployment of hybrid cloud computing. The whitepaper outlines the key considerations that customers must take into account as they adopt hybrid cloud computing and covers the strategic and tactical activities for decision makers implementing hybrid cloud solutions as well as technical considerations for deployment.
Download the deliverable: http://www.cloud-council.org/resource-hub
Steel and concrete modular construction is better suited than wood-framed modular for commercial and high-end residential buildings as it allows for larger openings, more flexible floor plans, fire resistance, and LEED certification. Modular units are constructed in a controlled factory setting using an assembly line process, then transported and assembled on-site, providing construction speed, cost savings, and faster occupancy compared to traditional on-site construction. Steel and concrete modular structures are more durable and rigid with longer lifespan than wood structures.
Hybrid integration reference architectureKim Clark
The ownership boundary of the typical enterprise now encompasses a much broader IT landscape. It is common to see that landscape stretch out to cloud native development platforms, software as a service, dependencies on external APIs from business partners, a mobile workforce and an ever growing range of digital channels. The integration surface area is dramatically increased and the integration patterns to support it are evolving just as quickly. These are the challenges we recognise as "hybrid integration". We will explore what a reference architecture for hybrid integration might look like, and how IBM's integration portfolio is growing and changing to meet the needs of digital transformation. This deck comes from the following article http://ibm.biz/HybridIntRefArch and is also described in this video http://ibm.biz/HybridIntRefArchYouTube
Webinar presentation March 9, 2017
IT environments are now fundamentally hybrid in nature – devices, systems, and people are spread across the globe, and at the same time virtualized. Achieving integration across this ever changing environment, and doing so at the pace of modern digital initiatives, is a significant challenge.
This presentation introduces a hybrid integration reference architecture published by the Cloud Standards Customer Council. Learn best practices from leading-edge enterprises that are starting to leverage a hybrid integration platform to take advantage of best of breed cloud-based and on-premises integration approaches.
This webinar draws from the CSCC's deliverable, Cloud Customer Architecture for Hybrid Integration. Read it here: http://www.cloud-council.org/deliverables/cloud-customer-architecture-for-hybrid-integration.htm
Five Steps to Creating a Secure Hybrid Cloud ArchitectureAmazon Web Services
A hybrid Architecture is one of the easiest ways to securely address new application requirements and cloud-first development initiatives. This approach allows you to start small and expand as your requirements change while maintaining a strong security posture. In this session, you will learn the 5 key steps to building a hybrid architecture on AWS using the VM-Series next-generation firewall.
This document discusses virtualization and cloud computing, specifically hybrid cloud architectures. It defines hybrid cloud as a cloud computing environment where an organization provides and manages some resources in-house and has others provided externally from a public cloud provider. The document outlines the key considerations for hybrid cloud planning, examines hybrid cloud architecture which combines a private cloud with at least one public cloud, and discusses the advantages of cost efficiency, isolation, availability and flexibility as well as the disadvantages of data beyond the firewall and greater internal IT maintenance required.
Modular coordination is a concept where buildings and components are dimensioned and positioned based on basic modular units. This allows for dimensional compatibility and simplifies construction. The basic module is 100mm denoted as 1M. Multiples and fractions of the basic module can also be used. A modular reference system establishes grids to coordinate the placement and sizing of building elements and components. Structural elements like walls, floors and columns are dimensioned to fit within the modular grids, as are non-structural components and finishes. This standardization aims to reduce waste and improve construction efficiency.
Modular construction involves prefabricating building components off-site and transporting them to the construction location for assembly. This document discusses the benefits of modular construction compared to standard construction methods. Some key benefits include reduced construction costs through industrialized manufacturing, faster installation times, standardized components that improve productivity, and reusability of modular units that provide flexibility. The document also explains modular coordination, which involves dimensioning buildings and components using a standard module unit of 100mm to facilitate industrialized production and assembly of standardized building parts.
Differentiating between web APIs, SOA, & integration…and why it mattersKim Clark
At a high level, both SOA and web APIs seem to solve the same problem – expose business function in real-time and in a reusable way. This tutorial looks at how these initiatives are different and how they align into an evolving integration architecture. It discusses how API Management differs from the integration architectures that came before it, such as SOA and EAI.
This document discusses industrialized building systems (IBS) and modular coordination. It defines IBS as building systems where prefabricated structural components are manufactured off-site and assembled with minimal work. Modular coordination standardizes dimensions to facilitate industrial production and assembly of building components. The document outlines various IBS classifications including frame, panel and box systems. It notes advantages like reduced labor, waste and faster completion compared to traditional construction methods.
AWS re:Invent 2016: Hybrid Architecture Design: Connecting Your On-Premises W...Amazon Web Services
You’re trying to minimize your time to deploy applications, reduce capital expenditure, and take advantage of the economies of scale made possible by using Amazon Web Services; however, you have existing on-premises applications that are not quite ready for complete migration. Hybrid architecture design can help! In this session, we discuss the fundamentals that any architect needs to consider when building a hybrid design from the ground up. Attendees get exposure to Amazon VPC, VPNs, Amazon Direct Connect, on-premises routing and connectivity, application discovery and definition, and how to tie all of these components together into a successful hybrid architecture.
Learn how Aerospike's Hybrid Memory Architecture brings transactions and analytics together to power real-time Systems of Engagement ( SOEs) for companies across AdTech, financial services, telecommunications, and eCommerce. We take a deep dive into the architecture including use cases, topology, Smart Clients, XDR and more. Aerospike delivers predictable performance, high uptime and availability at the lowest total cost of ownership (TCO).
The document summarizes a presentation on building automation systems (BAS) and their role in managing energy usage and demand in green, intelligent buildings. It discusses how BAS can integrate with the smart grid to support distributed energy and demand response. It also outlines the agenda, benefits of intelligent buildings, services that building systems can provide, and the vision of the Continental Automated Buildings Association (CABA) to advance integrated technology in buildings.
Big data is transforming terrestrial ecosystem science by enabling new approaches to model evaluation, development and prediction. Several examples are provided where large datasets on atmospheric measurements, remote sensing, and flux towers are integrated with models. This allows processes to be better understood from data analysis and provides opportunities to improve models. However, tools are still needed to easily facilitate comparison and assimilation of diverse data with models. The eMAST initiative aims to develop infrastructure for predictive ecosystem models that are fully informed by all relevant data.
The document discusses equations of motion used in weather forecasting and climate change studies. It begins with an introduction to geophysical fluid dynamics and the distinguishing effects of rotation and stratification. It then outlines the basic equations of motion, including conservation of momentum, mass, energy, and state. It describes how these equations are solved on grids using numerical models. It discusses the challenges of modeling processes at different spatial scales from synoptic to urban. It also addresses challenges in tropical weather prediction and how dynamical prediction of weather over South Asia has improved.
Physical processes in the earth system are modeled with mathematical representations called parameterizations. This talk will describe some of the conceptual approaches and mathematics used do describe physical parameterizations focusing on cloud parameterizations. This includes tracing physical laws to discrete representations in coarse scale models. Clouds illustrate several of the complexities and techniques common to many physical parameterizations. This includes the problem of different scales, sub-grid scale variability. Discussions of mathematical methods for dealing with the sub-grid scale will be discussed. In-exactness or indeterminate problems for both weather and climate will be discussed, including the problems of indeterminate parameterizations, and inexact initial conditions. Different mathematical methods, including the use of stochastic methods, will be described and discussed, with examples from contemporary earth system models.
The Climateprediction.net programme, big data climate modellingDavid Wallom
The Climateprediction.net program harnesses over 600,000 volunteers and their computers to conduct large ensemble climate simulations through the BOINC distributed computing platform. It has run over 130 million model years across more than 25 subprojects exploring uncertainties in climate predictions and extreme weather attribution. Current work includes super-ensembles examining stratosphere-troposphere coupling and mid-latitude dynamics, as well as the HAPPI project simulating 1.5°C and 2°C warming scenarios consistent with the goals of the Paris Agreement.
Towards Exascale Simulations for Regional-Scale Earthquake Hazard and Riskinside-BigData.com
The document discusses the goals and progress of the Department of Energy's Exascale Computing Project (ECP) to develop exascale simulations for regional-scale earthquake hazard and risk assessments. The ECP aims to (1) develop computational frameworks coupling geophysics and infrastructure modeling codes, (2) increase frequency resolution and reduce runtimes through advances in hardware, software, and algorithms, and (3) establish performance benchmarks to track progress towards exascale capabilities. Initial regional demonstrations in 2017 showed promising realism in simulated ground motions and infrastructure response. Further work includes waveform inversions, GPU optimizations, and assessing how far simulations can augment probabilistic hazard assessments.
Climate Modeling and Future Climate Change ProjectionsJesbin Baidya
Climate models are mathematical representations of the physical processes that control the climate system. The most sophisticated climate models are called General Circulation Models (GCMs) which attempt to simulate all relevant atmospheric and oceanic processes. GCMs are based on fundamental laws of physics and solve complex equations using computers. They allow scientists to project potential future climate changes from increasing greenhouse gases by assessing how the climate system may respond to restore equilibrium. While climate models have uncertainties, they provide valuable insights when evaluated against historical climate data.
fpbm- pg subject in Construction Managamentdeepika977036
This document discusses heuristic procedures for reactive project scheduling. It outlines four reactive scheduling procedures - priority lists scheduling, fixed resource allocation, sampling approach, and heuristic weighted earliness-tardiness procedure. It also describes the properties of heuristic algorithms and experimental setup testing the reactive procedures on 600 project instances from the PSPLIB 120 data set. The conclusions suggest that further research could apply sampling to minimize expected makespan or use different priority lists.
DSD-SEA 2023 Climate Stress Test Toolbox - BoisgontierDeltares
Presentation by Hélène Boisgontier (Deltares) at the Seminar Models and decision-making in the wake of climate uncertainties, during the Deltares Software Days South-East Asia 2023. Wednesday, 22 February 2023, Singapore.
An Exact Branch And Bound Algorithm For The General Quadratic Assignment ProblemJoe Andelija
The document describes an exact branch and bound algorithm for solving the general quadratic assignment problem. It reviews several existing exact algorithms and integer programming formulations for the QAP. The author proposes a new exact algorithm based on linearizing the general QAP into a linear assignment problem that is smaller in size. Computational results and comparisons to other methods are discussed.
This document is the table of contents for Calculus, Early Transcendentals, Second Edition by Michael Sullivan. It lists 16 chapters that cover topics in calculus including limits, derivatives, integrals, vector calculus, and differential equations. The preface provides advice for students on how to effectively use the textbook to learn calculus, such as reading actively before class and using the examples and features to build understanding. The table of contents provides an overview of the scope and organization of content covered in the calculus textbook.
Climate models are mathematical representations of physical processes that determine climate. They are used to understand climate processes and project future climate scenarios. Simplifications are needed due to complex interactions and limited computational capabilities. Models have improved over time with increased resolution and process representation. Observational evidence shows unequivocal warming globally with some regional precipitation variability. Projections show continued warming and changes in precipitation patterns for South Asia over the 21st century, but models have uncertainties. Continued improvements aim to better capture regional climate impacts.
Multiphase Flow Modeling and Simulation: HPC-Enabled Capabilities Today and T...inside-BigData.com
In this video from the 2014 HPC User Forum in Seattle, Igor Bolotnov from North Carolina State University presents: Multiphase Flow Modeling and Simulation: HPC-Enabled Capabilities Today and Tomorrow.
Learn more: http://insidehpc.com/video-gallery-hpc-user-forum-2014-seattle/
Dynamic scene understanding using temporal association rulesijunejo
This document describes a thesis defense presentation on dynamic scene understanding using temporal association rules. The presentation covers feature extraction using mean-shift tracking, event modeling through spectral clustering of object trajectories, mining frequent temporal patterns and association rules to learn a traffic scene model, and detecting anomalies by comparing test sequences to the learned model. Accuracy of 97% is achieved on junction and roundabout datasets for spatio-temporal anomaly detection.
20181128 3 voskov efficient and efficient geothermal simulationAlexandros Daniilidis
The document discusses advances in simulation capabilities for complex geothermal processes using the ADGPRS and DARTS simulation platforms. ADGPRS started as a research simulator and has expanded capabilities to include geomechanics, fractures, and chemical interactions. DARTS improves performance further using an operator-based linearization approach on CPU and GPU. Recent applications include modeling of acidizing wells, fracture propagation, uncertainties in geothermal reservoirs, and high-enthalpy geothermal processes. DARTS has demonstrated orders of magnitude better performance than ADGPRS while maintaining accuracy.
This document discusses using deep learning techniques to detect extreme weather patterns in climate data. It begins by outlining the scientific motivation and successes of deep learning in computer vision. It then describes early successes applying deep learning to climate science tasks like classifying tropical cyclones, atmospheric rivers, and weather fronts. Challenges include dealing with multi-variate climate data and lack of labeled examples. Future work involves creating unified deep learning models that can perform detection, localization, and segmentation of extreme weather across different climate datasets.
This document discusses coupling the near-field plume model CORMIX with the far-field hydrodynamic model Delft3D-FLOW to accurately simulate cooling water discharges over different spatial scales. It presents the distributed entrainment sink approach used to dynamically couple the models. Validation shows the coupled model reproduces observed physical phenomena in laboratory and field measurements better than traditional modeling. The coupled approach allows more realistic assessment of environmental impacts and intake temperatures.
Repairing Learning-Enabled Controllers While Preserving What WorksIvan Ruchkin
Presented at the 15th ACM/IEEE International Conference on Cyber-Physical Systems (ICCPS 2024).
Abstract: Learning-enabled controllers have been adopted in various cyber-physical systems (CPS). When a learning-enabled controller fails to accomplish its task from a set of initial states, researchers leverage repair algorithms to fine-tune the controller's parameters. However, existing repair techniques do not preserve previously correct behaviors. Specifically, when modifying the parameters to repair trajectories from a subset of initial states, another subset may be compromised. Therefore, the repair may break previously correct scenarios, introducing new risks that may not be accounted for. Due to this issue, repairing the entire initial state space may be hard or even infeasible. As a response, we formulate the Repair with Preservation (RwP) problem, which calls for preserving the already-correct scenarios during repair. To tackle this problem, we design the Incremental Simulated Annealing Repair (ISAR) algorithm, which leverages simulated annealing on a barriered energy function to safeguard the already-correct initial states while repairing as many additional ones as possible. Moreover, formal verification is utilized to guarantee the repair results. Case studies on an Unmanned Underwater Vehicle (UUV) and OpenAI Gym Mountain Car (MC) show that ISAR not only preserves correct behaviors from previously verified initial state regions, but also repairs 81.4% and 23.5% of broken state spaces in the two benchmarks. Moreover, the average signal temporal logic (STL) robustnesses of the ISAR repaired controllers are larger than those of the controllers repaired using baseline methods.
Computationally Efficient Protocols to Evaluate the Fatigue Resistance of Pol...npaulson
Evaluation and selection of polycrystalline microstructures for
fatigue resistance through computational means is hampered by the high cost of CPFEM for elastic-plastic analysis. In this work, novel approaches are employed to compare the projected HCF (and LCF) resistance of alpha-beta titanium microstructures with a variety of textures and boundary conditions based on mesoscopic FIPs. Specifically, a materials knowledge system approach for modeling of local grain responses based on spatial statistics is developed to quickly evaluate strain fields for a set of statistical volume elements (SVEs) representing a particular microstructure. Then, an explicit integration scheme (or a calibrated function) is developed to estimate the plastic strain in each voxel, allowing for the calculation of FIPs for each SVE and the evaluation of the robustness of each microstructure for HCF applications. This data science approach is orders of magnitude faster than traditional CPFEM methods, making it possible to compare large numbers of microstructures and identify those most suitable.
This document provides notes for an introduction to computational fluid dynamics (CFD) course. It outlines the course organization, learning objectives, and contents. The key points are:
- The course grade is based on homeworks, a report, and a final exam. Interaction is extremely important.
- The learning objectives are to understand the role of C programming in fluid dynamics and numerical methods, learn various CFD terminology and best practices, and be able to set up and analyze simple aerodynamic problems.
- The contents include introductions to partial differential equations, finite difference methods, grids/boundaries, Euler/RANS equations, and case studies. Hands-on lab sessions make up a large part of
Similar to A modular architecture for hybrid planning with theories cp2014 (20)
Main Java[All of the Base Concepts}.docxadhitya5119
This is part 1 of my Java Learning Journey. This Contains Custom methods, classes, constructors, packages, multithreading , try- catch block, finally block and more.
How to Add Chatter in the odoo 17 ERP ModuleCeline George
In Odoo, the chatter is like a chat tool that helps you work together on records. You can leave notes and track things, making it easier to talk with your team and partners. Inside chatter, all communication history, activity, and changes will be displayed.
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
Assessment and Planning in Educational technology.pptxKavitha Krishnan
In an education system, it is understood that assessment is only for the students, but on the other hand, the Assessment of teachers is also an important aspect of the education system that ensures teachers are providing high-quality instruction to students. The assessment process can be used to provide feedback and support for professional development, to inform decisions about teacher retention or promotion, or to evaluate teacher effectiveness for accountability purposes.
How to Manage Your Lost Opportunities in Odoo 17 CRMCeline George
Odoo 17 CRM allows us to track why we lose sales opportunities with "Lost Reasons." This helps analyze our sales process and identify areas for improvement. Here's how to configure lost reasons in Odoo 17 CRM
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Thinking of getting a dog? Be aware that breeds like Pit Bulls, Rottweilers, and German Shepherds can be loyal and dangerous. Proper training and socialization are crucial to preventing aggressive behaviors. Ensure safety by understanding their needs and always supervising interactions. Stay safe, and enjoy your furry friends!
Liberal Approach to the Study of Indian Politics.pdf
A modular architecture for hybrid planning with theories cp2014
1. A Modular Architecture for Hybrid Planning with Theories
Maria Fox
Planning Group, Dept of Informatics
King’s College London, UK
2. The Topic of This Talk
•Planning is moving towards ever more demanding applications:
•What challenges arise for planning in the physical world?
–Time, numeric quantities, continuous change etc
•How do inference and relaxation go together in hybrid planning?
•How does a planner reason about structured types?
3. Outline
•Quick introduction to temporal Planning and Relaxed Plan Search
•A challenging application for planning and constraint reasoning
•How physical dynamics complicate planning
•Planning with structured types
•Overall framework: Planning Modulo Theories
“Planning Modulo Theories”, Peter Gregory, Derek Long, Maria Fox and J. Christopher Beck, ICAPS 2012
4. Planning
•Planning is the problem of finding a sequence of concurrent collections of actions to transform an initial state into a goal state
•Suitable when there are long causal chains and inter-dependencies
•Assumes the world can be modelled as a finite collection of state variables and that actions cause changes in the values of those variables
Actions: Preconditions determine whether transitions are possible, effects assign values to state variables
There is an enormous search space to search using relaxations of the problem
5. Planning
•Planning is the problem of finding a sequence of concurrent collections of actions to transform an initial state into a goal state
•Suitable when there are long causal chains and inter-dependencies
•Assumes the world can be modelled as a finite collection of state variables and that actions cause changes in the values of those variables
……Until a plan is found that transforms the initial state into one satisfying the goal
There is an enormous search space to search using relaxations of the problem
6. Heuristic Forward Search for Temporal Planning
Current state
Possible next states
Relaxed plans generated for each evaluated candidate next state
Goal condition
State progression
Heuristic function computation
Abstracted reachability and relaxed plan extraction
State variable assignments and temporal constraints
“The FF Planning System: Fast Plan Generation through Heuristic Search” Joerg Hoffmann and Bernhard Nebel, JAIR 2001 "Forward-Chaining Partial-Order Planning (POPF)" A. J. Coles, A. I. Coles, M. Fox, and D. Long, ICAPS, May 2010.
Temporal reasoning and constraint propagation
Eg: (ignore negative effects)
7. Relaxing Plan Search Space
Initial state
Reachable in 1 action
Reachable in 2 actions
Reachable in n actions
Relaxation: Collect individual states at each step into a single abstract state at that step How many steps to reach an abstract state that satisfies the goal?
A state is a valuation for a finite set of variables, an abstract state is an abstract valuation
We have to construct an abstract domain for each of the variables in the state
8. (:durative-action boilwater
:parameters (?w - water)
:duration (= ?duration 93)
:condition (and (over all(heating ?w))
(at start (= (temperature ?w) 7))
)
:effect (and (at start (heating ?w))
(at end (assign (temperature ?w) 100))
(at end (not (heating ?w)))
(at end (boiled ?w))
)
)
?duration is fixed, assuming that water starts at cold tap temperature
The action starts the heating process
The action has the discrete effect of setting the temperature of the water to 100 degrees
“PDDL: The Planning Domain Definition Language” D. McDermott, M.Ghallab, A.Howe, C.Knoblock, A.Ram, M.Veloso, D.Weld, D.Wilkins (The Rules Committee for the First International Planning Competition, 1998)
“PDDL2.1: An extension to PDDL for Expressing Temporal Planning Domains” Maria Fox and Derek Long, JAIR 2003
over all is used to express invariant conditions
Invariants
End conditions
End effects
Start conditions
Start effects
?duration
Durative action construct of PDDL2.1
Numeric state variable
9. Abstraction to Semi-lattice
Consider variable: V R
Domain Abstraction
Applying relaxed steps in relaxed plan construction always causes variable value to climb up the lattice
New assignments combine original value with all newly achieved values at each relaxed step: this is a lattice join operation
Т
Т
………
………
…………………….
[x1,x1]
[l2,u2]
Semi-lattice
[x2,x2]
[xn,xn]
………
“The Metric-FF Planning System: Translating Ignoring Delete Lists to Numeric State Variables “, Jörg Hoffmann. JAIR 2003
11. Adding Constraints
(:action increment...)
(:action decrement...)
(:action double
:precondition (and (<= (x) 3)
(>= (x) 2))
:effect (scaleup (x) 2))
x = 2
x = [2,2]
x = [1,4]
{increment,decrement,double}
{increment,decrement,double}
x = [0,6]
Goal achieved in 4 steps
x = [-1,7]
{increment,decrement,double}
x = [-2,8]
{increment,decrement,double}
(2 <= x <= 3) and x = [1,4]
Lattice meet operation leads to:
x = [2,3] x = [4,6] (double)
Also: x = [2,5] (increment) and
x = [0,3] (decrement)
so join gives x = [0,6]
12. A Challenging Problem
•Sellafield is the site of a nuclear fuel reprocessing plant and also of two old nuclear plants, Windscale and Calder Hall, being decomissioned
•Around 240 of Sellafield's 1,400 buildings are nuclear facilities. All have to be decommissioned within 100 years at estimated cost of £50bn
13. •Processing of waste includes remote operations on old fuel rods, which are stored in water for cooling
A Challenging Problem
14. •When removed from the water, a process of heating starts
•If rods overheat a chain reaction could occur, releasing a huge amount of radioactive gas
•The interactions with the rods are temperature dependent and constrained by the heating process
•The rods and components of the rods can be partially cooled during the processing
•Treatment of key elements of the rods must be completed within a time window determined by the combined effects of heating and cooling
A Challenging Problem
16. Planning
•Planners need to combine discrete decisions, temporal and resource reasoning with awareness of continuous change
•All activities are time-dependent and time-critical
•Richer relaxations are required for search control
•Stronger inference is needed for pruning search and propagating consequences of decisions
•Modelling languages have to capture mixed discrete-continuous interactions
17. Related Work in Hybrid Planning
•Model-Directed Autonomous Systems, Nayak and Williams, AI Magazine 1998
•Sapa: A Multi-objective Metric Temporal Planner, Do, Kambhampati, JAIR 2003
•Integrated AI in Space: The Autonomous Sciencecraft on Earth Observing One, Chien, AAAI 2006
•Generative Planning for Hybrid Systems based on Flow Tubes, Li and Williams, ICAPS 2008
•UPMurphi: A Tool for Universal Planning on PDDL+ Problems, Della Penna, Magazzeni, Mercorio, Intrigila, ICAPS 2009
•Temporal Planning with Problems Requiring Concurrency through Action Graphs and Local Search, Gerevini, Saetti and Serina, ICAPS 2010
•A Planning-based Framework for Controlling Hybrid Systems, Lohr, Eyerich, Keller and Nebel, ICAPS 2012
•Planning with MIP for Supply Restoration in Power Distribution Systems, Thiébaux, Coffrin, Hijazi and Slaney, IJCAI 2013
18. Planning with Continuous Change
(:durative-action boilwater
:parameters (?w - water)
:duration (> ?duration 0)
:condition (and (over all(heating ?w))
(at end (= (temperature ?w) 100))
)
:effect (and (at start (heating ?w))
(at end (boiled ?w))
(increase (temperature ?w) (* #t 1))
(at end (not (heating ?w)))
)
)
?duration is a numeric parameter, whose value is chosen by the planner
The action has the continuous effect of increasing the temperature linearly with rate 1
d temperature
dt
= 1
“PDDL2.1: An extension to PDDL for Expressing Temporal Planning Domains” Maria Fox and Derek Long, JAIR 2003 “COLIN: Planning with Continuous Linear Numeric Change”, Coles, Coles, Fox, Long, JAIR 2012
20. Temporal Reachability
Current state
Goal condition
“Planning with Problems Requiring Temporal Coordination”, Coles, Fox, Long, Smith, AAAI 2008
“COLIN: Planning with Continuous Linear Numeric Change”, Coles, Coles, Fox, Long, JAIR 2012
A
Invariants
End conditions
End effects
Start conditions
Start effects
?duration
Start conditions
Start effects
End conditions
End effects
Astart
Aend
In relaxation:
Ensure that Aend cannot be applied before Astart
Aend effects are separated from Astart by ?duration
Ignore conflicts with invariants
In state progression:
Prune states that violate invariants
21. Continuous Processes
•Physical processes, such as boiling water, can be modelled directly in PDDL+
(:process boiling
:parameters (?w - water)
:precondition (heating ?w)
:effect (increase (temperature ?w) (* #t 1))
)
(:event boiled
:parameters (?w – water)
:precondition (and (heating ?w)
(= (temperature ?w) 100))
:effect (and (not (heating ?w)) (boiled ?w))
)
“Modelling Mixed Discrete-Continuous Domains for Planning” Maria Fox and Derek Long, JAIR 2006
24. Concurrent Continuous Processes
•The cooling rate depends on the current temperature and the room temperature:
(:process cooling
:parameters (?w - water)
:precondition (> (temperature ?w) (roomtemp))
:effect (decrease (temperature ?w)
(* #t (- (temperature ?w) (roomtemp)))
)
•Since cooling is triggered whenever the water is heating, the rate of change of the water temperature will be given by the sum of the process effects:
d temperature
dt
= heatingrate – (temperature – roomtemp)
nonlinear rate of change
25. window
When the window is opened a circuit is made, leading to the capacitor charging. When the required voltage is reached, the alarm is set off.
Suppose we want to model some physical process that the planner needs to interact with, such as an alarm system.
More Complex Models
26. Goal1: awake Plan: 0: (openwindow) …… Goal2: (and (deeplyasleep) (freshair)) Plan: 0: (openwindow) t: (closewindow)
Must be late enough to get the fresh air, and early enough to avoid the alarm going off
28. Cascading Events
The capacitor starts to store charge as soon as the circuit is made, continuing till the circuit voltage is reached
openwindow
windowclosed magnetoperational deeplyasleep
0 time
windowopen
not (magnetoperational)
circuit
(>= (charge) circuitvoltage)
increase (charge) (* #t (/ 1 (resistance)))
voltage
voltageavailable
chargecapacitor
makecircuit
(:process chargecapacitor
:parameters ( )
:precondition (and (circuit) (not (voltage)))
:effect (increase (charge)(* #t (/ 1 (resistance))))
)
29. Cascading Events
(:event voltageavailable
:parameters ( )
:precondition (and (>= (charge) 5)
(not (voltage)))
:effect (and (voltage))
)
(:event alarmtriggered
:parameters ( )
:precondition (and (circuit)
(alarmdisabled)
(voltage) )
:effect (and (alarmenabled)
(not (alarmdisabled))
(ringing))
)
Circuit voltage = 5V
Resistance = 2Ω
As soon as the circuit voltage is reached, the event of voltageavailable is triggered, which in turn triggers the alarm
32. When should the Prince do the Kiss?
•To wake her up, the planner has only to exploit the fact that opening the window will cause a circuit resulting in the alarm going off.
•The kiss action can then be timed to occur when the capacitor has had time to charge to the full circuit voltage, and the alarm has had time to ring.
•The capacitor is fully charged when charge = 5.
•The time it takes for the charge to reach 5 (given that resistance = 2) is 2*circuit voltage = 10. It will take an additional 0.001 time unit to rouse the princess.
•The kiss must take place no earlier than 10.002 to guarantee that the princess is fully awoken.
33. A Linear Program Constructed Alongside the Developing Plan
Planner
LP Solver
LP built from plan choices
Solution determines timing of actions
34. A Linear Program Constructed Alongside the Developing Plan
minimise timeofkiss
Subject to:
openwindow >= 0
makecircuit = openwindow
chargestart = makecircuit
chargeend - chargestart = 2*charge;
charge >= 5;
chargeend = voltageavailable;
triggeredalarm = voltageavailable;
ringingstart = triggeredalarm;
rouseprincess - ringingstart >= 0.001;
timeofkiss >= rouseprincess + 0.001;
Time variables
Find the earliest time at which to do the kiss action
Resistance = 2
Circuit voltage = 5
Reaction time = 0.001
35. openwindow
windowclosed
magnetoperational
deeplyasleep
0 time
almostawake
kiss
circuit
(>= (capacitance) 5)
increase (capacitance) (* #t (/ 1 (resistance)))
voltage
makecircuit
voltageavailable
0 time
alarmtriggered
ringing
chargecapacitor
ring
0.001 time units
awake
rouseprincess
windowopen not (magnetoperational)
36. Avoiding Event Effects
•To give her fresh air but ensure not to wake her up, the planner must choose the moment at which to close the opened window
•Let x be the control parameter: the amount of charge in the capacitor
•From initial facts we have that x <= 5 and resistance is 2.
•The window is open for non-zero time, so x > 0
•The window must be closed while x ϵ (0, 5)
•The time it takes for the charge to reach x is 2x.
•To avoid rousing her the planner must close the window in the interval t ϵ (0,10)
We don’t want the voltageavailable event so the interval is open on the right
x is strictly greater than zero so the interval is open on the left
Closing the window will break the circuit, causing a mutex with the alarmtriggered event, so it must come earlier than 10
t is strictly greater than zero so the interval is open on the left
37. Increasing Complexity
•Everything so far can be modelled in PDDL+
•All the state variables are Boolean or Numeric
•At least two generic planners exist that can solve PDDL+ problems:
UPMurphi: Della Penna, Magazzeni, Mercorio and Intriglia
POPF: latest version by Coles and Coles (ICAPS 2014)
•In more realistic domains there are structured types that encapsulate specialised behaviours
•Planning Modulo Theories is a planning framework designed for managing structured types in hybrid domains
38. 33kV network, load and supply
Supply Profile
Profiles modelled using Timed Initial Fluents:
(at 5 (= (load b1) 3.5))
(at 10 (= (load b2) 6))
(at 17 (= (supply g1) 20))…etc, added to the initial state
Planning Problem: to maintain voltages within bounds over a period of time (eg: 24 hours) given demand and supply at busbars in the network
39. 33kV network, load and supply
Supply Profile
Planning Problem: to maintain voltages within bounds over a period of time (eg: 24 hours) given demand and supply
at busbars in the network
-15
-10
-5
0
5
10
0 5 10 15 20 25
tap ratio
time
Tap Changes
tap0
tap9
-15
-10
-5
0
5
10
0 5 10 15 20 25
tap ratio
time
Tap Changes
tap0
tap6
tap14
tap15
tap16
0.9
0.95
1
1.05
1.1
1.15
0 5 10 15 20 25
voltage
time
Voltage Profile
busbar7
busbar6
busbar33
busbar32
threshold
0.9
0.95
1
1.05
1.1
1.15
0 5 10 15 20 25
voltage
time
Voltage Profile
busbar7
busbar6
busbar33
busbar32
threshold
Planner Reactive
40. Temporal Voltage Control
•Requires solution of AC power flow
equations
•Local changes have global effects
•Requires an external solver to find network properties at time points
•Solver computes voltages at busbars in context of current settings
(real)
(reactive)
(complex)
(phase)
41. Planner choice: setTap
Consequences on voltages
Are all constraints satisfied?
planner
Proposed settings
AC power flow
Accept/Reject + constraints
Temporal plan
Prune and search
setTap
42. Planner choice: setTap
Consequences on voltages
How good was the choice?
planner
Network and proposed settings
AC power flow
Network + constraints
Evaluate
Temporal plan
Abstract Network
setTap
44. Processes over Networks
(:process rampingUp
:parameters (?g – generator)
:precondition (currentlyRampingUp ?g)
:effect (increase (theNetwork) (* #t (rampUpRate ?g))))
(:process rampingDown
:parameters (?g – generator)
:precondition (currentlyRampingDown ?g)
:effect (decrease (theNetwork) (* #t (rampDownRate ?g))))
Voltage
(some busbar)
Time
Start Ramping Up G1
Start Ramping Down G2
Start Ramping Down G2
45. Planning Modulo Theories
language for defining structured types and their functions as modules
Now we have a range of types beyond Boolean and Numeric
language for defining actions, processes and events using structured types
MDDL
CDDL
Core Planner
46. Abstract Network Type
Abstraction: projection onto proportional effects of tap changes on busbar voltages
...
Alternative real ranges for each busbar
Т
Т
“Combining a Temporal Planner with an External Solver for the Power Balancing Problem in an Electricity Network” Chiara Piacentini, Maria Fox and Derek Long, ICAPS 2013
………
………
…………………….
([x1],[x2],…[xn])
([x1],[x2],…[xn])
([x1],[x2],…[xn])
([x1,px1],[x2p2x2],…)
D = Rn
pi is a real-valued proportion, which decays outwards from the tap
Each value in the ordering is obtained by combining the previous values with the generated new values
48. Meet operation
•Load and Supply profiles imply voltage constraints at busbars
•A lattice meet operation can be applied to reduce the range of reachable voltages at individual bus bars to ensure operational ranges are maintained
50. Planning Modulo Theories
•Identify the structured types required in the domain
•Decide on appropriate abstractions for these types
•For each one, build the join operation and the meet operation
•Combine all of the domain lattices into a single heuristic function for the planning domain
•Evaluate the informativeness of the heuristic
•If not good enough, go back and revise the abstractions of the types