COPPER (COmmon Persistable Process Excecution Runtime) is an open-source high performance workflow engine, that persists the workflow instances (process) state into a database. So there is no limit to the runtime of a process. It can run for weeks, month or years. In addition, this strategy leads to crash safety.
A workflow can describe business processes for example, however any kind of use case is supported. The "modelling" language is Java, that has several advantages:
* with COPPER any Java developer is able to design workflows
* all Java developers like to use Java
* many Java libs can be integrated within COPPER
* many Java tools, like IDEs, can be used
* with COPPER your productivity will be increased when using a workflow engine
* using Java solutions will protect your investment
* COPPER is OpenSource under Apache Licence 2.0
Please visit copper-engine.org for details.
Applications have to be integrated – no matter which programming languages, databases or infrastructures are used. However, the realization of integration scenarios is a complex and time-consuming task. Over 10 years ago, Enteprise Integration Patterns (EIP) became the world wide defacto standard for splitting huge, complex integration scenarios into smaller recurring problems. These patterns appear in almost every integration project.
This session revisits EIPs and gives shows status quo. After giving a short introduction with several examples, the audience will learn which EIPs still have a „right to exist“, and which new EIPs emerged in the meantime. The end of the session shows different frameworks and tools which already implement EIPs and therefore help the architect to reduce efforts a lot.
In this PPT we have given a detailed overview on Workday Integration and why & how to enroll for this course. Workday helps in shaping the future of the organization with its Workday Integration to increase efficiency, insights, and opportunities.
Simple cloud migration with OpenText MigrateOpenText
Migrate any server workload to any target destination with the OpenText Migrate cloud migration platform. Learn about common migration challenges and how to choose the right cloud migration tool.
DevOps Continuous Integration & Delivery - A Whitepaper by RapidValueRapidValue
In this whitepaper, we will deep dive into the concept of continuous integration, continuous delivery and continuous deployment and explain how businesses can benefit from this. We will also elucidate on how to build an effective CI/CD pipeline and some of the best practices for your enterprise DevOps journey.
Developing an API strategy should be considered a journey, not a project with a predetermined outcome. This presentation describes Netflix's journey to discover a winning API strategy as well as future directions for the API.
Migrating your .NET Applications to the AWS Serverless PlatformAmazon Web Services
Windows and .NET-based workloads are first-class citizens on AWS. In this session, we show how you can easily move an existing .NET application to the AWS cloud and take advantage of it serverless capabilities. We will cover migration and architectural considerations for porting your C# application to AWS Lambda, and using API Gateway to create a façade for your application to safely make changes as you migrate.
Speakers:
Stephen Liedig, Public Sector Solutions Architect, Amazon Web Services
Shane Baldacchino, Solutions Architect, Amazon Web Services
Applications have to be integrated – no matter which programming languages, databases or infrastructures are used. However, the realization of integration scenarios is a complex and time-consuming task. Over 10 years ago, Enteprise Integration Patterns (EIP) became the world wide defacto standard for splitting huge, complex integration scenarios into smaller recurring problems. These patterns appear in almost every integration project.
This session revisits EIPs and gives shows status quo. After giving a short introduction with several examples, the audience will learn which EIPs still have a „right to exist“, and which new EIPs emerged in the meantime. The end of the session shows different frameworks and tools which already implement EIPs and therefore help the architect to reduce efforts a lot.
In this PPT we have given a detailed overview on Workday Integration and why & how to enroll for this course. Workday helps in shaping the future of the organization with its Workday Integration to increase efficiency, insights, and opportunities.
Simple cloud migration with OpenText MigrateOpenText
Migrate any server workload to any target destination with the OpenText Migrate cloud migration platform. Learn about common migration challenges and how to choose the right cloud migration tool.
DevOps Continuous Integration & Delivery - A Whitepaper by RapidValueRapidValue
In this whitepaper, we will deep dive into the concept of continuous integration, continuous delivery and continuous deployment and explain how businesses can benefit from this. We will also elucidate on how to build an effective CI/CD pipeline and some of the best practices for your enterprise DevOps journey.
Developing an API strategy should be considered a journey, not a project with a predetermined outcome. This presentation describes Netflix's journey to discover a winning API strategy as well as future directions for the API.
Migrating your .NET Applications to the AWS Serverless PlatformAmazon Web Services
Windows and .NET-based workloads are first-class citizens on AWS. In this session, we show how you can easily move an existing .NET application to the AWS cloud and take advantage of it serverless capabilities. We will cover migration and architectural considerations for porting your C# application to AWS Lambda, and using API Gateway to create a façade for your application to safely make changes as you migrate.
Speakers:
Stephen Liedig, Public Sector Solutions Architect, Amazon Web Services
Shane Baldacchino, Solutions Architect, Amazon Web Services
Cost optimization is a frequently cited reason for adopting cloud computing. However, some organizations are finding that during the migration process, replicating what is in your data centers isn’t yielding ideal results and expending numerous resources. Cloud Technology Partners Principal Architect Kacy Clarke and Global Alliance Manager Stuart Robertson discuss the realistic results you should keep in mind during your initial TCO/ROI analysis and suggest best practices to realize those goals.
Showcase the strategies used in software upgrades by employing our professionally designed Deployment Strategies PowerPoint Presentation Slides. Discuss the approaches of deployment along with assumptions and risks with the help of the application deployment PPT slideshow. The slides also cover the pattern of rolling deployment. Take the assistance of software update strategy PPT theme and describe the architecture of the rolling deployment. Explain the blue-green deployment strategies with examples. Showcase how to create blue-green deployment strategies with the help of a ready-to-use PPT slide deck. Take the assistance of strategic deployment PPT templates and explain the working of the canary deployment environment. Captivate and inform your audience at the same time by using our readily available PPT slideshow. Guide your audience through a canary deployment pattern by using ready-to-use PPT layouts. It also represents the technique for testing the new version of the application. The slides also represent the comparison of deployment strategies on different bases. https://bit.ly/3vWRPsv
Washington DC MuleSoft Meetup: CI/CD Pipeline with MuleSoft and Azure DevOpsBig Compass
Do your clients want a fast, mess-free, organized delivery process? Learn how to set up a streamlined CI/CD pipeline to deploy your APIs to Runtime Manager to three different deployment targets using Azure DevOps. You’ll see how to set-up your MuleSoft APIs to deploy to CloudHub 1.0, CloudHub 2.0, and Runtime Fabric.
Main Takeaway/Learning Points
+ Grab a glimpse of the offered components and customizable capabilities in Azure DevOps
+ Build a CI/CD Pipeline in Azure DevOps
+ Utilize Azure DevOps to deploy MuleSoft APIs to CloudHub 1.0, CloudHub 2.0, and Runtime Fabric
Technical Overview of CDS View - SAP HANA Part IIAshish Saxena
It is very important that a developer understands that technically, CDS is an enhancement of SQL which provides a Data Definition Language (DDL) for defining semantically rich database tables/views (CDS entities) and user-defined types in the database. Unlike the SAP HANA CDS, ABAP CDS are independent of the database system. The entities of the models defined in ABAP CDS provide enhanced access functions compared with existing database tables and views defined in ABAP Dictionary, making it possible to optimize Open SQL-based applications. And it is because of these unparalleled advantages that ABAP CDS is the most preferred form of methodology when it comes to Code to Data paradigm.
An Overview of Best Practices for Large Scale Migrations - AWS Transformation...Amazon Web Services
Whether you are moving a small application or entire datacenters, migrating to the cloud can be a complex process. In this session, we will share some of the common challenges that our customers face on their journey to the cloud and discuss how these challenges can be overcome. We will outline the patterns of success that we have observed from partnering with hundreds of customers on their large-scale migrations as well as highlight the mechanisms we have created to help our customers migrate faster.
About the Event:
AWS Transformation Day is designed for enterprise organizations migrating to the cloud to become more responsive, agile and innovative, while staying secure and compliant. Join us for this one-day event and we’ll share our experiences of helping enterprise customers accelerate the pace of migration and adoption of strategic services.
Who should attend?
This event is recommended for IT and business leaders who are looking to create sustainable benefits and a competitive advantage by using the AWS Cloud. CIOs, CTOs, CISOs, CDOs, CFOs, IT leaders and IT professionals, enterprise developers, business decision makers, and finance executives.
Using Software Architecture Principles in PracticeEoin Woods
Architects have to balance providing clear guidance for important decisions with the need to let people get on and build their aspects of the system without interference. In this talk Eoin Woods explores how architecture principles can help achieve this by making constraints and priorities clear without being unnecessarily prescriptive about how they are to be implemented.
Presented at O'Reilly Software Architecture Conference in London during October 2016.
ENT211_How to Assess Your Organization’s Readiness to Migrate at Scale to AWSAmazon Web Services
Migrating to the cloud provides an opportunity to reinvent your organization's operations and the management of your IT landscape. In this session, we discuss how to evaluate your organizational readiness for the cloud and how to develop foundational capabilities before the migration. We also review key considerations developed by AWS Professional Services to help organizations prepare for a migration at scale through the Migration Readiness Assessment (MRA) and Migration Readiness and Planning (MRP) programs.
Enterprise Application Integration TechnologiesPeter R. Egli
Overview of Enterprise Application Integration Technologies.
Enterprise Application Integration, or EAI in short, aims at integrating different applications into an IT application landscape. Traditionally, EAI was understood as using the same communication infrastructure by all applications without service-orientation in mind. This meant that the benefits of a shared infrastructure were limited while driving up costs through additional integration platforms.
Service Oriented Architectures (SOA) brought a new paradigm by decomposing applications into reusable and shareable services. Service orientation requires careful design of services. A hierarchic scheme of services may help to define a suitable service decomposition.
While SOA is technically based on big web service technologies, namely SOAP, WSDL and BPEL, WOA or Web Oriented Architecture stands for the lightweight service paradigm. WOA makes use of REST-based technologies like JSON and HTTP.
In many cases, an Enterprise Service Bus (ESB) is used as an infrastructure element to achieve the technical integration of the services. The ESB core functions like message routing, filtering and transformation provide the mediation services required to integrate heterogeneous application landscapes.
Speaker: David Guest
Host: Angel Alberici
VirtualMuleys: 63
https://meetups.mulesoft.com/events/details/mulesoft-online-group-english-presents-event-driven-architecture-with-mulesoft/
In this session, we will look at
Event-driven (Asynch) vs Synchronous
Event-Driven Infrastructure
Event-Driven Patterns
Mulesoft Implementation
This presentation shows all the posible options to move Oracle BI on-premise system to Oracle Analytics Cloud. We are going to see all the steps to perform this migration as well as the issues that we have seen and how to troubleshoot them. In addition we will review the most common administration tasks.
[NEW LAUNCH!] Introducing Amazon Managed Streaming for Kafka (Amazon MSK) (AN...Amazon Web Services
Discover the power of running Apache Kafka on a fully managed AWS service. In this session, we describe how Amazon Managed Streaming for Kafka (Amazon MSK) runs Apache Kafka clusters for you, demo Amazon MSK and a migration, show you how to get started, and walk through other important details about the new service.
SAP Cloud Platform - Integration, Extensibility & ServicesAndrew Harding
SAP Cloud Platform enables businesses to extend their SAP solutions to create new applications, integrate with other SAP solutions and external third parties (applications, businesses & government) with the addition of cloud services bringing access to the latest technologies such as IoT, Machine Learning, Intelligent RPA, etc.
There are any number of tricks and traps around getting the query optimizer to provide you with an optimal execution plan that gets you your data quickly and efficiently. But, at the end of the day, the principal driving factor of the optimizer, and therefore of your queries, are the statistics that define your data. This session teaches you how those statistics are put together and maintained by SQL Server. Different types of maintenance results in different levels of accuracy within statistics so we detail what the structures and information looks like after this maintenance. Your understanding of how the optimizer works with statistics will better enable you to understand why you’re getting the performance and types of execution plans that you are getting. Understanding enables you to write better t-sql statements and deal with performance problems such as bad parameter sniffing.
Cost optimization is a frequently cited reason for adopting cloud computing. However, some organizations are finding that during the migration process, replicating what is in your data centers isn’t yielding ideal results and expending numerous resources. Cloud Technology Partners Principal Architect Kacy Clarke and Global Alliance Manager Stuart Robertson discuss the realistic results you should keep in mind during your initial TCO/ROI analysis and suggest best practices to realize those goals.
Showcase the strategies used in software upgrades by employing our professionally designed Deployment Strategies PowerPoint Presentation Slides. Discuss the approaches of deployment along with assumptions and risks with the help of the application deployment PPT slideshow. The slides also cover the pattern of rolling deployment. Take the assistance of software update strategy PPT theme and describe the architecture of the rolling deployment. Explain the blue-green deployment strategies with examples. Showcase how to create blue-green deployment strategies with the help of a ready-to-use PPT slide deck. Take the assistance of strategic deployment PPT templates and explain the working of the canary deployment environment. Captivate and inform your audience at the same time by using our readily available PPT slideshow. Guide your audience through a canary deployment pattern by using ready-to-use PPT layouts. It also represents the technique for testing the new version of the application. The slides also represent the comparison of deployment strategies on different bases. https://bit.ly/3vWRPsv
Washington DC MuleSoft Meetup: CI/CD Pipeline with MuleSoft and Azure DevOpsBig Compass
Do your clients want a fast, mess-free, organized delivery process? Learn how to set up a streamlined CI/CD pipeline to deploy your APIs to Runtime Manager to three different deployment targets using Azure DevOps. You’ll see how to set-up your MuleSoft APIs to deploy to CloudHub 1.0, CloudHub 2.0, and Runtime Fabric.
Main Takeaway/Learning Points
+ Grab a glimpse of the offered components and customizable capabilities in Azure DevOps
+ Build a CI/CD Pipeline in Azure DevOps
+ Utilize Azure DevOps to deploy MuleSoft APIs to CloudHub 1.0, CloudHub 2.0, and Runtime Fabric
Technical Overview of CDS View - SAP HANA Part IIAshish Saxena
It is very important that a developer understands that technically, CDS is an enhancement of SQL which provides a Data Definition Language (DDL) for defining semantically rich database tables/views (CDS entities) and user-defined types in the database. Unlike the SAP HANA CDS, ABAP CDS are independent of the database system. The entities of the models defined in ABAP CDS provide enhanced access functions compared with existing database tables and views defined in ABAP Dictionary, making it possible to optimize Open SQL-based applications. And it is because of these unparalleled advantages that ABAP CDS is the most preferred form of methodology when it comes to Code to Data paradigm.
An Overview of Best Practices for Large Scale Migrations - AWS Transformation...Amazon Web Services
Whether you are moving a small application or entire datacenters, migrating to the cloud can be a complex process. In this session, we will share some of the common challenges that our customers face on their journey to the cloud and discuss how these challenges can be overcome. We will outline the patterns of success that we have observed from partnering with hundreds of customers on their large-scale migrations as well as highlight the mechanisms we have created to help our customers migrate faster.
About the Event:
AWS Transformation Day is designed for enterprise organizations migrating to the cloud to become more responsive, agile and innovative, while staying secure and compliant. Join us for this one-day event and we’ll share our experiences of helping enterprise customers accelerate the pace of migration and adoption of strategic services.
Who should attend?
This event is recommended for IT and business leaders who are looking to create sustainable benefits and a competitive advantage by using the AWS Cloud. CIOs, CTOs, CISOs, CDOs, CFOs, IT leaders and IT professionals, enterprise developers, business decision makers, and finance executives.
Using Software Architecture Principles in PracticeEoin Woods
Architects have to balance providing clear guidance for important decisions with the need to let people get on and build their aspects of the system without interference. In this talk Eoin Woods explores how architecture principles can help achieve this by making constraints and priorities clear without being unnecessarily prescriptive about how they are to be implemented.
Presented at O'Reilly Software Architecture Conference in London during October 2016.
ENT211_How to Assess Your Organization’s Readiness to Migrate at Scale to AWSAmazon Web Services
Migrating to the cloud provides an opportunity to reinvent your organization's operations and the management of your IT landscape. In this session, we discuss how to evaluate your organizational readiness for the cloud and how to develop foundational capabilities before the migration. We also review key considerations developed by AWS Professional Services to help organizations prepare for a migration at scale through the Migration Readiness Assessment (MRA) and Migration Readiness and Planning (MRP) programs.
Enterprise Application Integration TechnologiesPeter R. Egli
Overview of Enterprise Application Integration Technologies.
Enterprise Application Integration, or EAI in short, aims at integrating different applications into an IT application landscape. Traditionally, EAI was understood as using the same communication infrastructure by all applications without service-orientation in mind. This meant that the benefits of a shared infrastructure were limited while driving up costs through additional integration platforms.
Service Oriented Architectures (SOA) brought a new paradigm by decomposing applications into reusable and shareable services. Service orientation requires careful design of services. A hierarchic scheme of services may help to define a suitable service decomposition.
While SOA is technically based on big web service technologies, namely SOAP, WSDL and BPEL, WOA or Web Oriented Architecture stands for the lightweight service paradigm. WOA makes use of REST-based technologies like JSON and HTTP.
In many cases, an Enterprise Service Bus (ESB) is used as an infrastructure element to achieve the technical integration of the services. The ESB core functions like message routing, filtering and transformation provide the mediation services required to integrate heterogeneous application landscapes.
Speaker: David Guest
Host: Angel Alberici
VirtualMuleys: 63
https://meetups.mulesoft.com/events/details/mulesoft-online-group-english-presents-event-driven-architecture-with-mulesoft/
In this session, we will look at
Event-driven (Asynch) vs Synchronous
Event-Driven Infrastructure
Event-Driven Patterns
Mulesoft Implementation
This presentation shows all the posible options to move Oracle BI on-premise system to Oracle Analytics Cloud. We are going to see all the steps to perform this migration as well as the issues that we have seen and how to troubleshoot them. In addition we will review the most common administration tasks.
[NEW LAUNCH!] Introducing Amazon Managed Streaming for Kafka (Amazon MSK) (AN...Amazon Web Services
Discover the power of running Apache Kafka on a fully managed AWS service. In this session, we describe how Amazon Managed Streaming for Kafka (Amazon MSK) runs Apache Kafka clusters for you, demo Amazon MSK and a migration, show you how to get started, and walk through other important details about the new service.
SAP Cloud Platform - Integration, Extensibility & ServicesAndrew Harding
SAP Cloud Platform enables businesses to extend their SAP solutions to create new applications, integrate with other SAP solutions and external third parties (applications, businesses & government) with the addition of cloud services bringing access to the latest technologies such as IoT, Machine Learning, Intelligent RPA, etc.
There are any number of tricks and traps around getting the query optimizer to provide you with an optimal execution plan that gets you your data quickly and efficiently. But, at the end of the day, the principal driving factor of the optimizer, and therefore of your queries, are the statistics that define your data. This session teaches you how those statistics are put together and maintained by SQL Server. Different types of maintenance results in different levels of accuracy within statistics so we detail what the structures and information looks like after this maintenance. Your understanding of how the optimizer works with statistics will better enable you to understand why you’re getting the performance and types of execution plans that you are getting. Understanding enables you to write better t-sql statements and deal with performance problems such as bad parameter sniffing.
A great power point presentation for DBMS Concepts from start to end and with best examples chapter by chapter. Please go though each chapters sequentially for your knowledge.
A very easy going study material for better understanding and concepts of Database Management System.
This session will address the business aspect of BPM as well as the technical aspects. How will BPM make your organization run more efficiently? Activiti improves the collaboration between business and IT. What is BPMN 2.0 and what can you do with it? The second part of this session is more concrete and will include some demonstrations, including; How to get your first process running. How did we make BPMN 2.0 developer-friendly? How to embed Activiti into your application?
My Node.js workshop from Sela's Developer Conference 2015.
In the Workshop we covered The basics Node.js api's and the express web application framework.
FBTFTP: an opensource framework to build dynamic tftp serversAngelo Failla
Talk given at EuroPython2016, Bilbao:
https://ep2016.europython.eu/conference/talks/fbtftp-facebooks-python3-framework-for-tftp-servers
TFTP was first standardized in ’81 (same year I was born!) and one of its primary uses is in the early stage of network booting. TFTP is very simple to implement, and one of the reasons it is still in use is that its small footprint allows engineers to fit the code into very low resource, single board computers, system-on-a-chip implementations and mainboard chipsets, in the case of modern hardware.
It is therefore a crucial protocol deployed in almost every data center environment. It is used, together with DHCP, to chain load Network Boot Programs (NBPs), like Grub2 and iPXE. They allow machines to bootstrap themselves and install operating systems off of the network, downloading kernels and initrds via HTTP and starting them up.
At Facebook, we have been using the standard in.tftpd daemon for years, however, we started to reach its limitations. Limitations that were partially due to our scale and the way TFTP was deployed in our infrastructure, but also to the protocol specifications based on requirements from the 80’s.
To address those limitations we ended up writing our own framework for creating dynamic TFTP servers in Python3, and we decided to open source it.
I will take you thru the framework and the features it offers. I’ll discuss the specific problems that motivated us to create it. We will look at practical examples of how touse it, along with a little code, to build your own server that are tailored to your own infra needs.
Presentation delivered by Matt Done, Head Of Platform Development at expanz Pty. Ltd. during DDD Sydney event on 2 July 2011.
Matt demonstrates what it takes to setup a highly sophisticated load test, using the Azure environment and how to use the results to optimise a fully blown application development platform and application server running on Azure.
Recording of this presentation can be found at www.youtube.com/expanzTV
Play Framework makes it easy to build web applications with Java & Scala. This presentation give a idea of how play is implemented using Netty, how routes work. How we get calls in controller's action. Walk through guice and logging.
Passenger 6 generic language support presentationHongli Lai
YouTube video: https://www.youtube.com/watch?v=QyMQSYdctv0
Blog: https://blog.phusion.nl/2018/11/06/how-passenger-6-generic-language-support-is-implemented/
Introductory presentation to the Passenger 6 generic language support coding session.
Apache Beam (formerly Google Cloud Dataflow SDK) is an unified model and set of language-specific SDKs for defining and executing data processing workflows. You design pipelines, simplifying the mechanics of large-scale batch and streaming data processing and can run on a number of runtimes like Apache Flink, Apache Spark, and Google Cloud Dataflow (a cloud service).
This presentation introduces the Beam programming model, and how you can use it to design your pipelines, transporting PCollection and applying some PTransforms. You will see how the same code will be "translated" to a target runtimes thanks to a specific runner. You will also have an overview of the current roadmap, with the new interesting features.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
2. Short Profile
High performance, lightweight workflow engine for Java
Outstanding:
Java is the workflow description language!
OpenSource Apache License
Running in any container, e.g. Spring, JEE, ...
Support for various RDBMS, currently
Oracle
MySQL
PostgreSQL
Apache DerbyDB
3. Why use Java for Workflow
Design?
Source: www.bpm-guide.de/bpmn
4. Why use Java for Workflow
Design?
Problems of graphical Process Modeling
Simple issues become more simple, complex issues more complex
The business process gets obscured as execution details slip in
The development process gets cumbersome
Too opaque for users, too unwieldy for developers
5. Why use Java for Workflow
Design?
Use the widely known Java language
Utilize the complete range of Java features
Use your favourite development environment
Use all those highly elaborated Java tools for
editing workflows
workflow compilation, debugging and profiling
teamwork support
Avoid team setup expenses because of additional languages,
notations, tools and runtimes
many skilled Java professionals available
6. Core Workflow Engine
Requirements
Readable and reasonable workflow description
Usually, workflows orchestrate multiple partner systems
Generally, the lifetime of a workflow is long
from seconds, to hours and days, even months
Conclusion:
Workflow instances have to survive Java process lifetime
(persistence)
A workflow engine has to cope with an unlimited number of
workflows instances at the same time.
Performance optimization with regard to throughput and latency
7. Why plain Java is not
enough
Straightforward workflow definition in pure Java
public void execute(Process processData) {
Contract contract = crmAdapter.getContractData(processData.getCustomerId());
if (contract.isPrepay())
sepAdapter.recharge(processData.getAmount());
else
postpayInvoice.subtract(processData.getAmount());
smsAdapter.message(processData.getMSISDN(), "recharging successful");
}
This is simple to read, but:
Every workflow instance occupies one Java thread
limited number of parallel workflow instances
A running Java thread cannot be persisted
no long running workflows, no crash safety
8. Try it asynchronously
One Thread occupied per Workflow instance?
Why not calling a partner system asynchronously?
public void execute(Process processData) {
ResponseReference r = new ResponseReference();
Contract contract = null;
synchronized (r) {
crmAdapter.sendContractDataRequest(processData.getCustomerId(), r);
r.wait();
contract = r.getContractData();
}
…
}
But: r.wait() still blocks the thread...
9. Don't block the thread
So, we try to avoid Object.wait:
private String correlationId = null;
public void execute(Process processData) {
if (correlationId == null) {
correlationId = … // create a GUID
crmAdapter.sendContractDataRequest(processData.getCustomerId(), correlationId);
// somehow register this workflow instance to wait for correlationId
// execute is called again, when the response is available
return;
}
else {
Contract contract = crmAdapter.getResponse(correlationId);
// continue to process the workflow
…
}}
But: This approach is bad for the readability, especially with
larger workflows
10. COPPER approach
Substitute Object.wait
public void execute(Process processData) {
String correlationId = getEngine().createUUID();
crmAdapter.sendContractDataRequest(processData.getCustomerId(), correlationId);
this.wait(WaitMode.ALL, 10000, correlationId);
Contract contract = this.getAndRemoveResponse(correlationId);
// continue to process the workflow
…
}
Interrupt and Resume anywhere (within the workflow)
Call stack is persisted and restored
Internally implemented by Bytecode Instrumentation
11. Some more features
Crash recovery
Change Management of Workflows
supports Versioning as well as Modification of workflows
hot workflow deployment
Management & Monitoring via JMX
Distributed Execution on multiple coupled engines enables
Load Balancing
Redundancy
High Availability (requires a high available DBMS, e.g. Oracle RAC)
Fast and generic Audit Trail
13. COPPER Architecture
explained
ProcessingEngine
The main entity in the COPPER architecture, responsible for
execution of workflow instances. Offers a Java API to launch
workflow instances, notification of waiting workflow instances,
etc.
The engine supports transient or persistent workflows - this
depends on the concrete configuration (both provided out-of-thebox)
An engine is running in a single JVM process. A JVM process may
host several engines.
14. COPPER Architecture
explained
Workflow Repository
encapsulates the storage and handling of workflow definitions
(i.e. their corresponding Java files) and makes the workflows
accessible to one or more COPPER processing engines.
Reads workflow definitions from the file system
Observes the filesystem for modified files --> hot deployment
15. Execution Animation
invoke()
wf:Workflow
Input Channel
id = 4711
data = foo
newInstance()
wf:Workflow
inject dependencies COPPER runtime
run(…)
id = null
data = null
InputChannel Processor pool
Remote Partner System
Queue
Workflow
Repository
Filesystem
Correlation Map
16. Execution Animation
Input Channel
COPPER runtime
InputChannel Processor pool
Queue
Workflow
Repository
wf:Workflow
Filesystem
Correlation Map
id = 4711
data = foo
dequeue()
Remote Partner System
17. Execution Animation
Input Channel
COPPER runtime
Serialize Java
call stack and
store it
persistently
InputChannel Processor pool
Remote Partner System
Queue
Workflow
Repository
wf:Workflow
Filesystem
Correlation Map
id = = foo
data 4711
data = foo
cid
18. Execution Animation
Input Channel
COPPER runtime
Processor Thread is now free to
process otherProcessor pool
InputChannel workflows
Remote Partner System
Queue
cid
Workflow
Repository
Filesystem
wf:Workflow
id = 4711
Correlation Map
data = foo
data = foo
19. Execution Animation
Input Channel
COPPER runtime
Retrieve
persistent Java
callstack and
resume
InputChannel Processor pool
Remote Partner System
Queue
cid
Workflow
Repository
Filesystem
wf:Workflow
id = 4711
Correlation Map
data = foo
response
data = foo data
20. Execution Animation
Input Channel
COPPER runtime
Retrieve
persistent Java
callstack and
resume
InputChannel Processor pool
Queue
cid
Workflow
Repository
wf:Workflow
Filesystem
Correlation Map
id = 4711
data = foo
response data
dequeue()
Remote Partner System
21. Execution Animation
Resume here
Input Channel
COPPER runtime
InputChannel Processor pool
Remote Partner System
removeWorkflow()
Queue
continue processing
Workflow
Repository
wf:Workflow
Filesystem
Correlation Map
id = 4711
data = foo
response data
23. COPPER Architecture
explained
Processor Pool
A named set of threads executing workflow instances
Configurable name and number of processing threads
Each processor pool owns a queue, containing the workflow
instances ready for execution, e.g. after initial enqueue or wakeup
a transient engine’s queue resides in memory
a persistent engine’s queue resides in the database
Supports changing the number of threads dynamically during
runtime via JMX
COPPER supports multiple processor pools, a workflow instance
may change its processor pool at any time
24. COPPER Architecture
explained
COPPER runtime
Short running tasks pay for the cost
induced by long running tasks because
Processor pool
of thread pool saturation
queue
long running tasks (e.g. complex database query)
short running tasks
27. COPPER Architecture
explained
Database Layer
Encapsulates the access to persistent workflow instances and
queues
Decoupling of the core COPPER components and the database
Enables implementation of custom database layers, e.g. with
application specific optimizations or for unsupported DBMS.
Audit Trail
Simple and generic Audit Trail implementations
Log data to the database for tracebility and analysis
28. COPPER Architecture
explained
Batcher
Enables simple use of database batching/bulking,
Collects single database actions (mostly insert, update, delete)
and bundles them to a single batch,
Usually increases the database throughput by a factor of 10 or
more,
Widely used by the COPPER database layer, but open for custom
use.