Many of us have follow well established practices for the development side of an integration project with tools like BizTalk, but even though we have been doing integration for many years a lot of projects still struggle with the process of working out what we need to do which puts a big burden on your development team to deliver a project with poor information about the requirements to be delivered. Often analysis before getting into development can be non existent or take a long time and still not capture the right information.
How do we do an effective job to get just the right amount of information to make life easy for a developer to deliver a good solution which is fit for purpose?
In this session Mike will share some ideas on this part of a project and the idea is to encourage some community activity to help people in this area.
Big Data Agile Analytics by Ken Collier - Director Agile Analytics, Thoughtwo...Thoughtworks
We are in the midst of an exciting time. There is an explosion of very interesting data, and emergence of powerful new technologies for harnessing data, and devices that enable humans to receive tremendous benefits from it. What is required are innovative processes that enable the creation and delivery of value from all of that data. More often than not, it is the predictive (what will happen?) and prescriptive (how to make it happen!) analytics that produces this value, not the raw data itself.
Agile software teams are continuously involved in projects that involve rich, complex, and messy data. Often this data represents innovative analytics opportunities. Being analytics-aware gives these teams the opportunity to collaborate with stakeholders to innovate by creating additional value from the data. This session is aimed at making Agile software teams more analytics-aware so that they will recognize these innovation opportunities.
The trouble with conventional analytics (like conventional software development) is that it involves long, phased, sequential steps that take too long and fail to deliver actionable results. This talk will examine the convergence of the following elements of an exciting emerging field called Agile Analytics:
•sophisticated analytics techniques, plus
•lean learning principles, plus
•agile delivery methods, plus
•so-called "big data" technologies
Learn:
•The analytical modeling process and techniques
•How analytical models are deployed using modern technologies
•The complexities of data discovery, harvesting, and preparation
•How to apply agile techniques to shorten the analytics development cycle
•How to apply lean learning principles to develop actionable and valuable analytics
•How to apply continuous delivery techniques to operationalize analytical models
Data Infrastructure for Your Retail Digital StrategyAtif Shaikh
Retailers are facing disruptive times and are pressured to diigtize their businesses. While data remains at the center of all operations, it can also be leveraged as a core enabler of your strategy if your data infrastructure follows these tenets
No doubt Visualization of Data is a key component of our industry. The path data travels since it is created till it takes shape in a chart is sometimes obscure and overlooked as it tends to live in the engineering side (when volume is relevant), an area where Data Scientist tend to visit but not the usual Web/Marketing Data Analyst. Nowadays the options to tame all that journey and make the best of it are many and they don't require extensive engineering knowledge. Small or Big Data, let's see what "Store, Extract, Transform, Load, Visualize" is all about.
Building Better Models Faster Using Active LearningCrowdFlower
Active learning is an increasingly popular technique for rapidly iterating the construction of machine learning models, exploiting the fact that the current state of the model can be used to predict which additional examples will be the most informative. Active learning is appealing for two main reasons: it optimizes ongoing human involvement in the model building process, and it helps overcome the negative effects of imbalanced training data. In this talk, Nick explains how active learning helps overcome common obstacles to building successful models, and also offers a peek into how CrowdFlower's new active learning based offering, CrowdFlower AI.
Big Data Agile Analytics by Ken Collier - Director Agile Analytics, Thoughtwo...Thoughtworks
We are in the midst of an exciting time. There is an explosion of very interesting data, and emergence of powerful new technologies for harnessing data, and devices that enable humans to receive tremendous benefits from it. What is required are innovative processes that enable the creation and delivery of value from all of that data. More often than not, it is the predictive (what will happen?) and prescriptive (how to make it happen!) analytics that produces this value, not the raw data itself.
Agile software teams are continuously involved in projects that involve rich, complex, and messy data. Often this data represents innovative analytics opportunities. Being analytics-aware gives these teams the opportunity to collaborate with stakeholders to innovate by creating additional value from the data. This session is aimed at making Agile software teams more analytics-aware so that they will recognize these innovation opportunities.
The trouble with conventional analytics (like conventional software development) is that it involves long, phased, sequential steps that take too long and fail to deliver actionable results. This talk will examine the convergence of the following elements of an exciting emerging field called Agile Analytics:
•sophisticated analytics techniques, plus
•lean learning principles, plus
•agile delivery methods, plus
•so-called "big data" technologies
Learn:
•The analytical modeling process and techniques
•How analytical models are deployed using modern technologies
•The complexities of data discovery, harvesting, and preparation
•How to apply agile techniques to shorten the analytics development cycle
•How to apply lean learning principles to develop actionable and valuable analytics
•How to apply continuous delivery techniques to operationalize analytical models
Data Infrastructure for Your Retail Digital StrategyAtif Shaikh
Retailers are facing disruptive times and are pressured to diigtize their businesses. While data remains at the center of all operations, it can also be leveraged as a core enabler of your strategy if your data infrastructure follows these tenets
No doubt Visualization of Data is a key component of our industry. The path data travels since it is created till it takes shape in a chart is sometimes obscure and overlooked as it tends to live in the engineering side (when volume is relevant), an area where Data Scientist tend to visit but not the usual Web/Marketing Data Analyst. Nowadays the options to tame all that journey and make the best of it are many and they don't require extensive engineering knowledge. Small or Big Data, let's see what "Store, Extract, Transform, Load, Visualize" is all about.
Building Better Models Faster Using Active LearningCrowdFlower
Active learning is an increasingly popular technique for rapidly iterating the construction of machine learning models, exploiting the fact that the current state of the model can be used to predict which additional examples will be the most informative. Active learning is appealing for two main reasons: it optimizes ongoing human involvement in the model building process, and it helps overcome the negative effects of imbalanced training data. In this talk, Nick explains how active learning helps overcome common obstacles to building successful models, and also offers a peek into how CrowdFlower's new active learning based offering, CrowdFlower AI.
Lean Analytics is a set of rules to make data science more streamlined and productive. It touches on many aspects of what a data scientist should be and how a data science project should be defined to be successful. During this presentation Richard will present where data science projects go wrong, how you should think of data science projects, what constitutes success in data science and how you can measure progress. This session will be loaded with terms, stories and descriptions of project successes and failures. If you're wondering whether you're getting value out of data science, how to get more value out of it and even whether you need it then this talk is for you!
What you will take away from this session
Learn how to make your data science projects successful
Evaluate how to track progress and report on the efficacy of data science solutions
Understand the role of engineering and data scientists
Understand your options for processes and software
Agile Analytics: Delivering on Promises by Atif Abdul RahmanAgile ME
Big Data is all the hype in town yet the real value still remain with delivering analytics that create business impact. Agile Analytics sets out to unleash the true promise usually lost in lengthy, elephantine projects and years of data management purists' pursuits of perfection. That is exactly what separates these big data technologies: They promise greater agility. But is a supportive technology enough or even mandatory to become more agile? We will go through the value chain of delivering high impact analytics using agile practices and devise a jumpstarter kit for you to adopt and adapt.
Creating a DevOps Practice for Analytics -- Strata Data, September 28, 2017Caserta
Over the past eight or nine years, applying DevOps practices to various areas of technology within business has grown in popularity and produced demonstrable results. These principles are particularly fruitful when applied to a data analytics environment. Bob Eilbacher explains how to implement a strong DevOps practice for data analysis, starting with the necessary cultural changes that must be made at the executive level and ending with an overview of potential DevOps toolchains. Bob also outlines why DevOps and disruption management go hand in hand.
Topics include:
- The benefits of a DevOps approach, with an emphasis on improving quality and efficiency of data analytics
- Why the push for a DevOps practice needs to come from the C-suite and how it can be integrated into all levels of business
- An overview of the best tools for developers, data analysts, and everyone in between, based on the business’s existing data ecosystem
- The challenges that come with transforming into an analytics-driven company and how to overcome them
- Practical use cases from Caserta clients
This presentation was originally given by Bob at the 2017 Strata Data Conference in New York City.
Customer experience for doing good business.
Building a profitable enterprise no longer depends on finding the right product and setting the right price. We live in the age of the customer, who is well informed through the capabilities of the internet. Customer Experience has become an important element in doing good business. Companies will have to reinvent themselves in order to fully understand and serve their customers, employees and partners. Time to get started on building amazing experiences, aided by digital tools.
How well does your solution work?, How do you know how well your solution works? In this session, Mike will show you how developers of integration solutions can use Azure Application Insights to complement existing monitoring solutions to provide developers with an additional level of insight into the way their solutions behave in the real world and how this can be applied to the types of integration components which we normally build.
Before your newly developed R algorithms can be used in a real-life production system, some additional challenges need to be tackled. In this presentation I will discuss the integration of R algorithms in the .NET back-end of the cash supply chain optimization solution of c-Quilibrium, one of our customers. Specific topics that will be addressed include how to set up the communication between R and .NET, parallelization of the R algorithms, encryption of the R code, and logging of the algorithm’s status and results.
BizTalk Server can connect to SQL AlwaysOn databases in some scenarios. A Highly available BizTalk Server 2016 environment can be built using SQL 2016 availability groups. I will answer the following questions:
- What are SQL AlwaysOn databases?
- How do you connect to SQL AlwaysOn databases using BizTalk Server?
- How do build a BizTalk 2016 Server to use SQL 2016 availability groups?
In this session we will look at the Azure Service Bus and its capabilities to deliver low cost massive scale messaging. We will also look at some demo’s of how to use the service bus and some real world use cases. We will cover Service Bus Relay, Messaging and Event Hubs.
This session will be an intermediate session where we will look at the product features, common use cases and some samples.
Lean Analytics is a set of rules to make data science more streamlined and productive. It touches on many aspects of what a data scientist should be and how a data science project should be defined to be successful. During this presentation Richard will present where data science projects go wrong, how you should think of data science projects, what constitutes success in data science and how you can measure progress. This session will be loaded with terms, stories and descriptions of project successes and failures. If you're wondering whether you're getting value out of data science, how to get more value out of it and even whether you need it then this talk is for you!
What you will take away from this session
Learn how to make your data science projects successful
Evaluate how to track progress and report on the efficacy of data science solutions
Understand the role of engineering and data scientists
Understand your options for processes and software
Agile Analytics: Delivering on Promises by Atif Abdul RahmanAgile ME
Big Data is all the hype in town yet the real value still remain with delivering analytics that create business impact. Agile Analytics sets out to unleash the true promise usually lost in lengthy, elephantine projects and years of data management purists' pursuits of perfection. That is exactly what separates these big data technologies: They promise greater agility. But is a supportive technology enough or even mandatory to become more agile? We will go through the value chain of delivering high impact analytics using agile practices and devise a jumpstarter kit for you to adopt and adapt.
Creating a DevOps Practice for Analytics -- Strata Data, September 28, 2017Caserta
Over the past eight or nine years, applying DevOps practices to various areas of technology within business has grown in popularity and produced demonstrable results. These principles are particularly fruitful when applied to a data analytics environment. Bob Eilbacher explains how to implement a strong DevOps practice for data analysis, starting with the necessary cultural changes that must be made at the executive level and ending with an overview of potential DevOps toolchains. Bob also outlines why DevOps and disruption management go hand in hand.
Topics include:
- The benefits of a DevOps approach, with an emphasis on improving quality and efficiency of data analytics
- Why the push for a DevOps practice needs to come from the C-suite and how it can be integrated into all levels of business
- An overview of the best tools for developers, data analysts, and everyone in between, based on the business’s existing data ecosystem
- The challenges that come with transforming into an analytics-driven company and how to overcome them
- Practical use cases from Caserta clients
This presentation was originally given by Bob at the 2017 Strata Data Conference in New York City.
Customer experience for doing good business.
Building a profitable enterprise no longer depends on finding the right product and setting the right price. We live in the age of the customer, who is well informed through the capabilities of the internet. Customer Experience has become an important element in doing good business. Companies will have to reinvent themselves in order to fully understand and serve their customers, employees and partners. Time to get started on building amazing experiences, aided by digital tools.
How well does your solution work?, How do you know how well your solution works? In this session, Mike will show you how developers of integration solutions can use Azure Application Insights to complement existing monitoring solutions to provide developers with an additional level of insight into the way their solutions behave in the real world and how this can be applied to the types of integration components which we normally build.
Before your newly developed R algorithms can be used in a real-life production system, some additional challenges need to be tackled. In this presentation I will discuss the integration of R algorithms in the .NET back-end of the cash supply chain optimization solution of c-Quilibrium, one of our customers. Specific topics that will be addressed include how to set up the communication between R and .NET, parallelization of the R algorithms, encryption of the R code, and logging of the algorithm’s status and results.
BizTalk Server can connect to SQL AlwaysOn databases in some scenarios. A Highly available BizTalk Server 2016 environment can be built using SQL 2016 availability groups. I will answer the following questions:
- What are SQL AlwaysOn databases?
- How do you connect to SQL AlwaysOn databases using BizTalk Server?
- How do build a BizTalk 2016 Server to use SQL 2016 availability groups?
In this session we will look at the Azure Service Bus and its capabilities to deliver low cost massive scale messaging. We will also look at some demo’s of how to use the service bus and some real world use cases. We will cover Service Bus Relay, Messaging and Event Hubs.
This session will be an intermediate session where we will look at the product features, common use cases and some samples.
This presentation was given to the Tech Change Technology for Monitoring and Evaluation Diploma course on 25th September 2015. It covers:
Why visualise data?
Where to start?
Which tools to use?
It ends with an overview of Kwantu's approach to this area and the technology choices that we've made.
Learn the critical components for successful data governance to support business analytics. We discuss the importance of data governance, warning signs that might suggest you need to improve it and how to implement it while staying nimble. View this on-demand webinar: https://senturus.com/resources/why-bother-with-data-governance/
Senturus offers a full spectrum of services in business intelligence and training on Power BI, Tableau and Cognos. Our resource library has hundreds of free live and recorded webinars, blog posts, demos and unbiased product reviews available on our website at: http://www.senturus.com/senturus-resources/.
Kontextdrivna krav - effektiv kravställning för din organisationADDQ
Det blir allt kortare produktlivscykler, krav på snabbare time to market och kvalitet ses inte längre bara som en hygienfaktor utan som ett måste. Därför är det viktigt att förstå förutsättningarna och anpassa kravarbetet så att du löser behoven på effektivaste sätt. Vilka är dessa förutsättningar, hur påverkar de ditt projekt och hur anpassar man arbetet för att uppnå mesta möjliga affärsvärde?
Best Practices for a Successful SharePoint Migration or Upgrade to the CloudPerficient, Inc.
Whether you are a chief information officer or an IT executive, this slideshare will provide you with the key details needed for a successful upgrade to SharePoint Online in Office 365. We will share:
Top reasons to move from on-premises to SharePoint Online
Challenges and technical considerations when migrating to the cloud
Options for migrating to Office 365/SharePoint Online
Best practices for secure cloud computing with SharePoint Online
Microsoft Teams is the fastest-growing product in Microsoft history, providing a powerful platform for collaboration and communication. However, because Teams was built on the backs of two leading workloads: SharePoint and Exchange, managing the security, compliance, and governance of Teams comes with some additional complexity. In this session, Christian walks through 10 essentials for effective Teams governance to help you 'know where to go' to meet your organizational requirements.
What You Need to Know Before Upgrading to SharePoint 2013Perficient, Inc.
Ready to join the SharePoint 2013 revolution but not sure what is involved? Are you in the middle of a migration that is behind schedule? This presentation walks you through general guidelines and common pitfalls to avoid so your transition to SharePoint 2013 will be successful.
Speaker Suzanne George discusses tips and tricks to ensure a successful SharePoint 2013 implementation and describe common mistakes that organizations make during the transition.
Whether you are in the middle of migrating to SharePoint 2013 or you are just thinking about implementation, this session will give you tools that will help you successfully deploy SharePoint within your organization.
Presenter Suzanne George, MCTS, is a Senior Technical Architect a Perficient. She has developed, administered, and architected website applications since 1995 and has worked with top 100 companies such as Netscape, AOL, Sun Microsystems, and Verio. Her experience includes custom applications and SharePoint integration with applications such as ESRI, Deltek Accounting Software, and SAP. Suzanne sits on the MSL IT Manager Advisory Council, was a contributing author for SharePoint 2010 Administrators and presents at SharePoint Saturdays around the country.
Building SharePoint Enterprise Platforms - Off the beaten pathAndy Talbot
To point and click our way through a SharePoint installation is relatively easy, but what about all the other 'stuff' that we might not have considered? These slides are from Andy Talbot's MetaVis webinar for a detailed discussion on building SharePoint platforms fit for enterprise customers.
In this webinar, Andy talked about some of the common challenges that can take some enterprises by surprise, factors that we should have planned for, and common failure points. Attendees should have benefited from this discussion regardless if they were starting out with their deployment, or already in production.
Project Server - Who can benefit from it and how?SPC Adriatics
Strict quantification of value provided by some software solution is very difficult. That’s especially true in the case of software which is supposed to increase value of project management discipline whose benefits are itself difficult to quantify. Difficulty to quantify doesn’t imply that such value doesn’t exist. It only means that we need to add an extra effort to recognize it.
The purpose of this presentation is to make that extra effort and, by examining different attributes like; industry, character and size of projects, maturity of the organization – recognize patterns that lead to value realization. The key Project Server capabilities will be discussed in context of how they provide value in specific usage scenarios.
As always, realized value have to be contrasted with required investment so we will estimate the cost of EPM implementation in on-premise and cloud model.
The target audience for the presentation are all of you who believe that projects could and should be better managed in your organizations, and you have a vote on decision on using software for such an improvement.
Project Server: Who can benefit from it and how?SPC Adriatics
Strict quantification of value provided by some software solution is very difficult. That’s especially true in the case of software which is supposed to increase value of project management discipline whose benefits are itself difficult to quantify. Difficulty to quantify doesn’t imply that such value doesn’t exist. It only means that we need to add an extra effort to recognize it.
The purpose of this presentation is to make that extra effort and, by examining different attributes like; industry, character and size of projects, maturity of the organization – recognize patterns that lead to value realization. The key Project Server capabilities will be discussed in context of how they provide value in specific usage scenarios.
As always, realized value have to be contrasted with required investment so we will estimate the cost of EPM implementation in on-premise and cloud model.
The target audience for the presentation are all of you who believe that projects could and should be better managed in your organizations, and you have a vote on decision on using software for such an improvement.
Igor Slišković
Advanced Project Data Analytics for Improved Project DeliveryMark Constable
Data Analytics is already beginning to impact how projects are delivered. We can now automate minute taking and capturing actions, we can use Flow to progress chase, Power BI reduces the burden of reporting.
But we are just scratching the surface. It won’t be long before we can leverage the rich dataset of experience to predict what risks are likely to occur, understand which WBS elements will be susceptible to variance, deduce what the optimum resource profile looks like, define a schedule by leveraging data from those projects that have gone before.
The role of a project professional is about to change dramatically. In this webinar we will explore the challenges and opportunities, and how we should respond. It’s a call-to-action for the community to mobilise, help to reshape project delivery and understand the implications for you and your organisation.
Presenter Martin Paver is a Chartered Project Professional, APM Fellow and Chartered Engineer. In December 2017 he established the London Project Data Analytics meetup which has quickly spread across the UK and expanded to 3000+ members. Martin has major project experience including leading a $billion projects with a team of 220 and a multi-billion PMO with a team of 50. He has a detailed grasp of project management and combines this with a broad understanding of recent developments in the field of data science. He is on a mission to ensure that the project management profession readies itself for a transformed future.
Learning outcomes:
- Understand the implications of advanced data analytics on project delivery
- Understand the scope of which functions it is likely to impact
- Help you to develop a strategy for how you engage with it
- Understand how to leverage the benefits and opportunities that will emerge from it
Presenter:
Martin Paver, CEO & Founder, Projecting Success Ltd
SharePoint 2013 Migration - Your 5 Rules for SuccessChristian Buckley
An overview of SharePoint 2013, and best practices for organizing and orchestrating your migration to the latest version of SharePoint -- whether on prem, in the cloud, or a hybrid. Includes a quick overview of PointBeyond's migration planning services.
Scaling on Atlassian: Avoiding The Top 5 Pitfalls When Migrating From a Legac...Cprime
New emerging platforms and technologies like “Atlassian” have caused us to revisit the many different software vendors that provide short term band-aid solutions to scalability challenges.
As organizations continue to heavily invest in software tooling, the need to standardize on an integrated platform is becoming ever more necessary. This provides an opportunity to reduce complexity, get to a reliable system that reduces duplication of efforts, enables better decision-making and provides more flexible ways of being more competitive. While there are 100’s of software vendors providing point solutions to problems in this ecosystem, Atlassian has come along and caused many to rethink software, services, processes, workflows, work items and more.
With the ultimate pursuit of moving faster in an integrated way, we will highlight our journey and uncover what we encountered as we migrated to Atlassian and left our legacy systems behind.
Learn from the experts at Netwoven on how to define your cloud strategy for SharePoint.
Key Takeaways:
- Develop your cloud migration strategy for SharePoint Online
- How to prepare for your migration
- Design your SharePoint Online Information Architecture
- Avoiding common errors while moving content and users to the cloud
- How to develop a successful change management plan
- What tools do you need for successful migrations? What are the trade-offs?
- The hard part – best practices for defining the migration logic for your organization
- Testing strategies for ensuring complete data migration
Similar to The Analysis Part of Integration Projects (20)
Optimise Business Activity Tracking – Insights from Smurfit KappaBizTalk360
Watch the webinar: https://bit.ly/3iye9nb
Smurfit Kappa is one of the leading providers of paper-based packaging and displays in the world. They have operations in 30 countries. Their branch office in The Netherlands is one of the early adopters of Atomic Scope.
For this webinar, we had invited middleware specialist Rob van der horst to explain how his company is using Atomic Scope. During the session, Rob showcased how Smurfit Kappa is using the product and how they streamlined their internal processes with the help of Atomic Scope.
Key takeaways:
1. Know-how Atomic Scope is used in a real-world scenario
2. Understand how your organization can benefit from the product
3. Hear about the performance and stability improvement in v8.1
Optimise Business Activity Tracking – Insights from Smurfit KappaBizTalk360
Smurfit Kappa is one of the leading providers of paper-based packaging and displays in the world. They have operations in 30 countries. Their branch office in The Netherlands is one of the early adopters of Atomic Scope.
For this webinar, we had invited middleware specialist Rob van der horst to explain how his company is using Atomic Scope. During the session, Rob showcased how Smurfit Kappa is using the product and how they streamlined their internal processes with the help of Atomic Scope.
Key takeaways:
1. Know-how Atomic Scope is used in a real-world scenario
2. Understand how your organization can benefit from the product
3. Hear about the performance and stability improvement in v8.1
What's inside "migrating to biz talk server 2020" Book (BizTalk360 Webinar)BizTalk360
Watch the full webinar: https://bit.ly/3mMzbS7
Explore from the renowned BizTalk Server panel (Sandro Pereira, Tom Canter, and Lex Hegt) as they highlight on the challenges and solutions involved in migrating from the old BizTalk Server versions to BizTalk Server 2020.
They will also guide you through all of the migration phases—Evaluate, Plan, and Implement—and will show you how to execute your upgrade in a controlled and timely way.
Note: This webinar threw light on what the audience could expect from BizTalk360's upcoming book “Migrating to BizTalk Server 2020”
Integration Monday - Logic Apps: Development ExperiencesBizTalk360
How can I start developing Logic Apps? What are the different tools I can use? What are the advantages and drawbacks of each developer approach? What are the deployment options that I have? These are some of the questions that Sandro and Pedro have answered in this session, along with several tips that will aim to improve your Logic Apps development experience.
Integration Monday - BizTalk Migrator Deep DiveBizTalk360
Join Dan Probert as we take you on a deep dive of Microsoft’s BizTalk Migrator tool. Learn what all the moving parts do, how to modify the configuration, how to improve performance, how to create your scenarios, how to debug the tool when it runs, and how to view the model built behind the scenes. If you have questions, then this is the place to be, as we’ll also have a Q&A session towards the end. In fact, there is so much content, we probably won’t cover everything in one session!
Testing for Logic App Solutions | Integration MondayBizTalk360
In this Integration Monday session, Mike discussed the challenges and approaches for some of the common testing scenarios when delivering integration solutions with Microsoft Azure.
System Integration using Reactive Programming | Integration MondayBizTalk360
In the current software eco-system, the applications are becoming more eventful and we can not trust enough traditional message-based integration concepts and technologies for system integration. Through this slide, Sagar will walk you through design considerations for event-based integrations and Azure Event Grid as technology backend for these integrations.
Building workflow solution with Microsoft Azure and Cloud | Integration MondayBizTalk360
Most will agree that a business process can be a workflow. But, what do people think of when running workflows in the Cloud and in particular Azure or Microsoft Cloud. Because, Microsoft Azure and Cloud offer us several options to build them: No-code/low-code, and a code option with Power Automate, Logic Apps, and Durable Functions? In this session, we'll explore each and focus on building workflows with them. Furthermore, we'll see the differences and how each could potentially, complement the other.
Serverless Minimalism: How to architect your apps to save 98% on your Azure b...BizTalk360
Hear how Daniel Bass, Senior Developer at M&G plc saved 98% on their Azure bill by using a Serverless architecture instead of a PaaS architecture and learn how you can do the same. Also, get to know how we surfaced resource costs to developers that enabled them to make informed decisions on what architecture to choose!
Learn how Terraform as IaC tool when applied using a DevOps mindset can help organizations build a very predictable and version-controlled target cloud infrastructure.
Get to know the two stateful programming models of Azure Serverless compute: workflows and actors and how these models can simplify development and how they enable stateful and long-running application patterns within Azure’s compute environments.
Learn how to build a sophisticated and user-configurable Slack Bot which gives customized trade reports to financial analysts using Serverless technologies on Azure. Learn the patterns we used and the architectural decisions we made from an experienced Serverless Enterprise developer and author.
Kubernetes is running. You have your deployments and services set. Now, how do you migrate the data store? Let's journey together on this code-focused tour through ConfigMaps, Secrets, Persistent Volumes, Persistent Volume Claims, and StatefulSets. We'll craft and launch a strategy to care for your users' data in this new container world. You can power your business on Kubernetes: stateless or stateful.
The Power Platform in Office 365 (Power BI, Power Apps, Flow, Forms, Sharepoint Online,...) is probably the best ecosystem in the world for a complete digital transformation in your company and maybe you are already paying for them without any usage.
We are living a complete digital transformation where people are not restricted by apps or devices or even location. Work can be done anywhere and on any device which leads to greater security concerns regarding this business data living on mobile devices and shared with external (sometimes not trusted users). Microsoft Unified Labeling protection leverages the power of the cloud and ease of use (a few clicks for implementation) to provide a complete Information Protection solution. Now with the new unified Azure label client, users can administer the labels from one location while being integrated across the whole Microsoft platform. Attendees will learn how to configure Unified labels with real case scenarios.
Network security is back! Whether you are using Azure Kubernetes Services, IaaS virtual machines, App Services, or any other PaaS feature, securing your application or data is critical to the business. Azure security is constantly evolving and how we did things even one year ago isn't necessarily the best way anymore. Learn about Azure network security, design patterns, learn what is new, and even to see some things that are coming soon.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™UiPathCommunity
In questo evento online gratuito, organizzato dalla Community Italiana di UiPath, potrai esplorare le nuove funzionalità di Autopilot, il tool che integra l'Intelligenza Artificiale nei processi di sviluppo e utilizzo delle Automazioni.
📕 Vedremo insieme alcuni esempi dell'utilizzo di Autopilot in diversi tool della Suite UiPath:
Autopilot per Studio Web
Autopilot per Studio
Autopilot per Apps
Clipboard AI
GenAI applicata alla Document Understanding
👨🏫👨💻 Speakers:
Stefano Negro, UiPath MVPx3, RPA Tech Lead @ BSP Consultant
Flavio Martinelli, UiPath MVP 2023, Technical Account Manager @UiPath
Andrei Tasca, RPA Solutions Team Lead @NTT Data
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdf
The Analysis Part of Integration Projects
1. Connected Systems Consulting Ltd
Sponsored & Brought to you by
The Analysis Part of Integration Projects
Michael Stephenson
https://twitter.com/michael_stephen
https://www.linkedin.com/in/michaelstephensonuk1
2. The Analysis Part of Integration Projects
Connected Systems Consulting Ltd
3. Connected Systems Consulting Ltd
Who am I?
Michael Stephenson
– UK-Based Freelance Consultant specializing in:
• BizTalk
• Windows Azure
• Integration
– Microsoft Integration MVP for 6+ years
– Pluralsight Author
– Azure Insider/Advisor
– One of organizers of UK Connected Systems User Group
– Founder of BizTalk Maturity Assessment – www.biztalkmaturity.com
– Worked on approx. 30 projects that have leveraged Azure
– First project went live about 4 years ago
– Contact Info:
• Blog: http://microsoftintegration.guru
• Twitter: @michael_Stephen
• Linked In: http://www.linkedin.com/in/michaelstephensonuk1
• Email: michael_stephensonuk@yahoo.co.uk
5. Question
“How often do you feel thrown in at the deep end when starting an integration
development project because no one seems to have a clear understanding of
what is required”?
+1 in the chat window if you think this is common
7. Common Scenario 1
“I need a feed of customer data from system A to system B”
Err…. Ok so what
does this mean?
8. Common Scenario 2
Contoso
Fabrikam
Acme
CRM System
ERP
System
Finance
System
Marketing
System
Products
System
Products
System
Orders
System
Orders
System
Orders
System
Customers
System
Its this interface here, can we just do this?
Those arrows
are pretty but
we don’t know
anything about
what it means
9. Common Scenario 3
“I need an additional data attribute added to an
Existing integration process”
This may be
straightforward and
we could just get on
with it… but lets get a
little info to be sure
10. Common Scenario 4
“We need to change the existing new customer process
So it also updates the new CRM system”
I hope we have all
of the data we
might need
12. What systems
are involved?
What does the
data look like?
How much
data do we
need to
process?
What do things
happen?
What is the
process flow?
Are there any
alternative or
exception
scenarios?
How do I talk
to each
application?
Are there any
SLA’s?
Is there any
business logic?
What are the
data
transformation
rules?
Will we need
any new
infrastructure?
Are there any
performance
requirements?
Are there any
security
requirements?
Is there any
documentation
?
When do
things
happen?
13. “There’s just so much unknown that affects how we design
and build the system”
14. How Important is Analysis?
Integration is easy when…
• The solution is very like something you have done before
• You are reusing stuff you have build in the past
• The applications involved are well understood
• Only a small number of people are involved
• The data is not complicated
• The process logic is simple
• You have a good test environment setup
• Everyone understands the process to be built
• You have good testing
• Your dependencies are well understood
Integration is hard when…
• You need new infrastructure
• You have lots of dependencies or they are poorly understood
• Lots of external vendors or people are involved
• You need to use patterns you have not implemented before
• The applications involved are not well understood
• The applications involved are unreliable
• You have a poor test environment setup
• You follow good ALM practices
• You don’t have enough information
• Your requirements are not well understood or clear
• You have a poor or incorrect solution design
• Your making it up as you go along
15. Getting it wrong
Not enough analysis
• Developer gets things wrong
• Architect makes bad assumptions
• Poor documentation
• Testing problems
• Delays to redo and fix things
• Developer ends up chasing around to do analysis that wasn’t
done
• Takes longer to deliver value
• Unhappy team
• Unhappy customer
Too much analysis
• Wasted time
• Too much documentation
• Lots of the information doesn’t add value
• Takes longer to deliver
• Costs too much
• Over complicates things
• Difficult to deal with change
• Time to delivering value is high
• Unhappy customer
16. Amount of Analysis
Amount of Analysis
High
HighLow
Sweet spot = Just enough 0
20
40
60
80
100
120
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
Chart Title
Cost of Project Risk
We are usually in one of these places
18. Methodologies – In Theory
Waterfall Agile
Analysis
Design
DevelopAnalysis Design Develop
19. Methodologies – In Practice
Waterfall Agile
Analysis &
Development
Re-doing
Analysis &
Develop that
we missed or
got wrong
Analysis Design Develop
Analysis Design Develop
Time Spent
Value Delivered
What
happened to
architecture?
20. Agile v’s Waterfall
Agile
• Just give it to the scrum team
• We are doing agile so things are assumed to be simpler
• The team will figure it out
• Often teams don’t do analysis sprints
• Often This should be a sprint 0 type activity
• You usually have more stakeholders than just the product
owner
• Encourages POC of risk areas which is good
Waterfall
• Lots of analysis and design before any real world
validation
• Analyst/Architect not close enough to delivery to handle
change
• Analysis sometimes too detailed and overkill in terms of
presentation format
• Analysis that doesn’t add value
“In the real world a methodology often doesn’t solve the problem
as they aren’t getting the right amount of the right information”
22. “Gathering the information we need to allow us to design, build and test the solution the customer
needs, and being able to communicate that information to the entire team”
Analysis is….
23. “Just enough analysis to do an effective job”
“Analyst and Architect working together”
25. Definition of Ready
Do we feel ready?
• Do we feel confident about what
we need to do
• Is the project broken down into
stories or features?
• Do we need or have the right
analysis artefacts
Measuring Readiness
• INVEST
• I – Independent (of others)
• N – Negotiable
• V – Valuable
• E – Estimatable
• S – Small
• T - Testable
26. Analysis Iterations
• Iteration 1 – High Level
• Iteration 2 - More Detail
• Iteration 3 – Reduce Risk
Iteration 1
Iteration 2
Iteration n
28. Step 1 – The User Story
Structure
In order to (why is it important)
As a ………. (who wants it)
I want …… (what do you need)
Example
In order to collect money from customers
As a sales director
I want the CRM system to process credit card
payments on a monthly basis
“If you cant explain the value statement
then why do we want to do it”?
29. Step 2 – The Context Diagram
Old CRM New CRM
Orders
Website
Lookup Customer
Customer Transfer
Orders
System
Process Orders
31. Step 4 – High Level Process Logic
• Could you easily explain the process logic to
someone
32. Step 5 – High Level Data Items
• What entities?
• Any relationships?
• Any key data attributes?
• Where does data come from?
33. Step 6 – Applications Involved
• What are the supported interfaces?
• What protocols are supported?
• Where are the applications?
• Who manages them?
• Who are the key contacts?
34. Step 7 – Non-Functional Requirements
• Is there a general list of NFR’s
• Are there any specific NFR’s raised by the business
• Are their any IT NFR’s
• How should the system perform?
35. Step 8 – Stakeholder Map
Keep Satisfied
• CRM Development
Team
• Infrastructure Team
Actively Engage
• Product Owner
• HR Department
• Support Team
Monitor
• Warehouse Dept
Keep Informed
• ERP Development
Team
• Finance Dept
• Make sure we identify stakeholders and where
they fit into the map so we can make sure we talk
to the right people at the right time
• Make sure owners of error conditions are
identified not just owners of processes
36. Step 9 – Which type of Integration?
• API & Services
• ESB / Messaging
• Batch Based
• Simple Orchestration
• Complex Orchestration
• Mixed
37. Step 10 – Candidate Architecture
• Is it something we already do?
• What patterns do we need?
• How might we build it?
• How confident are we about the design?
38. Step 11 – Create Integration Catalogue
• Agree a friendly Interface Name
• Identify Source and Destination Systems
• Identify Candidate Integration Type
• Batch
• API
• ESB/Messaging
• Complex Orchestration
• Identify Types of Data involved
• Identify grouping of interfaces
• Identify related Use Cases
• Relate non functional requirements
• Relate candidate architecture
39. Checklist
What have we done
• Created an Integration Catalogue for project which
pulls together:
• User Stories
• High Level Context
• Identified Use Cases
• Draw Process Logic
• High Level Data Model
• Applications Involved
• Stakeholders
• Non-Functional Req’s
• Candidate architecture
How do we feel
• Are we confident about process
• Are we confident about data
• Are we confident about candidate architecture
• Does it feel simple or complex
• Do we feel we have a good understanding
42. “We have decided we need more information & detail before proceeding, what do we get next?”
What next?
43. Where do we look – Key Areas
1 Processing Logic • What processing logic is
required
• What data transformation is
required
• Are there any patterns
2 Application
Connectivity
• How might we connect to the
application
• Is the application interface well
defined and easy to work with
3 The Data • Does the data make sense
• Does the data map between
systems well
1
2
2
2
2
3
3
3
44. Step 1 – Flush out features
• Find alternative scenarios
• Specification by example
• Find exceptions
• Confirm owner of exception scenarios
45. Step 2 – Define Data
• Get sample data
• Data Model
• Get data specifications if existing
• XSD/XML
• WSDL/Swagger
• JSON
• Flat File Structure
• HL7
• EDI
46. Step 3 – Define Data Mappings
• Encoding formats
• Reference data mappings (Look up values)
• Mr AA
• Mrs AB
• Miss AC
• Number formatting
• Leading zeros
• Decimal places
• Text
• Length
• Alignment
• Padding
• Date formatting
Important this can often be an area
Which can throw up more integration
use cases or maintenance challenges
47. Step 4 – Application Integration Capabilities
• Do we have connectors?
• Are the connectors well understood?
• Are there challenges doing the connection?
• How will security work?
• Are there any throttling requirements?
48. Step 5 – Non Functional’s
• Revisit the NFR’s and validate with stakeholders
49. Step 6 – Patterns
• Are there any obvious integration patterns
involved
• Patterns can help create a common understanding
50. Step 7 – Design Decisions
• Are there any key design decisions to make
• Who will be involved
• How will the decision be made
• How do we communicate to the decision to everyone
• How do we ensure the decision is followed?
51. Step 8 – Update Candidate Architecture
• Is our candidate still good?
• Are there any new things
52. Checklist
What have we done
• More detailed process definition using
Specification by Example
• Defined data in more detail
• Define mappings
• Reviewed NFR’s
• Identify Patterns
• Update Architecture
How do we feel
• Are we confident about process
• Are we confident about data
• Are we confident about candidate architecture
• Do we understand the complexity
55. “How do we reduce the risk around the areas we have concerns with”
What next?
56. Step 1 – Logical Process POC
• What to do
• Implement light weight stubs to prove the applications will
work
• What does it achieve
• Allows focus on “will the applications work”
• Mitigates risk that the process logic will not work
57. Step 2 – Application Integration POC
• What to do
• Test we can connect to the application
• Create BDD style Specflow tests of the Application interface
• What does this achieve
• Test the application behaves the way we expect it to
• Tests the data looks like what we expect it to
• Tests assumptions about error conditions
58. Step 3 – Update Candidate Architecture
• Is our candidate still good?
• Are there any new things
59. Checklist
What have we done
• POC the key risk areas
• Updated architecture candidate
How do we feel
• Do we have any concerns?
• Have we mitigated risks?
61. Analysis Iterations
• Just enough analysis to set us up to be successful
• Use the iterations that are needed
• Make sure information is clear and can be
understood by anyone in the team
Iteration 1
Iteration 2
Iteration n
62. Takeaway
• Define your organisations definition of ready?
• What analysis artefacts do you usually need?
• Share it?
• Who is up for some community guidance around this?
• Technet wiki or something?
Editor's Notes
Purple means they are related to the analysis and design phase of a project
Sometimes we use a methodology which proposes ideas around how things should be done which are misinterpreted or misused and it results in not doing an effective job.
Regardless of methodology we need a certain minimum level of understanding of what we need to do otherwise we will be guessing which is never a good thing.
Sometimes too much analysis can be detrimental though
We need to appreciate that all stakeholders in the project have a different perspective and will see things in a different way.
As an architect we need to be able to imagine ourselves in each of these positions to understand the needs of each different