The document discusses six governance processes for data and business intelligence: data lifecycle, data models, data quality, data security, data warehousing, and metadata. For each process, it provides an overview of why governance is important in that area, and what the governance process will do to manage issues and ensure requirements are met. The governance processes aim to balance various factors, control changes, and provide oversight and accountability for data management.
White Paper - The Business Case For Business IntelligenceDavid Walker
This white paper looks at the business case that should lie behind the decision to build a data warehouse and provide a business intelligence solution.
There are three primary drivers for making the investment in a business intelligence solution
1. Measurement and management of the business process
2. Analysis of why things change in the business in order to react better in the future
3. Providing information for stakeholders
As a consequence of the investment there will also be a number of secondary benefits that will help to justify the investment and these are also discussed. Finally there are a number of ‘anti-drivers’ – reasons for not embarking on a business intelligence programme.
Data Works Berlin 2018 - Worldpay - PCI ComplianceDavid Walker
A presentation from the Data Works conference in 2018 that looks how Worldpay, a major payments provider, deployed a secure Hadoop cluster in order to meet business requirements and in the process became on e of the few fully certified PCI compliance clusters in the world
Big Data Week 2016 - Worldpay - Deploying Secure ClustersDavid Walker
A presentation from the Big Data Week conference in 2016 that looks how Worldpay, a major payments provider, deployed a secure Hadoop cluster in order to meet business requirements
White Paper - Data Warehouse GovernanceDavid Walker
An organisation that is embarking on a data warehousing project is undertaking a long-term development and maintenance programme of a computer system. This system will be critical to the organisation and cost a significant amount of money, therefore control of the system is vital. Governance defines the model the organisation will use to ensure optimal use and re- use of the data warehouse and enforcement of corporate policies (e.g. business design, technical design and application security) and ultimately derive value for money.
This paper has identified five sources of change to the system and the aspects of the system that these sources of change will influence in order to assist the organisation to develop standards and structures to support the development and maintenance of the solution. These standards and structures must then evolve, as the programme develops to meet its changing needs.
“Documentation is not understanding, process is not discipline, formality is not skill”1
The best governance must only be an aid to the development and not an end in itself. Data Warehouses are successful because of good understanding, discipline and the skill of those involved. On the other hand systems built to a template without understanding, discipline and skill will inevitably deliver a system that fails to meet the users’ needs and sooner rather than later will be left on the shelf, or maintained at a very high cost but with little real use.
White Paper - The Business Case For Business IntelligenceDavid Walker
This white paper looks at the business case that should lie behind the decision to build a data warehouse and provide a business intelligence solution.
There are three primary drivers for making the investment in a business intelligence solution
1. Measurement and management of the business process
2. Analysis of why things change in the business in order to react better in the future
3. Providing information for stakeholders
As a consequence of the investment there will also be a number of secondary benefits that will help to justify the investment and these are also discussed. Finally there are a number of ‘anti-drivers’ – reasons for not embarking on a business intelligence programme.
Data Works Berlin 2018 - Worldpay - PCI ComplianceDavid Walker
A presentation from the Data Works conference in 2018 that looks how Worldpay, a major payments provider, deployed a secure Hadoop cluster in order to meet business requirements and in the process became on e of the few fully certified PCI compliance clusters in the world
Big Data Week 2016 - Worldpay - Deploying Secure ClustersDavid Walker
A presentation from the Big Data Week conference in 2016 that looks how Worldpay, a major payments provider, deployed a secure Hadoop cluster in order to meet business requirements
White Paper - Data Warehouse GovernanceDavid Walker
An organisation that is embarking on a data warehousing project is undertaking a long-term development and maintenance programme of a computer system. This system will be critical to the organisation and cost a significant amount of money, therefore control of the system is vital. Governance defines the model the organisation will use to ensure optimal use and re- use of the data warehouse and enforcement of corporate policies (e.g. business design, technical design and application security) and ultimately derive value for money.
This paper has identified five sources of change to the system and the aspects of the system that these sources of change will influence in order to assist the organisation to develop standards and structures to support the development and maintenance of the solution. These standards and structures must then evolve, as the programme develops to meet its changing needs.
“Documentation is not understanding, process is not discipline, formality is not skill”1
The best governance must only be an aid to the development and not an end in itself. Data Warehouses are successful because of good understanding, discipline and the skill of those involved. On the other hand systems built to a template without understanding, discipline and skill will inevitably deliver a system that fails to meet the users’ needs and sooner rather than later will be left on the shelf, or maintained at a very high cost but with little real use.
Data Works Summit Munich 2017 - Worldpay - Multi Tenancy ClustersDavid Walker
A presentation from the Data Works Summit conference in 2017 that looks how Worldpay, a major payments provider, deployed a secure Hadoop cluster to support multiple business cases in a multi-tenancy cluster.
Wallchart - Data Warehouse Documentation RoadmapDavid Walker
All projects need documentation and many companies provide templates as part of a methodology. This document describes the templates, tools and source documents used by Data Management & Warehousing. It serves two purposes:
• For projects using other methodologies or creating their own set of documents to use as a checklist. This allows the project to ensure that the documentation covers the essential areas for describing the data warehouse.
• To demonstrate our approach to our clients by describing the templates and deliverables that are produced.
Documentation, methodologies and templates are inherently both incomplete and flexible. Projects may wish to add, change, remove or ignore any part of any document. Some may also believe that aspects of one document would sit better in another. If this is the case then users of this document and these templates are encouraged to change them to fit their needs.
Data Management & Warehousing believes that the approach or methodology for building a data warehouse should be to use a series of guides and checklists. This ensures that small teams of relatively skilled resources developing the system can cover all aspects of the project whilst being free to deal with the specific issues of their environment to deliver exceptional solutions, rather than a rigid methodology that ensures that large teams of relatively unskilled staff can meet a minimum standard.
How Real TIme Data Changes the Data Warehousemark madsen
Surveys show a growing demand for more up-to-date data in our BI environments. To meet these needs requires changing from a strict reliance on nightly batch-style ETL to other methods. What is often ignored is how this affects the data warehouse. This shift introduces new technology and methods, which means the warehouse must support new types of workloads.
• Methods and tools for processing up-to-date data
• New requirements for your data warehouse database or platform
• What to look for as you address these requirements
Next Generation BI: current state and changing product assumptionsmark madsen
Short talk on the current state of BI products, changing assumptions about use, and the new design points, with the goal of changing things you consider in an evaluation. Missed attributing the last slide on software features and user success (my variation) to the excellent Kathy Sierra.
The webcast with full audio and demo by Tableau is archived at http://www.tableausoftware.com/resources/webinars/next-generation-bi-impact
MDM Institute: Why is Reference data mission critical now?Orchestra Networks
Learn why market-leading enterprises are focusing on RDM in this exclusive webinar from MDM research analyst Aaron Zornes
More than 55% of large enterprises surveyed by the MDM Institute are planning on implementing reference data management (RDM) in the next 18 months.
Why is RDM mission critical today?
How does RDM differ from (how is it similar to) MDM?
What are the top business drivers for RDM?
What are the “top 10” technical evaluation criteria?
Where are most organizations focusing their RDM efforts?
Aaron Zornes, Chief Research Officer of the MDM Institute, answers these questions and more when he reveals findings from the first ever RDM market study based on a 1Q2014 survey of 75+ global 5000 size enterprises.
Govern and Protect Your End User InformationDenodo
Watch this Fast Data Strategy session with speakers Clinton Cohagan, Chief Enterprise Data Architect, Lawrence Livermore National Lab & Nageswar Cherukupalli, Vice President & Group Manager, Infosys here: https://buff.ly/2k8f8M5
In its recent report “Predictions 2018: A year of reckoning”, Forrester predicts that 80% of firms affected by GDPR will not comply with the regulation by May 2018. Of those noncompliant firms, 50% will intentionally not comply.
Compliance doesn’t have to be this difficult! What if you have an opportunity to facilitate compliance with a mature technology and significant cost reduction? Data virtualization is a mature, cost-effective technology that enables privacy by design to facilitate compliance.
Attend this session to learn:
• How data virtualization provides a compliance foundation with data catalog, auditing, and data security.
• How you can enable single enterprise-wide data access layer with guardrails.
• Why data virtualization is a must-have capability for compliance use cases.
• How Denodo’s customers have facilitated compliance.
DAMA, Oregon Chapter, 2012 presentation - an introduction to Data Vault modeling. I will be covering parts of the methodology, comparison and contrast of issues in general for the EDW space. Followed by a brief technical introduction of the Data Vault modeling method.
After the presentation i I will be providing a demonstration of the ETL loading layers, LIVE!
You can find more on-line training at: http://LearnDataVault.com/training
Retour d'expérience sur un projet de Business Intelligence réalisé à l'EVAM selon une méthodologie Agile et avec un modèle de données Data Vault. Présentation faite lors du Swiss Data Forum du 24 novembre 2015 à Lausanne
Data-Ed Online Presents: Data Warehouse StrategiesDATAVERSITY
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered. This webinar will illustrate that good systems development more often depends on at least three data management disciplines in order to provide a solid foundation.
Takeaways:
Data system integration challenge analysis
Understanding of a range of data system-integration technologies including
Problem space (BI, Analytics, Big Data), Data (Warehousing, Vault, Cube) and alternative approaches (Virtualization, Linked Data, Portals, Meta-models)
Understanding foundational data warehousing & BI concepts based on the Data Management Body of Knowledge (DMBOK)
How to utilize data warehousing & BI in support of business strategy
Implementing Data Virtualization for Data Warehouses and Master Data Manageme...Denodo
The ongoing evolution of business requirements and growth of data volumes continue to put added challenges on existing DW and MDM implementations. Challenges that in many cases cannot be met. Data Virtualization compliments existing DW, MDM and other architectures and business initiatives, providing the agility and flexibility - at a lower cost – for the enablement of Virtual MDM, self-service BI, operational BI, rapid prototyping and real-time analytics.
More information and FREE registrations for this webinar: http://goo.gl/asYztF
Landing page for the entire Packed Lunch webinar series: http://goo.gl/NATMHw
Attend & get unique insights into:
How Data Virtualization can provide a simple and low cost alternative to traditional DW and MDM solutions
How Data Virtualization can enhance and extend existing DW or MDM solutions to provide a more agile data integration architecture
Case studies that demonstrate how Data Virtualization has increased agility to meet complex information needs
Webinar: How Banks Manage Reference Data with MongoDBMongoDB
Managing and distributing reference data globally has always been a challenge for financial institutions. Managing and maintaining database schemas while integrating and replicating that data across geographies is costly and time consuming. MongoDB's native replication capabilities and partitioned architecture make it simple to distribute and synchronize data efficiently across the globe. MongoDB’s dynamic schema dramatically reduces database maintenance for schema migrations – data structure changes can be applied with no down time, and with no impact to existing applications.
Data Works Summit Munich 2017 - Worldpay - Multi Tenancy ClustersDavid Walker
A presentation from the Data Works Summit conference in 2017 that looks how Worldpay, a major payments provider, deployed a secure Hadoop cluster to support multiple business cases in a multi-tenancy cluster.
Wallchart - Data Warehouse Documentation RoadmapDavid Walker
All projects need documentation and many companies provide templates as part of a methodology. This document describes the templates, tools and source documents used by Data Management & Warehousing. It serves two purposes:
• For projects using other methodologies or creating their own set of documents to use as a checklist. This allows the project to ensure that the documentation covers the essential areas for describing the data warehouse.
• To demonstrate our approach to our clients by describing the templates and deliverables that are produced.
Documentation, methodologies and templates are inherently both incomplete and flexible. Projects may wish to add, change, remove or ignore any part of any document. Some may also believe that aspects of one document would sit better in another. If this is the case then users of this document and these templates are encouraged to change them to fit their needs.
Data Management & Warehousing believes that the approach or methodology for building a data warehouse should be to use a series of guides and checklists. This ensures that small teams of relatively skilled resources developing the system can cover all aspects of the project whilst being free to deal with the specific issues of their environment to deliver exceptional solutions, rather than a rigid methodology that ensures that large teams of relatively unskilled staff can meet a minimum standard.
How Real TIme Data Changes the Data Warehousemark madsen
Surveys show a growing demand for more up-to-date data in our BI environments. To meet these needs requires changing from a strict reliance on nightly batch-style ETL to other methods. What is often ignored is how this affects the data warehouse. This shift introduces new technology and methods, which means the warehouse must support new types of workloads.
• Methods and tools for processing up-to-date data
• New requirements for your data warehouse database or platform
• What to look for as you address these requirements
Next Generation BI: current state and changing product assumptionsmark madsen
Short talk on the current state of BI products, changing assumptions about use, and the new design points, with the goal of changing things you consider in an evaluation. Missed attributing the last slide on software features and user success (my variation) to the excellent Kathy Sierra.
The webcast with full audio and demo by Tableau is archived at http://www.tableausoftware.com/resources/webinars/next-generation-bi-impact
MDM Institute: Why is Reference data mission critical now?Orchestra Networks
Learn why market-leading enterprises are focusing on RDM in this exclusive webinar from MDM research analyst Aaron Zornes
More than 55% of large enterprises surveyed by the MDM Institute are planning on implementing reference data management (RDM) in the next 18 months.
Why is RDM mission critical today?
How does RDM differ from (how is it similar to) MDM?
What are the top business drivers for RDM?
What are the “top 10” technical evaluation criteria?
Where are most organizations focusing their RDM efforts?
Aaron Zornes, Chief Research Officer of the MDM Institute, answers these questions and more when he reveals findings from the first ever RDM market study based on a 1Q2014 survey of 75+ global 5000 size enterprises.
Govern and Protect Your End User InformationDenodo
Watch this Fast Data Strategy session with speakers Clinton Cohagan, Chief Enterprise Data Architect, Lawrence Livermore National Lab & Nageswar Cherukupalli, Vice President & Group Manager, Infosys here: https://buff.ly/2k8f8M5
In its recent report “Predictions 2018: A year of reckoning”, Forrester predicts that 80% of firms affected by GDPR will not comply with the regulation by May 2018. Of those noncompliant firms, 50% will intentionally not comply.
Compliance doesn’t have to be this difficult! What if you have an opportunity to facilitate compliance with a mature technology and significant cost reduction? Data virtualization is a mature, cost-effective technology that enables privacy by design to facilitate compliance.
Attend this session to learn:
• How data virtualization provides a compliance foundation with data catalog, auditing, and data security.
• How you can enable single enterprise-wide data access layer with guardrails.
• Why data virtualization is a must-have capability for compliance use cases.
• How Denodo’s customers have facilitated compliance.
DAMA, Oregon Chapter, 2012 presentation - an introduction to Data Vault modeling. I will be covering parts of the methodology, comparison and contrast of issues in general for the EDW space. Followed by a brief technical introduction of the Data Vault modeling method.
After the presentation i I will be providing a demonstration of the ETL loading layers, LIVE!
You can find more on-line training at: http://LearnDataVault.com/training
Retour d'expérience sur un projet de Business Intelligence réalisé à l'EVAM selon une méthodologie Agile et avec un modèle de données Data Vault. Présentation faite lors du Swiss Data Forum du 24 novembre 2015 à Lausanne
Data-Ed Online Presents: Data Warehouse StrategiesDATAVERSITY
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered. This webinar will illustrate that good systems development more often depends on at least three data management disciplines in order to provide a solid foundation.
Takeaways:
Data system integration challenge analysis
Understanding of a range of data system-integration technologies including
Problem space (BI, Analytics, Big Data), Data (Warehousing, Vault, Cube) and alternative approaches (Virtualization, Linked Data, Portals, Meta-models)
Understanding foundational data warehousing & BI concepts based on the Data Management Body of Knowledge (DMBOK)
How to utilize data warehousing & BI in support of business strategy
Implementing Data Virtualization for Data Warehouses and Master Data Manageme...Denodo
The ongoing evolution of business requirements and growth of data volumes continue to put added challenges on existing DW and MDM implementations. Challenges that in many cases cannot be met. Data Virtualization compliments existing DW, MDM and other architectures and business initiatives, providing the agility and flexibility - at a lower cost – for the enablement of Virtual MDM, self-service BI, operational BI, rapid prototyping and real-time analytics.
More information and FREE registrations for this webinar: http://goo.gl/asYztF
Landing page for the entire Packed Lunch webinar series: http://goo.gl/NATMHw
Attend & get unique insights into:
How Data Virtualization can provide a simple and low cost alternative to traditional DW and MDM solutions
How Data Virtualization can enhance and extend existing DW or MDM solutions to provide a more agile data integration architecture
Case studies that demonstrate how Data Virtualization has increased agility to meet complex information needs
Webinar: How Banks Manage Reference Data with MongoDBMongoDB
Managing and distributing reference data globally has always been a challenge for financial institutions. Managing and maintaining database schemas while integrating and replicating that data across geographies is costly and time consuming. MongoDB's native replication capabilities and partitioned architecture make it simple to distribute and synchronize data efficiently across the globe. MongoDB’s dynamic schema dramatically reduces database maintenance for schema migrations – data structure changes can be applied with no down time, and with no impact to existing applications.
This is a slide deck that was assembled as a result of months of Project work at a Global Multinational. Collaboration with some incredibly smart people resulted in content that I wish I had come across prior to having to have assembled this.
Data Virtualization for Business Consumption (Australia)Denodo
Watch full webinar here: https://bit.ly/3llCY4s
A successful data virtualization initiative bridges the gap between two very different perspectives of data management: IT and business. However, most of the emphasis in these initiatives is put on the IT side, modeling, performance, security, etc. Business users are often left with a large library of data sets, hard to use and navigate.
Denodo’s data catalog has been designed to cover the needs of those users and simplify the use and understanding of the virtual layer from the business perspective. It provides the extra capabilities required for self-service initiatives to succeed, while avoiding many of the common pitfalls of other cataloging solutions.
Attend this session to learn:
- The role of the data catalog in a logical architecture
- How to incorporate the data catalog in the life of “citizen analysts”
- Best practices in documentation and metadata management
- Advanced usage of Denodo’s data catalog
Increasing Agility Through Data VirtualizationDenodo
During the Data Summit Conference in New York, our CMO Ravi Shankar and BJ Fesq, Chief Data Officer at CIT Group, were discussing the modernization of data architectures with data virtualization.
This presentation explores how data virtualization is being used to dramatically reduce data proliferation and ensure that all consumers are working with a single source of the truth. It also looks at how data virtualization can drive standardization, measure and improve data quality, abstract data consumers from data providers, expose data lineage, enable cross-company data integration, and serve as a common provisioning point from which to access all authoritative sources of data.
Oracle Application User Group sponsored Collaborate 2009 Presentation 'Building a Practical Strategy for Managing Data Quality' by Alex Fiteni CPA, CMA
Data Offload for the Chief Data Officer – how to move data onto Hadoop withou...DataWorks Summit
The CDO bears responsibility for the firm’s data and information strategy, governance, control, policy development, and exploitation of data assets to create business value” Source – Gartner
In this session we will show how the CDO can manage and exploit all of a company’s data assets on Hadoop, in a controlled manner where the quality of the data is verified and security access is controlled, and all data activities are logged and recorded automatically on Atlas. We will demonstrate how to use Bluemetrix Data Manager (BDM) and show how easy it is to ingest, transform and control data on Hadoop, while automatically deploying governance on Atlas.
Speaker
Liam English, CEO, Bluemetrix
Data Virtualization for Compliance – Creating a Controlled Data EnvironmentDenodo
CIT modernized its data architecture in response to intense regulatory scrutiny. In this presentation, they present how data virtualization is being used to drive standardization, enable cross-company data integration, and serve as a common provisioning point from which to access all authoritative sources of data.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/CCqUeT.
The Shifting Landscape of Data IntegrationDATAVERSITY
Enterprises and organizations from every industry and scale are working to leverage data to achieve their strategic objectives — whether they are to be more profitable, effective, risk-tolerant, prepared, sustainable, and/or adaptable in an ever-changing world. Data has exploded in volume during the last decade as humans and machines alike produce data at an exponential pace. Also, exciting technologies have emerged around that data to improve our abilities and capabilities around what we can do with data.
Behind this data revolution, there are forces at work, causing enterprises to shift the way they leverage data and accelerate the demand for leverageable data. Organizations (and the climates in which they operate) are becoming more and more complex. They are also becoming increasingly digital and, thus, dependent on how data informs, transforms, and automates their operations and decisions. With increased digitization comes an increased need for both scale and agility at scale.
In this session, we have undertaken an ambitious goal of evaluating the current vendor landscape and assessing which platforms have made, or are in the process of making, the leap to this new generation of Data Management and integration capabilities.
Big Data Analytics 2017 - Worldpay - Empowering PaymentsDavid Walker
A presentation from the Big Data Analytics conference in 2017 that looks how Worldpay, a major payments provider, uses data science and big data analytics to influence successful card payments.
A discussion on how insurance companies could use telematics data, social media and open data sources to analyse and better price policies for their customers
Data Driven Insurance Underwriting (Dutch Language Version)David Walker
A discussion on how insurance companies could use telematics data, social media and open data sources to analyse and better price policies for their customers
An introduction to data virtualization in business intelligenceDavid Walker
A brief description of what Data Virtualisation is and how it can be used to support business intelligence applications and development. Originally presented to the ETIS Conference in Riga, Latvia in October 2013
A presentation to the ETIS Business Intelligence & Data Warehousing Working Group in Brussels 22-Mar-13 discussing what Saas & Cloud means and how they will affect BI in Telcos
Business intelligence requirements are changing and business users are moving more and more from historical reporting into predictive analytics in an attempt to get both a better and deeper understanding of their data. Traditionally, building an analytical platform has required an expensive infrastructure and a considerable amount of time for setup and deployment. Here we look at a quick and simple alternative.
Using the right data model in a data martDavid Walker
A presentation describing how to choose the right data model design for your data mart. Discusses the pros and benefits of different data models with different rdbms technologies and tools
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.