Speakers: Paige Bartley, Senior Analyst, Data and Enterprise Intelligence, Ovum + Cameron Tovey, Head of Information Security, Confluent
For many organizations that want to adopt streaming data, strengthening their governance protocol is a key requirement. While this certainly poses a challenge for data protection regulations and standards, it also limits the potential of data in broader enterprise initiatives that look to maximize the value of information.
There’s a prevailing enterprise perception that compliance with data protection regulations and standards, such as General Data Protection Regulation (GDPR) in the EU, Payment Card Industry (PCI), International Standards Organization (ISO) and many others is a burden: limiting the leverage of data. However, the core requirement of compliance—better control of data—has multiple downstream benefits. When compliance objectives are aligned with existing business objectives, the business can experience net gain.
For many organizations that want to adopt streaming data, strengthening their governance protocol is a key requirement. While this certainly poses a challenge for data protection regulations and standards, it also limits the potential of data in broader enterprise initiatives that look to maximize the value of information.
Learning objectives:
-Understand how data compliance can be a facilitator of existing business objectives rather than a burden
-Find out how to align existing business initiatives with compliance initiatives for maximum business benefit
-Learn about the place of streaming data and data-in-motion in the compliance effort
-Identify governance and tooling needs, existing controls and how they apply to new and emerging technology
-Discover your options for improving governance
(Bjørn Kvernstuen + Tommy Jocumsen, Norwegian Directorate for Work and Welfare) Kafka Summit SF 2018
NAV (Norwegian Work and Welfare Department) currently distributes more than one third of the national budget to citizens in Norway or abroad. We’re there to assist people through all phases of life within the domains of work, family, health, retirement and social security. Events happening throughout a person’s life determines which services we provide to them, how we provide them and when we provide them.
Today, each person has to apply for these services resulting in many tasks that are largely handled manually by various case workers in the organization. Their access to insight and useful information is limited and often hard to find, causing frustration to both our case workers and our users. By streaming a person’s life events through our Kafka pipelines, we can revolutionize the way our users experience our services and the way we work.
NAV and the government as a whole have access to vast amounts of data about our citizens, reported by health institutions, employers, various government agencies or the users themselves. Some data is distributed by large batches, while others are available on-demand through APIs. We’re changing these patterns into streams using Kafka, Streams API and Java microservices. We aim to distribute and act on events about birth, death, relationships, employment, income and business processes to vastly improve the user experience, provide real-time insight and reduce the need to apply for services we already know are needed.
This talk will touch on the following topics:
-How we move from data-on-demand to streams
-How streams of life events will free our case workers from mundane tasks
-How life and business events make valuable insight
-How we protect our users and comply with GDPR
-Why we chose Confluent Platform
Event-driven Business: How Leading Companies Are Adopting Streaming Strategiesconfluent
With the evolution of data-driven strategies, event-based business models are influential in innovative organizations. These new business models are built around the availability of real-time information on customers, payments and supply chains. As businesses look to expand traditional revenues, sourcing events from enterprise applications, mobile apps, IoT devices and social media in real time becomes essential to staying ahead of the competition.
Join John Santaferraro, Research Director at leading IT analyst firm Enterprise Management Associates (EMA), and Lyndon Hedderly, Director of Customer Solutions at Confluent, to learn how business and technology leaders are adopting streaming strategies and how the world of streaming data implementations have changed for the better.
You will also learn how organizations are:
-Adopting streaming as a strategic decision
-Using streaming data for a competitive advantage
-Using real-time processing for their applications
-Evolving roadblocks for streaming data
-Creating business value with a streaming platform
Digital Transformation Mindset - More Than Just Technologyconfluent
Many enterprises faced with silo’ed, batch-oriented, legacy systems struggle to compete in this new digital-first world. Adhering to the ‘If it’s not broken don’t fix it’ mentality leaves the door wide open for native digital challengers to grow and succeed. To stay competitive, your organization must respond in real time to every customer experience transaction, sale, and market movement. But how do you get there? First, you must change your mindset.
As streaming platforms become central to data strategies, companies both small and large are re-thinking their enterprise architecture with real-time context at the forefront. Monoliths are evolving into microservices. Datacenters are moving to the cloud. What was once a ‘batch’ mindset is quickly being replaced with stream processing as the demands of the business impose real-time requirements on technology leaders.
Join Argyle, in partnership with Confluent, in our 2018 CIO Virtual Event: The Digital Transformation Mindset – More Than Just Technology. During the webinar we’ll learn how leading companies across industries rely on a streaming platform to make event-driven architectures central to:
• How data strategies and IT initiatives are improving the digital customer experiences
• How executives are reducing risk with real time monitoring and anomaly detection
• Increasing operational agility with microservices and IoT architectures within organizations
Event-Based Business Architecture: Orchestrating Enterprise Communications confluent
(Gary Samuelson, GarySamuelson) Kafka Summit SF 2018
A business-oriented view, illustrating both process models and in-flight task progress, is critical to understanding organizational health, efficiency and alignment to strategic goals. The intent of this talk is to illustrate the real-time relationship between Kafka-managed events (event driven) and business architecture via actionable models (real-time analytics).
Takeaways:
-Understand how business views technology in terms of capabilities aligned to strategy.
-Introduce process model and performance views into an event-oriented dashboard. This view illustrates the organization in terms of collaborating human and automated services.
-Illustrate how system architecture dovetails into business goals with an aligned business/IT architectures.
Digital Shift in Insurance: How is the Industry Responding with the Influx of...DataWorks Summit
The digital connected world is having an impact on the technology environments that insurers must create to thrive in the new era of computing. The nature of customer interactions, business processes from product, risk and claims management are continuously changing. During this session we will review recent research and insights from insurance companies in the life, general and reinsurance markets and discuss the implications for insurers as the industry considers implications from core systems, predictive and preventive analytics and improvements to customer experiences.
Millions of dollars are being spent annually by the insurance industry in InsurTech investments from risk listening, customer interactions (chatbots, SMS messaging, smart interactive conversations), to methods of evaluating claims (digital capture at notice of incident, dashcams, connected homes/vehicles).
These are all new types of data which the industry hasn't previously had to manage and govern.
Additionally, at the heart of this is how to create new business opportunities from data. We will also have an interactive conversation on discussing and exploring insurance implications of the new computing environment from AI, Big Data and IoT (Edge computing).
(Bjørn Kvernstuen + Tommy Jocumsen, Norwegian Directorate for Work and Welfare) Kafka Summit SF 2018
NAV (Norwegian Work and Welfare Department) currently distributes more than one third of the national budget to citizens in Norway or abroad. We’re there to assist people through all phases of life within the domains of work, family, health, retirement and social security. Events happening throughout a person’s life determines which services we provide to them, how we provide them and when we provide them.
Today, each person has to apply for these services resulting in many tasks that are largely handled manually by various case workers in the organization. Their access to insight and useful information is limited and often hard to find, causing frustration to both our case workers and our users. By streaming a person’s life events through our Kafka pipelines, we can revolutionize the way our users experience our services and the way we work.
NAV and the government as a whole have access to vast amounts of data about our citizens, reported by health institutions, employers, various government agencies or the users themselves. Some data is distributed by large batches, while others are available on-demand through APIs. We’re changing these patterns into streams using Kafka, Streams API and Java microservices. We aim to distribute and act on events about birth, death, relationships, employment, income and business processes to vastly improve the user experience, provide real-time insight and reduce the need to apply for services we already know are needed.
This talk will touch on the following topics:
-How we move from data-on-demand to streams
-How streams of life events will free our case workers from mundane tasks
-How life and business events make valuable insight
-How we protect our users and comply with GDPR
-Why we chose Confluent Platform
Event-driven Business: How Leading Companies Are Adopting Streaming Strategiesconfluent
With the evolution of data-driven strategies, event-based business models are influential in innovative organizations. These new business models are built around the availability of real-time information on customers, payments and supply chains. As businesses look to expand traditional revenues, sourcing events from enterprise applications, mobile apps, IoT devices and social media in real time becomes essential to staying ahead of the competition.
Join John Santaferraro, Research Director at leading IT analyst firm Enterprise Management Associates (EMA), and Lyndon Hedderly, Director of Customer Solutions at Confluent, to learn how business and technology leaders are adopting streaming strategies and how the world of streaming data implementations have changed for the better.
You will also learn how organizations are:
-Adopting streaming as a strategic decision
-Using streaming data for a competitive advantage
-Using real-time processing for their applications
-Evolving roadblocks for streaming data
-Creating business value with a streaming platform
Digital Transformation Mindset - More Than Just Technologyconfluent
Many enterprises faced with silo’ed, batch-oriented, legacy systems struggle to compete in this new digital-first world. Adhering to the ‘If it’s not broken don’t fix it’ mentality leaves the door wide open for native digital challengers to grow and succeed. To stay competitive, your organization must respond in real time to every customer experience transaction, sale, and market movement. But how do you get there? First, you must change your mindset.
As streaming platforms become central to data strategies, companies both small and large are re-thinking their enterprise architecture with real-time context at the forefront. Monoliths are evolving into microservices. Datacenters are moving to the cloud. What was once a ‘batch’ mindset is quickly being replaced with stream processing as the demands of the business impose real-time requirements on technology leaders.
Join Argyle, in partnership with Confluent, in our 2018 CIO Virtual Event: The Digital Transformation Mindset – More Than Just Technology. During the webinar we’ll learn how leading companies across industries rely on a streaming platform to make event-driven architectures central to:
• How data strategies and IT initiatives are improving the digital customer experiences
• How executives are reducing risk with real time monitoring and anomaly detection
• Increasing operational agility with microservices and IoT architectures within organizations
Event-Based Business Architecture: Orchestrating Enterprise Communications confluent
(Gary Samuelson, GarySamuelson) Kafka Summit SF 2018
A business-oriented view, illustrating both process models and in-flight task progress, is critical to understanding organizational health, efficiency and alignment to strategic goals. The intent of this talk is to illustrate the real-time relationship between Kafka-managed events (event driven) and business architecture via actionable models (real-time analytics).
Takeaways:
-Understand how business views technology in terms of capabilities aligned to strategy.
-Introduce process model and performance views into an event-oriented dashboard. This view illustrates the organization in terms of collaborating human and automated services.
-Illustrate how system architecture dovetails into business goals with an aligned business/IT architectures.
Digital Shift in Insurance: How is the Industry Responding with the Influx of...DataWorks Summit
The digital connected world is having an impact on the technology environments that insurers must create to thrive in the new era of computing. The nature of customer interactions, business processes from product, risk and claims management are continuously changing. During this session we will review recent research and insights from insurance companies in the life, general and reinsurance markets and discuss the implications for insurers as the industry considers implications from core systems, predictive and preventive analytics and improvements to customer experiences.
Millions of dollars are being spent annually by the insurance industry in InsurTech investments from risk listening, customer interactions (chatbots, SMS messaging, smart interactive conversations), to methods of evaluating claims (digital capture at notice of incident, dashcams, connected homes/vehicles).
These are all new types of data which the industry hasn't previously had to manage and govern.
Additionally, at the heart of this is how to create new business opportunities from data. We will also have an interactive conversation on discussing and exploring insurance implications of the new computing environment from AI, Big Data and IoT (Edge computing).
We introduce LEAD a formal CEP that was presented at DEBS 2019. Using a running example, we explain how the pattern matching job is generated using Aged Colored Petri Net as logical execution plan. The implementation is currently under Flink Streaming.
GDPR: 20 Million Reasons to Get Ready - Part 2: Living ComplianceCloudera, Inc.
Though the majority of organisations will spend plenty of time preparing for GDPR, it’s crucial they consider actually living the regulation. May 2018 is not the end of the need for compliance, it is the beginning. With preparation putting in the foundation for a data subject hub, organisations now need to focus on efficiency in fulfilling the data subject access rights. In this session, we will go into what it means to live GDPR compliance with topics like self service and what it needs to be secure be design.
Kai Wähner – Real World Use Cases for Realtime In-Memory Computing - NoSQL ma...NoSQLmatters
Kai Wähner – Real World Use Cases for Realtime In-Memory Computing
NoSQL is not just about different storage alternatives such as document store, key value store, graphs or column-based databases. The hardware is also getting much more important. Besides common disks and SSDs, enterprises begin to use in-memory storages more and more because a distributed in-memory data grid provides very fast data access and update. While its performance will vary depending on multiple factors, it is not uncommon to be 100 times faster than corresponding database implementations. For this reason and others described in this session, in-memory computing is a great solution for lifting the burden of big data, reducing reliance on costly transactional systems, and building highly scalable, fault-tolerant applications.The session begins with a short introduction to in-memory computing. Afterwards, different frameworks and product alternatives are discussed for implementing in-memory solutions. Finally, the main part of this session shows several different real world uses cases where in-memory computing delivers business value by supercharging the infrastructure.
Driving Digital Transformation through Big Data Analytics and Machine LearningWSO2
Learn how Verizon’s big data and AI platform is embracing digital transformation, and how it delivers actionable insights, predictions, and trends to its digital consumers.
Adapting to the exponential development of technologyDataWorks Summit
Digitalization is impacting every industry in every economy, disrupting markets and upending competitive hierarchies. Across every business discipline, from operations and manufacturing to marketing and sales, companies are struggling to integrate new data, new analytics, and new technologies into their existing processes and practices. Companies that successfully adapt to these new and accelerating changes will outperform their competitors
The new challenges require new ways of organizing, new social and technical architectures. In this session, we identify the key challenges businesses face in the era of massive digitalization, and the organizational and architectural steps that will enable savvy competitors to find business value in this time of tumultuous technical change.
The gap between first steps and effective digitalization is large. For instance, businesses can run scrum teams without being agile; can have a data science lab without supporting autonomous model deployment to consumer-facing processes and applications; and can stand up a Hadoop ecosystem without understanding how the components fit together or what to do with it.
Based on our deep experience with dozens of clients, we describe key differences in levels of technology maturity, and practical tactics leading companies are using to turn digital innovation into competitive advantage.
Speaker
Santiago Cabrera-Naranjo, Consulting Director, Teradata
Real-time Streaming Analytics: Business Value, Use Cases and Architectural Co...Impetus Technologies
Impetus webcast ‘Real-time Streaming Analytics: Business Value, Use Cases and Architectural Considerations’ available at http://bit.ly/1i6OrwR
The webinar talks about-
• How business value is preserved and enhanced using Real-time Streaming Analytics with numerous use-cases in different industry verticals
• Technical considerations for IT leaders and implementation teams looking to integrate Real-time Streaming Analytics into enterprise architecture roadmap
• Recommendations for making Real-time Streaming Analytics – real – in your enterprise
• Impetus StreamAnalytix – an enterprise ready platform for Real-time Streaming Analytics
Who changed my data? Need for data governance and provenance in a streaming w...DataWorks Summit
Enterprises have dealt with data governance over the years, but it has been mostly around master data. With the advent of IoT/web/app streams everywhere in the ecosystem surrounding an enterprise, data-in-motion has become a strong force to reckon. Data-in-motion passes through several levels of transformations and augmentation before it becomes data-at-rest. Through this, it is pertinent to preserve the sanctity of such data or at least track the provenance through the various changes. This is very important for a lot of verticals where there are strong regulatory and compliance laws that exist around "who changed what."
This session will go into detail around some specific use cases of how data gets changed, how it can be tracked seamlessly and why this is important for certain verticals. This will be presented in two parts. The first part will cover the industry angle to this and its importance weighed in by several regulatory bodies. The second part will address the technology aspect of it and discuss how companies can leverage Apache Atlas and Ranger in conjunction with NiFi and Kafka to embrace data governance and provenance of their data streams.
Speakers
Dinesh Chandrasekhar, Director, Hortonworks
Paige Bartley, Senior Analyst - Data and Enterprise Intelligence, Ovum
Data Virtualization for Data Architects (Australia)Denodo
Watch full webinar here: https://bit.ly/35sp2Q0
Success or failure in the digital age will be determined by how effectively organisations manage their data. The speed, diversity and volume of data present today can overwhelm older data architectures, leaving business leaders lacking the insight and operational agility needed to respond to market opportunity or competitive challenges.
With the pace of today’s business, modernisation of a data architecture must be seamless, and ideally, built on existing capabilities. This webinar explores how data virtualization can help provide a seamless evolution to the capabilities of an existing data architecture without business disruption.
You will discover:
- How to modernise your data architectures without disturbing the existing analytical workload
- How to extend your data architecture to more quickly exploit existing, and new sources of data
- How to enable your data architecture to present more low latency data
Cloudera Data Impact Awards 2021 - Finalists Cloudera, Inc.
This annual program recognizes organizations who are moving swiftly towards the future and building innovative solutions by making what was impossible yesterday, possible today.
The winning organizations' implementations demonstrate outstanding achievements in fulfilling their mission, technical advancement, and overall impact.
The 2021 Data Impact Awards recognize organizations' achievements with the Cloudera Data Platform in seven categories:
Data Lifecycle Connection
Data for Enterprise AI
Cloud Innovation
Security & Governance Leadership
People First
Data for Good
Industry Transformation
Accelerating big data with ioMemory and Cisco UCS and NOSQLSumeet Bansal
When great companies work together, an even greater outcome is possible. I am presenting this at the Oracle Open World 2012 at the Cisco theatre. Could one possibly support a twitter-like workload with just one server and few iodrives? Its all here.
A Successful Data Strategy for Insurers in Volatile Times (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3rpr4La
Data is an insurer’s most valuable asset. Capitalizing on all of that stored and incoming data to draw valuable insights for business decisions is what ultimately makes a competitive difference.
But, insurers face challenges when it comes to modernizing and digitizing their data architectures. Most organizations rely on traditional systems and data integration processes that are time consuming and slow. In addition, as many adopt cloud strategies, these traditional approaches fill the cloud modernization process with downtime and end user frustration.
This is why insurers need a flexible and easily adaptable data integration technology that allows them to keep up with the ever-changing and growing data environment.
Data virtualization is that modern data integration technology. It can support insurers not only on their journey to digitization, but also on their future infrastructure changes and innovations, adding agility, flexibility and efficiency to data architectures. Data virtualization can help insurance companies create 360° views of deals and claims processes as well as gather quick social media or sensor data for on-the-go risk profiling.
Join this on-demand webinar to:
- Find out why data virtualization should be a part of your enterprise data strategy
- See how this technology can help you capitalize on your data
- Hear how many of your peers are already leveraging the Denodo Platform for Data Virtualization and the benefits they’re observing
To mesh or mess up your data organisation - Jochem van Grondelle (Prosus/OLX ...Jochem van Grondelle
Recently the concept of a ‘data mesh’ was introduced by Zhamak Deghani to solve architectural and organizational challenges with getting value from data at scale more logically and efficiently, built around four principles:
* Domain-oriented decentralized data ownership
* Data as a product
* Self-serve data infrastructure as a platform
* Federated computational governance
This presentation will initially deep-dive into the ‘data mesh’ and how it fundamentally differs from the typical data lake architectures used today. Subsequently, it describes OLX Europe’s current data platform state aimed partially towards a more decentralized data architecture, covering its analytical data platform, data infrastructure, data discovery, and data privacy.
Finally, it will see to what extent the main principles around the ‘data mesh’ can be applied to a future vision for our data platform and what advantages and challenges implementing such a vision can bring for OLX and other companies.
For more information on data mesh principles, check out the original article by Zhamak: https://martinfowler.com/articles/data-mesh-principles.html.
Design advantages of Hadoop ETL offload with the Intel processor-powered Dell...Principled Technologies
High-level Hadoop analysis requires custom solutions to deliver the data that you need and the amount of time that even senior engineers require to create ETL jobs in a DIY hardware and software situation, can be substantial.
We found that the Dell | Cloudera | Syncsort solution was so easy to use that an entry-level employee could use it to create optimized ETL jobs after only a few days of training. And he could do it quickly—our technician, who had no previous experience using Hadoop, developed three optimized ETL jobs in 31 hours. That is less than half of the 68 hours our expert with years of Hadoop experience needed to create the same jobs using open source tools.
Using the Dell | Cloudera | Syncsort solution means that your organization can implement a Hadoop solution using employees already on your staff rather than trying to recruit expensive, difficult-to-find specialists. Not only that, but the projects can be completed in a fraction of the time. This makes the Dell | Cloudera | Syncsort solution a winning business proposition.
Data Warehouse vs. Live Datamart - Comparison and DifferencesKai Wähner
Data Warehouses have existed for many years in almost every company. While they are still as good and relevant for the same use cases as they were 20 years ago, they cannot solve new, existing challenges and those sure to come in a ever-changing digital world. The upcoming sections will clarify when to still use a Data Warehouse and when to use a modern Live Datamart instead.
Migrating to the Cloud – Is Application Performance Monitoring still required?eG Innovations
As more businesses adopt cloud technologies for their various benefits it must be noted that not all cloud offerings are the same and provide different services or infra SLA. Do you know that not all SLA from cloud providers mean your application will be ensured similar availability?
Depending on whether you are leveraging Saas, Paas, Iaas, etc to deliver your applications, you will have a different level of visibility and control of how you manage performance and deliver the user experience expected by your users.
Join the session and find out what remains within your responsibility and how you can monitor the various cloud infrastructure/services to give yourself the needed visibility to deliver the expected user experience without over-provisioning to ensure better performance.
The General Data Protection Regulation (GDPR) went into effect on May 25, 2018, and this has immediate implications for handling data in your big data, machine learning, and analytics environments. Traditional architectural approaches will need to be adjusted to be compliant with several of the provisions. The good news is that Cloudera can help you!
We introduce LEAD a formal CEP that was presented at DEBS 2019. Using a running example, we explain how the pattern matching job is generated using Aged Colored Petri Net as logical execution plan. The implementation is currently under Flink Streaming.
GDPR: 20 Million Reasons to Get Ready - Part 2: Living ComplianceCloudera, Inc.
Though the majority of organisations will spend plenty of time preparing for GDPR, it’s crucial they consider actually living the regulation. May 2018 is not the end of the need for compliance, it is the beginning. With preparation putting in the foundation for a data subject hub, organisations now need to focus on efficiency in fulfilling the data subject access rights. In this session, we will go into what it means to live GDPR compliance with topics like self service and what it needs to be secure be design.
Kai Wähner – Real World Use Cases for Realtime In-Memory Computing - NoSQL ma...NoSQLmatters
Kai Wähner – Real World Use Cases for Realtime In-Memory Computing
NoSQL is not just about different storage alternatives such as document store, key value store, graphs or column-based databases. The hardware is also getting much more important. Besides common disks and SSDs, enterprises begin to use in-memory storages more and more because a distributed in-memory data grid provides very fast data access and update. While its performance will vary depending on multiple factors, it is not uncommon to be 100 times faster than corresponding database implementations. For this reason and others described in this session, in-memory computing is a great solution for lifting the burden of big data, reducing reliance on costly transactional systems, and building highly scalable, fault-tolerant applications.The session begins with a short introduction to in-memory computing. Afterwards, different frameworks and product alternatives are discussed for implementing in-memory solutions. Finally, the main part of this session shows several different real world uses cases where in-memory computing delivers business value by supercharging the infrastructure.
Driving Digital Transformation through Big Data Analytics and Machine LearningWSO2
Learn how Verizon’s big data and AI platform is embracing digital transformation, and how it delivers actionable insights, predictions, and trends to its digital consumers.
Adapting to the exponential development of technologyDataWorks Summit
Digitalization is impacting every industry in every economy, disrupting markets and upending competitive hierarchies. Across every business discipline, from operations and manufacturing to marketing and sales, companies are struggling to integrate new data, new analytics, and new technologies into their existing processes and practices. Companies that successfully adapt to these new and accelerating changes will outperform their competitors
The new challenges require new ways of organizing, new social and technical architectures. In this session, we identify the key challenges businesses face in the era of massive digitalization, and the organizational and architectural steps that will enable savvy competitors to find business value in this time of tumultuous technical change.
The gap between first steps and effective digitalization is large. For instance, businesses can run scrum teams without being agile; can have a data science lab without supporting autonomous model deployment to consumer-facing processes and applications; and can stand up a Hadoop ecosystem without understanding how the components fit together or what to do with it.
Based on our deep experience with dozens of clients, we describe key differences in levels of technology maturity, and practical tactics leading companies are using to turn digital innovation into competitive advantage.
Speaker
Santiago Cabrera-Naranjo, Consulting Director, Teradata
Real-time Streaming Analytics: Business Value, Use Cases and Architectural Co...Impetus Technologies
Impetus webcast ‘Real-time Streaming Analytics: Business Value, Use Cases and Architectural Considerations’ available at http://bit.ly/1i6OrwR
The webinar talks about-
• How business value is preserved and enhanced using Real-time Streaming Analytics with numerous use-cases in different industry verticals
• Technical considerations for IT leaders and implementation teams looking to integrate Real-time Streaming Analytics into enterprise architecture roadmap
• Recommendations for making Real-time Streaming Analytics – real – in your enterprise
• Impetus StreamAnalytix – an enterprise ready platform for Real-time Streaming Analytics
Who changed my data? Need for data governance and provenance in a streaming w...DataWorks Summit
Enterprises have dealt with data governance over the years, but it has been mostly around master data. With the advent of IoT/web/app streams everywhere in the ecosystem surrounding an enterprise, data-in-motion has become a strong force to reckon. Data-in-motion passes through several levels of transformations and augmentation before it becomes data-at-rest. Through this, it is pertinent to preserve the sanctity of such data or at least track the provenance through the various changes. This is very important for a lot of verticals where there are strong regulatory and compliance laws that exist around "who changed what."
This session will go into detail around some specific use cases of how data gets changed, how it can be tracked seamlessly and why this is important for certain verticals. This will be presented in two parts. The first part will cover the industry angle to this and its importance weighed in by several regulatory bodies. The second part will address the technology aspect of it and discuss how companies can leverage Apache Atlas and Ranger in conjunction with NiFi and Kafka to embrace data governance and provenance of their data streams.
Speakers
Dinesh Chandrasekhar, Director, Hortonworks
Paige Bartley, Senior Analyst - Data and Enterprise Intelligence, Ovum
Data Virtualization for Data Architects (Australia)Denodo
Watch full webinar here: https://bit.ly/35sp2Q0
Success or failure in the digital age will be determined by how effectively organisations manage their data. The speed, diversity and volume of data present today can overwhelm older data architectures, leaving business leaders lacking the insight and operational agility needed to respond to market opportunity or competitive challenges.
With the pace of today’s business, modernisation of a data architecture must be seamless, and ideally, built on existing capabilities. This webinar explores how data virtualization can help provide a seamless evolution to the capabilities of an existing data architecture without business disruption.
You will discover:
- How to modernise your data architectures without disturbing the existing analytical workload
- How to extend your data architecture to more quickly exploit existing, and new sources of data
- How to enable your data architecture to present more low latency data
Cloudera Data Impact Awards 2021 - Finalists Cloudera, Inc.
This annual program recognizes organizations who are moving swiftly towards the future and building innovative solutions by making what was impossible yesterday, possible today.
The winning organizations' implementations demonstrate outstanding achievements in fulfilling their mission, technical advancement, and overall impact.
The 2021 Data Impact Awards recognize organizations' achievements with the Cloudera Data Platform in seven categories:
Data Lifecycle Connection
Data for Enterprise AI
Cloud Innovation
Security & Governance Leadership
People First
Data for Good
Industry Transformation
Accelerating big data with ioMemory and Cisco UCS and NOSQLSumeet Bansal
When great companies work together, an even greater outcome is possible. I am presenting this at the Oracle Open World 2012 at the Cisco theatre. Could one possibly support a twitter-like workload with just one server and few iodrives? Its all here.
A Successful Data Strategy for Insurers in Volatile Times (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3rpr4La
Data is an insurer’s most valuable asset. Capitalizing on all of that stored and incoming data to draw valuable insights for business decisions is what ultimately makes a competitive difference.
But, insurers face challenges when it comes to modernizing and digitizing their data architectures. Most organizations rely on traditional systems and data integration processes that are time consuming and slow. In addition, as many adopt cloud strategies, these traditional approaches fill the cloud modernization process with downtime and end user frustration.
This is why insurers need a flexible and easily adaptable data integration technology that allows them to keep up with the ever-changing and growing data environment.
Data virtualization is that modern data integration technology. It can support insurers not only on their journey to digitization, but also on their future infrastructure changes and innovations, adding agility, flexibility and efficiency to data architectures. Data virtualization can help insurance companies create 360° views of deals and claims processes as well as gather quick social media or sensor data for on-the-go risk profiling.
Join this on-demand webinar to:
- Find out why data virtualization should be a part of your enterprise data strategy
- See how this technology can help you capitalize on your data
- Hear how many of your peers are already leveraging the Denodo Platform for Data Virtualization and the benefits they’re observing
To mesh or mess up your data organisation - Jochem van Grondelle (Prosus/OLX ...Jochem van Grondelle
Recently the concept of a ‘data mesh’ was introduced by Zhamak Deghani to solve architectural and organizational challenges with getting value from data at scale more logically and efficiently, built around four principles:
* Domain-oriented decentralized data ownership
* Data as a product
* Self-serve data infrastructure as a platform
* Federated computational governance
This presentation will initially deep-dive into the ‘data mesh’ and how it fundamentally differs from the typical data lake architectures used today. Subsequently, it describes OLX Europe’s current data platform state aimed partially towards a more decentralized data architecture, covering its analytical data platform, data infrastructure, data discovery, and data privacy.
Finally, it will see to what extent the main principles around the ‘data mesh’ can be applied to a future vision for our data platform and what advantages and challenges implementing such a vision can bring for OLX and other companies.
For more information on data mesh principles, check out the original article by Zhamak: https://martinfowler.com/articles/data-mesh-principles.html.
Design advantages of Hadoop ETL offload with the Intel processor-powered Dell...Principled Technologies
High-level Hadoop analysis requires custom solutions to deliver the data that you need and the amount of time that even senior engineers require to create ETL jobs in a DIY hardware and software situation, can be substantial.
We found that the Dell | Cloudera | Syncsort solution was so easy to use that an entry-level employee could use it to create optimized ETL jobs after only a few days of training. And he could do it quickly—our technician, who had no previous experience using Hadoop, developed three optimized ETL jobs in 31 hours. That is less than half of the 68 hours our expert with years of Hadoop experience needed to create the same jobs using open source tools.
Using the Dell | Cloudera | Syncsort solution means that your organization can implement a Hadoop solution using employees already on your staff rather than trying to recruit expensive, difficult-to-find specialists. Not only that, but the projects can be completed in a fraction of the time. This makes the Dell | Cloudera | Syncsort solution a winning business proposition.
Data Warehouse vs. Live Datamart - Comparison and DifferencesKai Wähner
Data Warehouses have existed for many years in almost every company. While they are still as good and relevant for the same use cases as they were 20 years ago, they cannot solve new, existing challenges and those sure to come in a ever-changing digital world. The upcoming sections will clarify when to still use a Data Warehouse and when to use a modern Live Datamart instead.
Migrating to the Cloud – Is Application Performance Monitoring still required?eG Innovations
As more businesses adopt cloud technologies for their various benefits it must be noted that not all cloud offerings are the same and provide different services or infra SLA. Do you know that not all SLA from cloud providers mean your application will be ensured similar availability?
Depending on whether you are leveraging Saas, Paas, Iaas, etc to deliver your applications, you will have a different level of visibility and control of how you manage performance and deliver the user experience expected by your users.
Join the session and find out what remains within your responsibility and how you can monitor the various cloud infrastructure/services to give yourself the needed visibility to deliver the expected user experience without over-provisioning to ensure better performance.
The General Data Protection Regulation (GDPR) went into effect on May 25, 2018, and this has immediate implications for handling data in your big data, machine learning, and analytics environments. Traditional architectural approaches will need to be adjusted to be compliant with several of the provisions. The good news is that Cloudera can help you!
n this webinar, GDPR expert, Richard Hogg, answers the following questions:
What will the GDPR mean for my organization?
Where do I start on the journey to compliance?
What tools and technology are available to help?
Attendees: Operations, Finance, Compliance, Governance, IT
https://www.integro.com/recorded-webinar/nov-17-2016-gdpr
Symantec Webinar Part 2 of 6 GDPR ComplianceSymantec
Symantec is offering an opportunity to hear first-hand the challenges businesses face when adopting the cloud and adhering to compliance regulations.
To watch the webinar on demand click here: https://symc.ly/2Ivwblu.
Impact of GDPR on Third Party and M&A SecurityEQS Group
GDPR impact has been dissected and examined to death - however, M&A activities, as well as third-party security posture, can be greatly affected as well, and this aspect has not been very often pursued. This session hopes to be useful for that.
Master Data in the Cloud: 5 Security FundamentalsSarah Fane
Your master data is essential to the smooth operation of your business. But it is also valuable to others. Master data is vulnerable to both internal and external attacks. As the future of business and data is increasingly cloud-based, we explore five fundamentals to ensure the security of your data.
General Data Protection Regulation (GDPR) Implications for Canadian Firmsaccenture
The General Data Protection Regulation (GDPR) represents significant challenges for financial institutions to comply with the new data processing and record keeping requirements. This Accenture Finance & Risk presentation explores the impact of GDPR on Canadian firms, including lessons learned from our work with clients and knowledge gained that can be used for an effective GDPR journey.
GDPR: it's big, but it's not impossible.
With GDPR looming on the horizon, it’s understandable organisations might be worried. Few companies have stood up and declared compliance yet. Most are heads down identifying personal data and implementing plans for compliance. There are hurried glances at the clock as the time to 25 May ticks away; the reality of daunting fines comes ever closer.
Digital Enterprise Festival Birmingham 13/04/17 - Ian West Cognizant VP Data ...CIO Edge
Learn what the EU Global Data Protection Regulation means for your business – Carrot or Stick its your choice but with fines of €20m or up to 4% of Global Revenue (whichever is the larger) being applied for every data breach and every data mis-use after May 2018 the carrot is the better option.
Are you aware? Are you prepared? Do you comply?
To book a free non sales consultation about GDPR with Ian West contact us enquiry@digitalenterprisefest.com
Today’s organizations give predominant importance to increased privacy regulations, stakeholder’s profitability demands and the ever so changing consumer privacy expectations. As a result, the emphasis on personal data is growing and the companies are facing complicated reputational, regulatory and data privacy risk environment. It’s a sad fact that the frequency of critical data breaches are increasing and as a result the management administration and the IT departments focus on safeguarding their data systems more than ever before. Our experienced and expertise data security, privacy and information governance experts in UAE helps you to reduce the risks associated with various privacy compliance frameworks along with recognizing the value of your personal data.
Bridging the Gap between Privacy and Security: Using Technology to Manage Com...IBM Security
Data breach and Cybersecurity incident reporting regulations are becoming more widespread. The introduction of GDPR in May 2018, with its 72-hour reporting requirement, resulted in organizations having to review their incident response processes and more regional and industry-specific regulations are being introduced all the time. Security Operations and Privacy teams need to be aligned to meet these new requirements. Technology such as Security Orchestration and Automation is also being adopted to collaborate on the investigation and remediation of security incidents.
This webinar, hosted by Privacy experts from Ovum and IBM, will look at how technology can close the gap between Privacy and Security to reduce the time to contain incidents and maintain compliance with complex breach laws.
View the recording: https://event.on24.com/wcc/r/1930112/BE462033358FFF36C4B27F76C9755753?partnerref=LI
GDPR: 20 Million Reasons to get ready - Part 1: Preparing for complianceCloudera, Inc.
The first webinar of the series starts at the beginning: preparing for GDPR compliance. In this session, we look at how technology and process come together to let organisations get to grips with the GDPR relevant data that flows around their companies and work towards compliance. We will give you practical examples on how to apply data discovery, data minimisation, data protection and security as well as the role of the record of processing in this.
Improve IT Security and Compliance with Mainframe Data in SplunkPrecisely
Avoid security blind spots with an enterprise-wide view.
If your organization relies on Splunk as its security nerve center, you can’t afford to leave out your mainframes.
They work with the rest of your IT infrastructure to support critical business applications–and they need to be
viewed in that wider context to address potential security blind spots.
Although the importance of including mainframe data in Splunk is undeniable, many organizations have left it out
because Splunk doesn’t natively support IBM Z® environments. Learn how Precisely Ironstream can help with a
straight-forward, powerful approach for integrating your mainframe security data into Splunk, and making it actionable
once it’s there.
Date: 15th November 2017
Location: AI Lab Theatre
Time: 16:30 - 17:00
Speaker: Elisabeth Olafsdottir / Santiago Castro
Organisation: Microsoft / Keyrus
Similar to Compliance in Motion: Aligning Data Governance Initiatives with Business Objectives in the Streaming Era (20)
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...confluent
In our exclusive webinar, you'll learn why event-driven architecture is the key to unlocking cost efficiency, operational effectiveness, and profitability. Gain insights on how this approach differs from API-driven methods and why it's essential for your organization's success.
Unlocking the Power of IoT: A comprehensive approach to real-time insightsconfluent
In today's data-driven world, the Internet of Things (IoT) is revolutionizing industries and unlocking new possibilities. Join Data Reply, Confluent, and Imply as we unveil a comprehensive solution for IoT that harnesses the power of real-time insights.
Workshop híbrido: Stream Processing con Flinkconfluent
El Stream processing es un requisito previo de la pila de data streaming, que impulsa aplicaciones y pipelines en tiempo real.
Permite una mayor portabilidad de datos, una utilización optimizada de recursos y una mejor experiencia del cliente al procesar flujos de datos en tiempo real.
En nuestro taller práctico híbrido, aprenderás cómo filtrar, unir y enriquecer fácilmente datos en tiempo real dentro de Confluent Cloud utilizando nuestro servicio Flink sin servidor.
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...confluent
Our talk will explore the transformative impact of integrating Confluent, HiveMQ, and SparkPlug in Industry 4.0, emphasizing the creation of a Unified Namespace.
In addition to the creation of a Unified Namespace, our webinar will also delve into Stream Governance and Scaling, highlighting how these aspects are crucial for managing complex data flows and ensuring robust, scalable IIoT-Platforms.
You will learn how to ensure data accuracy and reliability, expand your data processing capabilities, and optimize your data management processes.
Don't miss out on this opportunity to learn from industry experts and take your business to the next level.
La arquitectura impulsada por eventos (EDA) será el corazón del ecosistema de MAPFRE. Para seguir siendo competitivas, las empresas de hoy dependen cada vez más del análisis de datos en tiempo real, lo que les permite obtener información y tiempos de respuesta más rápidos. Los negocios con datos en tiempo real consisten en tomar conciencia de la situación, detectar y responder a lo que está sucediendo en el mundo ahora.
Eventos y Microservicios - Santander TechTalkconfluent
Durante esta sesión examinaremos cómo el mundo de los eventos y los microservicios se complementan y mejoran explorando cómo los patrones basados en eventos nos permiten descomponer monolitos de manera escalable, resiliente y desacoplada.
Purpose of the session is to have a dive into Apache, Kafka, Data Streaming and Kafka in the cloud
- Dive into Apache Kafka
- Data Streaming
- Kafka in the cloud
Build real-time streaming data pipelines to AWS with Confluentconfluent
Traditional data pipelines often face scalability issues and challenges related to cost, their monolithic design, and reliance on batch data processing. They also typically operate under the premise that all data needs to be stored in a single centralized data source before it's put to practical use. Confluent Cloud on Amazon Web Services (AWS) provides a fully managed cloud-native platform that helps you simplify the way you build real-time data flows using streaming data pipelines and Apache Kafka.
Q&A with Confluent Professional Services: Confluent Service Meshconfluent
No matter whether you are migrating your Kafka cluster to Confluent Cloud, running a cloud-hybrid environment or are in a different situation where data protection and encryption of sensitive information is required, Confluent Service Mesh allows you to transparently encrypt your data without the need to make code changes to you existing applications.
Citi Tech Talk: Event Driven Kafka Microservicesconfluent
Microservices have become a dominant architectural paradigm for building systems in the enterprise, but they are not without their tradeoffs. Learn how to build event-driven microservices with Apache Kafka
Confluent & GSI Webinars series - Session 3confluent
An in depth look at how Confluent is being used in the financial services industry. Gain an understanding of how organisations are utilising data in motion to solve common problems and gain benefits from their real time data capabilities.
It will look more deeply into some specific use cases and show how Confluent technology is used to manage costs and mitigate risks.
This session is aimed at Solutions Architects, Sales Engineers and Pre Sales, and also the more technically minded business aligned people. Whilst this is not a deeply technical session, a level of knowledge around Kafka would be helpful.
Transforming applications built with traditional messaging solutions such as TIBCO, MQ and Solace to be scalable, reliable and ready for the move to cloud
How can applications built with traditional messaging technologies like TIBCO, Solace and IBM MQ be modernised and be made cloud ready? What are the advantages to Event Streaming approaches to pub/sub vs traditional message queues? What are the strengeths and weaknesses of both approaches, and what use cases and requirements are actually a better fit for messaging than Kafka?
This session will show why the old paradigm does not work and that a new approach to the data strategy needs to be taken. It aims to show how a Data Streaming Platform is integral to the evolution of a company’s data strategy and how Confluent is not just an integration layer but the central nervous system for an organisation
Vous apprendrez également à :
• Créer plus rapidement des produits et fonctionnalités à l’aide d’une suite complète de connecteurs et d’outils de gestion des flux, et à connecter vos environnements à des pipelines de données
• Protéger vos données et charges de travail les plus critiques grâce à des garanties intégrées en matière de sécurité, de gouvernance et de résilience
• Déployer Kafka à grande échelle en quelques minutes tout en réduisant les coûts et la charge opérationnelle associés
Confluent Partner Tech Talk with Synthesisconfluent
A discussion on the arduous planning process, and deep dive into the design/architectural decisions.
Learn more about the networking, RBAC strategies, the automation, and the deployment plan.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Compliance in Motion: Aligning Data Governance Initiatives with Business Objectives in the Streaming Era
1. Compliance in Motion:
Aligning Data Governance Initiatives with
Business Objectives in the Streaming Era
Paige Bartley,
Cameron Tovey,
2. 2
Cameron Tovey is the head of information security at Confluent. With
nearly 20 years of experience protecting data, he ensures that Confluent’s
information security program is complete and running smoothly. Before
Confluent he protected data for technology startups, healthcare
organizations, retail companies, banking institutions and other Fortune
100 entities.
Cameron Tovey
Head of Information Security, Confluent
Paige specializes in all aspects of the data lifecycle including creation,
cleansing, security, privacy and productivity. Working across the
information management space, Paige researches how data use affects
both large organizations and individuals alike. Paige’s other areas of
expertise include regulatory and legal matters, data preparation, data
quality, unstructured data, master data and records management, as well
as neuroscience and cognitive science.
Paige Bartley
Senior Analyst, Data and Enterprise Intelligence, Ovum
3. 3
Session Overview
● This session will be one hour
● The last 10-15 minutes will consist of Q&A
● Submit questions by entering them into the GoToWebinar panel
● The slides and recording will be available
27. 27
What is driving compliance in your organization?
Confluent
Cloud
Managing risk
Shortening the
sales cycle
28. 28
Managing Risk
Governmental Regulations:
● General Data Protection Regulation (GDPR)
● Health Insurance Portability and Accountability
(HIPAA)
● Federal Risk and Authorization Management
Program (FedRAMP)
Many organizations must address these regulatory
requirements, either by directly processing or storing
protected information, or by their customers’ need for
them to process or store protected information.
Data Protection Standards:
● ISO/IEC 27000 series standards
● Payment Card Industry Data Security Standard
(PCI DSS)
● Service Organization Control 2 (SOC 2)
The ability of these standards to change at a
reasonable rate to match and stay inline with industry
trends is what helps them remain applicable and
reusable in an effective data security and compliance
program.
30. 30
Regulations and Standards
Health Insurance Portability and Accountability (HIPAA)
Health Information Technology for Economic and Clinical Health Act (HITECH)
Compliance with this regulation includes performing a gap assessment to understand what holes need to
be fixed in order to properly comply, putting together and beginning implementation on a remediation plan.
General Data Protection Regulation (GDPR)
This European regulation identifies the rights of individuals to request a copy of, make changes to or have
personal information completely deleted from systems. It also requires clear communication to individuals
regarding the purposes for which their information is being used.
Federal Risk and Authorization Management Program (FedRAMP)
Federal Information Security Modernization Act (FISMA)
The Federal Risk and Authorization Management Program (FedRAMP) is a government-wide program that
provides a standardized approach to security assessment, authorization, and continuous monitoring for
cloud products and services. Required by U.S. government entities for cloud services, FedRAMP requires
implementation of National Institute of Standards and Technology (NIST) standards for data protection.
31. 31
Regulations and Standards
Payment Card Industry Data Security Standard (PCI DSS)
PCI DSS has specific technical requirements designed to protect credit card data used in processing
payments. It strictly controls which pieces of data normally contained on the magnetic stripe of a credit
card may be retained by a company and what controls must be in place in order to do so.
Service Organization Control (SOC 2)
Defined by the American Institute of Certified Public Accountants (AICPA), this audit standard provides an
external and independent assessment of a service provider’s controls environment. Well recognized in the
U.S., this audit assessment covers how well a security program has operated their security controls over a
period of time.
ISO 27001 & ISO 27018
The International Standards Organization (ISO) has provided guidance on how to implement an effective
information security management system (ISMS). The 27001 standard provides requirements for
establishing, implementing, maintaining and continually improving an ISMS. The 27018 standard focuses on
protection of personal data in the cloud.
32. 32
Three Pillars of Data Protection
Data is only
accessible to
those to whom it
is intended.
The ability to
access data at
the moment
required
Ensuring that
data does not
change
inappropriately
Confidentiality Availability Integrity
33. 33
Emerging Technology
Streaming Technology
● Confidentiality
● Integrity
● Availability
● Authentication
● Authorization
● Audit/Non-Repudiation/Logging
Cloud Services
● Shared Responsibility Model
● What does your provider control?
● What controls do they evidence for you?
● Are their controls good enough for you?
● What do you control?
● What are your requirements for the
things in your control?
34. 34
Streaming Technology
Confidentiality – What controls are available for data confidentiality?
● Does the system encrypt data in transit?
● Does the system encrypt data at rest?
● How are these accomplished?
● Does the system provide role-based access controls?
● Does the system integrate with directory services or use single sign-on (SSO)?
Integrity – What controls are available to maintain data integrity?
● Who has access to make changes?
● How do you know your data is accurate?
● What is the backup model (i.e., traditional data or system backups or distributed copies of data)?
● How long does it take to recover from a problem?
Availability – What controls are utilized for availability and performance scaling to accommodate growth?
● What level of service is guaranteed?
● How do I measure compliance with your commitments for uptime?
● What is your plan to resolve issues in the event of downtime?
35. 35
Streaming Technology
Authentication – What controls are available to authenticate both customer users and service provider users?
● Can customer authentication services like LDAP, Active Directory or single sign-on be integrated?
● Can password and other authentication settings be managed by the customer?
● Can users utilize multi-factor authentication?
● How do automated processes integrate with the service?
● How are authentication credentials protected?
Authorization – What different activities can the customer control or limit?
● Are role-based controls available?
● Are roles able to be defined to match my organization in your system?
● What are the critical functions that should or could be limited?
● Is the ability to limit read, write, delete, and administration functions available?
Audit – How can a customer monitor access and changes to data and environments?
● Are system activities logged?
● How are these logs available to the customer?
● Are the logs available in a format that can be automatically consumed and processed by customer
systems?
36. 36
Cloud Services
Shared Responsibility Model
Responsibility for security of data and systems deployed in any cloud provider is always shared.
Cloud Service Provider Controls
There are controls clearly provided by the cloud service provider over which the customer has little or
no influence.
Optional Controls
There are controls made available by the cloud service provider which the customer can choose to
implement at their discretion.
Customer Controls
There are controls that are completely in control of the customer who utilizes a cloud service.
37. 37
Is an external opinion or
audit report available that
explains the controls they
put in place?
Does the customer have
an accounting of the
customer-responsible
controls?
?
What controls does the
cloud service provider make
available to all customers?
What controls are the
responsibility of the
customer?
Are the documented
controls sufficient?
Cloud Services
Are the available customer
controls and settings
configured correctly?
How does a customer
monitor for when
changes made expose
customer data or systems?
Are customer controls
and settings documented?
38. 38
Resources and Next Steps
https://confluent.io
http://cnfl.io/slack
#security
@confluentinc