This presentation will discuss the stories of 3 companies that span different industries; what challenges they faced and how cloud analytics solved for them; what technologies were implemented to solve the challenges; and how they were able to benefit from their new cloud analytics environments.
The objectives of this session include:
• Detail and explain the key benefits and advantages of moving BI and analytics workloads to the cloud, and why companies shouldn’t wait any longer to make their move.
• Compare the different analytics cloud options companies have, and the pros and cons of each.
• Describe some of the challenges companies may face when moving their analytics to the cloud, and what they need to prepare for.
• Provide the case studies of three companies, what issues they were solving for, what technologies they implemented and why, and how they benefited from their new solutions.
• Learn what to look for one considering a partner and trusted advisor to assist with an analytics cloud migration.
Data Lakes are meant to support many of the same analytics capabilities of Data Warehouses while overcoming some of the core problems. Yet Data Lakes have a distinctly different technology base. This webinar will provide an overview of the standard architecture components of Data Lakes.
This will include:
The Lab and the factory
The base environment for batch analytics
Critical governance components
Additional components necessary for real-time analytics and ingesting streaming data
Data Modeling, Data Governance, & Data QualityDATAVERSITY
Data Governance is often referred to as the people, processes, and policies around data and information, and these aspects are critical to the success of any data governance implementation. But just as critical is the technical infrastructure that supports the diverse data environments that run the business. Data models can be the critical link between business definitions and rules and the technical data systems that support them. Without the valuable metadata these models provide, data governance often lacks the “teeth” to be applied in operational and reporting systems.
Join Donna Burbank and her guest, Nigel Turner, as they discuss how data models & metadata-driven data governance can be applied in your organization in order to achieve improved data quality.
Data Lakehouse, Data Mesh, and Data Fabric (r1)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. I’ll include use cases so you can see what approach will work best for your big data needs.
Data Architecture Strategies: Data Architecture for Digital TransformationDATAVERSITY
MDM, data quality, data architecture, and more. At the same time, combining these foundational data management approaches with other innovative techniques can help drive organizational change as well as technological transformation. This webinar will provide practical steps for creating a data foundation for effective digital transformation.
The world of data architecture began with applications. Next came data warehouses. Then text was organized into a data warehouse.
Then one day the world discovered a whole new kind of data that was being generated by organizations. The world found that machines generated data that could be transformed into valuable insights. This was the origin of what is today called the data lakehouse. The evolution of data architecture continues today.
Come listen to industry experts describe this transformation of ordinary data into a data architecture that is invaluable to business. Simply put, organizations that take data architecture seriously are going to be at the forefront of business tomorrow.
This is an educational event.
Several of the authors of the book Building the Data Lakehouse will be presenting at this symposium.
This is Part 4 of the GoldenGate series on Data Mesh - a series of webinars helping customers understand how to move off of old-fashioned monolithic data integration architecture and get ready for more agile, cost-effective, event-driven solutions. The Data Mesh is a kind of Data Fabric that emphasizes business-led data products running on event-driven streaming architectures, serverless, and microservices based platforms. These emerging solutions are essential for enterprises that run data-driven services on multi-cloud, multi-vendor ecosystems.
Join this session to get a fresh look at Data Mesh; we'll start with core architecture principles (vendor agnostic) and transition into detailed examples of how Oracle's GoldenGate platform is providing capabilities today. We will discuss essential technical characteristics of a Data Mesh solution, and the benefits that business owners can expect by moving IT in this direction. For more background on Data Mesh, Part 1, 2, and 3 are on the GoldenGate YouTube channel: https://www.youtube.com/playlist?list=PLbqmhpwYrlZJ-583p3KQGDAd6038i1ywe
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Mr. Pollock is an expert technology leader for data platforms, big data, data integration and governance. Jeff has been CTO at California startups and a senior exec at Fortune 100 tech vendors. He is currently Oracle VP of Products and Cloud Services for Data Replication, Streaming Data and Database Migrations. While at IBM, he was head of all Information Integration, Replication and Governance products, and previously Jeff was an independent architect for US Defense Department, VP of Technology at Cerebra and CTO of Modulant – he has been engineering artificial intelligence based data platforms since 2001. As a business consultant, Mr. Pollock was a Head Architect at Ernst & Young’s Center for Technology Enablement. Jeff is also the author of “Semantic Web for Dummies” and "Adaptive Information,” a frequent keynote at industry conferences, author for books and industry journals, formerly a contributing member of W3C and OASIS, and an engineering instructor with UC Berkeley’s Extension for object-oriented systems, software development process and enterprise architecture.
The Path to Data and Analytics ModernizationAnalytics8
Learn about the business demands driving modernization, the benefits of doing so, and how to get started.
Can your data and analytics solutions handle today’s challenges?
To stay competitive in today’s market, companies must be able to use their data to make better decisions. However, we are living in a world flooded by data, new technologies, and demands from the business for better and more advanced analytics. Most companies do not have the modern technologies and processes in place to keep up with these growing demands. They need to modernize how they collect, analyze, use, and share their data.
In this webinar, we discuss how you can build modern data and analytics solutions that are future ready, scalable, real-time, high speed, and agile and that can enable better use of data throughout your company.
We cover:
-The business demands and industry shifts that are impacting the need to modernize
-The benefits of data and analytics modernization
-How to approach data and analytics modernization- steps you need to take and how to get it right
-The pillars of modern data management
-Tips for migrating from legacy analytics tools to modern, next-gen platforms
-Lessons learned from companies that have gone through the modernization process
Data Lakes are meant to support many of the same analytics capabilities of Data Warehouses while overcoming some of the core problems. Yet Data Lakes have a distinctly different technology base. This webinar will provide an overview of the standard architecture components of Data Lakes.
This will include:
The Lab and the factory
The base environment for batch analytics
Critical governance components
Additional components necessary for real-time analytics and ingesting streaming data
Data Modeling, Data Governance, & Data QualityDATAVERSITY
Data Governance is often referred to as the people, processes, and policies around data and information, and these aspects are critical to the success of any data governance implementation. But just as critical is the technical infrastructure that supports the diverse data environments that run the business. Data models can be the critical link between business definitions and rules and the technical data systems that support them. Without the valuable metadata these models provide, data governance often lacks the “teeth” to be applied in operational and reporting systems.
Join Donna Burbank and her guest, Nigel Turner, as they discuss how data models & metadata-driven data governance can be applied in your organization in order to achieve improved data quality.
Data Lakehouse, Data Mesh, and Data Fabric (r1)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. I’ll include use cases so you can see what approach will work best for your big data needs.
Data Architecture Strategies: Data Architecture for Digital TransformationDATAVERSITY
MDM, data quality, data architecture, and more. At the same time, combining these foundational data management approaches with other innovative techniques can help drive organizational change as well as technological transformation. This webinar will provide practical steps for creating a data foundation for effective digital transformation.
The world of data architecture began with applications. Next came data warehouses. Then text was organized into a data warehouse.
Then one day the world discovered a whole new kind of data that was being generated by organizations. The world found that machines generated data that could be transformed into valuable insights. This was the origin of what is today called the data lakehouse. The evolution of data architecture continues today.
Come listen to industry experts describe this transformation of ordinary data into a data architecture that is invaluable to business. Simply put, organizations that take data architecture seriously are going to be at the forefront of business tomorrow.
This is an educational event.
Several of the authors of the book Building the Data Lakehouse will be presenting at this symposium.
This is Part 4 of the GoldenGate series on Data Mesh - a series of webinars helping customers understand how to move off of old-fashioned monolithic data integration architecture and get ready for more agile, cost-effective, event-driven solutions. The Data Mesh is a kind of Data Fabric that emphasizes business-led data products running on event-driven streaming architectures, serverless, and microservices based platforms. These emerging solutions are essential for enterprises that run data-driven services on multi-cloud, multi-vendor ecosystems.
Join this session to get a fresh look at Data Mesh; we'll start with core architecture principles (vendor agnostic) and transition into detailed examples of how Oracle's GoldenGate platform is providing capabilities today. We will discuss essential technical characteristics of a Data Mesh solution, and the benefits that business owners can expect by moving IT in this direction. For more background on Data Mesh, Part 1, 2, and 3 are on the GoldenGate YouTube channel: https://www.youtube.com/playlist?list=PLbqmhpwYrlZJ-583p3KQGDAd6038i1ywe
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Mr. Pollock is an expert technology leader for data platforms, big data, data integration and governance. Jeff has been CTO at California startups and a senior exec at Fortune 100 tech vendors. He is currently Oracle VP of Products and Cloud Services for Data Replication, Streaming Data and Database Migrations. While at IBM, he was head of all Information Integration, Replication and Governance products, and previously Jeff was an independent architect for US Defense Department, VP of Technology at Cerebra and CTO of Modulant – he has been engineering artificial intelligence based data platforms since 2001. As a business consultant, Mr. Pollock was a Head Architect at Ernst & Young’s Center for Technology Enablement. Jeff is also the author of “Semantic Web for Dummies” and "Adaptive Information,” a frequent keynote at industry conferences, author for books and industry journals, formerly a contributing member of W3C and OASIS, and an engineering instructor with UC Berkeley’s Extension for object-oriented systems, software development process and enterprise architecture.
The Path to Data and Analytics ModernizationAnalytics8
Learn about the business demands driving modernization, the benefits of doing so, and how to get started.
Can your data and analytics solutions handle today’s challenges?
To stay competitive in today’s market, companies must be able to use their data to make better decisions. However, we are living in a world flooded by data, new technologies, and demands from the business for better and more advanced analytics. Most companies do not have the modern technologies and processes in place to keep up with these growing demands. They need to modernize how they collect, analyze, use, and share their data.
In this webinar, we discuss how you can build modern data and analytics solutions that are future ready, scalable, real-time, high speed, and agile and that can enable better use of data throughout your company.
We cover:
-The business demands and industry shifts that are impacting the need to modernize
-The benefits of data and analytics modernization
-How to approach data and analytics modernization- steps you need to take and how to get it right
-The pillars of modern data management
-Tips for migrating from legacy analytics tools to modern, next-gen platforms
-Lessons learned from companies that have gone through the modernization process
Creating a clearly articulated data strategy—a roadmap of technology-driven capability investments prioritized to deliver value—helps ensure from the get-go that you are focusing on the right things, so that your work with data has a business impact. In this presentation, the experts at Silicon Valley Data Science share their approach for crafting an actionable and flexible data strategy to maximize business value.
Emerging Trends in Data Architecture – What’s the Next Big Thing?DATAVERSITY
With technological innovation and change occurring at an ever-increasing rate, it’s hard to keep track of what’s hype and what can provide practical value for your organization. Join this webinar to see the results of a recent DATAVERSITY survey on emerging trends in Data Architecture, along with practical commentary and advice from industry expert Donna Burbank.
Building a Data Strategy – Practical Steps for Aligning with Business GoalsDATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task – but it’s worth the effort. Getting your Data Strategy right can provide significant value, as data drives many of the key initiatives in today’s marketplace – from digital transformation, to marketing, to customer centricity, to population health, and more. This webinar will help demystify Data Strategy and its relationship to Data Architecture and will provide concrete, practical ways to get started.
Standing on the Shoulders of Open-Source Giants: The Serverless Realtime Lake...HostedbyConfluent
"Unlike just a few years ago, today the lakehouse architecture is an established data platform embraced by all major cloud data companies such as AWS, Azure, Google, Oracle, Microsoft, Snowflake and Databricks.
This session kicks off with a technical, no-nonsense introduction to the lakehouse concept, dives deep into the lakehouse architecture and recaps how a data lakehouse is built from the ground up with streaming as a first-class citizen.
Then we focus on serverless for streaming use cases. Serverless concepts are well-known from developers triggering hundreds of thousands of AWS Lambda functions at a negligible cost. However, the same concept becomes more interesting when looking at data platforms.
We have all heard about the principle ""It runs best on Powerpoint"", so I decided to skip slides here and bring a serverless demo instead:
A hands-on, fun, and interactive serverless streaming use case example where we ingest live events from hundreds of mobile devices (don't miss out - bring your phone and be part of it!!). Based on this use case I will critically explore how much of a modern lakehouse is serverless and how we implemented that at Databricks (spoiler alert: serverless is everywhere from data pipelines, workflows, optimized Spark APIs, to ML).
TL;DR benefits for the Data Practitioners:
-Recap the OSS foundation of the Lakehouse architecture and understand its appeal
- Understand the benefits of leveraging a lakehouse for streaming and what's there beyond Spark Structured Streaming.
- Meat of the talk: The Serverless Lakehouse. I give you the tech bits beyond the hype. How does a serverless lakehouse differ from other serverless offers?
- Live, hands-on, interactive demo to explore serverless data engineering data end-to-end. For each step we have a critical look and I explain what it means, e.g for you saving costs and removing operational overhead."
Data at the Speed of Business with Data Mastering and GovernanceDATAVERSITY
Do you ever wonder how data-driven organizations fuel analytics, improve customer experience, and accelerate business productivity? They are successful by governing and mastering data effectively so they can get trusted data to those who need it faster. Efficient data discovery, mastering and democratization is critical for swiftly linking accurate data with business consumers. When business teams can quickly and easily locate, interpret, trust, and apply data assets to support sound business judgment, it takes less time to see value.
Join data mastering and data governance experts from Informatica—plus a real-world organization empowering trusted data for analytics—for a lively panel discussion. You’ll hear more about how a single cloud-native approach can help global businesses in any economy create more value—faster, more reliably, and with more confidence—by making data management and governance easier to implement.
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how data architecture is a key component of an overall enterprise architecture for enhanced business value and success.
Wonder what this data mesh stuff is all about? What are the principles of data mesh? Can you or should you consider data mesh as the approach for your analytics platform? And most important - how can Snowflake help?
Given in Montreal on 14-Dec-2021
In this session, Sergio covered the Lakehouse concept and how companies implement it, from data ingestion to insight. He showed how you could use Azure Data Services to speed up your Analytics project from ingesting, modelling and delivering insights to end users.
Data is the lifeblood of just about every organization and functional area today. As businesses struggle to come to grips with the data flood, it is even more critical to focus on data as an asset that directly supports business imperatives as other organizational assets do. Organizations across most industries attempt to address data opportunities (e.g. Big Data) and data challenges (e.g. data quality) to enhance business unit performance. Unfortunately however, the results of these efforts frequently fall far below expectations due to haphazard approaches. Overall, poor organizational data management capabilities are the root cause of many of these failures. This webinar covers three lessons (illustrated by examples), which will help you to establish realistic OM plans and expectations, and help demonstrate the value of such actions to both internal and external decision makers.
Check out more of our webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Big data architectures and the data lakeJames Serra
With so many new technologies it can get confusing on the best approach to building a big data architecture. The data lake is a great new concept, usually built in Hadoop, but what exactly is it and how does it fit in? In this presentation I'll discuss the four most common patterns in big data production implementations, the top-down vs bottoms-up approach to analytics, and how you can use a data lake and a RDBMS data warehouse together. We will go into detail on the characteristics of a data lake and its benefits, and how you still need to perform the same data governance tasks in a data lake as you do in a data warehouse. Come to this presentation to make sure your data lake does not turn into a data swamp!
DAS Slides: Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key inter-relationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how Data Architecture is a key component of an overall enterprise architecture for enhanced business value and success.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
How to Use a Semantic Layer to Deliver Actionable Insights at ScaleDATAVERSITY
Learn about using a semantic layer to enable actionable insights for everyone and streamline data and analytics access throughout your organization. This session will offer practical advice based on a decade of experience making semantic layers work for Enterprise customers.
Attend this session to learn about:
- Delivering critical business data to users faster than ever at scale using a semantic layer
- Enabling data teams to model and deliver a semantic layer on data in the cloud.
- Maintaining a single source of governed metrics and business data
- Achieving speed of thought query performance and consistent KPIs across any BI/AI tool like Excel, Power BI, Tableau, Looker, DataRobot, Databricks and more.
- Providing dimensional analysis capability that accelerates performance with no need to extract data from the cloud data warehouse
Who should attend this session?
Data & Analytics leaders and practitioners (e.g., Chief Data Officers, data scientists, data literacy, business intelligence, and analytics professionals).
Databricks CEO Ali Ghodsi introduces Databricks Delta, a new data management system that combines the scale and cost-efficiency of a data lake, the performance and reliability of a data warehouse, and the low latency of streaming.
How to Build & Sustain a Data Governance Operating Model DATUM LLC
Learn how to execute a data governance strategy through creation of a successful business case and operating model.
Originally presented to an audience of 400+ at the Master Data Management & Data Governance Summit.
Visit www.datumstrategy.com for more!
How to Build the Data Mesh Foundation: A Principled Approach | Zhamak Dehghan...HostedbyConfluent
Organizations have been chasing the dream of data democratization, unlocking and accessing data at scale to serve their customers and business, for over a half a century from early days of data warehousing. They have been trying to reach this dream through multiple generations of architectures, such as data warehouse and data lake, through a cambrian explosion of tools and a large amount of investments to build their next data platform. Despite the intention and the investments the results have been middling.
In this keynote, Zhamak shares her observations on the failure modes of a centralized paradigm of a data lake, and its predecessor data warehouse.
She introduces Data Mesh, a paradigm shift in big data management that draws from modern distributed architecture: considering domains as the first class concern, applying self-sovereignty to distribute the ownership of data, applying platform thinking to create self-serve data infrastructure, and treating data as a product.
This talk introduces the principles underpinning data mesh and Zhamak's recent learnings in creating a path to bring data mesh to life in your organization.
These slides--based on the webinar--from leading IT analyst firm Enterprise Management Associates (EMA) and Attunity provide insights into how organizations are overcoming inherent challenges to enable SAP data-driven initiatives and meet the analytics goals of their organization with modern data integration and management.
Data Virtualization, a Strategic IT Investment to Build Modern Enterprise Dat...Denodo
This content was presented during the Smart Data Summit Dubai 2015 in the UAE on May 25, 2015, by Jesus Barrasa, Senior Solutions Architect at Denodo Technologies.
In the era of Big Data, IoT, Cloud and Social Media, Information Architects are forced to rethink how to tackle data management and integration in the enterprise. Traditional approaches based on data replication and rigid information models lack the flexibility to deal with this new hybrid reality. New data sources and an increasing variety of consuming applications, like mobile apps and SaaS, add more complexity to the problem of delivering the right data, in the right format, and at the right time to the business. Data Virtualization emerges in this new scenario as the key enabler of agile, maintainable and future-proof data architectures.
Creating a clearly articulated data strategy—a roadmap of technology-driven capability investments prioritized to deliver value—helps ensure from the get-go that you are focusing on the right things, so that your work with data has a business impact. In this presentation, the experts at Silicon Valley Data Science share their approach for crafting an actionable and flexible data strategy to maximize business value.
Emerging Trends in Data Architecture – What’s the Next Big Thing?DATAVERSITY
With technological innovation and change occurring at an ever-increasing rate, it’s hard to keep track of what’s hype and what can provide practical value for your organization. Join this webinar to see the results of a recent DATAVERSITY survey on emerging trends in Data Architecture, along with practical commentary and advice from industry expert Donna Burbank.
Building a Data Strategy – Practical Steps for Aligning with Business GoalsDATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task – but it’s worth the effort. Getting your Data Strategy right can provide significant value, as data drives many of the key initiatives in today’s marketplace – from digital transformation, to marketing, to customer centricity, to population health, and more. This webinar will help demystify Data Strategy and its relationship to Data Architecture and will provide concrete, practical ways to get started.
Standing on the Shoulders of Open-Source Giants: The Serverless Realtime Lake...HostedbyConfluent
"Unlike just a few years ago, today the lakehouse architecture is an established data platform embraced by all major cloud data companies such as AWS, Azure, Google, Oracle, Microsoft, Snowflake and Databricks.
This session kicks off with a technical, no-nonsense introduction to the lakehouse concept, dives deep into the lakehouse architecture and recaps how a data lakehouse is built from the ground up with streaming as a first-class citizen.
Then we focus on serverless for streaming use cases. Serverless concepts are well-known from developers triggering hundreds of thousands of AWS Lambda functions at a negligible cost. However, the same concept becomes more interesting when looking at data platforms.
We have all heard about the principle ""It runs best on Powerpoint"", so I decided to skip slides here and bring a serverless demo instead:
A hands-on, fun, and interactive serverless streaming use case example where we ingest live events from hundreds of mobile devices (don't miss out - bring your phone and be part of it!!). Based on this use case I will critically explore how much of a modern lakehouse is serverless and how we implemented that at Databricks (spoiler alert: serverless is everywhere from data pipelines, workflows, optimized Spark APIs, to ML).
TL;DR benefits for the Data Practitioners:
-Recap the OSS foundation of the Lakehouse architecture and understand its appeal
- Understand the benefits of leveraging a lakehouse for streaming and what's there beyond Spark Structured Streaming.
- Meat of the talk: The Serverless Lakehouse. I give you the tech bits beyond the hype. How does a serverless lakehouse differ from other serverless offers?
- Live, hands-on, interactive demo to explore serverless data engineering data end-to-end. For each step we have a critical look and I explain what it means, e.g for you saving costs and removing operational overhead."
Data at the Speed of Business with Data Mastering and GovernanceDATAVERSITY
Do you ever wonder how data-driven organizations fuel analytics, improve customer experience, and accelerate business productivity? They are successful by governing and mastering data effectively so they can get trusted data to those who need it faster. Efficient data discovery, mastering and democratization is critical for swiftly linking accurate data with business consumers. When business teams can quickly and easily locate, interpret, trust, and apply data assets to support sound business judgment, it takes less time to see value.
Join data mastering and data governance experts from Informatica—plus a real-world organization empowering trusted data for analytics—for a lively panel discussion. You’ll hear more about how a single cloud-native approach can help global businesses in any economy create more value—faster, more reliably, and with more confidence—by making data management and governance easier to implement.
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how data architecture is a key component of an overall enterprise architecture for enhanced business value and success.
Wonder what this data mesh stuff is all about? What are the principles of data mesh? Can you or should you consider data mesh as the approach for your analytics platform? And most important - how can Snowflake help?
Given in Montreal on 14-Dec-2021
In this session, Sergio covered the Lakehouse concept and how companies implement it, from data ingestion to insight. He showed how you could use Azure Data Services to speed up your Analytics project from ingesting, modelling and delivering insights to end users.
Data is the lifeblood of just about every organization and functional area today. As businesses struggle to come to grips with the data flood, it is even more critical to focus on data as an asset that directly supports business imperatives as other organizational assets do. Organizations across most industries attempt to address data opportunities (e.g. Big Data) and data challenges (e.g. data quality) to enhance business unit performance. Unfortunately however, the results of these efforts frequently fall far below expectations due to haphazard approaches. Overall, poor organizational data management capabilities are the root cause of many of these failures. This webinar covers three lessons (illustrated by examples), which will help you to establish realistic OM plans and expectations, and help demonstrate the value of such actions to both internal and external decision makers.
Check out more of our webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Big data architectures and the data lakeJames Serra
With so many new technologies it can get confusing on the best approach to building a big data architecture. The data lake is a great new concept, usually built in Hadoop, but what exactly is it and how does it fit in? In this presentation I'll discuss the four most common patterns in big data production implementations, the top-down vs bottoms-up approach to analytics, and how you can use a data lake and a RDBMS data warehouse together. We will go into detail on the characteristics of a data lake and its benefits, and how you still need to perform the same data governance tasks in a data lake as you do in a data warehouse. Come to this presentation to make sure your data lake does not turn into a data swamp!
DAS Slides: Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key inter-relationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how Data Architecture is a key component of an overall enterprise architecture for enhanced business value and success.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
How to Use a Semantic Layer to Deliver Actionable Insights at ScaleDATAVERSITY
Learn about using a semantic layer to enable actionable insights for everyone and streamline data and analytics access throughout your organization. This session will offer practical advice based on a decade of experience making semantic layers work for Enterprise customers.
Attend this session to learn about:
- Delivering critical business data to users faster than ever at scale using a semantic layer
- Enabling data teams to model and deliver a semantic layer on data in the cloud.
- Maintaining a single source of governed metrics and business data
- Achieving speed of thought query performance and consistent KPIs across any BI/AI tool like Excel, Power BI, Tableau, Looker, DataRobot, Databricks and more.
- Providing dimensional analysis capability that accelerates performance with no need to extract data from the cloud data warehouse
Who should attend this session?
Data & Analytics leaders and practitioners (e.g., Chief Data Officers, data scientists, data literacy, business intelligence, and analytics professionals).
Databricks CEO Ali Ghodsi introduces Databricks Delta, a new data management system that combines the scale and cost-efficiency of a data lake, the performance and reliability of a data warehouse, and the low latency of streaming.
How to Build & Sustain a Data Governance Operating Model DATUM LLC
Learn how to execute a data governance strategy through creation of a successful business case and operating model.
Originally presented to an audience of 400+ at the Master Data Management & Data Governance Summit.
Visit www.datumstrategy.com for more!
How to Build the Data Mesh Foundation: A Principled Approach | Zhamak Dehghan...HostedbyConfluent
Organizations have been chasing the dream of data democratization, unlocking and accessing data at scale to serve their customers and business, for over a half a century from early days of data warehousing. They have been trying to reach this dream through multiple generations of architectures, such as data warehouse and data lake, through a cambrian explosion of tools and a large amount of investments to build their next data platform. Despite the intention and the investments the results have been middling.
In this keynote, Zhamak shares her observations on the failure modes of a centralized paradigm of a data lake, and its predecessor data warehouse.
She introduces Data Mesh, a paradigm shift in big data management that draws from modern distributed architecture: considering domains as the first class concern, applying self-sovereignty to distribute the ownership of data, applying platform thinking to create self-serve data infrastructure, and treating data as a product.
This talk introduces the principles underpinning data mesh and Zhamak's recent learnings in creating a path to bring data mesh to life in your organization.
These slides--based on the webinar--from leading IT analyst firm Enterprise Management Associates (EMA) and Attunity provide insights into how organizations are overcoming inherent challenges to enable SAP data-driven initiatives and meet the analytics goals of their organization with modern data integration and management.
Data Virtualization, a Strategic IT Investment to Build Modern Enterprise Dat...Denodo
This content was presented during the Smart Data Summit Dubai 2015 in the UAE on May 25, 2015, by Jesus Barrasa, Senior Solutions Architect at Denodo Technologies.
In the era of Big Data, IoT, Cloud and Social Media, Information Architects are forced to rethink how to tackle data management and integration in the enterprise. Traditional approaches based on data replication and rigid information models lack the flexibility to deal with this new hybrid reality. New data sources and an increasing variety of consuming applications, like mobile apps and SaaS, add more complexity to the problem of delivering the right data, in the right format, and at the right time to the business. Data Virtualization emerges in this new scenario as the key enabler of agile, maintainable and future-proof data architectures.
Is your big data journey stalling? Take the Leap with Capgemini and ClouderaCloudera, Inc.
Transitioning to a Big Data architecture is a big step; and the complexity of moving existing analytical services onto modern platforms like Cloudera, can seem overwhelming.
Uncover how your business can save money and find new revenue streams.
Driving profitability is a top priority for companies globally, especially in uncertain economic times. It's imperative that companies reimagine growth strategies and improve process efficiencies to help cut costs and drive revenue – but how?
By leveraging data-driven strategies layered with artificial intelligence, companies can achieve untapped potential and help their businesses save money and drive profitability.
In this webinar, you'll learn:
- How your company can leverage data and AI to reduce spending and costs
- Ways you can monetize data and AI and uncover new growth strategies
- How different companies have implemented these strategies to achieve cost optimization benefits
BDW Chicago 2016 - Ramu Kalvakuntla, Sr. Principal - Technical - Big Data Pra...Big Data Week
We all are aware of the challenges enterprises are having with growing data and silo’d data stores. Business is not able to make reliable decisions with un-trusted data and on top of that, they don’t have access to all data within and outside their enterprise to stay ahead of the competition and make key decisions in their business
This session will take a deep dive into current challenges business are having today and how to build a Modern Data Architecture using emerging technologies such as Hadoop, Spark, NoSQL data stores, MPP Data stores and scalable and cost effective cloud solutions such as AWS, Azure and Bigstep.
Looking to the Future: Embracing the Cloud for a More Modern Data Quality App...Precisely
Data quality: it’s what we all strive for, and yet we don’t always have what we need to achieve it.
Embracing the cloud with a more holistic, yet simplified user experience will help you find exponential value in your data today – and plan for tomorrow. Join us to learn about a more modern approach that will empower your teams to more deeply understand, trust, and pro-actively address anomalies in your critical data.
Learn more about the value of next-generation cloud solutions that will power your organization into the future by joining us on September 22 where you will hear from Precisely’s Emily Washington, SVP of Product Management, Chuck Kane, VP of Product Management, and David Woods, SVP of Strategic Services. Be sure to bring your questions for our team of experts to the live Q&A session following their presentations and demos.
All business sizes can benefit from better use of their data to gain insights, how the cloud can help overcome common data challenges and accelerate transformation with the cloud technology
https://www.rapyder.com/cloud-data-analytics-services/
Booz Allen Hamilton uses its Cloud Analytics Reference Architecture to build technology infrastructures that can withstand the weight of massive datasets – and deliver the deep insights organizations need to drive innovation.
Organizations often struggle to select and implement big data projects that produce meaningful results.
Learning from the success and failure of other organizations will help you identify common pitfalls and get more value from your big data initiatives. A new study from 451 research takes an in-depth look into six organizations and their cloud-based big data adoption efforts.
In this webinar, we will share some of the key findings from this research and see how organizations across a variety of industries use the Cloud to drive measurable value from big data. You will learn the challenges they faced, the tools they use to address these challenges, and the benefits of using AWS Cloud to develop and deploy big data solutions.
Learning Objectives:
Hear the experiences of organizations in a variety of industries, including a mobile technology analytics platform provider; a mobile application platform provider; a financial services regulator; a technology consultancy; a marketing strategy firm; and a mainstream financial services firm
Identify some of the challenges of deploying big data solutions
Learn 5 ways the Cloud delivers value for big data users
Understand the benefits of using the AWS Cloud to develop and deploy big data solutions
Who Should Attend:
Business & technical decision makers, architects and director-level or above of development for Big Data solutions, business analysts, data scientists, VP/Directors of engineering, CIOs, CTOs
This presentation is for Analytic and Business Intelligence leads as well as IT leads who manage analytics. In addition, existing Oracle Business Intelligence and Analytic Customers will find it valuable to understand how they can leverage their existing investments along with Oracle Analytics Cloud.
How to Capitalize on Big Data with Oracle Analytics CloudPerficient, Inc.
The average age of a company listed on the S&P 500 has fallen from almost 60 years old in the 1950s to less than 20 years old today. Innovative companies that are willing to embrace transformative technologies make the list today, while businesses that are hesitant to embrace change risk becoming obsolete.
Innovators use big data solutions as a competitive advantage to increase revenue, reduce cost, and improve cash flow. Turn big data into actionable insights with Oracle Analytics Cloud.
We identified the big data opportunities in front of you and how to take advantage of them:
-Big data and its architecture
-Why a big data strategy is imperative to remaining relevant
-How Oracle Analytics Cloud can help you connect people, places, data, and systems to fundamentally change how you analyze, understand, and act on information
Visual Analytics combines human intuition and data science to derive knowledge from the data in a very efficient, effective and easy way. Visual Analytics empowers your people to interact with the data and generate new insights.
Big Data Expo 2015 - Talend Delivering Real TimeBigDataExpo
Pioneers like Mint in the financial sector, Amazon in retail or Netflix in media proved that turning Big Data into actions and insights at the customer touch points delivers measurable outcomes – increased transformation rate, larger share of wallet, better customer acquisition, just in time fraud detection, etc. They showcased that it is possible today to put in place a platform for the management of customer data that is able to integrate and deliver information in real time, regardless of the interaction channel being used… and as a result establish the foundation to disrupt a whole industry with data driven processes.
Now, this Customer Data Platform is reaching the mainstream through affordable technologies such as Hadoop and Spark, if empowered with embedded data and application integration, data governance, master data management, analytics and real time data processing. This platform, sometimes referred as Customer Data Platform (CDP) or as a Data Management Platform (DMP), allows organizations to reconstruct the entire customer journey by centralizing and cross referencing interactional or internal data such as purchase history, preferences, satisfaction, and loyalty with social or external data that can uncover customer intention as well as broader habits and tastes.
In this presentation, attendees will get knowledge of the key components of the platform, how to implement it, and how to run it in the context of the enterprise marketing activities.
Four Key Considerations for your Big Data Analytics StrategyArcadia Data
Learn 4 of the key things to consider as you create your big data analytics strategy from John Meyers (Enterprise Management Associates) and Steve Wooledge (Arcadia Data).
DAMA Webinar: Turn Grand Designs into a Reality with Data VirtualizationDenodo
Watch full webinar here: https://buff.ly/2HMdbUp
What started to evolve as the most agile and real-time enterprise data fabric, data virtualization is proving to go beyond its initial promise and is becoming one of the most important enterprise big data fabrics.
Attend this session to learn:
• What data virtualization really is,
• How it differs from other enterprise data integration technologies
• Real-world examples of data virtualization in action from companies such as Logitech, Autodesk and Festo.
Next Generation Data Center - IT TransformationDamian Hamilton
Computerworld CIO Event in Hong Kong sponsored by Dimension Data, EMC & Cisco.
Insights into Dimension Data's DC strategy and recent Client engagements
Increase your ROI with Hadoop in Six Months - Presented by Dell, Cloudera and...Cloudera, Inc.
Are you struggling to validate the added costs of a Hadoop implementation? Are you struggling to manage your growing data?
The costs of implementing Hadoop may be more beneficial than you anticipate. Dell and Intel recently commissioned a study with Forrester Research to determine the Total Economic Impact of the Dell | Cloudera Apache Hadoop Solution, accelerated by Intel. The study determined customers can see a 6-month payback when implementing the Dell | Cloudera solution.
Join Dell, Intel and Cloudera, three big data market leaders, to understand how to begin a simplified and cost-effective big data journey and to hear case studies that demonstrate how users have benefited from the Dell | Cloudera Apache Hadoop Solution.
Accelerate SQL Server Migration to the AWS Cloud Datavail
In today’s marketplace, moving to the public Cloud is a familiar and consistent trend within the SQL Server community.
But which cloud provider do you choose? After all there are different AWS instances each with their own distinctive features. Migrations to the cloud are only going to gain greater momentum as organizations grapple with their on-premises alternatives.
Recent cloud breaches may have some organizations hesitant to take the leap and move to the cloud, however market-leading cloud providers are making every attempt in adhering to compliance guidelines while boosting their security framework and reliability offerings. They are also becoming more competitive by managing their cost more effectively.
For both homogeneous and heterogeneous migrations, planning plays a critical role in moving to the cloud. Preparing a checklist and asking the right questions to stakeholders lays the groundwork in this planning. There are different methods to migrate databases from on-premises to the AWS cloud.
This webinar is in partnership with PASS, download the recording to learn more about:
Reasons to go to the cloud
SQL Server on AWS EC2 vs. AWS RDS
SQL Server high availability (HA) & disaster recovery (DR)
SQL Server migration methodology
DBAs role in the cloud
MOUS 2020 - Hyperion 11.2 vs. Cloud: Should I Stay or Should I Go?Datavail
Oracle has announced the 11.2 release of the Oracle Hyperion EPM on-premises suite, tentatively scheduled for Q1 2019. The impending release represents a decision point for many on-premises customers: Should I invest in upgrading to 11.2, or is this the right time to move to the cloud?
The presentation will cover:
• On-premise infrastructure impacts
• Hyperion/Oracle EPM 11.2.x.x. vs. Cloud
• Understanding Oracle’s Cloud strategy
• Alternative cloud migration approaches
We will share the most important considerations when making this decision and share some of our related real-world experience.
Oracle Enterprise Manager Seven Robust Features to Put in Action finalDatavail
Oracle Enterprise Manager (OEM) brings your Oracle deployments together in a single management, monitoring, and automation dashboard. Oracle developed this solution, so it offers deep integration with many of its technologies. The ease of integration, coupled with the support of both on-premise and cloud-based Oracle databases, allows it to fit into many enterprise infrastructures. Oracle Enterprise Manager can also monitor and manage non-Oracle databases, making it a cost-effective and central tool to manage IT environments with a mix of database platforms.
The single point of control is appealing for complex enterprise infrastructures, especially when they’re heavily invested in Oracle technologies. Out-of-the-box monitoring and reporting templates cover many common use cases, and simplifies the configuration of management automation for databases, applications, and more.
Watch the webinar to see a brief history of OEM and a deep dive into seven robust features organizations should consider implementing:
Lessons from Migrating Oracle Databases to Amazon RDS or Amazon Aurora Datavail
Learn and leverage database migration best practices from moving off commercial Oracle databases to Amazon RDS or Aurora. We’ll cover common pitfalls, issues, the biggest differences between the engines, migration best practices, and how some of our customers have completed these migrations.
EPM 11.2: Lessons Learned and 2021 PreparednessDatavail
As we all know, EPM 11.2 is here!
But…it was released too late in 2019 for most organizations to budget an on-premises EPM upgrade for Fiscal 2020. However, the end of support for 11.1.2.4 is also looming in 2021. If you’re staying on-premises, an upgrade to 11.2 should go live no later than December 2021 (earlier if subject to SOX controls).
Rather than waiting for the next budget cycle to roll around, this webinar will show attendees how to prepare for an upgrade this year without spending significant time and capital. We’ll also share what we’ve learned while upgrading to 11.2, and what you can expect post-install.
Optimizing Oracle Databases & Applications Gives Fast Food Giant Major GainsDatavail
A leader in the fast-food industry began experiencing issues with database performance and financial close processes that were having major effects on the business. By implementing optimization techniques, re-architecting systems, migrating to the cloud, and properly distributing server load, this fast-food giant was able to:
Cut server lag from 24 hours to five minutes during even the most active periods
Decrease time to implement global changes to menus from one week to overnight
Speed their financial close time frame
Significantly reduce the frequency of crashes and downtime
And more!
Watch this webinar to learn HOW this was achieved with our 5S performance tuning methodology, so you can do the same in your own environment.
As an Oracle DBA manager, you have a lot to keep up with—new technologies, certifications, cloud migrations, upgrades, etc. Learn how to help your team stay ahead of technology shifts by following a few easy steps.
Delivered by DBA Managers with 15+ years of hands-on technical experience, this presentation provides proven methods these experts used to help their direct reports advance their skill sets and careers. Walk away with actionable steps you can use to elevate your DBA team’s expertise.
Upcoming Extended Support Deadlines & What They Mean for YouDatavail
Extended Support deadlines are drawing near for the technology undergirding on-premises Oracle® EPM/Hyperion systems.
Watch this on-demand webinar to learn about vendor Extended Support deadlines for Java, Oracle JRockit, Microsoft Windows Server, Microsoft SQL Server, Linux, and Oracle EPM 11.1.2.4 and prior and how they will affect your EPM/Hyperion applications. While some of these dates are a few years away, others are not and may surprise you.
Also, learn about implications of either an upgrade vs. moving to the Cloud if your system is subject to Sarbanes-Oxley or similar change audit controls. If your Oracle EPM system is subject to these controls, take note of ways to avoid being red-flagged in a future year’s SOX audit.
Are you are interested in running SQL on Linux, but don’t know how to get started? In this presentation, we’ll share the software and hardware you need to get started. We’ll also cover these steps:
- Installing and configuring VirtualBox, Ubuntu Server, PUTTY, SQL Server 2019 on Ubuntu.
- Review the basic administration steps such as start and stop the SQL Server services on Linux
- Backup and restore a database on Linux and checking CPU usage, disk i/o, and disk space.
By the end of the presentation, you will have the required knowledge to setup your own lab and continue your journey on further learning of SQL on Linux.
Reduce Cost by Tuning Queries on Azure DBaaSDatavail
Poorly written queries on on-premise servers slow down the server performance.But in the case of Azure SQL database, those queries not only degrade the application performance but also cost money. When you tune queries in Azure SQL Database, you may benefit from reducing resource demands as your application might run at a lower compute size and then eventually you can reduce cost.
In any given SQL Server instance, there are likely 8 to 10 queries or stored procedures that are responsible for 80 to 90 percent of the server load. If you can identify these problem queries and tune them, you can make a significant impact on the overall performance of your database. This presentation will explain some simple techniques of tuning the queries and will demonstrate before and after performance differences.
MOUS 2019 - Keeping Pace with Change: Prepare for Tomorrow & Advance Your Car...Datavail
There’s a lot to keep up with in the IT industry and especially as a DBA manager.
Oracle expert, Steve Thompson takes on the topic of advance your career while keeping pace with change.
Download the presentation to learn more about:
How to embrace changes
Determining your direction
How to focus on value and avoid distractions
Expanding skills
Mastering the cloud
Essbase On-Prem to Oracle Analytics Cloud - How, When, and WhyDatavail
In this presentation, you will get insight into the benefits of upgrading vs. moving to the cloud, scenarios and case studies from our recent years of experience, and how moving to the cloud might affect your budgeting, software updates & patches, existing investments, licensing costs, and more.
Is "Free" Good Enough for Your MySQL Environment?Datavail
MySQL can be the perfect answer for fast-growing, highly-performant and geographically-distributed database environments, but in order to function as a business-critical system with immediate response times, the ubiquitous database server needs a little help.
That’s where Continuent and Datavail come in. Combined, these two companies, which specialize in making MySQL and other databases perform continuously, have helped hundreds of enterprise, mid-market and start-up companies alike, including many in the data-dependent SaaS, e-commerce, financial services and gaming industries.
In addition, we’ll dive into why ‘managed’ database-as-a-service solutions, may not be quite as self-managing as people would like to believe. You’ll hear several case studies on how clients are effectively utilizing Continuent Tungsten software and Datavail services to optimize their MySQL environments.
Critical Preflight Checks for Your EPM ApplicationsDatavail
The environment which houses your business critical EPM applications is complex.
Maybe as complex as the cockpit of an aircraft. Just as a pilot might not be able to build or fix everything on their plane, you might be using applications but not know how to build or fix everything that’s being used. This shouldn’t stop you from doing a pre-flight check to ensure that all your Hyperion systems are running properly and set for you and your end users.
Let’s talk about some different strategies to achieve this and give you the confidence in your systems so that you can know when things are running well—or more importantly, when they need attention before takeoff.
In this presentation, we will do assess the on-premises environment and determining what workloads and databases are ready to make the move and what can you do to improve their Azure readiness while reducing downtime during the migration. Planning and assessment plays a critical role in moving to the cloud. We would see wide range of resources and tools to get an assessment completed with ease while identifying workload dependencies with practical tips and tricks focusing on sizing and costs. And finally, we’ll assess the SQL instances and identify their readiness for Azure as well.
Essbase On-Prem to Oracle Analytics Cloud - How, When, and WhyDatavail
In this presentation, you will get insight into the benefits of upgrading vs. moving to the cloud, scenarios and case studies from our recent years of experience, and how moving to the cloud might affect your budgeting, software updates & patches, existing investments, licensing costs, and more.
In this presentation, we’ll explore the Accidental DBA. Oracle DBA Team Lead and expert, Steve Thompson’s presentation from Kscope19 takes on the different ways to lead an Accidental DBA.
The presentation explores:
What is an Accidental DBA, and what scenarios create an Accidental DBA?
Why it’s important to evaluate skill gaps, risks and benefits and plan for them.
Why companies should invest in the training.
Managing an EPM platform is not for the faint of heart – and going at it without a plan can leave you frustrated, nervous, and accountable if trouble strikes. But how do you prepare?
This presentation helps you get all of your EPM planning in one place with an EPM Punch List. We’ll talk through all the areas you should be concerned about to keep your Hyperion and Oracle EPM applications running smoothly, and give you solid, actionable strategies so that you are prepared for the worst.
Why NBC Universal Migrated to MongoDB AtlasDatavail
NBCUniversal, a worldwide mass media corporation, was looking for a more affordable and easier way to manage their database solution that hosts their extensive online digital assets. With Datavail’s assistance, NBCUniversal made the move from MongoDB 3.6 to MongoDB Atlas on AWS.
In this presentation, learn how making this move enabled the entertainment titan to reduce overhead and labor costs associated with managing its database environment.
In this presentation, we’ll explore the essential steps to get started and running SQL on Linux. Get up to speed quickly on identifying the software and hardware required plus the how-to on installation, configuration and administration for SQL on Linux.
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
1. Journey to
Cloud Analytics
How 3 Companies’ Analytics Challenges
Were Solved by Moving to the Cloud
Richmond Virtual CIO and IT Security Forum | May 25, 2021
2. T
Tom
Hoblitzell
Datavail
VP, Data
Management
Passion for solving complex global business challenges
through advanced technology and leading-edge digital
business intelligence
A forward-thinking strategist to drive outstanding
business results
Built and successfully grew analytics practices at major
systems integration firms leading growth from start-up to
mature practice with over $60 MM in revenue
Sold Practices as part of strategic acquisitions (Fujitsu,
Capgemini)
Acquired and integrated IT acquisitions into existing
practices to increase growth and round out capabilities
and competencies
Acts as a strategic advisor to key clients enabling growth
of analytics and digital transformation initiatives to
leverage data as a strategic asset
Over 30 years of experience
www.datavail.com 2
3. Fill Out Our
Cloud Analytics
Survey
Fill out the Cloud Analytics survey for a chance
to WIN a Sonos Move, a battery-powered
Smart Speaker!
https://www.datavail.com/cloud-analytics-survey
4. A
Our
Agenda
Simply migrating to the
cloud with a “lift and
shift”' approach does not
result in innovation, but it
does add another level of
complexity to operations.
- Gartner
Why cloud for analytics?
New capabilities in the cloud
versus on prem
Challenges moving analytics
to the cloud
Maturity matters
Datavail’s approach
Players in the Cloud Analytics Space
Case Studies
Two major players to watch
• Snowflake
• RedShift
www.datavail.com 4
5. www.datavail.com 5
“IDC Forecasts Revenues for Big Data and Business Analytics Solutions Will Reach $189.1
Billion This Year(2019) with Double-Digit Annual Growth Through 2022 (to $274.3 billion)”
Companies are investing in Cloud Analytics
0
50
100
150
200
250
300
350
2019 2020 2021 2022
Billions
Year
Cloud Analytics Growth: 44% of Market in 2022
Cloud Analytics=32%
Annual Growth!
On-Prem=13%
Annual Growth
6. www.datavail.com 6
Why Cloud for Analytics?
The Cloud as a Driver of Analytics Innovation
Analytics Innovation
Brainstorming
Ideation Visibility
Constant
Refinement
Design
Thinking
Incubation
Focus
Source: Gartner
7. A
Cloud
Analytics
What Is It?
“Analytics is the process
of gathering, cleansing,
transforming, and
modeling data with the
goal of discovering useful
information to support
decision making.”
Source: Quantzig
The goal of Analytics is to
make data accessible,
useful, and actionable,
which leads to digital
transformation.
Cloud Analytics uses
modern cloud technologies
and approaches to achieve
the goal with lower costs,
faster scalability, and
agile implementation.
www.datavail.com 7
8. www.datavail.com 8
Analytics: New Capabilities in the Cloud vs on Prem –
Help Generate and Manage innovation
Source: Gartner
On-premises
Cloud
Attracting users
with emerging
capabilities
More prototypes
with greater
visibility
Metadata
powered
collaboration
Focus on high-
quality analytics
Desktop or web-
based training
Prototype with
limited audience
Discussion
Tired of
repetitive basic
analytics
Onboarding Prototype Pilot Production
Ideation Design Incubation Focus
Sandbox Constant refinement Elasticity/automation
9. www.datavail.com 9
Business needs self-service data
exploration and discovery-
oriented forms of advanced
analytics
Business needs data integrated into
a single, trusted data store
Want answers to any question
across business processes
Business wants both new and
traditional data, thereby enabling
analytics correlations across all data
Low total cost of ownership (TCO)
360° view of any person or
organization that touches the
company
Analytics systems should respond
quickly and cheaply to changes in
business conditions or acquisitions
Scalable
Fast response
What Organizations want from their Analytics
10. www.datavail.com 10
But Moving to the Cloud Can Be Challenging…
Top Internal Challenges Adopting Data & Analytics in the Cloud
Challenges with technology infrastructure
and/or architecture
Solving risk and governance issues (security,
ethics, privacy, data quality)
Adding more agility and flexibility to our data
and analytics initiatives
Integrating multiple data sources
Obtaining skills and capabilities needed
Making data and analytics more usable for
business consumers and front-line workers
0% 5% 10% 15% 20% 25% 30% 35%
33%
32%
29%
29%
27%
26%
n= 270, total respondents, excluding “don’t know” Source: Gartner
11. M
BI Maturity
Stages
Maturity is now critical
to company
competitiveness and
success.
www.datavail.com
Creating Market Agility and Differentiation
Fostering Innovation and People Productivity
Integrating Performance Management & the Business
Measuring and Monitoring the Business
Running the Business
1
2
3
4
5
11
12. A
Our
Approach
We provide a
consultative and advisory
service entrenched in
technology, people, and
process.
A fined-tuned service and flexible
solution with several successful
engagements under our belt
Solves a business problem (“pain
points”)
Includes Datavail developed
accelerators
Aims to be vendor software agnostic
Delivered as an “outcomes” based
solution with defined ‘Quick Wins’
www.datavail.com 12
13. Direct
Efforts With
A Focused
Business
Vision
Goal is to enable
Systems of Insight to
drive business value
and efficiencies.
3
2
1
Data Foundation
Approach to delivering on a well architected journey
for tools and framework to drive data integration
and analytics through an execution roadmap and
timeline in accordance with compliance.
Transform Information
Raise the bar on operational excellence and
corporate success by converting data into
actionable insights, along with ability to dynamically
adjust to reporting requirements and compliance.
Modern Analytics
Guided, actionable analytics, providing self-service,
distributed analytics, dashboards, and future-proof
scaling of data to information integration.
www.datavail.com 13
14. www.datavail.com 14
Players in the Cloud Analytics Space
ETL
MDM
Cloud
Providers
Database
Reporting/
Analytics
Cloud &
Big Data
16. C
Case
Studies
From small business to
large enterprises, see
how we’ve helped our
clients gain value from
their organizational data.
Major Media Company
Retail Company
National Broadcasting
Company
www.datavail.com 16
17. C
Case
Studies
From small business to
large enterprises, see
how we’ve helped our
clients gain value from
their organizational data.
Major Media Company
Retail Company
National Broadcasting
Company
www.datavail.com 17
18. www.datavail.com 18
Challenge
Client’s IT Staff was dedicated to providing custom
reports based on client requirements that required two
to three dedicated resources.
Cost of Database Software license was becoming
prohibitive
Basic problem with the on-prem existing analytics
solution:
• Didn’t scale
• Costly (licenses and VM Servers)
• IT Bottleneck (required for each dataset developed)
• Dependence on Affinity ERP email capability (performance
and file-size limitations)
• Dependency on internal staff for report customization
Solution
Proposed solution was to take
advantage of the AWS Cloud
Analytics services.
Serverless solution reduced
cost (pay as you go)
Scaled easily
Provide Self Service data
visualization and data set
delivery
Automation of data movement
and processing
Case Study: Major Media Company
19. www.datavail.com 19
Media Company – New Architecture
SQLServer
DB OLTP
OLTP
1. Existing Data Source
bucket with
incremental
data
Stage - S3
2. Stage Data 3. Data Marts
RDS RDBMS
Amazon
RDS
4. Self Service
Analysis
Analysis
Lambda
function
Amazon
CloudWatch
AWS Data
Pipeline
AWS Glue
Amazon
QuickSight
Internal User
External User
7. REST API for
Data Integration
Data Integration Services
Business
Intelligence
bucket with
data sets
Data Set Delivery -S3
5. Build and Deliver
Data Sets
AWS Glue
6. Deliver Reports –
signed URL in email
20. C
Case
Studies
From small business to
large enterprises, see
how we’ve helped our
clients gain value from
their organizational data.
Major Media Company
Retail Company
National Broadcasting
Company
www.datavail.com 20
21. www.datavail.com 21
Challenge
Existing vendor solution was not
providing the reporting and analytics
environment required to manage the
business.
Technology was obsolete
Support was minimal “keep the lights on”
Needed to expand from B2B to include
B2C Sales and Operational Data
Expand to include additional data
sources
Solution
Determined that an AWS “Data Lake”
solution to bring both structured and
unstructured data into the Data Lake for
processing to drive analytics for the
business.
Utilized AWS Data Lab and POC to prove
solution addressed business needs
A support model was established so that
Datavail was in a Build/Run opportunity to
provide support for the new solution – from
data loads, to reporting, to governance and
managing the environment
Case Study: International Retail Company
22. www.datavail.com 22
Existing Business Environment
Agility for Today’s and Tomorrow’s Business Needs – Cloud
Flexibility and Speed - Time to Deliver Updates and Data Availability
Proactive Control of Data Quality
23. www.datavail.com 23
The Solution: Automated Data
Profiling/Reporting
Analysts
CSV or
Other
Files
On-Prem
S3 Bucket AWS
Athena
Data Catalog
Glue Crawler
Glue Crawler
Profiler Metrics
Repository
Data Profiler on
EMR
24. C
Case
Studies
From small business to
large enterprises, see
how we’ve helped our
clients gain value from
their organizational data.
Major Media Company
Retail Company
National Broadcasting
Company
www.datavail.com 24
25. www.datavail.com 25
National
Broadcasting
Company
Challenges
Broadcasting Company has an existing data
warehouse that is not meeting the user’s needs and
they want to re-engineer this warehouse to meet the
functional and analytical requirements of the user
The existing DW has obscure field names which forces
all reporting requests to go through a Data Scientist vs.
enabling the user to create their own reports
External data is not integrated into the warehouse for
trend analysis or for other types of market analysis
Improving the frequency of digital advertising data will
improve and enhance fund raising campaigns and
pledge drives
The existing DW environment:
• SQL Server
• Tableau and Microsoft BI for reporting
• Alteryx as the ETL tool
26. www.datavail.com 26
Solution
Considerations
Improve the flexibility, scalability and overall capabilities of
the warehouse to support business reporting and
analytics while providing data to the data science team to
focus on analysis that is external to Broadcasting
Company
Improve and reduce the support structure to make the
solution easily supportable by the existing support team
including technical training, knowledge transfer, etc.
Protect PCI and PII data in a secure manner
Leverage the cloud to take advantage of potentially lower
costs assuming security can be maintained
Provide an approach to start with Broadcasting
Company’s Digital business while extending the solution
to other lines of business
27. www.datavail.com 27
A Modern Data Lake Architecture
INGEST MODEL ANALYZE REPORTING
STAGE & STORE
DATA SOURCES
Azure
Data
Factory
Azure Data Lake
Power BI
Service
Snowflake DB
SaaS
Other Data
Sources
Prayer Data
Streaming
Data
Web Site
Data Ad-hoc Reporting
and Analysis
Standard
Reporting
29. www.datavail.com 29
Snowflake
Snowflake’s Data Cloud is a Software-as-a-Service
(SaaS) data platform that enables data storage,
processing, and analytical solutions that are faster,
easier to use, and more flexible than traditional
analytics offerings
Snowflake combines a new SQL query engine with an
innovative architecture that is natively designed for the
cloud
Snowflake runs on the following cloud platforms:
• Azure, AWS, Google
Snowflake processes queries using MPP (Massively
Parallel Processing) compute clusters storing a portion
of the entire data set locally to offer data management
simplicity of a shared-disk architecture but with the
performance and scale-out benefits of a shared-noting
architecture
Snowflake stores data in a columnar format with the
data only accessible through SQL query operations
AWS Redshift
Redshift can be described as a fully-managed, cloud-ready
petabyte-scale data warehouse service that can be seamlessly
integrated with business intelligence (BI) tools.
An Amazon Redshift data warehouse is an enterprise-class
relational database query and management system.
Amazon Redshift integrates with various data loading and ETL
(extract, transform, and load) tools and business intelligence (BI)
reporting, data mining, and analytics tools. Amazon Redshift is
based on industry-standard PostgreSQL.
Amazon Redshift supports client connections with many types of
applications, including business intelligence (BI), reporting, data,
and analytics tools.
When you execute analytic queries, you are retrieving, comparing,
and evaluating large amounts of data in multiple-stage operations to
produce a final result.
Amazon Redshift achieves efficient storage and optimum query
performance through a combination of massively parallel
processing, columnar data storage, and very efficient, targeted data
compression encoding schemes.
Two Key Players to Watch and Learn From
30. www.datavail.com 30
AWS Data Hub – with Snowflake
ERP
On-Prem Data
Sources
Data Hub
BI Tool(s)
Data as a
Service (data
sets, 360
search, API,
Web apps,
predictive
models)
Information
Delivery:
Amazon Athena
AWS Glue
ETL
Amazon Elasticsearch
Service
Crawler
AWS Database Migration
Service
Data Lake (S3)
Data Catalog
Landing Tier
Analytics 2 Tier
Analytics 1 Tier
Machine Learning
Algorithms
Data Source
Model
Train Data
Test Data
- Csv data files
- Delta only (some full)
- Granular level data
- No transformations
- Parquet/ORC files
- Partitioned
- Coelescing Partitions
- Optimized for Analytics
- Domain Level
- Org by Use
Cases
- Optimized special
analysis
Views
Accommodat
e Updates
and Deletes
AWS Glue
ETL
AWS
Lambda
AWS
Lambda
Amazon SageMaker
Amazon EMR
Snowflake DB
SaaS
31. www.datavail.com 31
AWS Data Hub – with Redshift
ERP
On-Prem Data
Sources
Data Hub
BI Tool(s)
Data as a
Service (data
sets, 360
search, API,
Web apps,
predictive
models)
Information
Delivery:
Amazon Athena
AWS Glue
ETL
Amazon Elasticsearch
Service
Crawler
AWS Database Migration
Service
Data Lake (S3)
Data Catalog
Landing Tier
Analytics 2 Tier
Analytics 1 Tier
Machine Learning
Algorithms
Data Source
Model
Train Data
Test Data
- Csv data files
- Delta only (some full)
- Granular level data
- No transformations
- Parquet/ORC files
- Partitioned
- Coelescing Partitions
- Optimized for Analytics
- Domain Level
- Org by Use
Cases
- Optimized special
analysis
Views
Accommodat
e Updates
and Deletes
AWS Glue
ETL
AWS
Lambda
AWS
Lambda
Amazon SageMaker
Amazon EMR
33. www.datavail.com 33
Get a clear view of your cloud
strategy – and align
• Expected Benefits of moving to the cloud
• Cloud data strategy
• XaaS strategy
• Constraints
• Roadmap
Assess your current state
Use the cloud for experimentation
Set the right migration approach
based on your priorities
Put analytics wherever the data is
Utilize the power of the cloud to
scale
Use multiple clouds depending on
your purpose
Enable self-service analytics
Best Practices in moving to the Cloud
34. www.datavail.com 34
“As a Service” of cloud – pay as you
go instead of capital outlay
Increased scalability. Think about
your on-site IT infrastructure
Faster insights
Easier maintenance and disaster
recovery
Stronger decision making
Can start with a Small Project!
Cost-Savings
Agility
Scalability
Solves new analytics requirements
(Use-Cases)
Summary: Why Move Analytics to the Cloud?
35. Fill Out Our
Cloud Analytics
Survey
Fill out the Cloud Analytics survey for a chance
to WIN a Sonos Move, a battery-powered
Smart Speaker!
https://www.datavail.com/cloud-analytics-survey
This is not a “niche” solution – this is the future of how companies will achieve Strategic Agility and Differentiation.
Why do we get deals or credits from AWS and Azure? Because they can’t keep up with the demand – they need companies like Datavail!
Should Datavail focus on any particular area within that market?”
Answer: Yes, the Cloud area of Big Data and Business Analytics Solutions.
Cloud Analytics growing from 52 Billion this year to 121 Billion in 2022
The lack of innovation is not the result of laziness. People and organizations are simply too busy providing descriptive analytics to engage in in-depth thinking. Simply migrating to the cloud with a “lift and shift”' approach does not result in innovation, but it does add another level of complexity to operations. The effort expended on maintaining the traditional analytics process turns into “analytics debt” for organizations, which impedes their ability to be creative and innovative. This lack of innovation ultimately costs organizations in terms of productivity in analytics, preventing them from adding value to the business. It is urgent, therefore, for organizations to explore the cloud, and inevitable that they will do so, as this is where new capabilities emerge owing to the effects of data gravity
Cloud analytics offers new capabilities for users to generate business value through a trial-and-errorbased environment. Organizations can introduce cloud analytics as a use case for users to generate more visible analytics prototypes that form the basis for innovation (see Figure 3). Data and analytics leaders need to pitch onboarding with cloud analytics as an ideation process to start analytics with the following steps.
Note: This is mostly business needs, not an IT needs! IT should know this, but often don’t.
Basically they want answers to business questions – when they want to ask them – not all up front in a “requirements gathering” phase. Scalable – don’t want to wait for a procurement process…. Hopefully, Modern Analytics addresses many of these needs.
Needed to rearchitect using new technologies and approaches.
What’s different about this solution? Serverless. Less delivery of data sets and more interaction, ad-hoc analyses by both internal and external users. Delivers Business Insights more than data sets. Very elegant solution!
Stan
Add bullets about flexible data ecosystem that enables them to adjust and change to meet their business objectives. Add more content and bullet points about this.
Cloud on demand
Agility of the business
Infrastructure flexibility – Cloud
Future demands
Real time reporting and inventory management
Quality of the solution – flexibility and quality
Time to delivery of changes to meet the reporting needs as they change
The solution must provide the ability to re-develop the existing process while providing a capability for managing services for processing data from over 75,000 chains and 15,000 wholesalers and consolidating with master data (Store, SAP Customer, Product, Employee pro) to create cubes in SQL Server. These cubes will be consumed by over 30 analysts/data scientists and the resulting output (reports, dashboards) are viewed by over 300 users and must provide like functionality as to what the analysts/data scientists leverage today.
Scope
The overall scope of the solution is to process data from the existing data sources and to then create data cubes for analysis. Data for processing the RBH data received from the following sources:
75000 + stores
15000+ wholesalers
The Master data that must be leveraged in the processing of the data received from these sources includes the following master data:
Store
SAP Customer
Product
Employee
There are two File Specs expected for the Chains, however there are up to 5 different formats between 30 participating chains. Some chains will submit weekly aggregated data and the remaining chains will submit data at the daily aggregate level, but it will be submitted weekly. Some chains will be providing data in the Excel format and the remaining chains will be providing data in the text file format.
The structure of the input data files from the store chains, wholesalers and master data will be provided by RBH or RBH will designate the service provider as the party that is authorized to communicate directly with the wholesalers and store chains to acquire the input data file structures.
The overall solution must be responsible for keeping track of receiving the input files from the wholesalers, store chains and maser data from RBH. Vendor will also be responsible for following up with the input file and master data providers in case of any delay in receiving these files. The overall solution must be authorized by RBH to communicate with the wholesalers and store chains on behalf of RBH.
The logic to perform the ETL process to transform the input files form the wholesalers and store chains into an output format that can be used by RBH analysts to execute existing reports, dashboards or ad hoc queries will be developed for use by the existing team of analysts/data scientists in a format similar to what they are using today. This format will be provided by RBH as well as the requirements for the overall process including any transformation business rules are not clear from the output format like in the case of calculated fields. The new process must also be designed and developed to cleanse and consolidate points of sales data for the products that are not present in the RBH master file.
Stan
A little like switching from film pictures to digital pictures.
Answers the question, Who Needs Modern Analytics? Pretty much everyone.
Cloud analytics lets companies leverage the power of analytics more quickly, more powerfully, and at lower cost.
Cost Savings: Cost savings or financial benefits are one common reason for moving to the cloud. If you are only looking to just move your spending from a capital expense model to an operational expense model, then your achievement criterion is effortlessly met by simply moving to the cloud and subscribing to “as a service.”
Agility: Another major reason for moving to the cloud is agility, but this is an option only if agility is important to you. For example, when on-premises capacity is made to handle the main jobs of the month, quarter, or year, then moving resources to the cloud can allow you to right-size the on-premises infrastructure for the workloads it needs to handle most of the time and only raise up to the peak demand when needed.
A little like switching from film pictures to digital pictures.
Answers the question, Who Needs Modern Analytics? Pretty much everyone.
Cloud analytics lets companies leverage the power of analytics more quickly, more powerfully, and at lower cost.
Cost Savings: Cost savings or financial benefits are one common reason for moving to the cloud. If you are only looking to just move your spending from a capital expense model to an operational expense model, then your achievement criterion is effortlessly met by simply moving to the cloud and subscribing to “as a service.”
Agility: Another major reason for moving to the cloud is agility, but this is an option only if agility is important to you. For example, when on-premises capacity is made to handle the main jobs of the month, quarter, or year, then moving resources to the cloud can allow you to right-size the on-premises infrastructure for the workloads it needs to handle most of the time and only raise up to the peak demand when needed.