This document discusses the challenges of building reliable banking architectures in the cloud and how Starling Bank addressed this issue. It introduces some key concepts like distributed architectures, self-contained systems, and the DITTO architecture which focuses on idempotency and eventual consistency. The benefits of this approach for Starling Bank included safe instance termination, continuous delivery of backend changes up to 5 times a day using chat-ops releases, and the ability to "chaos test" to ensure reliability.
What Is My Enterprise Data Maturity 2021DATAVERSITY
Maturity frameworks have varying levels of Data Management maturity. Each level corresponds to not only increased data maturity but also increased organizational maturity and bottom-line ROI. There are recommended targets to achieve an effective information management program. The speaker’s maturity framework sequences the information management activities for your consideration. It is based on real client roadmaps. This webinar promises to offer a wealth of ideas for key quick wins to benefit the organization’s information management program.
Attendees can self-assess their current information management capabilities as we go through Data Strategy, organization, architecture, and technology, yielding an overall view of the current level of information management maturity.
This webinar provides a foundation for enhancing current data and analytic capabilities and updating the strategy and plans for the achievement of improved information management maturity, aligned with major initiatives.
Advanced Analytics: Analytic Platforms Should Be Columnar OrientationDATAVERSITY
A columnar database is an implementation of the relational theory, but with a twist. The data storage layer does not contain records. It contains a grouping of columns.
Due to the variable column lengths within a row, a small column with low cardinality, or variability of values, may reside completely within one block while another column with high cardinality and longer length may take a thousand blocks. In columnar, all the same data — your data — is there. It’s just organized differently (automatically, by the DBMS).
The main reason why you would want to utilize a columnar approach is simply to speed up the native performance of analytic queries.
Learn about the columnar orientation and how it can be effective for your needs. This is the native orientation of many databases and several others that have optional column-oriented storage layers.
There is also the equivalent in the cloud storage world, which is open format Parquet.
TeraStream - Data Integration/Migration/ETL/Batch ToolDataStreams
TeraStream™ is leading the Korean data migration and ETL market. Take a look at the powerful performances, features and user conveniences of TeraStream™
Slides: Moving from a Relational Model to NoSQLDATAVERSITY
Businesses are quickly moving to NoSQL databases to power their modern applications. However, a technology migration involves risk, especially if you have to change your data model. What if you could host a relatively unmodified RDBMS schema on your NoSQL database, then optimize it over time?
We’ll show you how Couchbase makes it easy to:
• Use SQL for JSON to query your data and create joins
• Optimize indexes and perform HashMap queries
• Build applications and analysis with NoSQL
Slides: Migrate BI Dashboards to Run Directly on a Cloud Data Lake in Five Ea...DATAVERSITY
While BI dashboards are great at democratizing analytics in organizations, the architecture that traditionally powers them has hidden consequences that have serious impacts on the business.
This architecture is based on a 30-year-old paradigm that requires many different systems, ETL jobs, and copies of data in data marts, data warehouses, and BI extracts. One downside of many is that it takes many days if not weeks to answer a different business question with this architecture. The negative consequences are further multiplied by the tens, hundreds, or even thousands of dashboards needed to run a data-driven organization.
Now, there’s a straightforward way to overcome these challenges that many organizations are already taking advantage of, an open cloud data lake architecture and Dremio
Join Jason Hughes, Technical Director at Dremio, for this webinar to learn how you can migrate BI dashboards to Dremio to quickly provide interactive dashboards to data consumers without the issues of the traditional architecture — and finally deliver the benefits always promised by BI.
What you’ll learn:
• Why BI dashboards’ traditional architecture implemented at scale causes many issues, which hinder the very insights it promises.
• How a Dremio-powered cloud data lake architecture eliminates or mitigates the negative consequences of the traditional approach.
• Step-by-step instructions for migrating a BI dashboard to run directly on a cloud data lake, both a self-contained example and your own dashboards.
Webinar: Decoding the Mystery - How to Know if You Need a Data Catalog, a Dat...DATAVERSITY
There’s a lot of confusion out there about the differences between a data catalog, a data dictionary and a business glossary, and it's not always easy to understand who needs which and why. Join Malcolm Chisholm, Ph.D., President of Data Millennium, and Amichai Fenner, Product Lead at Octopai, as they help decode the mystery. Spoiler alert: one of these enables collaboration across BI and IT, which is it?
Speed Matters - Intelligent Strategies to Accelerate Data-Driven DecisionsDATAVERSITY
COVID-19 has shown us the importance of data in being able to quickly make decisions when market variables are out of our control. In order to accelerate and harness the process, an organization needs an agile approach to data integration and analytics that avoids the limitations of predefined schemas and data models.
Learn from 451 Research, now part of S&P Global Market Intelligence, a leading global IT research and advisory firm, and Qlik about best practices that can help you accelerate the data to decision path with agility. You’ll understand how to:
-Rethink traditional assumptions about data management and analytic roles and technologies
-Recognize trends that drive the demand to reduce the time required to investigate, analyze and take action on business data.
See a new state of business intelligence, where the data pipeline is optimized to enable organizations to make decisions and act in real-time. Seeking alternatives to the traditional approaches to become more agile in today’s evolving market and economy? Then don’t miss this presentation!
Because every organization produces and propagates data as part of their day-to-day operations, data trends are becoming more and more important in the mainstream business world’s consciousness. For many organizations in various industries, though, comprehension of this development begins and ends with buzzwords: “big data,” “NoSQL,” “data scientist,” and so on. Few realize that any and all solutions to their business problems, regardless of platform or relevant technology, rely to a critical extent on the data model supporting them. As such, Data Modeling is not an optional task for an organization’s data effort, but rather a vital activity that facilitates the solutions driving your business. Since quality engineering/architecture work products do not happen accidentally, the more your organization depends on automation, the more important the data models driving the engineering and architecture activities of your organization become. This webinar illustrates Data Modeling as a key activity upon which so much technology depends.
What Is My Enterprise Data Maturity 2021DATAVERSITY
Maturity frameworks have varying levels of Data Management maturity. Each level corresponds to not only increased data maturity but also increased organizational maturity and bottom-line ROI. There are recommended targets to achieve an effective information management program. The speaker’s maturity framework sequences the information management activities for your consideration. It is based on real client roadmaps. This webinar promises to offer a wealth of ideas for key quick wins to benefit the organization’s information management program.
Attendees can self-assess their current information management capabilities as we go through Data Strategy, organization, architecture, and technology, yielding an overall view of the current level of information management maturity.
This webinar provides a foundation for enhancing current data and analytic capabilities and updating the strategy and plans for the achievement of improved information management maturity, aligned with major initiatives.
Advanced Analytics: Analytic Platforms Should Be Columnar OrientationDATAVERSITY
A columnar database is an implementation of the relational theory, but with a twist. The data storage layer does not contain records. It contains a grouping of columns.
Due to the variable column lengths within a row, a small column with low cardinality, or variability of values, may reside completely within one block while another column with high cardinality and longer length may take a thousand blocks. In columnar, all the same data — your data — is there. It’s just organized differently (automatically, by the DBMS).
The main reason why you would want to utilize a columnar approach is simply to speed up the native performance of analytic queries.
Learn about the columnar orientation and how it can be effective for your needs. This is the native orientation of many databases and several others that have optional column-oriented storage layers.
There is also the equivalent in the cloud storage world, which is open format Parquet.
TeraStream - Data Integration/Migration/ETL/Batch ToolDataStreams
TeraStream™ is leading the Korean data migration and ETL market. Take a look at the powerful performances, features and user conveniences of TeraStream™
Slides: Moving from a Relational Model to NoSQLDATAVERSITY
Businesses are quickly moving to NoSQL databases to power their modern applications. However, a technology migration involves risk, especially if you have to change your data model. What if you could host a relatively unmodified RDBMS schema on your NoSQL database, then optimize it over time?
We’ll show you how Couchbase makes it easy to:
• Use SQL for JSON to query your data and create joins
• Optimize indexes and perform HashMap queries
• Build applications and analysis with NoSQL
Slides: Migrate BI Dashboards to Run Directly on a Cloud Data Lake in Five Ea...DATAVERSITY
While BI dashboards are great at democratizing analytics in organizations, the architecture that traditionally powers them has hidden consequences that have serious impacts on the business.
This architecture is based on a 30-year-old paradigm that requires many different systems, ETL jobs, and copies of data in data marts, data warehouses, and BI extracts. One downside of many is that it takes many days if not weeks to answer a different business question with this architecture. The negative consequences are further multiplied by the tens, hundreds, or even thousands of dashboards needed to run a data-driven organization.
Now, there’s a straightforward way to overcome these challenges that many organizations are already taking advantage of, an open cloud data lake architecture and Dremio
Join Jason Hughes, Technical Director at Dremio, for this webinar to learn how you can migrate BI dashboards to Dremio to quickly provide interactive dashboards to data consumers without the issues of the traditional architecture — and finally deliver the benefits always promised by BI.
What you’ll learn:
• Why BI dashboards’ traditional architecture implemented at scale causes many issues, which hinder the very insights it promises.
• How a Dremio-powered cloud data lake architecture eliminates or mitigates the negative consequences of the traditional approach.
• Step-by-step instructions for migrating a BI dashboard to run directly on a cloud data lake, both a self-contained example and your own dashboards.
Webinar: Decoding the Mystery - How to Know if You Need a Data Catalog, a Dat...DATAVERSITY
There’s a lot of confusion out there about the differences between a data catalog, a data dictionary and a business glossary, and it's not always easy to understand who needs which and why. Join Malcolm Chisholm, Ph.D., President of Data Millennium, and Amichai Fenner, Product Lead at Octopai, as they help decode the mystery. Spoiler alert: one of these enables collaboration across BI and IT, which is it?
Speed Matters - Intelligent Strategies to Accelerate Data-Driven DecisionsDATAVERSITY
COVID-19 has shown us the importance of data in being able to quickly make decisions when market variables are out of our control. In order to accelerate and harness the process, an organization needs an agile approach to data integration and analytics that avoids the limitations of predefined schemas and data models.
Learn from 451 Research, now part of S&P Global Market Intelligence, a leading global IT research and advisory firm, and Qlik about best practices that can help you accelerate the data to decision path with agility. You’ll understand how to:
-Rethink traditional assumptions about data management and analytic roles and technologies
-Recognize trends that drive the demand to reduce the time required to investigate, analyze and take action on business data.
See a new state of business intelligence, where the data pipeline is optimized to enable organizations to make decisions and act in real-time. Seeking alternatives to the traditional approaches to become more agile in today’s evolving market and economy? Then don’t miss this presentation!
Because every organization produces and propagates data as part of their day-to-day operations, data trends are becoming more and more important in the mainstream business world’s consciousness. For many organizations in various industries, though, comprehension of this development begins and ends with buzzwords: “big data,” “NoSQL,” “data scientist,” and so on. Few realize that any and all solutions to their business problems, regardless of platform or relevant technology, rely to a critical extent on the data model supporting them. As such, Data Modeling is not an optional task for an organization’s data effort, but rather a vital activity that facilitates the solutions driving your business. Since quality engineering/architecture work products do not happen accidentally, the more your organization depends on automation, the more important the data models driving the engineering and architecture activities of your organization become. This webinar illustrates Data Modeling as a key activity upon which so much technology depends.
DAS Slides: Graph Databases — Practical Use CasesDATAVERSITY
Graph databases are seeing a spike in popularity as their value in leveraging large data sets for key areas such as fraud detection, marketing, and network optimization become increasingly apparent. With graph databases, it’s been said that ‘the data model and the metadata are the database’. What does this mean in a practical application, and how can this technology be optimized for maximum business value?
Platforming the Major Analytic Use Cases for Modern EngineeringDATAVERSITY
We’ll describe some use cases as examples of a broad range of modern use cases that need a platform. We will describe some popular valid technology stacks that enterprises use in accomplishing these modern use cases of customer churn, predictive analytics, fraud detection, and supply chain management.
In many industries, to achieve top-line growth, it is imperative that companies get the most out of existing customer relationships. Customer churn use cases are about generating high levels of profitable customer satisfaction through the use of knowledge generated from corporate and external data to help drive a more positive customer experience (CX).
Many organizations are turning to predictive analytics to increase their bottom line and efficiency and, therefore, competitive advantage. It can make the difference between business success or failure.
Fraudulent activity detection is exponentially more effective when risk actions are taken immediately (i.e., stop the fraudulent transaction), instead of after the fact. Fast digestion of a wide network of risk exposures across the network is required in order to minimize adverse outcomes.
Supply chain leaders are under constant pressure to reduce overall supply chain management (SCM) costs while maintaining a flexible and diverse supplier ecosystem. They will leverage IoT, sensors, cameras, and blockchain. Major investments in advanced analytics, warehouse relocation, and automation, both in distribution centers and stores, will be essential for survival.
ADV Slides: The World in 2045 – What Has Artificial Intelligence Created?DATAVERSITY
How will technology and society change in the next 25 years? We have been discussing how technology has evolved in the last few years; in this episode, we look forward to the next 25 years.
The year 2045 may seem far away, but we already have predictions about the technological innovations prevalent in 2045. Hint: Artificial intelligence will have a huge impact.
Slides: How AI Makes Analytics More HumanDATAVERSITY
People think AI makes analytics less human, replacing human decision making. But the truth is, AI actually makes analytics more human. Augmented analytics are helping organizations finally break through the low levels of adoption and limitations typical of 2nd generation visualization tools.
Most business problems cannot be solved purely by algorithms or machine learning — they require human interaction and perspective. Uniting precedent-based machine learning systems with natural human intuition and curiosity is the foundation of 3rd generation BI and democratizing data across an enterprise.
It is a natural flow to enhance your data eco-system by deploying a platform with augmented intelligence to work alongside users in the pursuit of surfacing new insights, automating tasks, and supporting natural language interaction. All work as accelerators for achieving active intelligence and Data Literacy.
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the mission will be executed and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business you’re in, you’re in the business of analytics.
The coming years will be full of big changes in enterprise analytics and Data Architecture. William will kick off the fourth year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
Too often I hear the question “Can you help me with our Data Strategy?” Unfortunately, for most, this is the wrong request because it focuses on the least valuable component – the Data Strategy itself. A more useful request is this: “Can you help me apply data strategically?”Yes, at early maturity phases the process of developing strategic thinking about data is more important than the actual product! Trying to write a good (must less perfect) Data Strategy on the first attempt is generally not productive –particularly given the widespread acceptance of Mike Tyson’s truism: “Everybody has a plan until they get punched in the face.” Refocus on learning how to iteratively improve the way data is strategically applied. This will permit data-based strategy components to keep up with agile, evolving organizational strategies. This approach can also contribute to three primary organizational data goals.
In this webinar, you will learn how improving your organization’s data, the way your people use data, and the way your people use data to achieve your organizational strategy will help in ways never imagined. Data are your sole non-depletable, non-degradable, durable strategic assets, and they are pervasively shared across every organizational area. Addressing existing challenges programmatically includes overcoming necessary but insufficient prerequisites and developing a disciplined, repeatable means of improving business objectives. This process (based on the theory of constraints) is where the strategic data work really occurs, as organizations identify prioritized areas where better assets, literacy, and support (Data Strategy components) can help an organization better achieve specific strategic objectives. Then the process becomes lather, rinse, and repeat. Several complementary concepts are also covered, including:
- A cohesive argument for why Data Strategy is necessary for effective Data Governance
- An overview of prerequisites for effective strategic use of Data Strategy, as well as common pitfalls
- A repeatable process for identifying and removing data constraints
- The importance of balancing business operation and innovation
ADV Slides: Modern Analytic Data Architecture Maturity ModelingDATAVERSITY
Maturity frameworks have varying levels of Data Management maturity. Each level corresponds to not only increased data maturity, but also increased organizational maturity and bottom-line ROI. There are recommended targets to achieve an effective information management program. The speaker’s maturity framework sequences the information management activities for your consideration. It is based on real client roadmaps. This webinar promises to offer a wealth of ideas for key quick wins to benefit the organization’s information management program.
Attendees can self-assess their current information management capabilities as we go through data strategy, organization, architecture, and technology, yielding an overall view of the current level of information management maturity.
This webinar provides a foundation for enhancing current data and analytic capabilities and updating the strategy and plans for achievement of improved information management maturity, aligned with major initiatives.
Measuring Data Quality Return on InvestmentDATAVERSITY
Data Quality is an elusive subject that can defy measurement and yet be critical enough to derail any project, strategic initiative, or even a company. The data layer of an organization is a critical component because it is so easy to ignore the quality of that data or to make overly optimistic assumptions about its efficacy. Having Data Quality as a focus is a business philosophy that aligns strategy, business culture, company information, and technology in order to manage data to the benefit of the enterprise. It is a competitive strategy.
Information management plays a critical role in supporting strategic business initiatives. Despite the apparent value of providing the data infrastructure for these initiatives, many executives question the economic feasibility of business intelligence and analytics. This requires information professionals to calculate and present the business value in terms business executives can understand.
Unfortunately, most IT professionals lack the knowledge required to develop comprehensive cost-benefit analyses and return on investment (ROI) measurements.
This session provides a framework to help IT professionals research, measure, and present the economic value of a proposed or existing information initiative. The session will provide practical advice about how to calculate ROI, which formula to use, and how to collect the necessary information.
DataEd Slides: Leveraging Data Management TechnologiesDATAVERSITY
Our architecturally solid stool requires three legs: people, process, and technologies. This webinar looks at the most misunderstood of these three components: technology. While most organizations begin with technologies, it turns out that technologies are the last component that should be considered. This webinar will survey a range of technologies that can be used to increase the productivity of Data Management efforts. The goal is to invest in as little infrastructure as possible while still achieving business/program objectives. This program’s learning objectives include:
• Understanding technology considerations
• Appreciating the overview of data technologies and then specifically
• CASE technologies
• Repositories
• Profiling/discovery tools
• Data Quality engineering tools
• Appreciating the complete Data Quality life cycle
ADV Slides: 2021 Trends in Enterprise AnalyticsDATAVERSITY
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the mission will be executed, and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business you’re in, you’re in the business of analytics.
The coming years will be full of big changes in enterprise analytics and Data Architecture. William will kick off the third year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
Presentation at Data Innovation Summit 2021. Trusted, well managed data is key to AI and machine learning success. Data citizens need data insights and data scientists need to spend more time building models. Everyone wants to spend less time finding, discovering, and munging data and ensuring the data quality to deliver business results. However, traditional data approaches lock data away and slow AI implementation leaves much of this work on the data practitioner’s shoulders. This session will cover how AI is also helping solve these problems. New data tools that combine automation with human expertise are enabling data and knowledge sharing (including new data classes like IOT data), data democratization, and cloud migration. AI-driven data enablement ensures everyone can find the right data and make intelligent use of it. Join us for a lively discussion on the most critical resource for AI: your data.
ADV Slides: How to Improve Your Analytic Data Architecture MaturityDATAVERSITY
Many organizations are immature when it comes to data use. The answer lies in delivering a greater level of insight from data, straight to the point of need. Enter: machine learning.
In this webinar, William will look at categories of organizational response to the challenge across strategy, architecture, modeling, processes, and ethics. Machine learning maturity levels tend to move in harmony across these categories. As a general principle of maturity models, you can’t skip levels in any category, nor can you advance in one category well beyond the others.
Vis-à-vis ML, attaining and retaining momentum up the model is paramount for success. You will ascend the model through concerted efforts delivering business wins utilizing progressive elements of the model, and thereby increasing your machine learning maturity. The model will evolve. No plateaus are comfortable for long.
With ML maturity markers, sequencing, and tactics, this webinar provides a plan for how to build analytic Data Architecture maturity in your organization.
Case Manager for Content Management - A Customer's PerspectiveThe Dayhuff Group
Motorists Mutual Insurance and Dayhuff Group share best practices and lessons learned from the Case Manager implementation at Motorists that is finally allowing the customer to realize the promise of Content Management.
ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here?
In this webinar, we look at this foundational technology for modern Data Management and show how it evolved to meet the workloads of today, as well as when other platforms make sense for enterprise data.
Slides: Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how data architecture is a key component of an overall enterprise architecture for enhanced business value and success.
Estimating the Total Costs of Your Cloud Analytics PlatformDATAVERSITY
Organizations today need a broad set of enterprise data cloud services with key data functionality to modernize applications and utilize machine learning. They need a platform designed to address multi-faceted needs by offering multi-function Data Management and analytics to solve the enterprise’s most pressing data and analytic challenges in a streamlined fashion. They need a worry-free experience with the architecture and its components.
As digital channels continue to grow, they drive greater diversity in our data landscape. At Yorkshire Building Society, our purpose is to provide real help with real life and this relies on data from a myriad of sources. This diversity creates a need for points of intersection, where data can unite to feed customer and business insights. How do we create these hubs of intersection and what can modern technology offer?
Speaker:
Mark Walters
Lead Enterprise Data Architect for Data & Information
Yorkshire Building Society
How can Insurers Accelerate Digital Transformation with Data Virtualization (...Denodo
Watch full webinar here: https://bit.ly/2Qpwqo9
Insurers’ globally are accelerating their digital journey, making rapid strides with their digitisation efforts, and adding key capabilities to adapt and innovate in the new normal. However, many insurance organisations find this transformation challenging as they rely on established systems that are often not only poorly integrated, but also highly resistant to modernisation without downtime. Hear how peers in your industry are leveraging data virtualization that facilitates digital transformation via a modern data integration / data delivery approach to gain greater agility, flexibility, and efficiency.
In this session, you will learn:
- Industry key trends and challenges driving the digital transformation mandate
- What is data virtualization, use cases, and how it can enable insurers to develop critical capabilities
- Lessons from success stories of insurers who already use data virtualization to differentiate themselves from the competition, have a single view of all their data and a way to establish security controls across the entire infrastructure
Software Architecture as Systems DissolveEoin Woods
The way we build systems is changing. From our history of monolithic systems, then distributed systems, to Internet connected systems, we are now entering the era of cloud-hosted, microservice based, pay-for-usage system development. What does the history of software architecture tell us about the challenges of this new environment? And how does our approach to software architecture need to evolve in order to meet them?
Software architecture has been a mainstream discipline since the 1990s and in that time has become a recognised, widely researched and often valued part of the software engineering process. However architecture approaches must reflect the technologies and priorities of the systems we are building and in this regard its future has never looked more uncertain or more exciting. From our history of monolithic compile time architecture, to many tiered distributed systems, to Internet connected services, we are now entering the era of cloud-hosted, microservice-based, pay-for-usage systems development. In this new world the boundaries of “my” system are no longer so clear and our systems are dissolving into complex webs of independently owned and evolved services, with nothing more in common than a shared credit card for billing and an agreement on the format of network requests. What can the history of software architecture tell us about the likely challenges in this environment? And how must it develop in order to meet them?
This version of the talk was presented at GOTO London in October 2016.
Minimum Viable Architecture - Good Enough is Good EnoughRandy Shoup
The “right” architecture and organization depends on the size and scale of your company. The only constant is change, and what works for 5 engineers does not work for 5000. Based upon lessons from Google and eBay, learn how to evolve both technology and organization together successfully.
This presentation is based on many hard-won lessons by the speaker, who led large-scale engineering teams at Google and eBay, but also co-founded a tiny startup and tried (unsuccessfully) to apply the same techniques. This session hopes to help others from making the same mistakes by introducing the concept of “Minimal Viable Architecture”. It outlines the common architectural evolution of a company or project through the search, execution, and scaling phases, and discusses the appropriate technologies, disciplines, and organizational structures at each phase. You'll start with a monolith, and end up with microservices, and that's completely and entirely appropriate.
DAS Slides: Graph Databases — Practical Use CasesDATAVERSITY
Graph databases are seeing a spike in popularity as their value in leveraging large data sets for key areas such as fraud detection, marketing, and network optimization become increasingly apparent. With graph databases, it’s been said that ‘the data model and the metadata are the database’. What does this mean in a practical application, and how can this technology be optimized for maximum business value?
Platforming the Major Analytic Use Cases for Modern EngineeringDATAVERSITY
We’ll describe some use cases as examples of a broad range of modern use cases that need a platform. We will describe some popular valid technology stacks that enterprises use in accomplishing these modern use cases of customer churn, predictive analytics, fraud detection, and supply chain management.
In many industries, to achieve top-line growth, it is imperative that companies get the most out of existing customer relationships. Customer churn use cases are about generating high levels of profitable customer satisfaction through the use of knowledge generated from corporate and external data to help drive a more positive customer experience (CX).
Many organizations are turning to predictive analytics to increase their bottom line and efficiency and, therefore, competitive advantage. It can make the difference between business success or failure.
Fraudulent activity detection is exponentially more effective when risk actions are taken immediately (i.e., stop the fraudulent transaction), instead of after the fact. Fast digestion of a wide network of risk exposures across the network is required in order to minimize adverse outcomes.
Supply chain leaders are under constant pressure to reduce overall supply chain management (SCM) costs while maintaining a flexible and diverse supplier ecosystem. They will leverage IoT, sensors, cameras, and blockchain. Major investments in advanced analytics, warehouse relocation, and automation, both in distribution centers and stores, will be essential for survival.
ADV Slides: The World in 2045 – What Has Artificial Intelligence Created?DATAVERSITY
How will technology and society change in the next 25 years? We have been discussing how technology has evolved in the last few years; in this episode, we look forward to the next 25 years.
The year 2045 may seem far away, but we already have predictions about the technological innovations prevalent in 2045. Hint: Artificial intelligence will have a huge impact.
Slides: How AI Makes Analytics More HumanDATAVERSITY
People think AI makes analytics less human, replacing human decision making. But the truth is, AI actually makes analytics more human. Augmented analytics are helping organizations finally break through the low levels of adoption and limitations typical of 2nd generation visualization tools.
Most business problems cannot be solved purely by algorithms or machine learning — they require human interaction and perspective. Uniting precedent-based machine learning systems with natural human intuition and curiosity is the foundation of 3rd generation BI and democratizing data across an enterprise.
It is a natural flow to enhance your data eco-system by deploying a platform with augmented intelligence to work alongside users in the pursuit of surfacing new insights, automating tasks, and supporting natural language interaction. All work as accelerators for achieving active intelligence and Data Literacy.
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the mission will be executed and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business you’re in, you’re in the business of analytics.
The coming years will be full of big changes in enterprise analytics and Data Architecture. William will kick off the fourth year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
Too often I hear the question “Can you help me with our Data Strategy?” Unfortunately, for most, this is the wrong request because it focuses on the least valuable component – the Data Strategy itself. A more useful request is this: “Can you help me apply data strategically?”Yes, at early maturity phases the process of developing strategic thinking about data is more important than the actual product! Trying to write a good (must less perfect) Data Strategy on the first attempt is generally not productive –particularly given the widespread acceptance of Mike Tyson’s truism: “Everybody has a plan until they get punched in the face.” Refocus on learning how to iteratively improve the way data is strategically applied. This will permit data-based strategy components to keep up with agile, evolving organizational strategies. This approach can also contribute to three primary organizational data goals.
In this webinar, you will learn how improving your organization’s data, the way your people use data, and the way your people use data to achieve your organizational strategy will help in ways never imagined. Data are your sole non-depletable, non-degradable, durable strategic assets, and they are pervasively shared across every organizational area. Addressing existing challenges programmatically includes overcoming necessary but insufficient prerequisites and developing a disciplined, repeatable means of improving business objectives. This process (based on the theory of constraints) is where the strategic data work really occurs, as organizations identify prioritized areas where better assets, literacy, and support (Data Strategy components) can help an organization better achieve specific strategic objectives. Then the process becomes lather, rinse, and repeat. Several complementary concepts are also covered, including:
- A cohesive argument for why Data Strategy is necessary for effective Data Governance
- An overview of prerequisites for effective strategic use of Data Strategy, as well as common pitfalls
- A repeatable process for identifying and removing data constraints
- The importance of balancing business operation and innovation
ADV Slides: Modern Analytic Data Architecture Maturity ModelingDATAVERSITY
Maturity frameworks have varying levels of Data Management maturity. Each level corresponds to not only increased data maturity, but also increased organizational maturity and bottom-line ROI. There are recommended targets to achieve an effective information management program. The speaker’s maturity framework sequences the information management activities for your consideration. It is based on real client roadmaps. This webinar promises to offer a wealth of ideas for key quick wins to benefit the organization’s information management program.
Attendees can self-assess their current information management capabilities as we go through data strategy, organization, architecture, and technology, yielding an overall view of the current level of information management maturity.
This webinar provides a foundation for enhancing current data and analytic capabilities and updating the strategy and plans for achievement of improved information management maturity, aligned with major initiatives.
Measuring Data Quality Return on InvestmentDATAVERSITY
Data Quality is an elusive subject that can defy measurement and yet be critical enough to derail any project, strategic initiative, or even a company. The data layer of an organization is a critical component because it is so easy to ignore the quality of that data or to make overly optimistic assumptions about its efficacy. Having Data Quality as a focus is a business philosophy that aligns strategy, business culture, company information, and technology in order to manage data to the benefit of the enterprise. It is a competitive strategy.
Information management plays a critical role in supporting strategic business initiatives. Despite the apparent value of providing the data infrastructure for these initiatives, many executives question the economic feasibility of business intelligence and analytics. This requires information professionals to calculate and present the business value in terms business executives can understand.
Unfortunately, most IT professionals lack the knowledge required to develop comprehensive cost-benefit analyses and return on investment (ROI) measurements.
This session provides a framework to help IT professionals research, measure, and present the economic value of a proposed or existing information initiative. The session will provide practical advice about how to calculate ROI, which formula to use, and how to collect the necessary information.
DataEd Slides: Leveraging Data Management TechnologiesDATAVERSITY
Our architecturally solid stool requires three legs: people, process, and technologies. This webinar looks at the most misunderstood of these three components: technology. While most organizations begin with technologies, it turns out that technologies are the last component that should be considered. This webinar will survey a range of technologies that can be used to increase the productivity of Data Management efforts. The goal is to invest in as little infrastructure as possible while still achieving business/program objectives. This program’s learning objectives include:
• Understanding technology considerations
• Appreciating the overview of data technologies and then specifically
• CASE technologies
• Repositories
• Profiling/discovery tools
• Data Quality engineering tools
• Appreciating the complete Data Quality life cycle
ADV Slides: 2021 Trends in Enterprise AnalyticsDATAVERSITY
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the mission will be executed, and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business you’re in, you’re in the business of analytics.
The coming years will be full of big changes in enterprise analytics and Data Architecture. William will kick off the third year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
Presentation at Data Innovation Summit 2021. Trusted, well managed data is key to AI and machine learning success. Data citizens need data insights and data scientists need to spend more time building models. Everyone wants to spend less time finding, discovering, and munging data and ensuring the data quality to deliver business results. However, traditional data approaches lock data away and slow AI implementation leaves much of this work on the data practitioner’s shoulders. This session will cover how AI is also helping solve these problems. New data tools that combine automation with human expertise are enabling data and knowledge sharing (including new data classes like IOT data), data democratization, and cloud migration. AI-driven data enablement ensures everyone can find the right data and make intelligent use of it. Join us for a lively discussion on the most critical resource for AI: your data.
ADV Slides: How to Improve Your Analytic Data Architecture MaturityDATAVERSITY
Many organizations are immature when it comes to data use. The answer lies in delivering a greater level of insight from data, straight to the point of need. Enter: machine learning.
In this webinar, William will look at categories of organizational response to the challenge across strategy, architecture, modeling, processes, and ethics. Machine learning maturity levels tend to move in harmony across these categories. As a general principle of maturity models, you can’t skip levels in any category, nor can you advance in one category well beyond the others.
Vis-à-vis ML, attaining and retaining momentum up the model is paramount for success. You will ascend the model through concerted efforts delivering business wins utilizing progressive elements of the model, and thereby increasing your machine learning maturity. The model will evolve. No plateaus are comfortable for long.
With ML maturity markers, sequencing, and tactics, this webinar provides a plan for how to build analytic Data Architecture maturity in your organization.
Case Manager for Content Management - A Customer's PerspectiveThe Dayhuff Group
Motorists Mutual Insurance and Dayhuff Group share best practices and lessons learned from the Case Manager implementation at Motorists that is finally allowing the customer to realize the promise of Content Management.
ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here?
In this webinar, we look at this foundational technology for modern Data Management and show how it evolved to meet the workloads of today, as well as when other platforms make sense for enterprise data.
Slides: Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how data architecture is a key component of an overall enterprise architecture for enhanced business value and success.
Estimating the Total Costs of Your Cloud Analytics PlatformDATAVERSITY
Organizations today need a broad set of enterprise data cloud services with key data functionality to modernize applications and utilize machine learning. They need a platform designed to address multi-faceted needs by offering multi-function Data Management and analytics to solve the enterprise’s most pressing data and analytic challenges in a streamlined fashion. They need a worry-free experience with the architecture and its components.
As digital channels continue to grow, they drive greater diversity in our data landscape. At Yorkshire Building Society, our purpose is to provide real help with real life and this relies on data from a myriad of sources. This diversity creates a need for points of intersection, where data can unite to feed customer and business insights. How do we create these hubs of intersection and what can modern technology offer?
Speaker:
Mark Walters
Lead Enterprise Data Architect for Data & Information
Yorkshire Building Society
How can Insurers Accelerate Digital Transformation with Data Virtualization (...Denodo
Watch full webinar here: https://bit.ly/2Qpwqo9
Insurers’ globally are accelerating their digital journey, making rapid strides with their digitisation efforts, and adding key capabilities to adapt and innovate in the new normal. However, many insurance organisations find this transformation challenging as they rely on established systems that are often not only poorly integrated, but also highly resistant to modernisation without downtime. Hear how peers in your industry are leveraging data virtualization that facilitates digital transformation via a modern data integration / data delivery approach to gain greater agility, flexibility, and efficiency.
In this session, you will learn:
- Industry key trends and challenges driving the digital transformation mandate
- What is data virtualization, use cases, and how it can enable insurers to develop critical capabilities
- Lessons from success stories of insurers who already use data virtualization to differentiate themselves from the competition, have a single view of all their data and a way to establish security controls across the entire infrastructure
Software Architecture as Systems DissolveEoin Woods
The way we build systems is changing. From our history of monolithic systems, then distributed systems, to Internet connected systems, we are now entering the era of cloud-hosted, microservice based, pay-for-usage system development. What does the history of software architecture tell us about the challenges of this new environment? And how does our approach to software architecture need to evolve in order to meet them?
Software architecture has been a mainstream discipline since the 1990s and in that time has become a recognised, widely researched and often valued part of the software engineering process. However architecture approaches must reflect the technologies and priorities of the systems we are building and in this regard its future has never looked more uncertain or more exciting. From our history of monolithic compile time architecture, to many tiered distributed systems, to Internet connected services, we are now entering the era of cloud-hosted, microservice-based, pay-for-usage systems development. In this new world the boundaries of “my” system are no longer so clear and our systems are dissolving into complex webs of independently owned and evolved services, with nothing more in common than a shared credit card for billing and an agreement on the format of network requests. What can the history of software architecture tell us about the likely challenges in this environment? And how must it develop in order to meet them?
This version of the talk was presented at GOTO London in October 2016.
Minimum Viable Architecture - Good Enough is Good EnoughRandy Shoup
The “right” architecture and organization depends on the size and scale of your company. The only constant is change, and what works for 5 engineers does not work for 5000. Based upon lessons from Google and eBay, learn how to evolve both technology and organization together successfully.
This presentation is based on many hard-won lessons by the speaker, who led large-scale engineering teams at Google and eBay, but also co-founded a tiny startup and tried (unsuccessfully) to apply the same techniques. This session hopes to help others from making the same mistakes by introducing the concept of “Minimal Viable Architecture”. It outlines the common architectural evolution of a company or project through the search, execution, and scaling phases, and discusses the appropriate technologies, disciplines, and organizational structures at each phase. You'll start with a monolith, and end up with microservices, and that's completely and entirely appropriate.
[db tech showcase Tokyo 2018] #dbts2018 #B23 『Python, Oracle Cloud, Blockchai...Insight Technology, Inc.
[db tech showcase Tokyo 2018] #dbts2018 #B23
『Python, Oracle Cloud, Blockchain & Cryptocurrency - The Perfect Combination』
Data Intensity - Director of Innovation Francisco Munoz Alvarez 氏
In this MIT Enterprise Forum of Cambridge Innovation Series event, we heard from CEOs, founders, and technology experts at the companies on the forefront of this ongoing digital transformation of supply chain.
Speakers: Samantha Radocchia, Co-Founder and Chief Product Officer, Chronicled, Brigid McDermott, Vice President, Blockchain, IBM, Dan Harple, Founder and CEO, Context Labs, Dan Doles, CEO, Mojix and Bill McBeath, Chief Research Officer, ChainLink Research
Moderated by: Michael Casey, Senior Advisor - Blockchain Research, MIT Media Lab
This presentation introduces the idea of a "Minimal Viable Architecture". As a company and product evolves, its architecture should evolve as well. We talk about the different phases of a product -- from the idea phase, to the starting phase, scaling phase, and optimizing phase. For each phase, we discuss the goals and constraints on the business, and we suggest an appropriate software architecture to match. Throughout the presentation, we use examples from eBay, Google, StitchFix, and others.
Serverless Architectures enable scalable and cost-effective apps to be built faster, so they can dramatically increase the odds of Your Startup's Success!
In "Startups + Serverless = Match made in Heaven" meetup, www.ServerlessToronto.org members discussed how to help Entrepreneurs push their businesses up to "other side of the teeterboard" (without failing) using the Serverless technologies: https://www.youtube.com/watch?v=1SqfJo47kMA
Digitization solutions - A new breed of softwareUwe Friedrichsen
This slide deck is about the challenges we have to face if we deal with digitization solutions. As this term currently is massively overused, I first introduce a very simple definition to define what I mean with "digitization solution" in the context of the presentation.
Afterwards, I list the challenges - at least the most relevant ones - that arise from moving into the digitzation solution domain. Based on that, I try to examine the trends, prerequisites and limitations that you are confronted with from an IT point of view and you better need to adapt to if you are confronted with digitization in your company. Last, but not least, I try to derive some practical hints for us as individuals, how we can prepare for such an environment.
As always, the voice track is missing, but I hope also the slides on their own bear some value for you.
An Organizational View on the Disrupting Cloud
The (public) cloud is mainly considered to be an IT aspect of a business strategy. But this is in fact one of the main reasons why Cloud adoption often results in a perception of a (partial) failure. Unless an enterprise is willing to reinvent itself from a business perspective, the Cloud will not, in the long term, have any significant impact on the business sustainability.
This is, almost obviously, why organisations that adopt business agility as a strategy and are in fact moving towards what I call true BizDevOps, will actually benefit from the Cloud as they are Cloud Native Enterprises.
Luke Closs at URISA BC (Feb, 2012) talking about Innovation and Open Data, and how cities can better capture innovation created by open data communities to lower costs and provide better services.
Modern Digital Design: The power of Responsive DesignValtech UK
You've probably already heard of the term Responsive Design. Currently it's one of the hot topics being discussed in the digital space and something many businesses are trying to get their heads around.
So what exactly is Responsive Design? And why does it matter?
Transforming Consumer Banking with a 100% Cloud-Based Bank (FSV204) - AWS re:...Amazon Web Services
Customer demands for higher levels of service and value, constantly evolving technology capabilities, and stringent regulatory requirements are all powerful forces reshaping retail banking. Built exclusively on AWS, Starling Bank’s 100% cloud-based, mobile-only banking solution satisfies regulators in terms of its resilience, security, and reliability. It also satisfies consumers by giving them greater control over their data, streamlining the account opening process, accelerating payments, and providing access to innovative new services developed from scratch with open APIs, a developer platform, integration with Apple Pay, Google Pay, and Fitbit Pay and a custom backend ledger and payments integrations. Starling Bank is leading the open banking revolution. In this session, learn how Starling Bank delivers value to their customers and innovates at a very fast pace in a sector that can be slow to evolve.
Contiuously Deploying Culture 2.0 - Agile ÍslandRich Smith
Presentation discussing the story of engineering culture at Etsy and the lessons learnt of maintaining a genuine and engaging culture in a rapidly growing technology company.
The truth about IoT field gateways (Sam Vanhoutte @IoT Convention Europe 2017) Codit
Should you connect devices directly to the cloud, or rather consolidate them via a field gateway? Discover the main raisons behind introducing a gateway into your IoT architecture, how they accelerate a rollout and what capabilities should you look for. Learn how gateways cope with connectivity issues and security challenges. Discover from Sam’s experiences how crucial IoT field gateways are for the future roadmap of your IoT solution. Being a connected company is no small decision. But you can make it easy;
Speed Up Your Apache Cassandra™ Applications: A Practical Guide to Reactive P...Matt Stubbs
Speaker: Cedrick Lunven, Developer Advocate, DataStax
Speaker Bio: Cedrick is a Developer Advocate at DataStax where he finds opportunities to share his passions by speaking about developing distributed architectures and implementing reference applications for developers. In 2013, he created FF4j, an open source framework for Feature Toggle which he still actively maintains. He is now contributor in JHipster team.
Talk Synopsis: We have all introduced more or less functional programming and asynchronous operations into our applications in order to speed up and distribute treatments (e.g., multi-threading, future, completableFuture, etc.). To build truly non-blocking components, optimize resource usage, and avoid "callback hell" you have to think reactive—everything is an event.
From the frontend UI to database communications, it’s now possible to develop Java applications as fully reactive with frameworks like Spring WebFlux and Reactor. With high throughput and tunable consistency, applications built on top of Apache Cassandra™ fit perfectly within this pattern.
DataStax has been developing Apache Cassandra drivers for years, and in the latest version of the enterprise driver we introduced reactive programming.
During this session we will migrate, step by step, a vanilla CRUD Java service (SpringBoot / SpringMVC) into reactive with both code review and live coding. Bring home a working project!
Filmed at Skills Matter/Code Node London on 9th May 2019 as part of the Big Data LDN Meetup Blueprint Series.
Meetup sponsored by DataStax.
Blueprint Series: Expedia Partner Solutions, Data PlatformMatt Stubbs
Join Anselmo for an engaging overview of the new end-to-end data architecture at Expedia Group, taking a journey through cloud and on-prem data lakes, real-time and batch processes and streamlined access for data producers and consumers. Find out how the new architecture unifies a complex mix of data sources and feeds the data science development cycle. Expedia might appear to be a market-leading travel company – in reality, it’s a highly successful technology and data science company.
Blueprint Series: Architecture Patterns for Implementing Serverless Microserv...Matt Stubbs
Richard Freeman talks about how the data science team at JustGiving built KOALA, a fully serverless stack for real-time web analytics capture, stream processing, metrics API, and storage service, supporting live data at scale from over 26M users. He discusses recent advances in serverless computing, and how you can implement traditionally container-based microservice patterns using serverless-based architectures instead. Deploying Serverless in your organisation can dramatically increase the delivery speed, productivity and flexibility of the development team, while reducing the overall running, DevOps and maintenance costs.
Big Data LDN 2018: DATABASE FOR THE INSTANT EXPERIENCEMatt Stubbs
Date: 14th November 2018
Location: Customer Experience Theatre
Time: 12:30 - 13:00
Speaker: David Maitland
Organisation: Redis Labs
About: This session will cover the technology underpinning at the software infrastructure level required to deliver the instant experience to the end user and enterprises alike. Use cases and value derived by major brands will be shared in this insightful session based the world's most loved database REDIS.
Big Data LDN 2018: BIG DATA TOO SLOW? SPRINKLE IN SOME NOSQLMatt Stubbs
Date: 14th November 2018
Location: Customer Experience Theatre
Time: 11:50 - 12:20
Speaker: Perry Krug
Organisation: Couchbase
About: Who wants to see an ad today for the shoes they bought last week? Everyone knows that customer experience is driven by data: don't waste an opportunity to get them the right data at the right time. Real-time results are critical, but raw speed isn't everything: you need power and flexibility to react to changes on the fly. Come learn how market-leading enterprises are using Couchbase as their speed layer for ingestion, incremental view and presentation layers alongside Kafka, Spark and Hadoop to liberate their data lakes.
Big Data LDN 2018: ENABLING DATA-DRIVEN DECISIONS WITH AUTOMATED INSIGHTSMatt Stubbs
Date: 13th November 2018
Location: Customer Experience Theatre
Time: 11:50 - 12:20
Speaker: Charlotte Emms
Organisation: seenit
About: How do you get your colleagues interested in the power of data? Taking you through Seenit’s journey using Couchbase's NoSQL database to create a regular, fully automated update in an easily digestible format.
Big Data LDN 2018: DATA MANAGEMENT AUTOMATION AND THE INFORMATION SUPPLY CHAI...Matt Stubbs
Date: 14th November 2018
Location: Governance and MDM Theatre
Time: 10:30 - 11:00
Speaker: Mike Ferguson
Organisation: IBS
About: For most organisations today, data complexity has increased rapidly. In the area of operations, we now have cloud and on-premises OLTP systems with customers, partners and suppliers accessing these applications via APIs and mobile apps. In the area of analytics, we now have data warehouse, data marts, big data Hadoop systems, NoSQL databases, streaming data platforms, cloud storage, cloud data warehouses, and IoT-generated data being created at the edge. Also, the number of data sources is exploding as companies ingest more and more external data such as weather and open government data. Silos have also appeared everywhere as business users are buying in self-service data preparation tools without consideration for how these tools integrate with what IT is using to integrate data. Yet new regulations are demanding that we do a better job of governing data, and business executives are demanding more agility to remain competitive in a digital economy. So how can companies remain agile, reduce cost and reduce the time-to-value when data complexity is on the up?
In this session, Mike will discuss how companies can create an information supply chain to manufacture business-ready data and analytics to reduce time to value and improve agility while also getting data under control.
Date: 13th November 2018
Location: Governance and MDM Theatre
Time: 12:30 - 13:00
Organisation: Immuta
About: Artificial intelligence is rising in importance, but it’s also increasingly at loggerheads with data protection regimes like the GDPR—or so it seems. In this talk, Sophie will explain where and how AI and GDPR conflict with one another, and how to resolve these tensions.
Big Data LDN 2018: REALISING THE PROMISE OF SELF-SERVICE ANALYTICS WITH DATA ...Matt Stubbs
Date: 13th November 2018
Location: Governance and MDM Theatre
Time: 11:50 - 12:20
Speaker: Mark Pritchard
Organisation: Denodo
About: Self-service analytics promises to liberate business users to perform analytics without the assistance of IT, and this in turn promises to free IT to focus on enhancing the infrastructure.
Join us to learn how data virtualization will allow you to gain real-time access to enterprise-wide data and deliver self-service analytics. We will explore how you can seamlessly unify fragmented data, replace your high-maintenance and high cost data integrations with a single, low-maintenance data virtualization layer; and how you can preserve your data integrity and ensure data lineage is fully traceable.
Big Data LDN 2018: TURNING MULTIPLE DATA LAKES INTO A UNIFIED ANALYTIC DATA L...Matt Stubbs
Date: 13th November 2018
Location: Governance and MDM Theatre
Time: 11:10 - 11:40
Organisation: TIBCO
About: The big data phenomenon continues to accelerate, resulting in multiple data lakes at most organisations. However, according to Gartner, “Through 2019, 90% of the information assets from big data analytic efforts will be siloed and unusable across multiple business processes.”
Are you ready to unleash this data from these silos and deliver the insights your organisation needs to drive compelling customer experiences, innovative new products and optimized operations? In this session you will learn how to apply data virtualisation to: - Access, transform and deliver data from across your lakes, clouds and other data sources - Empower a range of analytic users and tools with all the data they need - Move rapidly to a modern and flexible data architecture for the long run In addition, you will see a demonstration of data virtualisation in action.
Big Data LDN 2018: CONSISTENT SECURITY, GOVERNANCE AND FLEXIBILITY FOR ALL WO...Matt Stubbs
Date: 14th November 2018
Location: Data-Driven Ldn Theatre
Time: 12:30 - 13:00
Organisation: Cloudera
About: The growth of public cloud is reinforcing the need to think more carefully about taking a consistent approach to data governance as technology teams build out a flexible and agile infrastructure to meet the demands of the business.
Join this session to learn more about Cloudera's recommended approach for enterprise-grade security and governance and how to ensure a consistent framework across private, public and on-premises environments.
Big Data LDN 2018: MICROLISE: USING BIG DATA AND AI IN TRANSPORT AND LOGISTICSMatt Stubbs
Date: 14th November 2018
Location: Data-Driven Ldn Theatre
Time: 11:10 - 11:40
Organisation: Microlise
About: Microlise are a leading provider of technology solutions to the transport and logistics industry worldwide. Discover how, with over 400,000 connected assets generating billions of messages a day, Microlise is evolving its platform to bring real-time analytics to its customers to improve safety, security and efficiency outcomes.
Big Data LDN 2018: EXPERIAN: MAXIMISE EVERY OPPORTUNITY IN THE BIG DATA UNIVERSEMatt Stubbs
Date: 14th November 2018
Location: Data-Driven Ldn Theatre
Time: 10:30 - 11:00
Speaker: Anna Matty
Organisation: Experian
About: Today there is a widespread focus on the 'how' in relation to problem solving. How can we gain better knowledge of what consumers want, or need? How can we be more efficient, reduce the cost to serve, or grow the lifetime value of a customer? But, how do you move to a place where you are not only solving a problem, you are redesigning the entire strategic potential of that problem? You are being armed with insight on what the problem is.
Data and innovation offer huge potential to revolutionise all markets. There is an opportunity to be one step ahead of the need, to redesign journeys and enhance enterprise strategies. To do this you need access to the most advanced analytics but also the best quality, including variations and types of data, and then the technology that can act on this insight. Data science can present a unique opportunity for uncovered growth and accelerate your business through strategic innovation – fast. In this session you will hear more about how today's analytics can move from a single task, to an ongoing strategic opportunity. An opportunity that helps you move at the speed of the market and helps you maximise every opportunity.
Big Data LDN 2018: A LOOK INSIDE APPLIED MACHINE LEARNINGMatt Stubbs
Date: 13th November 2018
Location: Data-Driven Ldn Theatre
Time: 13:10 - 13:40
Speaker: Brian Goral
Organisation: Cloudera
About: The field of machine learning (ML) ranges from the very practical and pragmatic to the highly theoretical and abstract. This talk describes several of the challenges facing organisations that want to leverage more of their data through ML, including some examples of the applied algorithms that are already delivering value in business contexts.
Big Data LDN 2018: DEUTSCHE BANK: THE PATH TO AUTOMATION IN A HIGHLY REGULATE...Matt Stubbs
Date: 13th November 2018
Location: Data-Driven Ldn Theatre
Time: 12:30 - 13:00
Speaker: Paul Wilkinson, Naveen Gupta
Organisation: Cloudera
About: Investment banks are faced with some of the toughest regulatory requirements in the world. In a market where data is increasing and changing at extraordinary rates the journey with data governance never ends.
In this session, Deutsche Bank will share their journey with big data and explain some of the processes and techniques they have employed to prepare the bank for today’s challenges and tomorrow’s opportunities.
Brought to you by Naveen Gupta, VP Software Engineering, Deutsche Bank and Paul Wilkinson, Principal Solutions Architect, Cloudera.
Big Data LDN 2018: FROM PROLIFERATION TO PRODUCTIVITY: MACHINE LEARNING DATA ...Matt Stubbs
Date: 14th November 2018
Location: Self-Service Analytics Theatre
Time: 13:50 - 14:20
Speaker: Stephanie McReynolds
Organisation: Alation
About: Raw data is proliferating at an enormous rate. But so are our derived data assets - hundreds of dashboards, thousands of reports, millions of transformed data sets. Self-service analytics have ensured that this noise is making it increasingly hard to understand and trust data for decision-making. This trust gap is holding your organisation back from business outcomes.
European analytics leaders have found a way to close the gap between data and decision-making. From MunichRe to Pfizer and Daimler, analytics teams are adopting data catalogues for thousands of self-service analytics users.
Join us in this session to hear how data catalogues that activate data by incorporating machine learning can:
• Increase analyst productivity 20-40%
• Boost the understanding of the nuances of data and
• Establish trust in data-driven decisions with agile stewardship
Big Data LDN 2018: DATA APIS DON’T DISCRIMINATEMatt Stubbs
Date: 13th November 2018
Location: Self-Service Analytics Theatre
Time: 15:50 - 16:20
Speaker: Nishanth Kadiyala
Organisation: Progress
About: The exploding API economy, combined with an advanced analytics market projected to reach $30 billion by 2019, is forcing IT to expose more and more data through APIs. Business analysts, data engineers, and data scientists are still not happy because their needs never really made it into the existing API strategies. This is because most APIs are designed for application integration, but not for the data workers who are looking for APIs that facilitate direct data access to run complex analytics. Data APIs are specifically designed to provide that frictionless data access experience to support analytics across standard interoperable interfaces such as OData (REST) or ODBC/JDBC (SQL). Consider expanding your API strategy to service the developers with open analytics in this $30 billion market.
Big Data LDN 2018: A TALE OF TWO BI STANDARDS: DATA WAREHOUSES AND DATA LAKESMatt Stubbs
Date: 13th November 2018
Location: Self-Service Analytics Theatre
Time: 14:30 - 15:00
Speaker: Zaf Khan
Organisation: Arcadia Data
About: The use of data lakes continue to grow, and a recent survey by Eckerson Group shows that organizations are getting real value from their deployments. However, there’s still a lot of room for improvement when it comes to giving business users access to the wealth of potential insights in the data lake.
While the data management aspect has been fairly well understood over the years, the success of business intelligence (BI) and analytics on data lakes lags behind. In fact, organizations often struggle with data lakes because they are only accessible by highly-skilled data scientists and not by business users. But BI tools have been able to access data warehouses for years, so what gives?
In this talk, we’ll discuss:
• Why traditional BI tools are architected well for data warehouses, but not data lakes.
• Why every organization should have two BI standards: one for data warehouses and one for data lakes.
• Innovative capabilities provided by BI for data lakes
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
8. Who are Starling Bank?
• Tech start-up with a banking licence
• ~100% cloud-based, mobile-only
• Mastercard debit card
• DDs and faster payments
• Location-enriched transaction feed
• ApplePay, GooglePay, FitBitPay...
• Spending insights
• Granular card control
• Open APIs & developer platform
9. We built a bank in a year
• Jan 2014 - Founded by Anne Boden
• Jun 2014 - Kick-off with Regulators
• Sep 2015 - Technical prototypes
• Jul 2016 - Granted a partial banking license
• Nov 2016 - Launching the alpha app
• Feb 2017 - Launching the beta app
• Apr 2017 - Granted a full banking license
• May 2017 - Public launch
• Mar 2018 - Awarded Best British Bank
17. DITTO architecture
• Do everything at least once and at most once
• Retry (at least once)
• Idempotency (at most once)
• Work towards correctness, eventual consistency
• Reduce synchronicity to a minimum
• Save all requests to the database first
• Keep the smarts in the services, not in the pipes
• No distributed transactions
• Do not trust other services
20. Continuous(ish) delivery of back-end
• Continual deployment to non-prod, sign-off into prod
• Auto build, dockerise, test, scan, deploy < 1h
• Code released to production up to 5 times a day
22. The “rolling” giphy
• Our auditors loved this one
• Yes it’s in our release documentation
• Clear signal in engineering channel that is release in progress