http://spr.ly/AA_Utilities - Most companies in the oil and gas (O&G), utilities and chemical process industries benefit significantly from global markets; however, they also face pressures that demand instant response to fast-paced international events. Energy companies are using real-time data and analytics to solve key challenges in hotly competitive global markets.
-Bloomberg Businessweek Research
BIG Data & Hadoop Applications in FinanceSkillspeed
Explore the applications of BIG Data & Hadoop in Finance via Skillspeed.
BIG Data & Hadoop in Finance is a key differentiator, especially in terms of generating greater investment insights. They are used by companies & professionals for risk assessment, fraud detection & forecasting trends in financial markets.
To get more details regarding BIG Data & Hadoop, please visit - www.SkillSpeed.com
Data Mesh in Practice - How Europe's Leading Online Platform for Fashion Goes...Dr. Arif Wider
A talk presented by Max Schultze from Zalando and Arif Wider from ThoughtWorks at NDC Oslo 2020.
Abstract:
The Data Lake paradigm is often considered the scalable successor of the more curated Data Warehouse approach when it comes to democratization of data. However, many who went out to build a centralized Data Lake came out with a data swamp of unclear responsibilities, a lack of data ownership, and sub-par data availability.
At Zalando - europe’s biggest online fashion retailer - we realised that accessibility and availability at scale can only be guaranteed when moving more responsibilities to those who pick up the data and have the respective domain knowledge - the data owners - while keeping only data governance and metadata information central. Such a decentralized and domain focused approach has recently been coined a Data Mesh.
The Data Mesh paradigm promotes the concept of Data Products which go beyond sharing of files and towards guarantees of quality and acknowledgement of data ownership.
This talk will take you on a journey of how we went from a centralized Data Lake to embrace a distributed Data Mesh architecture and will outline the ongoing efforts to make creation of data products as simple as applying a template.
This is Part 4 of the GoldenGate series on Data Mesh - a series of webinars helping customers understand how to move off of old-fashioned monolithic data integration architecture and get ready for more agile, cost-effective, event-driven solutions. The Data Mesh is a kind of Data Fabric that emphasizes business-led data products running on event-driven streaming architectures, serverless, and microservices based platforms. These emerging solutions are essential for enterprises that run data-driven services on multi-cloud, multi-vendor ecosystems.
Join this session to get a fresh look at Data Mesh; we'll start with core architecture principles (vendor agnostic) and transition into detailed examples of how Oracle's GoldenGate platform is providing capabilities today. We will discuss essential technical characteristics of a Data Mesh solution, and the benefits that business owners can expect by moving IT in this direction. For more background on Data Mesh, Part 1, 2, and 3 are on the GoldenGate YouTube channel: https://www.youtube.com/playlist?list=PLbqmhpwYrlZJ-583p3KQGDAd6038i1ywe
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Mr. Pollock is an expert technology leader for data platforms, big data, data integration and governance. Jeff has been CTO at California startups and a senior exec at Fortune 100 tech vendors. He is currently Oracle VP of Products and Cloud Services for Data Replication, Streaming Data and Database Migrations. While at IBM, he was head of all Information Integration, Replication and Governance products, and previously Jeff was an independent architect for US Defense Department, VP of Technology at Cerebra and CTO of Modulant – he has been engineering artificial intelligence based data platforms since 2001. As a business consultant, Mr. Pollock was a Head Architect at Ernst & Young’s Center for Technology Enablement. Jeff is also the author of “Semantic Web for Dummies” and "Adaptive Information,” a frequent keynote at industry conferences, author for books and industry journals, formerly a contributing member of W3C and OASIS, and an engineering instructor with UC Berkeley’s Extension for object-oriented systems, software development process and enterprise architecture.
Fighting financial fraud at Danske Bank with artificial intelligenceRon Bodkin
Danske Bank, the leader in mobile payments in Denmark, is innovating with AI. Danske Bank’s existing fraud detection engine is being enhanced with deep learning algorithms that can analyze potentially tens of thousands of latent features. Danske Bank’s current system is largely based on handcrafted rules created by the business, based on intuition and some light analysis. The system is effective at blocking fraud, but it has a high rate of false positives, which is expensive and inconvenient, and it has proved impractical to update and maintain as fraudsters evolve their capabilities. Moreover, the bank understands that fraud is getting worse in the near- and long-term future due to the increased digitization of banking and the prevalence of mobile banking applications and recognizes the need to use cutting-edge techniques to engage fraudsters not where they are today but where they will be tomorrow.
Application fraud is an important emerging trend, in which machines fill in transaction forms. There is evidence that criminals are employing sophisticated machine-learning techniques to attack, so it’s critical to use sophisticated machine learning to catch fraud in banking and mobile payment transactions.
Ron Bodkin and Nadeem Gulzar explore how Danske Bank uses deep learning for better fraud detection. Danske Bank’s multistep program first productionizes “classic” machine learning techniques (boosted decision trees) while in parallel developing deep learning models with TensorFlow as a “challenger” to test. The system was first tested in shadow production and then in full production in a champion-challenger setup against live transactions. Ron and Nadeem explain how the bank is integrating the models with the efforts already running, giving the bank and its investigation team the ability to adapt to new patterns faster than before and taking on complex highly varying functions not present in the training examples.
Data Warehouse Concepts | Data Warehouse Tutorial | Data Warehousing | EdurekaEdureka!
This tutorial on data warehouse concepts will tell you everything you need to know in performing data warehousing and business intelligence. The various data warehouse concepts explained in this video are:
1. What Is Data Warehousing?
2. Data Warehousing Concepts:
i. OLAP (On-Line Analytical Processing)
ii. Types Of OLAP Cubes
iii. Dimensions, Facts & Measures
iv. Data Warehouse Schema
BIG Data & Hadoop Applications in FinanceSkillspeed
Explore the applications of BIG Data & Hadoop in Finance via Skillspeed.
BIG Data & Hadoop in Finance is a key differentiator, especially in terms of generating greater investment insights. They are used by companies & professionals for risk assessment, fraud detection & forecasting trends in financial markets.
To get more details regarding BIG Data & Hadoop, please visit - www.SkillSpeed.com
Data Mesh in Practice - How Europe's Leading Online Platform for Fashion Goes...Dr. Arif Wider
A talk presented by Max Schultze from Zalando and Arif Wider from ThoughtWorks at NDC Oslo 2020.
Abstract:
The Data Lake paradigm is often considered the scalable successor of the more curated Data Warehouse approach when it comes to democratization of data. However, many who went out to build a centralized Data Lake came out with a data swamp of unclear responsibilities, a lack of data ownership, and sub-par data availability.
At Zalando - europe’s biggest online fashion retailer - we realised that accessibility and availability at scale can only be guaranteed when moving more responsibilities to those who pick up the data and have the respective domain knowledge - the data owners - while keeping only data governance and metadata information central. Such a decentralized and domain focused approach has recently been coined a Data Mesh.
The Data Mesh paradigm promotes the concept of Data Products which go beyond sharing of files and towards guarantees of quality and acknowledgement of data ownership.
This talk will take you on a journey of how we went from a centralized Data Lake to embrace a distributed Data Mesh architecture and will outline the ongoing efforts to make creation of data products as simple as applying a template.
This is Part 4 of the GoldenGate series on Data Mesh - a series of webinars helping customers understand how to move off of old-fashioned monolithic data integration architecture and get ready for more agile, cost-effective, event-driven solutions. The Data Mesh is a kind of Data Fabric that emphasizes business-led data products running on event-driven streaming architectures, serverless, and microservices based platforms. These emerging solutions are essential for enterprises that run data-driven services on multi-cloud, multi-vendor ecosystems.
Join this session to get a fresh look at Data Mesh; we'll start with core architecture principles (vendor agnostic) and transition into detailed examples of how Oracle's GoldenGate platform is providing capabilities today. We will discuss essential technical characteristics of a Data Mesh solution, and the benefits that business owners can expect by moving IT in this direction. For more background on Data Mesh, Part 1, 2, and 3 are on the GoldenGate YouTube channel: https://www.youtube.com/playlist?list=PLbqmhpwYrlZJ-583p3KQGDAd6038i1ywe
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Mr. Pollock is an expert technology leader for data platforms, big data, data integration and governance. Jeff has been CTO at California startups and a senior exec at Fortune 100 tech vendors. He is currently Oracle VP of Products and Cloud Services for Data Replication, Streaming Data and Database Migrations. While at IBM, he was head of all Information Integration, Replication and Governance products, and previously Jeff was an independent architect for US Defense Department, VP of Technology at Cerebra and CTO of Modulant – he has been engineering artificial intelligence based data platforms since 2001. As a business consultant, Mr. Pollock was a Head Architect at Ernst & Young’s Center for Technology Enablement. Jeff is also the author of “Semantic Web for Dummies” and "Adaptive Information,” a frequent keynote at industry conferences, author for books and industry journals, formerly a contributing member of W3C and OASIS, and an engineering instructor with UC Berkeley’s Extension for object-oriented systems, software development process and enterprise architecture.
Fighting financial fraud at Danske Bank with artificial intelligenceRon Bodkin
Danske Bank, the leader in mobile payments in Denmark, is innovating with AI. Danske Bank’s existing fraud detection engine is being enhanced with deep learning algorithms that can analyze potentially tens of thousands of latent features. Danske Bank’s current system is largely based on handcrafted rules created by the business, based on intuition and some light analysis. The system is effective at blocking fraud, but it has a high rate of false positives, which is expensive and inconvenient, and it has proved impractical to update and maintain as fraudsters evolve their capabilities. Moreover, the bank understands that fraud is getting worse in the near- and long-term future due to the increased digitization of banking and the prevalence of mobile banking applications and recognizes the need to use cutting-edge techniques to engage fraudsters not where they are today but where they will be tomorrow.
Application fraud is an important emerging trend, in which machines fill in transaction forms. There is evidence that criminals are employing sophisticated machine-learning techniques to attack, so it’s critical to use sophisticated machine learning to catch fraud in banking and mobile payment transactions.
Ron Bodkin and Nadeem Gulzar explore how Danske Bank uses deep learning for better fraud detection. Danske Bank’s multistep program first productionizes “classic” machine learning techniques (boosted decision trees) while in parallel developing deep learning models with TensorFlow as a “challenger” to test. The system was first tested in shadow production and then in full production in a champion-challenger setup against live transactions. Ron and Nadeem explain how the bank is integrating the models with the efforts already running, giving the bank and its investigation team the ability to adapt to new patterns faster than before and taking on complex highly varying functions not present in the training examples.
Data Warehouse Concepts | Data Warehouse Tutorial | Data Warehousing | EdurekaEdureka!
This tutorial on data warehouse concepts will tell you everything you need to know in performing data warehousing and business intelligence. The various data warehouse concepts explained in this video are:
1. What Is Data Warehousing?
2. Data Warehousing Concepts:
i. OLAP (On-Line Analytical Processing)
ii. Types Of OLAP Cubes
iii. Dimensions, Facts & Measures
iv. Data Warehouse Schema
Data Mesh is a new socio-technical approach to data architecture, first described by Zhamak Dehghani and popularised through a guest blog post on Martin Fowler's site.
Since then, community interest has grown, due to Data Mesh's ability to explain and address the frustrations that many organisations are experiencing as they try to get value from their data. The 2022 publication of Zhamak's book on Data Mesh further provoked conversation, as have the growing number of experience reports from companies that have put Data Mesh into practice.
So what's all the fuss about?
On one hand, Data Mesh is a new approach in the field of big data. On the other hand, Data Mesh is application of the lessons we have learned from domain-driven design and microservices to a data context.
In this talk, Chris and Pablo will explain how Data Mesh relates to current thinking in software architecture and the historical development of data architecture philosophies. They will outline what benefits Data Mesh brings, what trade-offs it comes with and when organisations should and should not consider adopting it.
A Work of Zhamak Dehghani
Principal consultant
ThoughtWorks
https://martinfowler.com/articles/data-monolith-to-mesh.html
https://fast.wistia.net/embed/iframe/vys2juvzc3?videoFoam
How to Move Beyond a Monolithic Data Lake to a Distributed Data Mesh
Many enterprises are investing in their next generation data lake, with the hope of democratizing data at scale to provide business insights and ultimately make automated intelligent decisions. Data platforms based on the data lake architecture have common failure modes that lead to unfulfilled promises at scale. To address these failure modes we need to shift from the centralized paradigm of a lake, or its predecessor data warehouse. We need to shift to a paradigm that draws from modern distributed architecture: considering domains as the first class concern, applying platform thinking to create self-serve data infrastructure, and treating data as a product.
Build Real-Time Applications with Databricks StreamingDatabricks
In this presentation, we will study a recent use case we implemented recently. In this use case we are working with a large, metropolitan fire department. Our company has already created a complete analytics architecture for the department based upon Azure Data Factory, Databricks, Delta Lake, Azure SQL and Azure SQL Server Analytics Services (SSAS). While this architecture works very well for the department, they would like to add a real-time channel to their reporting infrastructure.
This channel should serve up the following information: •The most up-to-date locations and status of equipment (fire trucks, ambulances, ladders etc.)
• The current locations and status of firefighters, EMT personnel and other relevant fire department employees
• The current list of active incidents within the city The above information should be visualized through an automatically updating dashboard. The central component of the dashboard will be map which automatically updates with the locations and incidents. This view should be as real-time as possible and will be used by the fire chiefs to assist with real-time decision-making on resource and equipment deployments.
In this presentation, we will leverage Databricks, Spark Structured Streaming, Delta Lake and the Azure platform to create this real-time delivery channel.
An introduction to data centers, including a discussion on location criteria, key factors to look for when thinking about establishing a data center in an existing building and case studies on data centers in the Atlanta and Toronto areas.
Big Data Architectural Patterns and Best Practices on AWSAmazon Web Services
The world is producing an ever increasing volume, velocity, and variety of big data. Consumers and businesses are demanding up-to-the-second (or even millisecond) analytics on their fast-moving data, in addition to classic batch processing. AWS delivers many technologies for solving big data problems. But what services should you use, why, when, and how? In this session, we simplify big data processing as a data bus comprising various stages: ingest, store, process, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architecture, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
More and more organizations are moving their ETL workloads to a Hadoop based ELT grid architecture. Hadoop`s inherit capabilities, especially it`s ability to do late binding addresses some of the key challenges with traditional ETL platforms. In this presentation, attendees will learn the key factors, considerations and lessons around ETL for Hadoop. Areas such as pros and cons for different extract and load strategies, best ways to batch data, buffering and compression considerations, leveraging HCatalog, data transformation, integration with existing data transformations, advantages of different ways of exchanging data and leveraging Hadoop as a data integration layer. This is an extremely popular presentation around ETL and Hadoop.
Data Mesh in Practice: How Europe’s Leading Online Platform for Fashion Goes ...Databricks
The Data Lake paradigm is often considered the scalable successor of the more curated Data Warehouse approach when it comes to democratization of data. However, many who went out to build a centralized Data Lake came out with a data swamp of unclear responsibilities, a lack of data ownership, and sub-par data availability.
Banking Circle: Money Laundering Beware: A Modern Approach to AML with Machin...Neo4j
by Ruben Menke, Lead Data Scientist at Banking Circle
In this talk, Banking Circle will show how a modern computational method is essential in the fight against money laundering.
Building a Real-Time Fraud Prevention Engine Using Open Source (Big Data) Sof...Spark Summit
Fraudsters attempt to pay for goods, flights, hotels – you name it – using stolen credit cards. This hurts both the trust of card holders and the business of vendors around the world. We built a Real-Time Fraud Prevention Engine using Open Source (Big Data) Software: Spark, Spark ML, H2O, Hive, Esper. In my talk I will highlight both the business and the technical challenges that we’ve faced and dealt with.
Customer Event Hub - the modern Customer 360° viewGuido Schmutz
Today, companies are using various channels to communicate with their customers. As a consequence, a lot of data is created, more and more also outside of the traditional IT infrastructure of an enterprise. This data often does not have a common format and they are continuously created with ever increasing volume. With Internet of Things (IoT) and their sensors, the volume as well as the velocity of data just gets more extreme.
To achieve a complete and consistent view of a customer, all these customer-related information has to be included in a 360 degree view in a real-time or near-real-time fashion. By that, the Customer Hub will become the Customer Event Hub. It constantly shows the actual view of a customer over all his interaction channels and provides an enterprise the basis for a substantial and effective customer relation.
In this presentation the value of such a platform is shown and how it can be implemented.
Presentation on Data Mesh: The paradigm shift is a new type of eco-system architecture, which is a shift left towards a modern distributed architecture in which it allows domain-specific data and views “data-as-a-product,” enabling each domain to handle its own data pipelines.
Always Encrypted is a highly-touted new feature of SQL Server 2016 that promises to make encryption simple to use and transparent to applications while still protecting the data both at rest and in motion, even from high-privilege users such as developers and DBAs. Does that sound too good to be true? It isn’t – Always Encrypted is an incredible feature – but like any new technology, it does have some limitations. In this session, you’ll see how to configure Always Encrypted, and we’ll talk about when you should and shouldn’t use it in your environment.
this is part 3 of the series on Data Mesh ... looking at the intersection of microservices architecture concepts, data integration / replication technologies and log-based stream integration techniques. This webinar was mostly a demonstration, but several slides used to setup the demo are included here as a PDF for viewers.
Enterprise guide to building a Data MeshSion Smith
Making Data Mesh simple, Open Source and available to all; without vendor lock-in, without complex tooling and to use an approach centered around ‘specifications’, existing tools and baking in a ‘domain’ model.
Idera live 2021: Keynote Presentation The Future of Data is The Data Cloud b...IDERA Software
Join us for an introduction from Idera's CEO Randy Jacops followed by our Keynote Presentation: “The Future of Data is The Data Cloud”; presented by Kent Graziano (AKA The Data Warrior), Chief Technical Evangelist for Snowflake.
Lots has happened at Snowflake in the last few years (including a HUGE IPO!). In this session Kent will give an update on Snowflake’s vision of a world with unlimited access to governed data, enabling every organization to tackle the challenges and opportunities of today and be prepared for the possibilities of tomorrow.
Every company in the world still struggles with how to take all their siloed data and turn it into insight, quickly. The Snowflake Data Cloud enables organizations, in every industry, to democratize their data and become data-driven. This talk will introduce you to The Data Cloud, how it works, and the problems it solves for real companies across the globe and across industries. Kent will also update you on recent governance innovations such as dynamic data masking, tagging, and row access policies that will help you build a robust and secure analytics platform.
About our Keynote Speaker
= = = = = = = = = = = = = = =
Kent Graziano, is the Chief Technical Evangelist for Snowflake and an award-winning author, speaker, and thought leader. He is an Oracle ACE Director (Alumni), Knight of the OakTable Network, a certified Data Vault Master and Data Vault 2.0 Practitioner (CDVP2), and expert solution architect with over 35 years of experience, including more than 25 years of designing advanced data and analytics architectures (in multiple industries). He is an internationally recognized expert in cloud and agile data design. Mr. Graziano has developed and led many successful software and data analytics implementation teams, including multiple agile DW/BI teams. He has written numerous articles, authored 3 Kindle books, co-authored 4 other books (including the 1st Edition of The Data Model Resource Book), and has given hundreds of presentations around the world.
Wonder what this data mesh stuff is all about? What are the principles of data mesh? Can you or should you consider data mesh as the approach for your analytics platform? And most important - how can Snowflake help?
Given in Montreal on 14-Dec-2021
Big Data BlackOut: Are Utilities Powering Up Their Data Analytics?Capgemini
Analytics is seeing greater recognition amongst utility executives. Our research showed that 80% of utilities consider big data analytics as a source of new business opportunities and 75% see it as crucial for future success. Big Data indeed offers an exciting opportunity to transform utility operational effectiveness, while at the same time dealing with the historical problem of low customer satisfaction. Take operational efficiency alone. The annual cost of weather-related power outages to the U.S. economy is estimated to be between $18 billion to $33 billion. Organizations can use Big Data analytics to detect operational challenges and prevent outages, substantially reducing costs. Big Data also affords opportunities to utilities for inventing new business models through the data generated by the smart infrastructure.
The analytics opportunity for utilities is clear, but there continues to be a lack of real impetus and value delivery. Only 20% have already implemented big data analytics initiatives. What is putting the brakes on utilities?
In this paper, we highlight the big data opportunities that utilities can leverage and identify the challenges that are currently holding them back. We conclude the paper with concrete recommendations on how to ensure analytics drive business value.
Data Mesh is a new socio-technical approach to data architecture, first described by Zhamak Dehghani and popularised through a guest blog post on Martin Fowler's site.
Since then, community interest has grown, due to Data Mesh's ability to explain and address the frustrations that many organisations are experiencing as they try to get value from their data. The 2022 publication of Zhamak's book on Data Mesh further provoked conversation, as have the growing number of experience reports from companies that have put Data Mesh into practice.
So what's all the fuss about?
On one hand, Data Mesh is a new approach in the field of big data. On the other hand, Data Mesh is application of the lessons we have learned from domain-driven design and microservices to a data context.
In this talk, Chris and Pablo will explain how Data Mesh relates to current thinking in software architecture and the historical development of data architecture philosophies. They will outline what benefits Data Mesh brings, what trade-offs it comes with and when organisations should and should not consider adopting it.
A Work of Zhamak Dehghani
Principal consultant
ThoughtWorks
https://martinfowler.com/articles/data-monolith-to-mesh.html
https://fast.wistia.net/embed/iframe/vys2juvzc3?videoFoam
How to Move Beyond a Monolithic Data Lake to a Distributed Data Mesh
Many enterprises are investing in their next generation data lake, with the hope of democratizing data at scale to provide business insights and ultimately make automated intelligent decisions. Data platforms based on the data lake architecture have common failure modes that lead to unfulfilled promises at scale. To address these failure modes we need to shift from the centralized paradigm of a lake, or its predecessor data warehouse. We need to shift to a paradigm that draws from modern distributed architecture: considering domains as the first class concern, applying platform thinking to create self-serve data infrastructure, and treating data as a product.
Build Real-Time Applications with Databricks StreamingDatabricks
In this presentation, we will study a recent use case we implemented recently. In this use case we are working with a large, metropolitan fire department. Our company has already created a complete analytics architecture for the department based upon Azure Data Factory, Databricks, Delta Lake, Azure SQL and Azure SQL Server Analytics Services (SSAS). While this architecture works very well for the department, they would like to add a real-time channel to their reporting infrastructure.
This channel should serve up the following information: •The most up-to-date locations and status of equipment (fire trucks, ambulances, ladders etc.)
• The current locations and status of firefighters, EMT personnel and other relevant fire department employees
• The current list of active incidents within the city The above information should be visualized through an automatically updating dashboard. The central component of the dashboard will be map which automatically updates with the locations and incidents. This view should be as real-time as possible and will be used by the fire chiefs to assist with real-time decision-making on resource and equipment deployments.
In this presentation, we will leverage Databricks, Spark Structured Streaming, Delta Lake and the Azure platform to create this real-time delivery channel.
An introduction to data centers, including a discussion on location criteria, key factors to look for when thinking about establishing a data center in an existing building and case studies on data centers in the Atlanta and Toronto areas.
Big Data Architectural Patterns and Best Practices on AWSAmazon Web Services
The world is producing an ever increasing volume, velocity, and variety of big data. Consumers and businesses are demanding up-to-the-second (or even millisecond) analytics on their fast-moving data, in addition to classic batch processing. AWS delivers many technologies for solving big data problems. But what services should you use, why, when, and how? In this session, we simplify big data processing as a data bus comprising various stages: ingest, store, process, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architecture, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
More and more organizations are moving their ETL workloads to a Hadoop based ELT grid architecture. Hadoop`s inherit capabilities, especially it`s ability to do late binding addresses some of the key challenges with traditional ETL platforms. In this presentation, attendees will learn the key factors, considerations and lessons around ETL for Hadoop. Areas such as pros and cons for different extract and load strategies, best ways to batch data, buffering and compression considerations, leveraging HCatalog, data transformation, integration with existing data transformations, advantages of different ways of exchanging data and leveraging Hadoop as a data integration layer. This is an extremely popular presentation around ETL and Hadoop.
Data Mesh in Practice: How Europe’s Leading Online Platform for Fashion Goes ...Databricks
The Data Lake paradigm is often considered the scalable successor of the more curated Data Warehouse approach when it comes to democratization of data. However, many who went out to build a centralized Data Lake came out with a data swamp of unclear responsibilities, a lack of data ownership, and sub-par data availability.
Banking Circle: Money Laundering Beware: A Modern Approach to AML with Machin...Neo4j
by Ruben Menke, Lead Data Scientist at Banking Circle
In this talk, Banking Circle will show how a modern computational method is essential in the fight against money laundering.
Building a Real-Time Fraud Prevention Engine Using Open Source (Big Data) Sof...Spark Summit
Fraudsters attempt to pay for goods, flights, hotels – you name it – using stolen credit cards. This hurts both the trust of card holders and the business of vendors around the world. We built a Real-Time Fraud Prevention Engine using Open Source (Big Data) Software: Spark, Spark ML, H2O, Hive, Esper. In my talk I will highlight both the business and the technical challenges that we’ve faced and dealt with.
Customer Event Hub - the modern Customer 360° viewGuido Schmutz
Today, companies are using various channels to communicate with their customers. As a consequence, a lot of data is created, more and more also outside of the traditional IT infrastructure of an enterprise. This data often does not have a common format and they are continuously created with ever increasing volume. With Internet of Things (IoT) and their sensors, the volume as well as the velocity of data just gets more extreme.
To achieve a complete and consistent view of a customer, all these customer-related information has to be included in a 360 degree view in a real-time or near-real-time fashion. By that, the Customer Hub will become the Customer Event Hub. It constantly shows the actual view of a customer over all his interaction channels and provides an enterprise the basis for a substantial and effective customer relation.
In this presentation the value of such a platform is shown and how it can be implemented.
Presentation on Data Mesh: The paradigm shift is a new type of eco-system architecture, which is a shift left towards a modern distributed architecture in which it allows domain-specific data and views “data-as-a-product,” enabling each domain to handle its own data pipelines.
Always Encrypted is a highly-touted new feature of SQL Server 2016 that promises to make encryption simple to use and transparent to applications while still protecting the data both at rest and in motion, even from high-privilege users such as developers and DBAs. Does that sound too good to be true? It isn’t – Always Encrypted is an incredible feature – but like any new technology, it does have some limitations. In this session, you’ll see how to configure Always Encrypted, and we’ll talk about when you should and shouldn’t use it in your environment.
this is part 3 of the series on Data Mesh ... looking at the intersection of microservices architecture concepts, data integration / replication technologies and log-based stream integration techniques. This webinar was mostly a demonstration, but several slides used to setup the demo are included here as a PDF for viewers.
Enterprise guide to building a Data MeshSion Smith
Making Data Mesh simple, Open Source and available to all; without vendor lock-in, without complex tooling and to use an approach centered around ‘specifications’, existing tools and baking in a ‘domain’ model.
Idera live 2021: Keynote Presentation The Future of Data is The Data Cloud b...IDERA Software
Join us for an introduction from Idera's CEO Randy Jacops followed by our Keynote Presentation: “The Future of Data is The Data Cloud”; presented by Kent Graziano (AKA The Data Warrior), Chief Technical Evangelist for Snowflake.
Lots has happened at Snowflake in the last few years (including a HUGE IPO!). In this session Kent will give an update on Snowflake’s vision of a world with unlimited access to governed data, enabling every organization to tackle the challenges and opportunities of today and be prepared for the possibilities of tomorrow.
Every company in the world still struggles with how to take all their siloed data and turn it into insight, quickly. The Snowflake Data Cloud enables organizations, in every industry, to democratize their data and become data-driven. This talk will introduce you to The Data Cloud, how it works, and the problems it solves for real companies across the globe and across industries. Kent will also update you on recent governance innovations such as dynamic data masking, tagging, and row access policies that will help you build a robust and secure analytics platform.
About our Keynote Speaker
= = = = = = = = = = = = = = =
Kent Graziano, is the Chief Technical Evangelist for Snowflake and an award-winning author, speaker, and thought leader. He is an Oracle ACE Director (Alumni), Knight of the OakTable Network, a certified Data Vault Master and Data Vault 2.0 Practitioner (CDVP2), and expert solution architect with over 35 years of experience, including more than 25 years of designing advanced data and analytics architectures (in multiple industries). He is an internationally recognized expert in cloud and agile data design. Mr. Graziano has developed and led many successful software and data analytics implementation teams, including multiple agile DW/BI teams. He has written numerous articles, authored 3 Kindle books, co-authored 4 other books (including the 1st Edition of The Data Model Resource Book), and has given hundreds of presentations around the world.
Wonder what this data mesh stuff is all about? What are the principles of data mesh? Can you or should you consider data mesh as the approach for your analytics platform? And most important - how can Snowflake help?
Given in Montreal on 14-Dec-2021
Big Data BlackOut: Are Utilities Powering Up Their Data Analytics?Capgemini
Analytics is seeing greater recognition amongst utility executives. Our research showed that 80% of utilities consider big data analytics as a source of new business opportunities and 75% see it as crucial for future success. Big Data indeed offers an exciting opportunity to transform utility operational effectiveness, while at the same time dealing with the historical problem of low customer satisfaction. Take operational efficiency alone. The annual cost of weather-related power outages to the U.S. economy is estimated to be between $18 billion to $33 billion. Organizations can use Big Data analytics to detect operational challenges and prevent outages, substantially reducing costs. Big Data also affords opportunities to utilities for inventing new business models through the data generated by the smart infrastructure.
The analytics opportunity for utilities is clear, but there continues to be a lack of real impetus and value delivery. Only 20% have already implemented big data analytics initiatives. What is putting the brakes on utilities?
In this paper, we highlight the big data opportunities that utilities can leverage and identify the challenges that are currently holding them back. We conclude the paper with concrete recommendations on how to ensure analytics drive business value.
This presentation examines how AMI data, the collection of this data and the creation of tools to use this data have dramatically changed and is continuing to change metering operations. We will look at some of the challenges we are facing as we learn how to do business most effectively with this information and these tools.
This presentation examines how AMI data, the collection of this data and the creation of tools to use this data have dramatically changed and is continuing to change metering operations. We will look at some of the challenges we are facing as we learn how to do business most effectively with this information and these tools.
Revue de presse IoT / Data du 26/03/2017Romain Bochet
Sommaire :
- From the Edge To the Enterprise
- The Internet of Energy: Smart Sockets
- Google's big data calculates US rooftop solar potential
- Energy management: Oracle Utilities launches smart grid and IoT device management solution in the cloud
- Are vehicles the mobile sensor beds of the future?
This presentation examines how AMI data, the collection of this data and the creation of tools to use this data have dramatically changed and is continuing to change metering operations. We will look at some of the challenges we are facing as we learn how to do business most effectively with this information and these tools. 05/09/19
Pouring the Foundation: Data Management in the Energy IndustryDataWorks Summit
At CenterPoint Energy, both structured and unstructured data are continuing to grow at a rapid pace. This growth presents many opportunities to deliver business value and many challenges to control costs. To maximize the value of this data while controlling costs, CenterPoint Energy created a data lake using SAP HANA and Hadoop. During this presentation, CenterPoint will discuss their journey of moving smart meter data to Hadoop, how Hadoop is allowing CenterPoint to derive value from big data and their future use case road map.
Drilling Into Operational Metrics in the Energy IndustrySAP Analytics
http://spr.ly/AA_OG - The oil and gas industry has long used analytics to determine where to drill for reserves that are generally 5,000 to 35,000 feet below the earth’s surface. Relatively new, however, is that the “digital oilfield” is now expanding the use of analytics to gain fresh insight into the performance of those wells once they are producing. Using in-memory computing and analytics, Continental Resources makes better decisions to keep its oil wells pumping more efficiently.
-Bloomberg Businessweek Research
The need for a transition from a traditional maintenance practices to a dependency around data that uses analytics to alter maintenance practices has the potential to add value while creating new rewards and challenges to the utility world.
Revue de presse IoT / Data du 19/02/2017Romain Bochet
Bonjour,
Voici la revue de presse IoT/data/energie du 19 février 2017. Au sommaire :
- Why IoT is key to industrial energy efficiency
- How Technology Influences the Future of Energy Management
- Disruption at the Edge: IoT Transforming Energy Grids
- Arkados Partners with SparkFund to Offer Lighting-as-a-Service To Commercial and Industrial Customers
- IoT And AI: Improving Customer Satisfaction
- Flutura raises US$7.5M Series A to provide industrial IoT to engineering, energy firms
- IoT Tech Expo: Convergence of Tech, Business Model Innovation, Collaboration and Smart Cities
- Sigfox to Transform Global Asset Tracking with Spot’it, a Low Cost GPS-free Geolocation Service
Evolving Distribution Grid article in Electric Perspectives magazine Jan-Feb 2015 edition. Article discusses emerging business opportunities for a utility Distribution Services Provider.
#askSAP Analytics Innovations Community Call – Bridging the Information GapSAP Analytics
Imagine your business taking advantage of intelligent decision making with all aspects of your business, in real-time and effortlessly, consistent and reliable day-by-day. Now, consider doing this without the need to acquire, setup or maintain any additional hardware.
In our second #askSAP Community Call of the year, we will take it up a notch. This time around we’ll bring together experts you haven’t heard before to bring this vision to life. See in action how one holistic data management framework can enable confident decision making, which includes:
Simplifying access to your data by providing live real-time access to all data sources in a distributed landscape
Providing a single point of access for security and anonymization, while reducing data duplication
Giving you intelligent data that you can rely on by taking advantage of advanced analytics tools to deliver intelligent solutions
Reducing TCO by minimizing management costs and enabling self-service for business users
In addition, Puget Sound Energy, an SAP customer in the Washington State public sector utility, will share how they’re reimagining industry processes for increased employee engagement and organizational success. Learn how they combine the power of SAP SuccessFactors solutions and Labor Relations Software by Sodales, an application extension built on SAP Cloud Platform, to enhance engagement with the unionized workforce. This successful HR transformation is expected to reduce operational expenses and positively impact organizational integrity.
Optimize Business Intelligence Efforts With Embedded, Application-Driven Anal...SAP Analytics
Want to discover, procure, and provision business intelligence data more effectively? Forrester Consulting has a suggestion: embedded, application-driven analytics.
#askSAP Analytics Innovations Community Call: SAP Analytics 2019 Strategy and...SAP Analytics
Whether it’s growing a loyal customer base, pressure from innovative competitors, or retaining top talent, there is no shortage of strategic decisions to make by businesses today. SAP is tackling these challenges head-on by delivering the Intelligent Enterprise, and SAP Analytics is the glue that connects all the pieces together – from data right through to intelligence. Whether through new cloud technologies, existing on-premise investments, or every possible hybrid combination, SAP Analytics enables businesses to make faster, more confident decisions. In this webinar you will learn about what's ahead in 2019 for SAP Analytics. Visit www.sap.com/asksap to view the recording.
#AskSAP Analytics Innovations Community Call: SAP Analytics Fall 2018 Innovat...SAP Analytics
Analytics are the window into an Intelligent Enterprise and the landscapes are always changing, evolving, and improving. We’ve gone from ‘governed BI’ to ‘modern and agile BI with data discovery’, to the acceptance of cloud analytics and now emergent ‘smart BI’ using AI and machine learning to provide natural language interfaces. Innovation is moving both technology and the benefits it can offer organizations rapidly forward - on premise, in the cloud, and hybrid.
Discover how we’re bringing analytics together, making analytics easier, and delivering modern innovation in the cloud. Visit www.sap.com/asksap
#askSAP Analytics Innovations Community Call: Become an Intelligent Enterpris...SAP Analytics
New digital technologies allow companies to make core business processes more intelligent. Companies can automate business processes and augment employee's capabilities to let them do more than they could do before with more focus on strategic tasks based on real-time and forward-looking insights. This shift is accelerating in every industry and compelling every organization to transform into an intelligent enterprise to nimbly adapt to new market realities. SAP S/4HANA is the digital core platform to become such an intelligent enterprise. This digital transformation is driving the need for more analytics and a big opportunity which occurs through this transformation is to give all employees, partners and customers, immediate and forward-looking insights into what is going on and what is going to happen next.
#askSAP Analytics Innovations Community Call: SAP 2018 strategy and Roadmap f...SAP Analytics
Today companies need modern analytics capabilities that work together to analyze data wherever it resides for enterprises of all sizes and across every industry. To help them better understand that data and to help simplify and transform their business there are different technology options available - on premise, cloud, and hybrid. Hear SAP product experts share SAP’s roadmap and vision for business intelligence and data discovery covering these technology options
#asksap Analytics Innovations Community Call: SAP BW/4HANA - the Big Data War...SAP Analytics
Learn how SAP BW/4HANA delivers big data warehouse solutions that meet your current and future business analytics needs in a rapidly changing data landscape and increase your organization’s success in the next generation of business.
This presentation by Timo Elliott given at the 2017 SAP Insider event in Amsterdam provides an overview of SAP Leonardo, a "digital innovation system."
#askSAP Analytics Innovations Community Call: Delivering the Intelligent Ente...SAP Analytics
www.sap.com/asksap - Digitization is turning every company into a software-driven organization that intelligently connects people, things, and businesses together. At the center of digitization is massive data that can be harnessed into unprecedented insights.
Learn more about SAP’s Machine Learning strategy, and portfolio of machine learning driven applications and intelligent services.
Data & Analytics: The Competitive Edge for Small and Midsize BusinessesSAP Analytics
To further explore how small and medium businesses are using analytics for decision making, download the IDC paper, “Analytics for SMBs: Sharpen Operations, Capitalize on Business Opportunities,” sponsored by SAP: http://bit.ly/2qQkWf6.
Data Analytics Help Drive Digital Transformation InfographicSAP Analytics
Digital transformation has become a board-level initiative at many companies. A key part of this investment is being focused on data and analytics.
For more on how data and analytics is being used in digital transformation, download the IDC paper, “The Value of Data & Analytics in Digital Transformation”, sponsored by SAP. Download here: http://bit.ly/2siS5QY
#askSAP: Journey to the Cloud: SAP Strategy and Roadmap for Cloud and Hybrid ...SAP Analytics
www.sap.com/businessobjects-cloud. The momentum of customers moving to the SAP BusinessObjects Cloud is rapidly accelerating – and so are the innovations being introduced by SAP. New features and functionality for cloud and on premise with SAP BusinessObjects Enterprise offer hybrid use cases that organizations can take advantage of as they embark on their journey to the cloud. View the webinar reply at http://webinars.sap.com/asksap-webinar-series/en/home#section_3.
Unify Line of Business Data with SAP Digital BoardroomSAP Analytics
In this sample use case, you can see the power and potential of unifying your business data using SAP Digital Boardroom to gain meaningful insight. Learn more at http://www.sap.com/digital-boardroom
With SAP Digital Boardroom, you’re able to:
• Connect to Cloud data
• Leverage SAP business networks such as Hybris, Fieldglass, Ariba, and SuccessFactors
• Make faster decisions on live data
• Gain actionable insights
• Collaborate seamlessly in real-time
#asksap Analytics Innovations Community Call - Take Action in 2017 with Innov...SAP Analytics
Learn more about the highly anticipated 2017 release of SAP BusinessObjects Lumira 2.0, the expanded modeling and scalable machine learning capabilities of SAP BusinessObjects Predictive Analytics 3.1, and the extended availability of SAP BusinessObjects Roambi.
Companies using SAP BusinessObjects Analytics solutions have achieved significant improvements and savings. For a deeper look, read the October 2016 Forrester Consulting study, “The Total Economic Impact™ of SAP BusinessObjects Analytics: Cost Savings and Business Benefits,” commissioned by SAP.
For more, please visit: http://bit.ly/2g00ntG
#askSAP EPM Innovations Community Call: How Planning Can Ignite Digital Trans...SAP Analytics
SAP.com/EPM - With various planning solutions available at SAP including SAP BusinessObjects Planning and Consolidation, version for SAP NetWeaver and SAP BusinessObjects Cloud for planning, get a better understanding on how using one or both of these tools can help drive Digital Transformation in the Enterprise.
Learn what the latest releases of these SAP solutions offer and how they can work together to bring further insights to the Finance department.
#askSAP Analytics Innovations Community Call: Reimagine Analytics for the Dig...SAP Analytics
http://sap.com/predictive - New digital technologies allow companies to reimagine business models, rise to disruptive market entrants and squeeze more productivity from less resources. Companies embracing digital transformation by investing in advanced analytics are winning. They create more revenue, greater market valuations, and growing profitability.
SAP BusinessObjects Predictive Analytics is a game changer in the predictive space by helping you create, deploy, and maintain thousands of predictive models that anticipate future outcomes and guide better, more profitable decision-making across your digital enterprise.
#askSAP Analytics Innovations Community Call: Innovation in Core BI Solutions...SAP Analytics
Where is SAP BusinessObjects BI headed, and how will your organization benefit?
See how SAP is delivering on its commitment to innovate its business intelligence (BI) applications. With SAP BusinessObjects BI 4.2 now available, now is a perfect time to explore what this new release has to offer and hear about announcements made at SAPPHIRE NOW 2016.
#askSAP GRC Innovations Community Call: Cybersecurity Risk and GovernanceSAP Analytics
How is your organization tackling ever increasing cybersecurity threats? Do you have the proper structure and methods in place to effectively mitigate this constantly evolving risk?
Get a sneak preview on how SAP is helping companies embrace the age of digital transformation while rethinking their security strategy, especially as it relates to protecting business applications and improving overarching risk and governance programs.
#askSAP EPM Innovations Community Call: Transform Finance into Instant InsightSAP Analytics
http://bit.ly/askSAP_TransformFinance - What will instant insight bring to the finance picture? Faster and more insightful budgeting and forecasting. The ability to analyze changes to actuals vs. budgets – with no delays. A common financial platform for budgets, forecasts and actuals. And the list goes on…
You won’t want to miss this #askSAP session to get a sneak preview on how SAP is helping the finance sector embrace the digital age with SAP Business Planning and Consolidation, optimized for S/4HANA Finance.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
2. RESEARCH UPDATE | Real-Time Analytics Case Studies
CenterPoint uses in-memory computing, which can
aggregate and analyze huge volumes of data from many
CenterPoint Energy’s Big Data Mandate
different sources much more quickly than traditional
} mprove employee and customer safety
I
technology. This computing power, together with
} dentify issues faster—often before customers
I
visualization and business intelligence software, is
realize them
necessary for analyzing many complex problems, such
} se workforce time and resources more efficiently
U
“The value of data
as addressing the deceptive nature of power outages.
} aise customer satisfaction, conserve energy and
R
mining is finding out
If an area loses power, the assumption might be that
what you don’t intuitively
the issue is a circuit breaker or transformer. However, a
know.”
faulty transformer could hide a broader problem.
eliminate “information disconnects”
CenterPoint’s IT Solutions
} mplemented a smart grid to gather information in
I
“When you are looking at this situation manually, without
—
DR. STEVEN PRATT,
CENTERPOINT ENERGY
smart devices, you can’t quickly diagnose when a problem
is indicative of a bigger issue,” Pratt says. “The value of
data mining is finding out what you don’t intuitively know.”
Big data and analytics enable CenterPoint to anticipate
problems. By immediately gathering and analyzing
real time
} dopted big data analytics and visualization
A
technology to troubleshoot problems faster and
improve customer service
} reated the Customer Vision Platform to segment
C
customers better and provide more personalized
customer service
power usage information, ultimately the utility may shed
electricity load when appropriate, without impacting
critical facilities. The smart grid can also alert the utility
For example, in the past, CenterPoint gauged its level
to small fluctuations that go unnoticed in a person’s
of service using customer surveys, but the response
house but signal a power loss may be imminent.
rate for the surveys was so low that it was difficult to get
“Before, we didn’t know about a problem until a
a clear picture. Now, the utility is turning to new tools,
customer called,” he says.
like customer sentiment analysis, which enables it to
analyze comments from customers on the Web. For
Along those lines, the utility is using external information
example, the company’s Customer Vision Platform—an
such as weather forecasts to augment its big data
integrated services application now in development—will
collection. This weather data is needed to predict future
pull data from different areas of the business and enable
events; for example, determining the likely magnitude
analysis of the customer experience from multiple
of an outage if a storm occurs based on historical data,
angles. Ultimately, the platform will streamline customer
the age of the equipment in the area and other factors.
interactions across all channels in the company’s natural
“That allows us to have the right resources in place even
gas distribution, electric transmission and distribution,
before the storm comes through,” Pratt says.
and Home Service Plus businesses.2
As a result, the utility is able to ensure service is restored as
In time, CenterPoint hopes to leverage the Customer
quickly as possible while using resources more efficiently.
Vision Platform to segment customers and market to
them in a more personalized way based on their actual
Better Service Through Analytics
needs. “If we can provide specific energy services that
As the use of smart grid technology grows, Pratt says
will truly make our customers’ lives easier, we know we
the industry will need to focus more on how analytics
are using the data to serve them better,” Pratt says.
can improve customer service. “There has been a lot of
“And that’s what big data and analytics is all about.” •
hype around big data, so you need to identify use cases
for your company,” he says. “You may need a different
2. “CenterPoint Energy Orders New
Customer Vision Platform.” Transmission
Distribution World, April 10, 2013.
http://goo.gl/FqbJ48
2 |
set of technology, applications and capabilities than the
company right next door.”
Bloomberg Businessweek Research Services
Joe Mullich is a freelance business and technology
writer based in Sherman Oaks, Calif.
This research project was funded by a grant from SAP.
3. SPONSOR’S STATEMENT
Big Data Analytics Delivers Real-Time
Insights for Energy, Process Industries
N
ew research, qualitative interviews and the
Are you able to manage your assets efficiently? Are
report from Bloomberg Businessweek
you able to deliver the right insight to your internal
Research Services clearly show that energy
stakeholders to support decision-making and
and process industries struggle to leverage big data
customer service? Second, define your vision for the
and analytics today to improve profitability, mitigate
future. For example, how will your customer needs
risk and increase expectations for returns on
change? How will your company’s offerings need to
investments in these technologies.
change? How will you deliver products and services
to your future customers? Finally, consider taking a
SAP’s Recipe for
Success
Many companies face inflexible legacy systems,
} AP HANA platform
S
handles your big data
challenges and delivers
real-time insights
} AP BusinessObjects
S
Business Intelligence
suite enables every
individual in the
organization to
make fact-based
decisions via easily
accessible and relevant
information whenever
and wherever needed
} AP Lumira is an agile
S
data visualization
solution that helps
decision-makers
easily discover unique
insights
} AP Predictive
S
Analysis empowers
business users
with predictive and
advanced analytics
} AP Solutions for
S
Social Media is for
analyzing social data
and improving business
customer experience
} AP Services help
S
you differentiate
your company and
make more profitable
business decisions
volumes of a wide variety of data (structured, semi-
The most immediate value SAP HANA and
structured and unstructured). This creates what is
analytics can provide utility companies—
known as the big data problem. SAP’s energy and
independent systems operators (ISOs) that must
process industry team discusses how technology
handle large quantities of data every day—is to
innovations and real-time insights can help energy
keep the power markets functioning, detect fraud
and process industry companies overcome these
and theft, and identify unbilled accounts. Energy
challenges and drive business results—including
and process companies are also starting to realize
better ways to reach customers, better asset
savings by using analytics to prioritize capital
maintenance and better risk mitigation.
investments in equipment and to integrate sales
gradual approach to your transformation.
lack of enterprise-wide analytic tools and large
and operations planning that support real-time
How do technology innovations solve the challenges
what-if simulations and social collaboration.
faced by most energy and process companies today?
In addition, by analyzing data from smart
A fast analytical platform that can handle large
meters, utilities can gain a better understanding of
amounts of transactional and process data quickly
customers’ consumption behavior.
and effectively in real time is the foundation
solution. The in-memory computing platform
Select a line of business or a region as a test bed and
called SAP HANA® is built to store and analyze big
implement an integrated, real-time reporting and
data from multiple systems. It helps companies
analytics solution. Using SAP HANA, energy and
achieve a more comprehensive—and instant—
process companies will quickly discover how the
view of their assets, customers and business
sheer speed and flexibility of the platform provide
performance. The solution from SAP consists of
immediate value to business users. Then adopt an
the SAP BusinessObjects Business Intelligence
enterprise-wide analytics strategy that includes big
platform, agile data visualization by SAP Lumira
data. The SAP BusinessObjects Business Intelligence
and the ability to leverage sophisticated models
suite will support that strategy. It is a simple,
and algorithms to predict customer behavior with
one-stop solution that supports big data, real-time
SAP Predictive Analysis.
insights, agile visualization and predictive analytics.
What can energy and process companies do to
For more information please visit this Web site:
derive real-time insights?
www.sap.com/appliedanalytics/
To start with, a company needs to assess its current
status. For example, are you able to maximize return
on capital invested? Are you able to keep up with
dynamic markets, customer needs and regulations?