Speaker: Vince Leat, Industry Consulting Executive, Teradata
Large enterprises need a partner who has done it before. Teradata has successfully implemented AI across multiple industries, proving the technology as well as producing material business outcomes. Teradata continues to channel IP from successful, field-based AI client engagements into accelerators that lead to faster time to value and reduce the risk of custom AI initiatives. Hear how Teradata helps customers build opportunities derived from AI.
Powering Asurion's Connected Home Platform with Spark Structured Streaming, D...Databricks
Asurion’s Connected Home simplifies the complexities of operation, setup, and management of connected devices and services.
Leveraging latest advancements in Machine Learning, Big Data, and Real-Time processing capabilities, we have built a platform capable of keeping the connected world connected and continually learning.
Solving this technical challenge requires running multiple continuous applications capable of transactional data storage, multi-level aggregations, joins, execution of ML models with data privacy and security at its core.
Asurion Connected Home platform achieves this goal by using Spark Structured Streaming, Delta Lake and MLflow on Databricks.
Startup pitch presented by co-founder and CEO Jaco Els. Cubitic offers a predictive analytics platform that allows developers to build custom solutions for analytics and visualisation on top of a machine learning engine.
Data Natives meets DataRobot | "Build and deploy an anti-money laundering mo...Dataconomy Media
Compliance departments within banks and other financial institutions are turning to machine learning for improving their Anti Money Laundering compliance activities. Today, the systems that aim to detect potentially suspicious activity are commonly rule-based, and suffer from ultra-high false positive rates. DataRobot will discuss how their Automated Machine Learning platform was successfully used for a real use case to reduce their false positives and to enhance their Anti-Money Laundering activities.
The Fog or Edge Computing model complements Cloud Computing with small, typically sensor-enabled and IOT connected devices that process distributed data at its source. As this model matures, we see an uptake on a 3-tier architecture with Intelligent Gateways to aggregate sensor input before communicating with data centers or a Cloud. Two forces will drive the practice of distributing Intelligence (Understanding/Reasoning/Learning) to the Gateway. The first is the presence of the Gateway itself, which enables a standards-based approach to distributing intelligence and moving it closer to the edge. The second is the trend for simplifying system requirements by processing training data or model validation with big data prior to deployment, and using small footprint devices for operational systems.
This webinar will present an overview of the relevant technologies and trends. Participants will learn about the state of the art today, and how to identify apps in their own environment that would be good candidates for Intelligent Edge solutions.
Speaker: Vince Leat, Industry Consulting Executive, Teradata
Large enterprises need a partner who has done it before. Teradata has successfully implemented AI across multiple industries, proving the technology as well as producing material business outcomes. Teradata continues to channel IP from successful, field-based AI client engagements into accelerators that lead to faster time to value and reduce the risk of custom AI initiatives. Hear how Teradata helps customers build opportunities derived from AI.
Powering Asurion's Connected Home Platform with Spark Structured Streaming, D...Databricks
Asurion’s Connected Home simplifies the complexities of operation, setup, and management of connected devices and services.
Leveraging latest advancements in Machine Learning, Big Data, and Real-Time processing capabilities, we have built a platform capable of keeping the connected world connected and continually learning.
Solving this technical challenge requires running multiple continuous applications capable of transactional data storage, multi-level aggregations, joins, execution of ML models with data privacy and security at its core.
Asurion Connected Home platform achieves this goal by using Spark Structured Streaming, Delta Lake and MLflow on Databricks.
Startup pitch presented by co-founder and CEO Jaco Els. Cubitic offers a predictive analytics platform that allows developers to build custom solutions for analytics and visualisation on top of a machine learning engine.
Data Natives meets DataRobot | "Build and deploy an anti-money laundering mo...Dataconomy Media
Compliance departments within banks and other financial institutions are turning to machine learning for improving their Anti Money Laundering compliance activities. Today, the systems that aim to detect potentially suspicious activity are commonly rule-based, and suffer from ultra-high false positive rates. DataRobot will discuss how their Automated Machine Learning platform was successfully used for a real use case to reduce their false positives and to enhance their Anti-Money Laundering activities.
The Fog or Edge Computing model complements Cloud Computing with small, typically sensor-enabled and IOT connected devices that process distributed data at its source. As this model matures, we see an uptake on a 3-tier architecture with Intelligent Gateways to aggregate sensor input before communicating with data centers or a Cloud. Two forces will drive the practice of distributing Intelligence (Understanding/Reasoning/Learning) to the Gateway. The first is the presence of the Gateway itself, which enables a standards-based approach to distributing intelligence and moving it closer to the edge. The second is the trend for simplifying system requirements by processing training data or model validation with big data prior to deployment, and using small footprint devices for operational systems.
This webinar will present an overview of the relevant technologies and trends. Participants will learn about the state of the art today, and how to identify apps in their own environment that would be good candidates for Intelligent Edge solutions.
Predictive Analytics for IoT Network Capacity Planning: Spark Summit East tal...Spark Summit
The Internet of Things (IoT) is a growing network, supporting a wide variety of service types with specific network requirements that differ from traditional human type communications. This has led to emergence of dedicated IoT network standards. To optimize investments for dedicated network infrastructures, we’re investigating a dynamic approach in network capacity planning to accommodate multiple IoT traffic types over a cellular network, while maintaining their specific requirements.
We studied models of IoT traffic and used machine learning in prediction and scheduling of future workload under heterogeneous and variable traffic conditions when human-type and machine-type communications are mixed.
An integrated analytics framework including Hadoop and Spark were deployed for experimentation and a number of capacity planning use cases were implemented to verify the accuracy of the method.
The edge computing market today includes consumer apps and devices, and the industrial sector, where increasingly powerful CPUs drive everything from wind turbines to autonomous vehicles, robots, drones and equipment. The device market is growing explosively:
These devices gather a wealth of data from a broad array of sensors – and have the potential to optimize efficiency, safety and performance, and revolutionize productivity and user experiences. But to deliver these benefits they need to become truly smart, performing analysis, training and inference on high volumes of sensor data on-the-fly.
There is an urgent need for software that simplifies and automates data analysis and inference at the edge, helping devices and systems learn from and make predictions about their environment: Cameras that recognize and track their targets; self-driving cars that choose the least congested routes using real- time predictions for intersections ahead; and drones that dynamically swarm, find their targets and gather intelligence without human oversight.
These examples require each device to make decisions based on a real-time analysis of its own sensor data fused with the analysis and predictions from other systems: Drones in a swarm need to collaborate or they will collide; they must gossip their insights to each other to enable the swarm to perform effectively. Today, the software to enable each of these complex scenarios must be developed from scratch, starting with raw data feeds and network protocols. To unlock the potential of an edge environment rich in sensors and power-efficient computing platforms developers need a simple way to get from vast amounts of raw data to insights and predictions.
What's needed is a new Architecture for the intelligent edge – one that consumes raw data from devices at the edge, and automatically creates a “digital twin” for each real-world system from its data. Digital twins statefully process their own data at the edge, analyzing, learning and predicting in real-time. Digital twins can find anomalies or correlations in their own data, and self-train powerful neural network models that enable them to predict their future performance, then share semantically enriched insights with other digital twins to solve system problems. The architecture helps application developers by dynamically creating digital twins that learn from their own data – automatically building a model of the real world that is always up to date, executes in real-time, and makes accurate predictions of the behavior of complex systems.
The Beauty of (Big) Data Privacy EngineeringDatabricks
Privacy engineering is an emerging discipline within the software and data engineering domains aiming to provide methodologies, tools, and techniques such that the engineered systems provide acceptable levels of privacy. In this talk, I will present our recent work on anonymization and privacy preserving analytics on large scale geo location datasets. In particular, the focus is on how to scale anonymization and geospatial analytics workloads with Spark, maximizing the performance by combining multi-dimensional spatial indexing with Spark in-memory computations.
Showing only reports of data is only a part of the whole story. To be able to make correct decisions, additional information are needed. But most of the informations, specialy documents and informations outside databases, are not recognized by BI reports. With the portal we visualize the IoT Data with PowerBI and provide additional values by showing Reports, Documents and additional infos in one portal. Users will get a real "single point of information" for that topic. An example with a demo will be shown.
Big Data Experience Sharing: Building Collaborative Data Analytics Platform -...Amazon Web Services
Speaker: Kenny Kwan, Head of Software and Cloud Engineering, Gibson Innovations Limited
Here from Gibson Innovations engineering on how they build, deploy and manage their Data Analytics Platforms and IoT Hub on AWS and get business insights.
Data Natives Frankfurt v 11.0 | "Competitive advantages with knowledge graphs...Dataconomy Media
The challenges of increasing complexity of organizations, companies and projects are obvious and omnipresent. Everywhere there are connections and dependencies that are often not adequately managed or not considered at all because of a lack of technology or expertise to uncover and leverage the relationships in data and information. In his presentation, Axel Morgner talks about graph technology and knowledge graphs as indispensable building blocks for successful companies.
Cheryl Wiebe - Advanced Analytics in the Industrial WorldRehgan Avon
2018 Women in Analytics Conference
https://www.womeninanalytics.org/
Cheryl will talk about her consulting practice in Industrial Solutions, Analytic solutions for industrial IoT-enabled businesses, including connected factory, connected supply chain, smart mobility, connected assets. Her path to this practice has bounced between hands on systems development, IT strategy, business process reengineering, supply chain analytics, manufacturing quality analytics, and now Industrial IoT analytics. She spent time working in industry as a developer, as a management consultant, started and sold a company, before settling in to pursue this topic as a career analytics consultant. Cheryl will shed light on what's happening in industrial companies struggling to make the transition to digital, what that means, and what barriers they're challenged with. She'll touch on how/where artificial intelligence, deep learning, and machine learning technologies are being used most effectively in industrial companies, and what are the unique challenges they are facing. Reflecting on what's changed over the years, and her journey to witness this, Cheryl will pose what she considers important ideas to consider for women (and men) in pursuing an analytics career successfully and meaningfully.
Solution Architecture Patterns for Digital TransformationWSO2
Digital transformation is a key enabler for both existing enterprises transforming themselves to compete in the modern marketplace and for newer startups trying to break into the business domain. Being able to involve internal and external stakeholders by exposing existing internal services as managed APIs whilst carefully governing your data is a core step in this process. Being able to adapt a lean devops process is a bonus.
This webinar serves as a primer to your digital transformation journey. It will discuss
API management as a coexisting solution within an enterprise
How that API management ties into the management and governance of data
Concepts of lean devops and containerization
In this deck, Trish Damkroger from Intel describes Technology Trends Driving HPC.
"HPC is now critical for more use cases, complex workloads, and data-intensive computing than ever before. From AI and visualization to simulation and modeling, Intel provides the advantage of one platform for any workload by integrating world-class compute with powerful fabric, memory, storage, and acceleration. You can move your research and innovation forward faster to solve the world’s most complex challenges."
Watch the video: https://insidehpc.com/2018/08/techtrends/
Learn more: http://intel.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
This session shows you how you can use Microsoft Azure to build a high-scalable solution for event-processing. You can use this approach for classical IoT-scenarios or if you want for example to capture telemetry-data of a widely distributed application. Then each application-instance will send data to Azure’s Event Hub. In this session you will not only get some insights into the Event Hub, but also into Stream Analytics. Stream Analytics is used to aggregate the millions of events coming from the Event Hub by using a SQL-like syntax. From Stream Analytics the data can be pushed into a database or for example into a Live Dashboard in Microsoft’s Power BI.
Data Natives Munich v 12.0 | "How to be more productive with Autonomous Data ...Dataconomy Media
Every day we are challenged with more data, more use cases and an ever increasing demand for analytics. In this talk Bjorn will explain how autonomous data management and machine learning help innovators to more productive and give examples how to deliver new data driven projects with less risk at lower costs.
Ayush Tiwari [PTC] | Unlock IoT Value with PTC’s ThingWorx Platform & InfluxD...InfluxData
PTC enables global manufacturers to realize double-digit impact with software solutions that enable them to accelerate product and service innovation, improve operational efficiency and increase workforce productivity. For developing IIoT solutions, PTC has partnered with InfluxData to manage time series data at scale. With PTC’s ThingWorx platform capabilities to rapidly build IIoT applications coupled with InfluxData’s leading time series data storage platform, customers are destined on a path to success in their digital transformation journey. Learn how selecting the ThingWorx solution and InfluxDB will unlock your IoT value.
Data engineering at the interface of art and analytics: the why, what, and ho...Data Con LA
Abstract:- Netflix has a growing presence in Hollywood, with technical teams working on everything from high-speed video editing pipelines to machine learning methods for categorizing films. Data is foundational across these efforts and in this talk Josh will take a tour through why we invest so much in data about content, what data engineering challenges we tackle, and the style in which we do it.
How Schneider Electric Assures Its Salesforce Lightning Migration with Thousa...ThousandEyes
ThousandEyes webinar from Tuesday September 17th 2019, presented by Archana Kesavan, Director of Product Marketing at ThousandEyes and Anil Sistal, Platform Architect at Schneider Electric on the topic of Schneider Electrics migration to ThousandEyes Synthetic Monitoring for Salesforce Lightning.
Predictive Analytics for IoT Network Capacity Planning: Spark Summit East tal...Spark Summit
The Internet of Things (IoT) is a growing network, supporting a wide variety of service types with specific network requirements that differ from traditional human type communications. This has led to emergence of dedicated IoT network standards. To optimize investments for dedicated network infrastructures, we’re investigating a dynamic approach in network capacity planning to accommodate multiple IoT traffic types over a cellular network, while maintaining their specific requirements.
We studied models of IoT traffic and used machine learning in prediction and scheduling of future workload under heterogeneous and variable traffic conditions when human-type and machine-type communications are mixed.
An integrated analytics framework including Hadoop and Spark were deployed for experimentation and a number of capacity planning use cases were implemented to verify the accuracy of the method.
The edge computing market today includes consumer apps and devices, and the industrial sector, where increasingly powerful CPUs drive everything from wind turbines to autonomous vehicles, robots, drones and equipment. The device market is growing explosively:
These devices gather a wealth of data from a broad array of sensors – and have the potential to optimize efficiency, safety and performance, and revolutionize productivity and user experiences. But to deliver these benefits they need to become truly smart, performing analysis, training and inference on high volumes of sensor data on-the-fly.
There is an urgent need for software that simplifies and automates data analysis and inference at the edge, helping devices and systems learn from and make predictions about their environment: Cameras that recognize and track their targets; self-driving cars that choose the least congested routes using real- time predictions for intersections ahead; and drones that dynamically swarm, find their targets and gather intelligence without human oversight.
These examples require each device to make decisions based on a real-time analysis of its own sensor data fused with the analysis and predictions from other systems: Drones in a swarm need to collaborate or they will collide; they must gossip their insights to each other to enable the swarm to perform effectively. Today, the software to enable each of these complex scenarios must be developed from scratch, starting with raw data feeds and network protocols. To unlock the potential of an edge environment rich in sensors and power-efficient computing platforms developers need a simple way to get from vast amounts of raw data to insights and predictions.
What's needed is a new Architecture for the intelligent edge – one that consumes raw data from devices at the edge, and automatically creates a “digital twin” for each real-world system from its data. Digital twins statefully process their own data at the edge, analyzing, learning and predicting in real-time. Digital twins can find anomalies or correlations in their own data, and self-train powerful neural network models that enable them to predict their future performance, then share semantically enriched insights with other digital twins to solve system problems. The architecture helps application developers by dynamically creating digital twins that learn from their own data – automatically building a model of the real world that is always up to date, executes in real-time, and makes accurate predictions of the behavior of complex systems.
The Beauty of (Big) Data Privacy EngineeringDatabricks
Privacy engineering is an emerging discipline within the software and data engineering domains aiming to provide methodologies, tools, and techniques such that the engineered systems provide acceptable levels of privacy. In this talk, I will present our recent work on anonymization and privacy preserving analytics on large scale geo location datasets. In particular, the focus is on how to scale anonymization and geospatial analytics workloads with Spark, maximizing the performance by combining multi-dimensional spatial indexing with Spark in-memory computations.
Showing only reports of data is only a part of the whole story. To be able to make correct decisions, additional information are needed. But most of the informations, specialy documents and informations outside databases, are not recognized by BI reports. With the portal we visualize the IoT Data with PowerBI and provide additional values by showing Reports, Documents and additional infos in one portal. Users will get a real "single point of information" for that topic. An example with a demo will be shown.
Big Data Experience Sharing: Building Collaborative Data Analytics Platform -...Amazon Web Services
Speaker: Kenny Kwan, Head of Software and Cloud Engineering, Gibson Innovations Limited
Here from Gibson Innovations engineering on how they build, deploy and manage their Data Analytics Platforms and IoT Hub on AWS and get business insights.
Data Natives Frankfurt v 11.0 | "Competitive advantages with knowledge graphs...Dataconomy Media
The challenges of increasing complexity of organizations, companies and projects are obvious and omnipresent. Everywhere there are connections and dependencies that are often not adequately managed or not considered at all because of a lack of technology or expertise to uncover and leverage the relationships in data and information. In his presentation, Axel Morgner talks about graph technology and knowledge graphs as indispensable building blocks for successful companies.
Cheryl Wiebe - Advanced Analytics in the Industrial WorldRehgan Avon
2018 Women in Analytics Conference
https://www.womeninanalytics.org/
Cheryl will talk about her consulting practice in Industrial Solutions, Analytic solutions for industrial IoT-enabled businesses, including connected factory, connected supply chain, smart mobility, connected assets. Her path to this practice has bounced between hands on systems development, IT strategy, business process reengineering, supply chain analytics, manufacturing quality analytics, and now Industrial IoT analytics. She spent time working in industry as a developer, as a management consultant, started and sold a company, before settling in to pursue this topic as a career analytics consultant. Cheryl will shed light on what's happening in industrial companies struggling to make the transition to digital, what that means, and what barriers they're challenged with. She'll touch on how/where artificial intelligence, deep learning, and machine learning technologies are being used most effectively in industrial companies, and what are the unique challenges they are facing. Reflecting on what's changed over the years, and her journey to witness this, Cheryl will pose what she considers important ideas to consider for women (and men) in pursuing an analytics career successfully and meaningfully.
Solution Architecture Patterns for Digital TransformationWSO2
Digital transformation is a key enabler for both existing enterprises transforming themselves to compete in the modern marketplace and for newer startups trying to break into the business domain. Being able to involve internal and external stakeholders by exposing existing internal services as managed APIs whilst carefully governing your data is a core step in this process. Being able to adapt a lean devops process is a bonus.
This webinar serves as a primer to your digital transformation journey. It will discuss
API management as a coexisting solution within an enterprise
How that API management ties into the management and governance of data
Concepts of lean devops and containerization
In this deck, Trish Damkroger from Intel describes Technology Trends Driving HPC.
"HPC is now critical for more use cases, complex workloads, and data-intensive computing than ever before. From AI and visualization to simulation and modeling, Intel provides the advantage of one platform for any workload by integrating world-class compute with powerful fabric, memory, storage, and acceleration. You can move your research and innovation forward faster to solve the world’s most complex challenges."
Watch the video: https://insidehpc.com/2018/08/techtrends/
Learn more: http://intel.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
This session shows you how you can use Microsoft Azure to build a high-scalable solution for event-processing. You can use this approach for classical IoT-scenarios or if you want for example to capture telemetry-data of a widely distributed application. Then each application-instance will send data to Azure’s Event Hub. In this session you will not only get some insights into the Event Hub, but also into Stream Analytics. Stream Analytics is used to aggregate the millions of events coming from the Event Hub by using a SQL-like syntax. From Stream Analytics the data can be pushed into a database or for example into a Live Dashboard in Microsoft’s Power BI.
Data Natives Munich v 12.0 | "How to be more productive with Autonomous Data ...Dataconomy Media
Every day we are challenged with more data, more use cases and an ever increasing demand for analytics. In this talk Bjorn will explain how autonomous data management and machine learning help innovators to more productive and give examples how to deliver new data driven projects with less risk at lower costs.
Ayush Tiwari [PTC] | Unlock IoT Value with PTC’s ThingWorx Platform & InfluxD...InfluxData
PTC enables global manufacturers to realize double-digit impact with software solutions that enable them to accelerate product and service innovation, improve operational efficiency and increase workforce productivity. For developing IIoT solutions, PTC has partnered with InfluxData to manage time series data at scale. With PTC’s ThingWorx platform capabilities to rapidly build IIoT applications coupled with InfluxData’s leading time series data storage platform, customers are destined on a path to success in their digital transformation journey. Learn how selecting the ThingWorx solution and InfluxDB will unlock your IoT value.
Data engineering at the interface of art and analytics: the why, what, and ho...Data Con LA
Abstract:- Netflix has a growing presence in Hollywood, with technical teams working on everything from high-speed video editing pipelines to machine learning methods for categorizing films. Data is foundational across these efforts and in this talk Josh will take a tour through why we invest so much in data about content, what data engineering challenges we tackle, and the style in which we do it.
How Schneider Electric Assures Its Salesforce Lightning Migration with Thousa...ThousandEyes
ThousandEyes webinar from Tuesday September 17th 2019, presented by Archana Kesavan, Director of Product Marketing at ThousandEyes and Anil Sistal, Platform Architect at Schneider Electric on the topic of Schneider Electrics migration to ThousandEyes Synthetic Monitoring for Salesforce Lightning.
Leveraging the Power of the ServiceNow® Platform with Mainframe and IBM i Sys...Precisely
ServiceNow is a recognized leader transforming the impact, speed and delivery of IT by breaking down silos and providing visibility across the enterprise. Meanwhile, more than 2.5 billion business transactions run on mainframes each day and over 100,000 companies use IBM i technology to run their business. Yet, until recently, these critical systems have been disconnected from the ServiceNow platform – leaving a significant blind spot in the enterprise-wide view of IT infrastructure.
View this webinar on-demand to learn about Syncsort Ironstream, the first product to seamlessly integrate IBM mainframe and IBM i systems into the ServiceNow platform to support IT Operations and Service Management.
Product experts will discuss the value this integration delivers to your business, as well as show how mainframe and IBM i data is used within the ServiceNow platform to deliver high-performance business services.
During this webinar, we cover:
• The benefits – and challenges – of including mainframe and IBM i data in the ServiceNow platform
• How Syncsort Ironstream integrates with ServiceNow Discovery and ServiceNow Event Management
• A demonstration of how mainframe and IBM i data works within ServiceNow to address top ITSM use cases, including change management, incident management and event management
Profit and market value is migrating away from hardware, but few product companies are prepared and executing the required digital transformation. High tech companies need to invest in digital growth strategies, reinvigorate business models and create new revenue streams. Find out how to harness disruption to grow your business.
AI now ron tolido, capgemini cwin18_toulouseCapgemini
Now that corporate enthusiasm for Artificial Intelligence is skyrocketing, the challenge for the industry is to deliver on the incredible promise. Although lots still needs to be discovered around ethics, culture and effects on society, there is no reason to procrastinate: AI can deliver benefits and impact already today. Here’s 3 ways of how to do it.
The Shift Is Here: Artificial Intelligence is at Work for the Construction In...indus.ai
Technology solutions that use Artificial Intelligence (AI) are already reducing costs, managing risk, and increasing safety for the Construction Industry.
This eBook shows industry stakeholders how to take advantage of the benefits of AI – both now and in the future.
Amazon Elasticsearch Service (Amazon ES) is a fully managed service that enables you to deploy, secure, and operate Amazon ES at scale with zero down time. Discover best practices to get the most out of search applications.
Hear highlights from how Municipal Securities Rulemaking Board (MSRB) used Amazon ES to modernize their analytics platform. The MSRB engaged Hitachi Vantara to develop a prototype data analytics platform that enabled natural language and free-text search. Join us to learn more about how to leverage the cloud to generate tremendous and transformative value.
Speaker: Santosh Karla, Principal Cloud Architect, Hitachi Vantara
2019 technology innovations and investmentsMarko Paris
A look at the technology trends and venture capital investment in 2019. Part of a series that will introduce the audience to AI and related topics and progressively delve into coding AI applications in the Fintech and Digital-health spaces.
The Internet of Things (IoT) keeps evolving, and there’s a critical need for high-speed data processing, analytics, and reduced latency at the edge. Meeting the needs of these systems that leverage a distributed architecture to bring compute resources to the edge and the cloud is essential. A cloud-only model might not be applicable for time-sensitive operations or where network connectivity is poor. Also, connecting every device to the cloud and sending raw data over the internet can have privacy, security, and legal implications, especially for sensitive data. Learn how AWS extends AWS Greengrass to devices, so they can act locally on data and use the cloud for management, analytics, and durable storage.
Outage analysis: BGP Routing Errors Ripple Across the InternetThousandEyes
How did Cloudflare, AWS, Discord and a metals manufacturer all end up in the same Internet pile-up? For nearly two hours on June 24, 2019, a BGP routing error from an ISP in Pennsylvania rippled outward across the global Internet, creating havoc and causing Cloudflare, AWS and related services like Discord and Nintendo Life to become unavailable to many users. The worst part is that it was entirely preventable.
Digital transformation of the manufacturing industry is underway in all aspects of the value chain, and the cloud is at the center. In this session, we discuss how global manufacturing companies are realizing the business value of the cloud. We look at how offerings and tools like AWS Internet-of-Things (IoT) services, high-performance computing (HPC), data lakes, and, ultimately, machine learning and artificial intelligence are impacting all process aspects from design and engineering to production and service operations.
Internet of Things e Machine Learning: i principali casi d'usoAmazon Web Services
In questa sessione, approfondiremo i principali casi d'uso di organizzazioni e aziende che hanno reso l'Internet of Things e il Machine Learning elementi centrali delle proprie attività e processi quotidiani. Vedremo come queste aziende hanno ottenuto un maggior livello di efficienza operativa e produttività, analizzando ciascun caso d'uso in termini di: sfide aziendali, metriche per il successo, ritorno dell'investimento (ROI), risorse e competenze.
The Future of Infrastructure: Key Trends to considerCapgemini
Gunnar Menzel Vice President, Chief Architect - Capgemini
Which technologies have made the biggest impact, and which ones will impact us greatest in the future? Will technology advances slow down, stay the same, or speed-up? Which trends and technologies should I consider? The Digital Agenda; shifting business models; and the need for speed at lower cost are impacting, shaping and forming new technologies – creating new opportunities at an ever-increasing pace. Gunnar will outline the various infrastructure-related trends and technologies that are currently key, in addition to those that will prove to be significant going forward.
Improving manufacturing operations is everything - MFG401 - Mexico City AWS S...Amazon Web Services
On the road to digitalization, manufacturing companies must increase their operations efficiency and accelerate their innovation rhythms. AWS provides tools and best practices for incorporating digital capabilities to achieve manufacturing business objectives without compromising security and governance and while reducing cost structures. It’s time to drive change and emerge better prepared to face current and future business environments. A strategy based on digital processes, tools, and data-driven decisions that enhances the value for customers is making a difference in the success of manufacturing companies. Learn about best practices, customer cases, and examples of companies that embarked on the digitalization journey and emerged as market leaders.
Similar to Autograph - Natural Signatures for Graph Modelling, Simon Brueckheimer, Ciena (20)
Atelier - Architecture d’applications de Graphes - GraphSummit ParisNeo4j
Atelier - Architecture d’applications de Graphes
Participez à cet atelier pratique animé par des experts de Neo4j qui vous guideront pour découvrir l’intelligence contextuelle. En utilisant un jeu de données réel, nous construirons étape par étape une solution de graphes ; de la construction du modèle de données de graphes à l’exécution de requêtes et à la visualisation des données. L’approche sera applicable à de multiples cas d’usages et industries.
Atelier - Innover avec l’IA Générative et les graphes de connaissancesNeo4j
Atelier - Innover avec l’IA Générative et les graphes de connaissances
Allez au-delà du battage médiatique autour de l’IA et découvrez des techniques pratiques pour utiliser l’IA de manière responsable à travers les données de votre organisation. Explorez comment utiliser les graphes de connaissances pour augmenter la précision, la transparence et la capacité d’explication dans les systèmes d’IA générative. Vous partirez avec une expérience pratique combinant les relations entre les données et les LLM pour apporter du contexte spécifique à votre domaine et améliorer votre raisonnement.
Amenez votre ordinateur portable et nous vous guiderons sur la mise en place de votre propre pile d’IA générative, en vous fournissant des exemples pratiques et codés pour démarrer en quelques minutes.
Neo4j - Product Vision and Knowledge Graphs - GraphSummit ParisNeo4j
Dr. Jesús Barrasa, Head of Solutions Architecture for EMEA, Neo4j
Découvrez les dernières innovations de Neo4j, et notamment les dernières intégrations cloud et les améliorations produits qui font de Neo4j un choix essentiel pour les développeurs qui créent des applications avec des données interconnectées et de l’IA générative.
Neo4j - Product Vision and Knowledge Graphs - GraphSummit ParisNeo4j
Dr. Jesús Barrasa, Head of Solutions Architecture for EMEA, Neo4j
Découvrez les dernières innovations de Neo4j, et notamment les dernières intégrations cloud et les améliorations produits qui font de Neo4j un choix essentiel pour les développeurs qui créent des applications avec des données interconnectées et de l’IA générative.
SOPRA STERIA - GraphRAG : repousser les limitations du RAG via l’utilisation ...Neo4j
Romain CAMPOURCY – Architecte Solution, Sopra Steria
Patrick MEYER – Architecte IA Groupe, Sopra Steria
La Génération de Récupération Augmentée (RAG) permet la réponse à des questions d’utilisateur sur un domaine métier à l’aide de grands modèles de langage. Cette technique fonctionne correctement lorsque la documentation est simple mais trouve des limitations dès que les sources sont complexes. Au travers d’un projet que nous avons réalisé, nous vous présenterons l’approche GraphRAG, une nouvelle approche qui utilise une base Neo4j générée pour améliorer la compréhension des documents et la synthèse d’informations. Cette méthode surpasse l’approche RAG en fournissant des réponses plus holistiques et précises.
ADEO - Knowledge Graph pour le e-commerce, entre challenges et opportunités ...Neo4j
Charles Gouwy, Business Product Leader, Adeo Services (Groupe Leroy Merlin)
Alors que leur Knowledge Graph est déjà intégré sur l’ensemble des expériences d’achat de leur plateforme e-commerce depuis plus de 3 ans, nous verrons quelles sont les nouvelles opportunités et challenges qui s’ouvrent encore à eux grâce à leur utilisation d’une base de donnée de graphes et l’émergence de l’IA.
GraphSummit Paris - The art of the possible with Graph TechnologyNeo4j
Sudhir Hasbe, Chief Product Officer, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
GraphAware - Transforming policing with graph-based intelligence analysisNeo4j
Petr Matuska, Sales & Sales Engineering Lead, GraphAware
Western Australia Police Force’s adoption of Neo4j and the GraphAware Hume graph analytics platform marks a significant advancement in data-driven policing. Facing the challenges of growing volumes of valuable data scattered in disconnected silos, the organisation successfully implemented Neo4j database and Hume, consolidating data from various sources into a dynamic knowledge graph. The result was a connected view of intelligence, making it easier for analysts to solve crime faster. The partnership between Neo4j and GraphAware in this project demonstrates the transformative impact of graph technology on law enforcement’s ability to leverage growing volumes of valuable data to prevent crime and protect communities.
GraphSummit Stockholm - Neo4j - Knowledge Graphs and Product UpdatesNeo4j
David Pond, Lead Product Manager, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Shirley Bacso, Data Architect, Ingka Digital
“Linked Metadata by Design” represents the integration of the outcomes from human collaboration, starting from the design phase of data product development. This knowledge is captured in the Data Knowledge Graph. It not only enables data products to be robust and compliant but also well-understood and effectively utilized.
Your enemies use GenAI too - staying ahead of fraud with Neo4jNeo4j
Delivered by Michael Down at Gartner Data & Analytics Summit London 2024 - Your enemies use GenAI too: Staying ahead of fraud with Neo4j.
Fraudsters exploit the latest technologies like generative AI to stay undetected. Static applications can’t adapt quickly enough. Learn why you should build flexible fraud detection apps on Neo4j’s native graph database combined with advanced data science algorithms. Uncover complex fraud patterns in real-time and shut down schemes before they cause damage.
BT & Neo4j _ How Knowledge Graphs help BT deliver Digital Transformation.pptxNeo4j
Delivered by Sreenath Gopalakrishna, Director of Software Engineering at BT, and Dr Jim Webber, Chief Scientist at Neo4j, at Gartner Data & Analytics Summit London 2024 this presentation examines how knowledge graphs and GenAI combine in real-world solutions.
BT Group has used the Neo4j Graph Database to enable impressive digital transformation programs over the last 6 years. By re-imagining their operational support systems to adopt self-serve and data lead principles they have substantially reduced the number of applications and complexity of their operations. The result has been a substantial reduction in risk and costs while improving time to value, innovation, and process automation. Future innovation plans include the exploration of uses of EKG + Generative AI.
Workshop: Enabling GenAI Breakthroughs with Knowledge Graphs - GraphSummit MilanNeo4j
Look beyond the hype and unlock practical techniques to responsibly activate intelligence across your organization’s data with GenAI. Explore how to use knowledge graphs to increase accuracy, transparency, and explainability within generative AI systems. You’ll depart with hands-on experience combining relationships and LLMs for increased domain-specific context and enhanced reasoning.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Enhancing Performance with Globus and the Science DMZGlobus
ESnet has led the way in helping national facilities—and many other institutions in the research community—configure Science DMZs and troubleshoot network issues to maximize data transfer performance. In this talk we will present a summary of approaches and tips for getting the most out of your network infrastructure using Globus Connect Server.
Welcome to the first live UiPath Community Day Dubai! Join us for this unique occasion to meet our local and global UiPath Community and leaders. You will get a full view of the MEA region's automation landscape and the AI Powered automation technology capabilities of UiPath. Also, hosted by our local partners Marc Ellis, you will enjoy a half-day packed with industry insights and automation peers networking.
📕 Curious on our agenda? Wait no more!
10:00 Welcome note - UiPath Community in Dubai
Lovely Sinha, UiPath Community Chapter Leader, UiPath MVPx3, Hyper-automation Consultant, First Abu Dhabi Bank
10:20 A UiPath cross-region MEA overview
Ashraf El Zarka, VP and Managing Director MEA, UiPath
10:35: Customer Success Journey
Deepthi Deepak, Head of Intelligent Automation CoE, First Abu Dhabi Bank
11:15 The UiPath approach to GenAI with our three principles: improve accuracy, supercharge productivity, and automate more
Boris Krumrey, Global VP, Automation Innovation, UiPath
12:15 To discover how Marc Ellis leverages tech-driven solutions in recruitment and managed services.
Brendan Lingam, Director of Sales and Business Development, Marc Ellis
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.