Draper Accelerator Talk Slides - convering convergence of of AI and Blockchain and how it solves challenges for IoT, Ai@Edge and Data Ethics and User Data Monetization.
Slides from Talk @ Intel IoT DevFest IV
With both Facebook and Google's recent shift in direction towards a "Future is Private" world, learn how you too can train and deploy your AI models in a privacy-preserving way, with Decentralized AI and a combination of AI and Blockchain. These techniques will become even more rampant as we move into a world where users will own their own data and companies will start using “ethically sourced data” and move towards a path for Ethical AI for the IoT space.
In this session, you will learn:
- Use cases for Decentralized AI, with combined benefits of AI + Blockchain for IoT applications
- Federated learning & related privacy-preserving AI model training techniques for IoT applications
- How to build Ethical AI solutions for IoT using these techniques
Building AI with Security and Privacy in mindgeetachauhan
The document discusses building AI with security and privacy in mind. It covers privacy challenges in AI like tensions between data privacy and model training. It then discusses various privacy preserving machine learning techniques like homomorphic encryption, differential privacy, secure multi-party computation, on-device computation, and federated learning. The document provides examples of how each technique works. It concludes by discussing tools and techniques for starting a privacy journey in AI and provides resources to learn more.
Scaling AI in production using PyTorchgeetachauhan
Slides from my talk at MLOps World' 21
Deploying AI models in production and scaling the ML services is still a big challenge. In this talk we will cover details of how to deploy your AI models, best practices for the deployment scenarios, and techniques for performance optimization and scaling the ML services. Come join us to learn how you can jumpstart the journey of taking your PyTorch models from Research to production.
Delivering Security Insights with Data Analytics and VisualizationRaffael Marty
It's an interesting exercise to look back to the year 2000 to see how we approached cyber security. We just started to realize that data might be a useful currency, but for the most part, security pursued preventative avenues, such as firewalls, intrusion prevention systems, and anti-virus. With the advent of log management and security incident and event management (SIEM) solutions we started to gather gigabytes of sensor data and correlate data from different sensors to improve on their weaknesses and accelerate their strengths. But fundamentally, such solutions didn't scale that well and struggled to deliver real security insight.
Today, cybersecurity wouldn't work anymore without large scale data analytics and machine learning approaches, especially in the realm of malware classification and threat intelligence. Nonetheless, we are still just scratching the surface and learning where the real challenges are in data analytics for security.
This talk will go on a journey of big data in cybersecurity, exploring where big data has been and where it must go to make a true difference. We will look at the potential of data mining, machine learning, and artificial intelligence, as well as the boundaries of these approaches. We will also look at both the shortcomings and potential of data visualization and the human computer interface. It is critical that today's systems take into account the human expert and, most importantly, provide the right data.
EclipseCon France 2015 - Science TrackBoris Adryan
Software is increasingly playing a big part in scientific research, but in most cases the growth is organic. The life time of research software is often as short as the duration of a postdoctoral contract: Once the researcher moves on, custom-written niche code is frequently not well documented, components are not reusable, and the overall development effort is likely lost.
This is a case study in looking at the evolution of software for research in the field of genomics within my research group at the Department of Genetics at Cambridge University. While our research questions changed over the past decade, we moved from Perl code and regular expressions to R and statistical analysis, and from there to agent-based simulations in Java. Not only will I discuss the languages and tools used as well as the processes and how they have evolved over the years. It also covers the factors that influence the nature of the growth, such as funding, but also how 'open source' as a default has changed our development work. We also take a look into the future to see how we predict the software usage will grow.
Also, in presenting the problems and discussing possible solution, this talk will look at the role institutions play in helping address these issues. In particular the Software Sustainability Institute (SSI, http://software.ac.uk/) works in the UK to promote the development, maintenance and (re)use of research software.
The Eclipse Foundation, with the Science Working Group, works to facilitate software sharing and reuse. How can organisations like the SSI and Eclipse align their strategies and activities for maximum effect?
Mehr und schneller ist nicht automatisch besser - data2day, 06.10.16Boris Adryan
Das Gesetz der großen Zahlen gilt immer: Die statistische Sicherheit nimmt mit der Anzahl der Datenpunkte immer zu, sofern die Datennahme fair erfolgt. Leider kostet das Sammeln der Daten oftmals Geld, und so ist man vor allem im Bereich der Sensorik (Stichwort: Internet der Dinge) gezwungen, sinnvolle Kompromisse einzugehen. In diesem Vortrag fasse ich die Erkenntnisse eines Projekts zusammen, in dem die Datenanalytik zeigte, dass man zukünftig nur 60% der ausgebrachten Sensoren wirklich braucht. Auch muss es nicht immer Echtzeit-Analyse sein: Mit einer auf den Business-Case abgestimmten Datenstrategie lassen sich unnötige Ausgaben vermeiden.
Graph enhancements to Artificial Intelligence and Machine Learning are changing the landscape of intelligent applications. Beyond improving accuracy and modeling speed, graph technologies make building AI solutions more accessible. Join us to hear about 4 areas at the forefront of graph enhanced AI and ML, and find out which techniques are commonly used today and which hold the potential for disrupting industries. We'll provide examples and specifically look how: - Graphs provide better accuracy through connected feature extraction - Graphs provide better performance through contextual model optimization - Graphs provide context through knowledge graphs - Graphs add explainability to neural networks
Speakers: Jake Graham, Alicia Frame
My talk about data and information models for IoT, how ontologies can establish the relationship between IoT devices, and how Eclipse Vorto could accommodate ontological information. Briefly features Eclipse Smarthome.
Slides from Talk @ Intel IoT DevFest IV
With both Facebook and Google's recent shift in direction towards a "Future is Private" world, learn how you too can train and deploy your AI models in a privacy-preserving way, with Decentralized AI and a combination of AI and Blockchain. These techniques will become even more rampant as we move into a world where users will own their own data and companies will start using “ethically sourced data” and move towards a path for Ethical AI for the IoT space.
In this session, you will learn:
- Use cases for Decentralized AI, with combined benefits of AI + Blockchain for IoT applications
- Federated learning & related privacy-preserving AI model training techniques for IoT applications
- How to build Ethical AI solutions for IoT using these techniques
Building AI with Security and Privacy in mindgeetachauhan
The document discusses building AI with security and privacy in mind. It covers privacy challenges in AI like tensions between data privacy and model training. It then discusses various privacy preserving machine learning techniques like homomorphic encryption, differential privacy, secure multi-party computation, on-device computation, and federated learning. The document provides examples of how each technique works. It concludes by discussing tools and techniques for starting a privacy journey in AI and provides resources to learn more.
Scaling AI in production using PyTorchgeetachauhan
Slides from my talk at MLOps World' 21
Deploying AI models in production and scaling the ML services is still a big challenge. In this talk we will cover details of how to deploy your AI models, best practices for the deployment scenarios, and techniques for performance optimization and scaling the ML services. Come join us to learn how you can jumpstart the journey of taking your PyTorch models from Research to production.
Delivering Security Insights with Data Analytics and VisualizationRaffael Marty
It's an interesting exercise to look back to the year 2000 to see how we approached cyber security. We just started to realize that data might be a useful currency, but for the most part, security pursued preventative avenues, such as firewalls, intrusion prevention systems, and anti-virus. With the advent of log management and security incident and event management (SIEM) solutions we started to gather gigabytes of sensor data and correlate data from different sensors to improve on their weaknesses and accelerate their strengths. But fundamentally, such solutions didn't scale that well and struggled to deliver real security insight.
Today, cybersecurity wouldn't work anymore without large scale data analytics and machine learning approaches, especially in the realm of malware classification and threat intelligence. Nonetheless, we are still just scratching the surface and learning where the real challenges are in data analytics for security.
This talk will go on a journey of big data in cybersecurity, exploring where big data has been and where it must go to make a true difference. We will look at the potential of data mining, machine learning, and artificial intelligence, as well as the boundaries of these approaches. We will also look at both the shortcomings and potential of data visualization and the human computer interface. It is critical that today's systems take into account the human expert and, most importantly, provide the right data.
EclipseCon France 2015 - Science TrackBoris Adryan
Software is increasingly playing a big part in scientific research, but in most cases the growth is organic. The life time of research software is often as short as the duration of a postdoctoral contract: Once the researcher moves on, custom-written niche code is frequently not well documented, components are not reusable, and the overall development effort is likely lost.
This is a case study in looking at the evolution of software for research in the field of genomics within my research group at the Department of Genetics at Cambridge University. While our research questions changed over the past decade, we moved from Perl code and regular expressions to R and statistical analysis, and from there to agent-based simulations in Java. Not only will I discuss the languages and tools used as well as the processes and how they have evolved over the years. It also covers the factors that influence the nature of the growth, such as funding, but also how 'open source' as a default has changed our development work. We also take a look into the future to see how we predict the software usage will grow.
Also, in presenting the problems and discussing possible solution, this talk will look at the role institutions play in helping address these issues. In particular the Software Sustainability Institute (SSI, http://software.ac.uk/) works in the UK to promote the development, maintenance and (re)use of research software.
The Eclipse Foundation, with the Science Working Group, works to facilitate software sharing and reuse. How can organisations like the SSI and Eclipse align their strategies and activities for maximum effect?
Mehr und schneller ist nicht automatisch besser - data2day, 06.10.16Boris Adryan
Das Gesetz der großen Zahlen gilt immer: Die statistische Sicherheit nimmt mit der Anzahl der Datenpunkte immer zu, sofern die Datennahme fair erfolgt. Leider kostet das Sammeln der Daten oftmals Geld, und so ist man vor allem im Bereich der Sensorik (Stichwort: Internet der Dinge) gezwungen, sinnvolle Kompromisse einzugehen. In diesem Vortrag fasse ich die Erkenntnisse eines Projekts zusammen, in dem die Datenanalytik zeigte, dass man zukünftig nur 60% der ausgebrachten Sensoren wirklich braucht. Auch muss es nicht immer Echtzeit-Analyse sein: Mit einer auf den Business-Case abgestimmten Datenstrategie lassen sich unnötige Ausgaben vermeiden.
Graph enhancements to Artificial Intelligence and Machine Learning are changing the landscape of intelligent applications. Beyond improving accuracy and modeling speed, graph technologies make building AI solutions more accessible. Join us to hear about 4 areas at the forefront of graph enhanced AI and ML, and find out which techniques are commonly used today and which hold the potential for disrupting industries. We'll provide examples and specifically look how: - Graphs provide better accuracy through connected feature extraction - Graphs provide better performance through contextual model optimization - Graphs provide context through knowledge graphs - Graphs add explainability to neural networks
Speakers: Jake Graham, Alicia Frame
My talk about data and information models for IoT, how ontologies can establish the relationship between IoT devices, and how Eclipse Vorto could accommodate ontological information. Briefly features Eclipse Smarthome.
Just because you can doesn't mean that you should - thingmonk 2016Boris Adryan
Big data! Fast data! Real-time analytics! These are buzzwords commonly associated with platform offerings around IoT.
Although the Law of large numbers always applies, just because you can deploy more sensors doesn't automatically mean that you should. After all, they cost money, bandwidth, and can be a pain to maintain. On the example of the Westminster Parking Trial, I'd like to show how analytics on preliminary survey data could have reduced the number of deployed sensors significantly.
A similar logic goes for fast and real-time analytics. While being advertised as killer features, many people new to IoT and analytics are not even aware that they might get away with batch processing. On the example of flying a drone, I'd like to discuss for which use cases I'd apply edge processing (on the drone), stream or micro-batch analytics (when data arrives at the platform) or work on batched data (stored in a database).
This document discusses how security teams are overwhelmed by large volumes of data from security alerts and indicators. It proposes that graph algorithms can help identify related alerts and events that should be investigated together, such as those targeting the same users or part of the same attack. The document provides examples of how community detection, centrality analysis, and other graph algorithms run on preprocessed security data can help prioritize work and generate new threat indicators.
Grounding Conversational AI in a Knowledge BaseVaticle
How does a conversational assistant understand questions like “What is the most expensive transaction on food I have made?” and where does it get the data to answer this kind of questions correctly? Or imagine a conversation with a bot that helps you manage your bank accounts. A person might ask questions like “On which of those accounts do I have more money?” or “What is the IBAN of the second account you just mentioned?”. How do we give our bot access to the relevant domain knowledge?
In this talk I will explain how I solved this by integrating a conversational assistant, built with Rasa, with a knowledge graph, built with Grakn. Together, these open source libraries help me understand what my bot’s users are talking about.
Industry of Things World - Berlin 19-09-16Boris Adryan
Dr. Boris Adryan gave a talk on the impact of IoT analytics on development budgets. He discussed that IoT data problems are often not as complex as perceived and do not necessarily require "big data" solutions or specialists. Basic data storage and processing can often be done cost-effectively using standard tools. True challenges lie in extracting useful insights, which may require specialized machine learning approaches. Not all analytics need to be real-time. The appropriate solution depends on the use case and desired insights.
Big Brother Vs. Big Data: Privacy-Preserving Threat Analytics at ScaleChester Parrott
The cyber-security threat landscape is a faceless whirlwind of deliberate and persistent attempts to compromise individual and organizational data toward nefarious ends. We might not know the direction of the next threat, but we certainly know it is inevitable. All organizations are faced with the massive problem of evolving threat; however, due to the current drought of cyber-security and data science professionals this problem can be prohibitively expensive to solve in isolation. In the absence of shared intelligence, best efforts can lead to ineffective ad-hoc security systems which are only token in nature. The dichotomy is as such: hackers and cyber-criminals can share intelligence for their purposes; how can we leverage and share intelligence about these common threats without compromising the data integrity of each organization? How do we detect advanced persistent threats without violating the privacy rights of individual users? In this talk, we discuss the state of the science in privacy-preserving threat analytics scaled to massive data sets and propose a solution to this dichotomy.
Data protection is at the center of a mature organizational information security strategy. Encryption plays an important role in that strategy to effectively protect data, even after other lines of defense have been compromised.
Unfortunately, there are many factors complicating the when, where and how of successfully using encryption technologies:
Getting Productive my Journey with Grakn and GraqlVaticle
Over many weeks and months, I have been learning about Grakn Labs and the two core components i.e. Grakn and Graql.
As a polyglot developer, specialising in Java/JVM tech, I have explored GraalVM, an enhanced JVM with runtime performance being its primary goal.
My first instinct was to run Grakn on GraalVM and then run GraknLabs’ benchmark suite to measure runtime performance against both the traditional JVM and GraalVM.
We will see how we can utilize and speak Graql just like we would be speaking to another human. We will explore English-to-Graql and vice-versa, using natural language to communicate with this novel graph engine called Grakn (this all started from a meetup and a few Github issues).
The document provides an overview of various digital technologies including AI, IoT, cloud computing, data analytics, and more. It discusses the "apples" or fundamental technologies in these areas like AR, VR, AI, IoT, and cloud computing. It then outlines several learning paths one could take to understand these technologies, beginning with foundations in areas like probability, statistics, computer science, and communications. It provides recommendations for books and courses to learn about each technology from roots to more advanced concepts. Finally, it discusses bringing all the pieces together using design thinking.
Designing Cross-Domain Semantic Web of Things ApplicationsAmélie Gyrard
The document discusses designing cross-domain semantic web of things applications. It introduces challenges including how to interpret IoT data, combine data from different domains, and reuse domain knowledge. The proposed M3 framework addresses these challenges through components like a SWoT generator template, M3 language and ontology, sensor-based linked open rules, and linked open vocabularies for IoT. Evaluations show the framework helps developers build semantic applications and interprets data efficiently while reusing interoperable domain knowledge. The framework has potential applications in domains like health, tourism and transportation.
Graph visualization options and latest developmentsLinkurious
This document discusses graph visualization tools and approaches. It describes two main approaches to graph visualization - the global approach which provides an overview first before allowing zooming and filtering for details, and the centered approach which allows searching first before showing context and expanding on demand. Several graph visualization tools are described and it is noted which approach they best support. Linkurious is presented as a tool that can visualize any graph database and supports the centered approach through search, context, and expansion.
This chapter is devoted to log mining or log knowledge discovery - a different type of log analysis, which does not rely on knowing what to look for. This takes the “high art” of log analysis to the next level by breaking the dependence on the lists of strings or patterns to look for in the logs.
This document provides an overview of Think Big Analytics, an analytics consulting firm. It discusses their services portfolio including data engineering, data science, analytics operations and managed services. It also highlights their global delivery model and successful projects with over 100 clients. The document then discusses their approach to artificial intelligence and deep learning, including applications across industries like banking, connected cars, and automated check processing. It emphasizes the need for a phased implementation approach to AI and challenges around technology, data, and deployment.
In this presentation Raffael Marty, VP of Research of Intelligence, Forcepoint X-Labs, explores the topic of Artificial Intelligence (AI) in cyber security. What is AI and how do we get to real intelligence in a cyber context? Raffael outlines some of the dangers of the way we are using algorithms (AI, Machine Learning) today and what that leads to. We then explore how we can add real intelligence through export knowledge to the problem of finding attackers and anomalies in our applications and networks.
Presented at AI 4 Cybersecurity in NYC on April 30, 2019
AI-SDV 2021: Francisco Webber - Efficiency is the New PrecisionDr. Haxel Consult
The global data sphere, consisting of machine data and human data, is growing exponentially reaching the order of zettabytes. In comparison, the processing power of computers has been stagnating for many years. Artificial Intelligence – a newer variant of Machine Learning – bypasses the need to understand a system when modelling it; however, this convenience comes with extremely high energy consumption.
The complexity of language makes statistical Natural Language Understanding (NLU) models particularly energy hungry. Since most of the zettabyte data sphere consists of human data, such as texts or social networks, we face four major obstacles:
1. Findability of Information – when truth is hard to find, fake news rule
2. Von Neumann Gap – when processors cannot process faster, then we need more of them (energy)
3. Stuck in the Average – when statistical models generate a bias toward the majority, innovation has a hard time
4. Privacy – if user profiles are created “passively” on the server side instead of “actively” on the client side, we lose control
The current approach to overcoming these limitations is to use larger and larger data sets on more and more processing nodes for training. AI algorithms should be optimized for efficiency rather than precision. In this case, statistical modelling should be disqualified as a brute force approach for language applications. When replacing statistical modelling and arithmetic, set theory and geometry seem to be a much better choice as it allows the direct processing of words instead of their occurrence counts, which is exactly what the human brain does with language – using only 7 Watts!
1) LOV4IoT is an extension of the Linked Open Vocabularies (LOV) catalogue that references over 300 ontology-based Internet of Things projects across numerous domains to encourage reuse of existing domain knowledge.
2) LOV4IoT provides an HTML user interface and web services to automatically compute statistics about the projects in its dataset, such as the number per domain.
3) The goal of LOV4IoT is to extract reusable domain knowledge from the referenced ontologies and datasets, such as a dictionary to unify IoT data and rules to interpret sensor data, to help developers design semantic-based IoT applications.
Neo4j GraphTalk Helsinki - Introduction and Graph Use CasesNeo4j
This document provides an introduction to graphs and Neo4j. It discusses that Neo4j is a native graph database that allows organizations to leverage connections in data in real-time to create value. It then provides information on Neo4j as a company and as a product, including that it is the world's leading graph database. The document goes on to define what graphs are from a data structure perspective and provides examples of famous graphs like social networks. It discusses why graph databases are useful compared to relational databases for representing complex, connected data and provides examples of use cases for Neo4j like recommendations, fraud detection, and network analysis.
This document discusses applying privacy preserving data mining techniques to code profiling data. Code profiling generates metrics about software attributes and performance. The author applies encryption to code profiling data from 140 Java codes to preserve privacy. K-means clustering and k-NN classification are performed on the actual and encrypted data, showing similar results while preserving privacy. Correlation analysis identifies weakly correlated attributes that are removed to improve clustering accuracy, though this decreases classifier accuracy. The paper concludes privacy preserving data mining of code profiling data is an emerging area that could benefit from additional encryption and classification techniques.
- Davide Mottin is an assistant professor in the Department of Computer Science at Aarhus University who researches graph mining.
- His talk discusses unveiling knowledge in knowledge graphs through personalized summarization techniques. Knowledge graphs contain entities and relationships between them.
- He describes an approach for generating personalized summaries of a knowledge graph based on a user's query history. The algorithm aims to find a subgraph that maximizes the probability of answering future queries, subject to a size limit.
Decentralized AI: Convergence of AI + Blockchain geetachauhan
Santa Clara IoT Expo talk slides - convering convergence of of AI and Blockchain and how it solves challenges for IoT, Ai@Edge and Data Ethics and User Data Monetization
Decentralized AI: Convergence of Blockchain + AIgeetachauhan
This document discusses the convergence of blockchain and AI through decentralized AI approaches. It outlines challenges with centralized AI models regarding privacy, influence, economics and transparency. Decentralized solutions proposed include federated learning, blockchain, homomorphic encryption, and data marketplaces. Blockchain provides an open, trustless network to replace centralized authorities and enable applications like data exchanges, AI marketplaces and distributed machine learning across devices. Overall the goal is to democratize AI and data through user ownership and control.
Just because you can doesn't mean that you should - thingmonk 2016Boris Adryan
Big data! Fast data! Real-time analytics! These are buzzwords commonly associated with platform offerings around IoT.
Although the Law of large numbers always applies, just because you can deploy more sensors doesn't automatically mean that you should. After all, they cost money, bandwidth, and can be a pain to maintain. On the example of the Westminster Parking Trial, I'd like to show how analytics on preliminary survey data could have reduced the number of deployed sensors significantly.
A similar logic goes for fast and real-time analytics. While being advertised as killer features, many people new to IoT and analytics are not even aware that they might get away with batch processing. On the example of flying a drone, I'd like to discuss for which use cases I'd apply edge processing (on the drone), stream or micro-batch analytics (when data arrives at the platform) or work on batched data (stored in a database).
This document discusses how security teams are overwhelmed by large volumes of data from security alerts and indicators. It proposes that graph algorithms can help identify related alerts and events that should be investigated together, such as those targeting the same users or part of the same attack. The document provides examples of how community detection, centrality analysis, and other graph algorithms run on preprocessed security data can help prioritize work and generate new threat indicators.
Grounding Conversational AI in a Knowledge BaseVaticle
How does a conversational assistant understand questions like “What is the most expensive transaction on food I have made?” and where does it get the data to answer this kind of questions correctly? Or imagine a conversation with a bot that helps you manage your bank accounts. A person might ask questions like “On which of those accounts do I have more money?” or “What is the IBAN of the second account you just mentioned?”. How do we give our bot access to the relevant domain knowledge?
In this talk I will explain how I solved this by integrating a conversational assistant, built with Rasa, with a knowledge graph, built with Grakn. Together, these open source libraries help me understand what my bot’s users are talking about.
Industry of Things World - Berlin 19-09-16Boris Adryan
Dr. Boris Adryan gave a talk on the impact of IoT analytics on development budgets. He discussed that IoT data problems are often not as complex as perceived and do not necessarily require "big data" solutions or specialists. Basic data storage and processing can often be done cost-effectively using standard tools. True challenges lie in extracting useful insights, which may require specialized machine learning approaches. Not all analytics need to be real-time. The appropriate solution depends on the use case and desired insights.
Big Brother Vs. Big Data: Privacy-Preserving Threat Analytics at ScaleChester Parrott
The cyber-security threat landscape is a faceless whirlwind of deliberate and persistent attempts to compromise individual and organizational data toward nefarious ends. We might not know the direction of the next threat, but we certainly know it is inevitable. All organizations are faced with the massive problem of evolving threat; however, due to the current drought of cyber-security and data science professionals this problem can be prohibitively expensive to solve in isolation. In the absence of shared intelligence, best efforts can lead to ineffective ad-hoc security systems which are only token in nature. The dichotomy is as such: hackers and cyber-criminals can share intelligence for their purposes; how can we leverage and share intelligence about these common threats without compromising the data integrity of each organization? How do we detect advanced persistent threats without violating the privacy rights of individual users? In this talk, we discuss the state of the science in privacy-preserving threat analytics scaled to massive data sets and propose a solution to this dichotomy.
Data protection is at the center of a mature organizational information security strategy. Encryption plays an important role in that strategy to effectively protect data, even after other lines of defense have been compromised.
Unfortunately, there are many factors complicating the when, where and how of successfully using encryption technologies:
Getting Productive my Journey with Grakn and GraqlVaticle
Over many weeks and months, I have been learning about Grakn Labs and the two core components i.e. Grakn and Graql.
As a polyglot developer, specialising in Java/JVM tech, I have explored GraalVM, an enhanced JVM with runtime performance being its primary goal.
My first instinct was to run Grakn on GraalVM and then run GraknLabs’ benchmark suite to measure runtime performance against both the traditional JVM and GraalVM.
We will see how we can utilize and speak Graql just like we would be speaking to another human. We will explore English-to-Graql and vice-versa, using natural language to communicate with this novel graph engine called Grakn (this all started from a meetup and a few Github issues).
The document provides an overview of various digital technologies including AI, IoT, cloud computing, data analytics, and more. It discusses the "apples" or fundamental technologies in these areas like AR, VR, AI, IoT, and cloud computing. It then outlines several learning paths one could take to understand these technologies, beginning with foundations in areas like probability, statistics, computer science, and communications. It provides recommendations for books and courses to learn about each technology from roots to more advanced concepts. Finally, it discusses bringing all the pieces together using design thinking.
Designing Cross-Domain Semantic Web of Things ApplicationsAmélie Gyrard
The document discusses designing cross-domain semantic web of things applications. It introduces challenges including how to interpret IoT data, combine data from different domains, and reuse domain knowledge. The proposed M3 framework addresses these challenges through components like a SWoT generator template, M3 language and ontology, sensor-based linked open rules, and linked open vocabularies for IoT. Evaluations show the framework helps developers build semantic applications and interprets data efficiently while reusing interoperable domain knowledge. The framework has potential applications in domains like health, tourism and transportation.
Graph visualization options and latest developmentsLinkurious
This document discusses graph visualization tools and approaches. It describes two main approaches to graph visualization - the global approach which provides an overview first before allowing zooming and filtering for details, and the centered approach which allows searching first before showing context and expanding on demand. Several graph visualization tools are described and it is noted which approach they best support. Linkurious is presented as a tool that can visualize any graph database and supports the centered approach through search, context, and expansion.
This chapter is devoted to log mining or log knowledge discovery - a different type of log analysis, which does not rely on knowing what to look for. This takes the “high art” of log analysis to the next level by breaking the dependence on the lists of strings or patterns to look for in the logs.
This document provides an overview of Think Big Analytics, an analytics consulting firm. It discusses their services portfolio including data engineering, data science, analytics operations and managed services. It also highlights their global delivery model and successful projects with over 100 clients. The document then discusses their approach to artificial intelligence and deep learning, including applications across industries like banking, connected cars, and automated check processing. It emphasizes the need for a phased implementation approach to AI and challenges around technology, data, and deployment.
In this presentation Raffael Marty, VP of Research of Intelligence, Forcepoint X-Labs, explores the topic of Artificial Intelligence (AI) in cyber security. What is AI and how do we get to real intelligence in a cyber context? Raffael outlines some of the dangers of the way we are using algorithms (AI, Machine Learning) today and what that leads to. We then explore how we can add real intelligence through export knowledge to the problem of finding attackers and anomalies in our applications and networks.
Presented at AI 4 Cybersecurity in NYC on April 30, 2019
AI-SDV 2021: Francisco Webber - Efficiency is the New PrecisionDr. Haxel Consult
The global data sphere, consisting of machine data and human data, is growing exponentially reaching the order of zettabytes. In comparison, the processing power of computers has been stagnating for many years. Artificial Intelligence – a newer variant of Machine Learning – bypasses the need to understand a system when modelling it; however, this convenience comes with extremely high energy consumption.
The complexity of language makes statistical Natural Language Understanding (NLU) models particularly energy hungry. Since most of the zettabyte data sphere consists of human data, such as texts or social networks, we face four major obstacles:
1. Findability of Information – when truth is hard to find, fake news rule
2. Von Neumann Gap – when processors cannot process faster, then we need more of them (energy)
3. Stuck in the Average – when statistical models generate a bias toward the majority, innovation has a hard time
4. Privacy – if user profiles are created “passively” on the server side instead of “actively” on the client side, we lose control
The current approach to overcoming these limitations is to use larger and larger data sets on more and more processing nodes for training. AI algorithms should be optimized for efficiency rather than precision. In this case, statistical modelling should be disqualified as a brute force approach for language applications. When replacing statistical modelling and arithmetic, set theory and geometry seem to be a much better choice as it allows the direct processing of words instead of their occurrence counts, which is exactly what the human brain does with language – using only 7 Watts!
1) LOV4IoT is an extension of the Linked Open Vocabularies (LOV) catalogue that references over 300 ontology-based Internet of Things projects across numerous domains to encourage reuse of existing domain knowledge.
2) LOV4IoT provides an HTML user interface and web services to automatically compute statistics about the projects in its dataset, such as the number per domain.
3) The goal of LOV4IoT is to extract reusable domain knowledge from the referenced ontologies and datasets, such as a dictionary to unify IoT data and rules to interpret sensor data, to help developers design semantic-based IoT applications.
Neo4j GraphTalk Helsinki - Introduction and Graph Use CasesNeo4j
This document provides an introduction to graphs and Neo4j. It discusses that Neo4j is a native graph database that allows organizations to leverage connections in data in real-time to create value. It then provides information on Neo4j as a company and as a product, including that it is the world's leading graph database. The document goes on to define what graphs are from a data structure perspective and provides examples of famous graphs like social networks. It discusses why graph databases are useful compared to relational databases for representing complex, connected data and provides examples of use cases for Neo4j like recommendations, fraud detection, and network analysis.
This document discusses applying privacy preserving data mining techniques to code profiling data. Code profiling generates metrics about software attributes and performance. The author applies encryption to code profiling data from 140 Java codes to preserve privacy. K-means clustering and k-NN classification are performed on the actual and encrypted data, showing similar results while preserving privacy. Correlation analysis identifies weakly correlated attributes that are removed to improve clustering accuracy, though this decreases classifier accuracy. The paper concludes privacy preserving data mining of code profiling data is an emerging area that could benefit from additional encryption and classification techniques.
- Davide Mottin is an assistant professor in the Department of Computer Science at Aarhus University who researches graph mining.
- His talk discusses unveiling knowledge in knowledge graphs through personalized summarization techniques. Knowledge graphs contain entities and relationships between them.
- He describes an approach for generating personalized summaries of a knowledge graph based on a user's query history. The algorithm aims to find a subgraph that maximizes the probability of answering future queries, subject to a size limit.
Decentralized AI: Convergence of AI + Blockchain geetachauhan
Santa Clara IoT Expo talk slides - convering convergence of of AI and Blockchain and how it solves challenges for IoT, Ai@Edge and Data Ethics and User Data Monetization
Decentralized AI: Convergence of Blockchain + AIgeetachauhan
This document discusses the convergence of blockchain and AI through decentralized AI approaches. It outlines challenges with centralized AI models regarding privacy, influence, economics and transparency. Decentralized solutions proposed include federated learning, blockchain, homomorphic encryption, and data marketplaces. Blockchain provides an open, trustless network to replace centralized authorities and enable applications like data exchanges, AI marketplaces and distributed machine learning across devices. Overall the goal is to democratize AI and data through user ownership and control.
There is a good chance that you have heard of artificial intelligence, machine learning, blockchain and bots. However, do you know what the implications of each of these technologies are? How it can and will impact your business in the near future? In this talk, we will discuss these technological trends, as well as a few others, that you will need to be familiar with as your association prepares to compete over the next few years. Let's take a peek into the future that is already here!
Blockchain is a promising technology getting a lot of attention these days; however, organizations aren’t entirely sure how it might improve business operations, what the risk implications are, and the security savviness needed to implement securely.
This webcast will address the most pressing issues and misconceptions surrounding Blockchain today, including:
• What is Blockchain?
• What are the new technologies I need to understand?
• Use Cases: where is Blockchain most advantageous?
• Snooze Cases: where/when is Blockchain a bad idea?
• What are the most common pitfalls with Blockchain?
Digital Transformation Major tech trends through the customer lens and relati...Larry Smith
Digital Transformation
Major tech trends through the customer lens and relationships to the Insurance Industry
7 core technology trends: Mobility – Data – Social - Bots – Intelligence – Visualization – Things
The document summarizes 10 key technology trends according to Gartner: 1) Applied AI and machine learning, 2) Intelligent apps, 3) Intelligent things, 4) Virtual and augmented reality, 5) Digital twins, 6) Blockchain, 7) Conversational systems, 8) Mesh app and service architecture, 9) Digital technology platforms, and 10) Adaptive security architecture. These trends are driven by advances in processing power, algorithms, sensors, and other technologies and will transform industries and how people interact with technology through more intelligent, personalized, and immersive experiences.
Algorithm Marketplace and the new "Algorithm Economy"Diego Oppenheimer
Diego Oppenheimer discusses the rise of algorithm marketplaces and the new "algorithm economy". Key points include:
- Advances in machine learning, computer vision, speech recognition and natural language processing are enabling algorithms to interpret unstructured data at scale.
- Algorithm marketplaces allow algorithms to be hosted, discovered, monetized and composed modularly to address a wide range of use cases across many industries.
- The algorithm economy will lower barriers to applying machine intelligence and foster innovation as algorithms become reusable assets that creators and users can both benefit from.
Présentation de Ethereum par Stephan Tual chez Mozilla Paris. Pour consulter et participer aux prochains événements de Ethereum :
http://www.meetup.com/Ethereum-Paris
Digital Experiences Using a Conversational InterfaceBala Iyer
The document discusses conversational interfaces and chatbots. It notes that chatbots allow users to interact with businesses through messaging apps using natural language. Chatbots are powered by artificial intelligence to understand users and perform tasks. Examples are given of different types of chatbots and popular platforms. Conversational commerce is emerging as users prefer quick interactions through their preferred messaging channels. Companies are advised to choose a business process to automate, develop on an open platform, collect user and product data, and explore opportunities to improve digital experiences and business models through chatbots.
Digitalization: A Challenge and An Opportunity for BanksJérôme Kehrli
Today’s banking industry era is strongly defined by a word - digital. The urgency to act is only getting severe each day. Banks using digital technologies to automate processes, improve regulatory compliance, and transform the customer experience may realize a profit upside of 40% or more, while laggards that resist digital innovation will be punished by customers, financial markets, regulators, and may see up to 35% of net profit eroded, according to a McKinsey analysis.
The vital question to answer is, do we get digitalization right? Why is it getting extremely urgent to digitize?
Priming Your Enterprise for Digital TransformationWSO2
The role of digital technology is rapidly shifting, from being a driver of marginal efficiency to an enabler of fundamental innovation and disruption, according to a white paper on digital enterprises by the World Economic Forum. The digital economy has changed the world of business, levelling the playground for newer entrants to compete head on with larger traditional enterprises.
In order to be competitive in today’s digital economy, organizations need to take steps to become digitally mature. This can be done both through internal and external digital innovations and transformations including
Transforming existing legacy systems via an integration layer
Building a macro or micro-services layer coupled with leaner devops for faster time-to-market
Enabling API driven stakeholder-inclusive businesses
Identifying new business insights via analytics
This document discusses various emerging technologies including Internet of Things (IoT), digital transformation, big data, data analytics, machine learning, artificial intelligence, blockchain, Ripple, LiFi, and Mitz technologies. It provides overviews and examples of each technology, noting how IoT is bringing more connected devices and creating challenges around data structures, formats, and analytics. Artificial intelligence can help with IoT data preparation, discovery, visualization, prediction, and geospatial analysis. Blockchain provides benefits for tracking connected devices and enabling secure transactions without centralized control.
Digital Transformation and Innovation on http://denreymer.com
- Merging the Real World and the Virtual World
- Intelligence Everywhere
- The New IT Reality Emerges
http://www.gartner.com//it/content/2940400/2940420/january_15_top_10_technology_trends_2015_dcearley.pdf
The document discusses recent trends in information technology including virtual and augmented reality, cloud computing, 5G wireless, the Internet of Things (IoT), and big data analytics. It provides an agenda for the session covering these topics and case studies applying these technologies. Examples of how IoT is enabling industrial automation and transportation are presented. The growth of big data and opportunities it provides are also summarized. The document concludes with a discussion of how information technology is developing through artificial intelligence, machine learning, smart devices, data, and social media.
Introducción al Machine Learning AutomáticoSri Ambati
¿Cómo puede llevar el aprendizaje automático a las masas? Los proyectos de Machine Learning con la búsqueda de talento, el tiempo para construir e implementar modelos y confiar en los modelos que se construyen.
¿Cómo puede tener varios equipos en su organización para crear modelos de ML precisos sin ser expertos en ciencia de datos o aprendizaje automático?
¿Se pregunta sobre los diferentes sabores de AutoML?
H2O Driverless AI emplea las técnicas de científicos expertos en datos en una aplicación fácil de usar que ayuda a escalar sus esfuerzos de ciencia de datos. La inteligencia artificial Driverless permite a los científicos de datos trabajar en proyectos más rápido utilizando la automatización y la potencia de computación de vanguardia de las GPU para realizar tareas en minutos que solían tomar meses.
Con H2O Driverless AI, todos, incluyendo expertos y científicos de datos junior, científicos de dominio e ingenieros de datos pueden desarrollar modelos confiables de aprendizaje automático. Esta plataforma de aprendizaje automático de última generación ofrece una funcionalidad única y avanzada para la visualización de datos, la ingeniería de características, la interpretabilidad del modelo y la implementación de baja latencia.
H2O Driverless AI hace:
* Visualización automática de datos
* Ingeniería automática de funciones a nivel de Grandmaster
* Selección automática del modelo
* Ajuste y capacitación automáticos del modelo
* Paralelización automática utilizando múltiples CPU o GPU
* Ensamblaje automático del modelo
*automática del Interpretaciónaprendizaje automático (MLI)
* Generación automática de código de puntuación
¿Quieres probarlo tú mismo? Puede obtener una prueba gratuita aquí: H2O Driverless AI trial.
Venga a esta sesión y descubra cómo comenzar con el Aprendizaje automático automático con AI sin conductor H2O, y cree modelos potentes con solo unos pocos clics.
¡Te veo pronto!
Acerca de H2O.ai
H2O.ai es una empresa visionaria de software de código abierto de Silicon Valley que creó y reimaginó lo que es posible. Somos una empresa de fabricantes que trajeron al mercado nuevas plataformas y tecnologías para impulsar el movimiento de inteligencia artificial. Somos los creadores de, H2O, la principal plataforma de aprendizaje de ciencia de datos de fuente abierta y de aprendizaje automático utilizada por casi la mitad de Fortune 500 y en la que confían más de 14,000 organizaciones y cientos de miles de científicos de datos de todo el mundo.
Fintech workshop Part I - Law Society of Hong Kong - XccelerateHenrique Centieiro
What is fintech? What are the technologies leveraging Fintech? How AI, Blockchain, Cloud and Data Analytics are changing the financial world?
Henrique works as Innovation Project Manager implementing Fintech and Blockchain Projects for the Financial Industry
Find me here: linkedin.com/in/henriquecentieiro
Correlation Analysis Modeling Use Case - IBM Power Systems Gautam Siwach
Do the people having good financial standing ,higher education level, a steady job corresponds to commit fewer crime, and Does the uneducated, or poor people commit more crime?
Data Source : From the Communities and Crime Un-normalized Data Set
Website : http://archive.ics.uci.edu/ml/machine-learning-databases/00211/CommViolPredUnnormalizedData.txt
Total Observations : 2215
Total Variables : 147
Profiling PyTorch for Efficiency & Sustainabilitygeetachauhan
From my talk at the Data & AI summit - latest update on the PyTorch Profiler and how you can use it for optimizations for efficiency. Talk also dives into the future and what we need to do together as an industry to move towards Sustainable AI
Building AI with Security Privacy in Mindgeetachauhan
The document discusses building AI with security and privacy in mind. It covers privacy challenges in AI like tensions between data privacy and model training. It then discusses various privacy preserving machine learning techniques like homomorphic encryption, differential privacy, secure multi-party computation, on-device computation, and federated learning. The document provides examples of how each technique works. It concludes by discussing tools and techniques for starting a privacy journey in AI and provides resources to learn more.
Building Interpretable & Secure AI Systems using PyTorchgeetachauhan
Slides from my talk at Deep Learning World 2020. The talk covered use cases, special challenges and solutions for building Interpretable and Secure AI systems using Pytorch.
- Tools for building Interpretable models
- How to build secure, privacy preserving AI models with Pytorch
- Use cases and insights from the field
Decentralized AI: Convergence of Blockchain + AIgeetachauhan
As we move into the world where User's will own their own data, and companies will use "Ethically Sourced Data", there will be a rampant need for Decentralized AI. And, combining with Blockchain one gets viable Business Models. This talk covers use cases for convergence of Blockchain and AI.
Talk @ ACM SF Bayarea Chapter on Deep Learning for medical imaging space.
The talk covers use cases, special challenges and solutions for Deep Learning for Medical Image Analysis using Tensorflow+Keras. You will learn about:
- Use cases for Deep Learning in Medical Image Analysis
- Different DNN architectures used for Medical Image Analysis
- Special purpose compute / accelerators for Deep Learning (in the Cloud / On-prem)
- How to parallelize your models for faster training of models and serving for inferenceing.
- Optimization techniques to get the best performance from your cluster (like Kubernetes/ Apache Mesos / Spark)
- How to build an efficient Data Pipeline for Medical Image Analysis using Deep Learning
- Resources to jump start your journey - like public data sets, common models used in Medical Image Analysis
The document discusses deep learning techniques for financial technology (FinTech) applications. It begins with examples of current deep learning uses in FinTech like trading algorithms, fraud detection, and personal finance assistants. It then covers topics like specialized compute hardware for deep learning training and inference, optimization techniques for CPUs and GPUs, and distributed training approaches. Finally, it discusses emerging areas like FPGA and quantum computing and provides resources for practitioners to start with deep learning for FinTech.
NIPS - Deep learning @ Edge using Intel's NCSgeetachauhan
The document discusses using Intel's Neural Compute Stick for deep learning at the edge. It introduces the Neural Compute Stick, which enables computer vision and AI capabilities in small, low power devices. It then provides an overview of deep learning and discusses how to build IoT applications using the Neural Compute Stick SDK. Examples of use cases for edge intelligence in IoT are also presented.
Best Practices for On-Demand HPC in Enterprisesgeetachauhan
Traditionally HPC has been popular in Scientific domains, but not in most other Enterprises. With the advent of on-demand-HPC in cloud and growing adoption of Deep Learning, HPC should now be a standard platform for any Enterprise leading with AI and Machine Learning. This session will cover the best practices for building your own on-demand HPC cluster for Enterprise workloads along with key use cases where Enterprises will benefit from HPC solution.
Deep learning @ Edge using Intel's Neural Compute Stickgeetachauhan
Talk @ Intel Global IoT DevFest, Nov 2017
The new generation of hardware accelerators are enabling rich AI driven, Intelligent IoT solutions @ the edge.
The talk showcased how to use Intel's latest Nervana Compute Stick for accelerating deep learning IoT solutions. It also covered use cases and code details for running Deep Learning models on Intel's Nervana Compute Stick.
Distributed deep learning optimizations - AI WithTheBestgeetachauhan
Learn how to optimize Tensorflow for your Intel CPU and techniques for distributed deep learning for both model training and inferencing. Talk @ AI WithTheBest
Distributed deep learning optimizationsgeetachauhan
The document discusses optimizations for distributed deep learning. It covers challenges like latency, cost and power consumption when scaling deep learning models. It then discusses specialized compute like Google TPUs and optimizations for CPU, GPU and inference workloads. Techniques like data parallelism, model parallelism, quantization and clustering are presented. Emerging areas like FPGA, neuromorphic and quantum computing are also mentioned.
Intel optimized tensorflow, distributed deep learninggeetachauhan
This document discusses optimizations for running TensorFlow on Intel CPUs for deep learning. It outlines techniques for compiling TensorFlow from source with CPU optimizations, using proper data formats and batch sizes, and reading data with queues to leverage multi-core CPUs. It also covers distributed deep learning using TensorFlow Estimators, parameter servers, and model parallelism to distribute graphs across multiple machines. Resources for further information on Intel optimizations, installing libraries, and distributed TensorFlow are provided.
How Deep Learning will change IoT to take us into new era of AI driven smart IoT devices with intelligence at the edge. Talk covers use cases and code details for running Tensorflow models on Intel Edison and Raspberry Pi. Slides from the talk given at Intel Iot With the Best 2017 conference
Build Secure IOT Solutions using Blockchaingeetachauhan
This document discusses using blockchain technology to build more secure Internet of Things (IoT) solutions. It begins by outlining some of the major security challenges facing IoT, including high-profile hacks that have impacted systems like HVAC and medical devices. It then provides an overview of blockchain technology, explaining how its distributed ledger model can replace middlemen and enable more open, trustworthy and secure digital record keeping through the use of techniques like smart contracts. The document presents several case studies of companies applying blockchain to improve IoT security for applications such as home rentals, solar energy tracking, and drone deliveries. It concludes by recommending some starting points for working with blockchain and IoT security, like the Ethereum platform.
Data Analytics in Real World (May 2016)geetachauhan
This document discusses challenges and solutions for data analytics in the real world. It outlines technological challenges like rapidly evolving technology stacks and shifts to cloud and hybrid models. Organizational challenges include long ROI timelines and a lack of domain expertise. The document then describes architectural patterns for data analytics, including lambda architecture, edge analytics, treating data centers as computers, and using blockchain. It emphasizes skills like continuous learning, experimentation, and using data to drive decisions.
Geeta Chauhan presented on data analytics in the real world. The presentation covered challenges like evolving technology, data cleansing, and cultural adoption of data-driven decision making. Architectural patterns discussed included lambda architecture with real-time and batch layers, edge analytics closer to data sources, and using data centers like distributed computing clusters. Key takeaways emphasized continuous learning, experimentation, and automation to enable rapid iteration in analytics projects.
This document discusses the potential of blockchain technology to revolutionize various industries by creating a decentralized internet of value. It describes how blockchain uses distributed ledgers and cryptography to allow for trustless and transparent transactions without middlemen. Examples are given of how blockchain could transform industries like transportation (Uber), healthcare (electronic medical records), insurance (peer-to-peer models), and more. Challenges around scalability and regulation are also mentioned. The document promotes blockchain as a means to fully democratize the internet through decentralized applications, smart contracts, and new models of value exchange and autonomous organizations.
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
Trusted Execution Environment for Decentralized Process MiningLucaBarbaro3
Presentation of the paper "Trusted Execution Environment for Decentralized Process Mining" given during the CAiSE 2024 Conference in Cyprus on June 7, 2024.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
leewayhertz.com-AI in predictive maintenance Use cases technologies benefits ...alexjohnson7307
Predictive maintenance is a proactive approach that anticipates equipment failures before they happen. At the forefront of this innovative strategy is Artificial Intelligence (AI), which brings unprecedented precision and efficiency. AI in predictive maintenance is transforming industries by reducing downtime, minimizing costs, and enhancing productivity.
leewayhertz.com-AI in predictive maintenance Use cases technologies benefits ...
Decentralized AI Draper
1. D E C E N T R A L I Z E D A I : C O N V E R G E N C E
O F B L O C K C H A I N + A I
A P R I L , 2 0 1 9
G E E TA C H A U H A N , C TO
2. RESILIENT ANTI-FRAGILE PLATFORMS
• Data Center as a Computer
• Microservices
• Elastic Scaling
• Self healing
• Chaos Monkey
• Canary Deployments
• A/B Testing
• Continuous Learning
• Continuous Monitoring
• Software Defined Networking
• 17 Platforms, Countless apps, 1Bil customers…
• Decentralized Cloud
• Crowd-sourcing Platform
• SaaS for Chip Design
• AI/Deep Learning Pipelines, Recommendation Engines
• Data Platforms for Telecom
• Own Cloud Platform
• IoT Analytics
• Mobile Device Management
• Social Media Monitoring Platform
• Web Monitoring Platform
• Speech Analytics Platform
• Developer Platforms w/
Real-time Debugger
3. C E N T R A L I Z E D A I I S L I K E T H E
C L O S E D S O U R C E O F T H E 1 9 9 0 S
4. CHALLENGES?
Privacy Problem Can entities train model without disclosing data?
Influence problem
Can 3rd parties contribute towards behavior of AI model in a way that is quantifiably
influential?
Economic Problem Can 3rd parties be correctly incentivized to contribute to knowledge & quality of AI
models?
Transparency Problem Can the activity of behaviour of AI model be transparently available to all parties
without a trusting middleman?
Latency Problem Centralized AI is inappropriate for use-cases where AI needs to interact in real time
with the real world
6. FEDERATED AI
• Subset of devices selected,
each downloads the model
• Train model with local data
• Model updates – gradients –
sent back to server
• Server aggregates
• Cancer treatment centers
training models
7. WHAT IS BLOCKCHAIN?
• An Immutable record of digital events
shared peer to peer between
different parties
• Distributed Ledger, AuditTrail
• Open +Trust + Secure
• Smart Contracts → dApps, DAO
• Non-FungibleTokens, SecurityTokens
• Fully Democratize Internet
Information Age → Internet ofValue
Source: Economist.com
8. W H A T I S H O M O M O R P H I C E N C R Y P T I O N ?
10. DATA EXCHANGE
• Blockchain for Data Provenance
• User Owned Data
• Time expiry for data
• Ethically Sourced Data
– Transparency
– Fairness
– Privacy
11. ETHICAL AI
• Personal Data Rights & Individual
Access Controls
• Well-being metrics
• Awareness of Misuse
• Respect for Privacy
• Governance of AI Autonomy
• Accountability,Transparency
• IEEE Ethical Design, Europe
Trustworthy AI
• California “Data Dividend”
proposal https://cnn.it/2N9KEay
12. AI
MARKETPLACE
• Data Competition each
week
• Encrypted data released
• Crowdsource Data Science
Model
• Data Scientists retain IP –
encrypted models
• Participants paid in Bitcoin
based on accuracy of their
guesses, payouts to top 60
• Originality paid extra
13. OTHER PLAYERS
SingularityNET Smart Contracts for Decentralized AI Microservices
Ocean Protocol Ecosystem for Sharing Data and Services
Effect.AI Decentralized Mturk, Human in the loop AI
Distributed ML
Blockchain agnostic Runtime to run ML models across devices
14. LATENCY
CHALLENGE
• Slow Inference problem
• Real-time scenarios
• AI needs to interact in
real-time with real-world
• Need compute on / close
to edge devices
Democratizing cloud computing
for cloud resource providers
and application developers
15. LATENCY CHALLENGE FOR IOT
• Slow Inference problem
• Real-time scenarios
• AI needs to interact in
real-time with real-world
• Need compute on / close
to edge devices for your
apps / microservices