This document discusses AI in the enterprise from past, present, and future perspectives. It provides an overview of the history and recent developments in AI and deep learning, including improved performance on tasks like image recognition. Case studies are presented showing how various large companies have successfully applied deep learning techniques like convolutional neural networks to problems in different industries involving computer vision, predictive maintenance, fraud detection, and more. The importance of data quantity for deep learning performance is highlighted. The final sections discuss challenges in AI adoption and the importance of piloting models before full production deployment.
Automated AI The Next Frontier in Analytics - StampedeCon AI Summit 2017StampedeCon
This talk will walk through the important building blocks of Automated AI. Rajiv will highlight the current gaps in the analytics organizations, how to close those gaps using automated AI. Some of the issues discussed around automated AI are the accuracy of models, tradeoffs around control when using automation, interpretability of models, and integration with other tools. These issues will be highlighted with examples of automated analytics in different industries. The talk will end with some examples of how automated AI in the hands of data scientists and business analysts is transforming analytic teams and organizations.
Why is artificial intelligence in business analytics so critical for business...Countants
Be it in the form of deep learning technologies, autonomous vehicles, or smart robots, artificial intelligence (or AI) is making its presence felt everywhere in the connected world. With AI-enabled technologies having a prominent place in the Gartner Hype Cycle for Emerging Technologies, this technology is enhancing the capabilities of business analytics and business intelligence.
Automated AI The Next Frontier in Analytics - StampedeCon AI Summit 2017StampedeCon
This talk will walk through the important building blocks of Automated AI. Rajiv will highlight the current gaps in the analytics organizations, how to close those gaps using automated AI. Some of the issues discussed around automated AI are the accuracy of models, tradeoffs around control when using automation, interpretability of models, and integration with other tools. These issues will be highlighted with examples of automated analytics in different industries. The talk will end with some examples of how automated AI in the hands of data scientists and business analysts is transforming analytic teams and organizations.
Why is artificial intelligence in business analytics so critical for business...Countants
Be it in the form of deep learning technologies, autonomous vehicles, or smart robots, artificial intelligence (or AI) is making its presence felt everywhere in the connected world. With AI-enabled technologies having a prominent place in the Gartner Hype Cycle for Emerging Technologies, this technology is enhancing the capabilities of business analytics and business intelligence.
[Ai in finance] AI in regulatory compliance, risk management, and auditingNatalino Busa
AI to Improve Regulatory Compliance, Governance & Auditing. How AI identifies and prevents risks, above and beyond traditional methods. Techniques and analytics that protect customers and firms from cyber-attacks and fraud. Using AI to quickly and efficiently provide evidence for auditing requests.
Functionalities in AI Applications and Use Cases (OECD)AnandSRao1962
This presentation was given at the OECD Network of AI Specialists (ONE) held in Paris on February 26 and 27. It covers the methodology for assessing AI use cases by technology, value chain, use, business impact, business value, and effort required.
In an increasingly data-centric world, a company which fails to leverage the power of AI-powered business intelligence tools often lag behind. Learn from these slides how these tools are affecting businesses today and why should you choose them.
Talk presented at the Analytics Frontiers Conference in Charlotte on March 21. The presentation evaluates opportunities and risks of AI and how consumers, businesses, society and governments can mitigate some of the risks.
AI continues to expand into different areas like healthcare, agriculture, scientific research and auditing.
AI is still only touching the surface when it comes to its application, especially if AI can work with time-series data.
Artificial Intelligence in Project Management by Dr. Khaled A. HamdyAgile ME
Video recording of the Dr. Khaled's session can be found at https://youtu.be/TFNhvAXNU5E.
The presentation explores how Artificial Intelligence (AI) can be used in the Project Management field. The origins and history of AI are discussed followed by a brief simplified explanation of the theories behind its application. The actual utilization of AI tools in the Project Management domain is discussed covering diverse areas such as Engineering Design, Cost Estimating and Bidding, Planning and Scheduling, Risk Management, Performance Prediction as well as Project Monitoring and Control. The presentation concludes by a brief discussion about Data Management and Knowledge Engineering and how they are used today to simplify (or complicate) our lives.
Naghi Prasad at AI Frontiers: Building AI systems to automate enterprise proc...AI Frontiers
In this talk we will discuss our experience building AI systems for enterprise process automation. Using examples of real-life deployed AI systems in AdTech, customer service, mortgage financing and recruiting, we discuss our learnings and insights gleaned.
AI can be used to create sophisticated tools to monitor and analyze behavior and activities in real time. Since these systems can adapt to changing risk environments, they continually enhance the organization’s monitoring capabilities in areas such as regulatory compliance and corporate governance.
AI systems
can adapt to changing risk environments
continually enhance the organization’s monitoring capabilities
Better manage regulatory compliance and corporate governance.
Weiyan Zhao, Nationwide Insurance - A Decade of Data Science. The Nationwide ...Sri Ambati
This session was recorded in NYC on October 22nd, 2019 and can be viewed here: https://youtu.be/_RxH-bNRqp0
A Decade of Data Science. The Nationwide Journey
The Nationwide Enterprise Analytics Office (formerly Customer Insights and Analytics) has more than 10 years of experience in end-to-end data product development and system integration. The culture to attract, train and develop talent, the technical advancement to apply the new methods, the model factory to productionalize models, and responsive processes to measure business impact have all contributed to positive business outcomes as well as this team’s fast growth. In this talk, we will introduce Nationwide’s data science capabilities through case studies of a few data products they have built and deployed.
Bio: Weiyan Zhao is the Director of Data Science at Nationwide Insurance’s Enterprise Analytics Office. She currently leads a team of data scientists to provide enterprise solutions that drive business value and influence decisions through application of advanced statistical modeling and machine learning techniques. Previously, Weiyan served as an Analytics Manager at Chase, and as a Research Associate at Nationwide Children’s Hospital and at University of Texas at San Antonio. She received her PhD in Epidemiology and Statistics, and has been passionate about data and analytics throughout her career. Additionally, she is also a long term volunteer for different non-profit organizations to promote culture and diversity, and mentors young professionals.
Support for the presentation • “Does AI Improve Managerial Decision-Making?”at the International Conference Airport Operational Excellence, Jan. 28-30 2019
Benefits of ai enabled project managementOrangescrum
Adoption of Artificial Intelligence in project and task management tools is helping in developing chat bots and adding AI enabled functionalities to get engage with users.
Artificial Intelligence application in workplace Nandini Singh
Sharing my latest work on #artificialintelligence in #workplacesolutions. Will start sharing such decks more frequently. Please do share your comments.
This presentation was given at the OECD Network of AI Specialists (ONE) held in Paris on February 26 and 27. It was presented at the joint session of ONE and the Digital Council. It covers some of the key trends and developments in AI including operationalizing AI, responsible AI and National AI Strategies
In this new Accenture Finance & Risk presentation we explore machine learning as a solution to some of the most important challenges faced by the banking sector today. To learn more, read our blog on Machine Learning in Banking: https://accntu.re/2oTVJiX
Frontiers in Alternative Data : Techniques and Use CasesQuantUniversity
QuantUniversity Summer School 2020 (https://qusummerschool.splashthat.com/)
https://quspeakerseries10.splashthat.com/
Lecture 1: Alexander Denev
In this talk, Alexander will introduce Alternative Data and discuss it's uses from his book, The Book of Alternative Data
- What is alternative data?
- Adoption of alternative data
- Information value chain
- Risks associated with alternative data
- Processes required to develop signals
- Valuation of alternative data
Lecture 2: Saeed Amen
In this talk, Saeed will discuss use cases in Alternative Data
-Deciphering Federal Reserve communications
- Using CLS flow data to trade FX
- Geospatial Insight satellite data to estimate retailers' EPS
- Saving "alpha" with transaction cost analysis
- Using Bloomberg News data to trade FX
Types of Blockchain - permissioned vs. permissionless platforms
Types of AI - Unsupervised, Supervised and Reinforcement Learning, Deep Learning
Future of Blockchain and AI
Foundations of Machine Learning - StampedeCon AI Summit 2017StampedeCon
This presentation will cover all aspects of modeling, from preparing data, training and evaluating the results. There will be descriptions of the mainline ML methods including, neural nets, SVM, boosting, bagging, trees, forests, and deep learning. common problems of overfitting and dimensionality will be covered with discussion of modeling best practices. Other topics will include field standardization, encoding categorical variables, feature creation and selection. It will be a soup-to-nuts overview of all the necessary procedures for building state-of-the art predictive models.
[Ai in finance] AI in regulatory compliance, risk management, and auditingNatalino Busa
AI to Improve Regulatory Compliance, Governance & Auditing. How AI identifies and prevents risks, above and beyond traditional methods. Techniques and analytics that protect customers and firms from cyber-attacks and fraud. Using AI to quickly and efficiently provide evidence for auditing requests.
Functionalities in AI Applications and Use Cases (OECD)AnandSRao1962
This presentation was given at the OECD Network of AI Specialists (ONE) held in Paris on February 26 and 27. It covers the methodology for assessing AI use cases by technology, value chain, use, business impact, business value, and effort required.
In an increasingly data-centric world, a company which fails to leverage the power of AI-powered business intelligence tools often lag behind. Learn from these slides how these tools are affecting businesses today and why should you choose them.
Talk presented at the Analytics Frontiers Conference in Charlotte on March 21. The presentation evaluates opportunities and risks of AI and how consumers, businesses, society and governments can mitigate some of the risks.
AI continues to expand into different areas like healthcare, agriculture, scientific research and auditing.
AI is still only touching the surface when it comes to its application, especially if AI can work with time-series data.
Artificial Intelligence in Project Management by Dr. Khaled A. HamdyAgile ME
Video recording of the Dr. Khaled's session can be found at https://youtu.be/TFNhvAXNU5E.
The presentation explores how Artificial Intelligence (AI) can be used in the Project Management field. The origins and history of AI are discussed followed by a brief simplified explanation of the theories behind its application. The actual utilization of AI tools in the Project Management domain is discussed covering diverse areas such as Engineering Design, Cost Estimating and Bidding, Planning and Scheduling, Risk Management, Performance Prediction as well as Project Monitoring and Control. The presentation concludes by a brief discussion about Data Management and Knowledge Engineering and how they are used today to simplify (or complicate) our lives.
Naghi Prasad at AI Frontiers: Building AI systems to automate enterprise proc...AI Frontiers
In this talk we will discuss our experience building AI systems for enterprise process automation. Using examples of real-life deployed AI systems in AdTech, customer service, mortgage financing and recruiting, we discuss our learnings and insights gleaned.
AI can be used to create sophisticated tools to monitor and analyze behavior and activities in real time. Since these systems can adapt to changing risk environments, they continually enhance the organization’s monitoring capabilities in areas such as regulatory compliance and corporate governance.
AI systems
can adapt to changing risk environments
continually enhance the organization’s monitoring capabilities
Better manage regulatory compliance and corporate governance.
Weiyan Zhao, Nationwide Insurance - A Decade of Data Science. The Nationwide ...Sri Ambati
This session was recorded in NYC on October 22nd, 2019 and can be viewed here: https://youtu.be/_RxH-bNRqp0
A Decade of Data Science. The Nationwide Journey
The Nationwide Enterprise Analytics Office (formerly Customer Insights and Analytics) has more than 10 years of experience in end-to-end data product development and system integration. The culture to attract, train and develop talent, the technical advancement to apply the new methods, the model factory to productionalize models, and responsive processes to measure business impact have all contributed to positive business outcomes as well as this team’s fast growth. In this talk, we will introduce Nationwide’s data science capabilities through case studies of a few data products they have built and deployed.
Bio: Weiyan Zhao is the Director of Data Science at Nationwide Insurance’s Enterprise Analytics Office. She currently leads a team of data scientists to provide enterprise solutions that drive business value and influence decisions through application of advanced statistical modeling and machine learning techniques. Previously, Weiyan served as an Analytics Manager at Chase, and as a Research Associate at Nationwide Children’s Hospital and at University of Texas at San Antonio. She received her PhD in Epidemiology and Statistics, and has been passionate about data and analytics throughout her career. Additionally, she is also a long term volunteer for different non-profit organizations to promote culture and diversity, and mentors young professionals.
Support for the presentation • “Does AI Improve Managerial Decision-Making?”at the International Conference Airport Operational Excellence, Jan. 28-30 2019
Benefits of ai enabled project managementOrangescrum
Adoption of Artificial Intelligence in project and task management tools is helping in developing chat bots and adding AI enabled functionalities to get engage with users.
Artificial Intelligence application in workplace Nandini Singh
Sharing my latest work on #artificialintelligence in #workplacesolutions. Will start sharing such decks more frequently. Please do share your comments.
This presentation was given at the OECD Network of AI Specialists (ONE) held in Paris on February 26 and 27. It was presented at the joint session of ONE and the Digital Council. It covers some of the key trends and developments in AI including operationalizing AI, responsible AI and National AI Strategies
In this new Accenture Finance & Risk presentation we explore machine learning as a solution to some of the most important challenges faced by the banking sector today. To learn more, read our blog on Machine Learning in Banking: https://accntu.re/2oTVJiX
Frontiers in Alternative Data : Techniques and Use CasesQuantUniversity
QuantUniversity Summer School 2020 (https://qusummerschool.splashthat.com/)
https://quspeakerseries10.splashthat.com/
Lecture 1: Alexander Denev
In this talk, Alexander will introduce Alternative Data and discuss it's uses from his book, The Book of Alternative Data
- What is alternative data?
- Adoption of alternative data
- Information value chain
- Risks associated with alternative data
- Processes required to develop signals
- Valuation of alternative data
Lecture 2: Saeed Amen
In this talk, Saeed will discuss use cases in Alternative Data
-Deciphering Federal Reserve communications
- Using CLS flow data to trade FX
- Geospatial Insight satellite data to estimate retailers' EPS
- Saving "alpha" with transaction cost analysis
- Using Bloomberg News data to trade FX
Types of Blockchain - permissioned vs. permissionless platforms
Types of AI - Unsupervised, Supervised and Reinforcement Learning, Deep Learning
Future of Blockchain and AI
Foundations of Machine Learning - StampedeCon AI Summit 2017StampedeCon
This presentation will cover all aspects of modeling, from preparing data, training and evaluating the results. There will be descriptions of the mainline ML methods including, neural nets, SVM, boosting, bagging, trees, forests, and deep learning. common problems of overfitting and dimensionality will be covered with discussion of modeling best practices. Other topics will include field standardization, encoding categorical variables, feature creation and selection. It will be a soup-to-nuts overview of all the necessary procedures for building state-of-the art predictive models.
Building an AI Startup: Realities & TacticsMatt Turck
AI is all the rage in tech circles, and the press is awash in tales of AI entrepreneurs striking it rich after being acquired by one of the giants. As always, the realities of building a startup are different, and the path to success requires not just technical prowess but also thoughtful market positioning and business excellence.
In a talk of interest to anyone building or implementing an AI product, Matt Turck and Peter Brodsky leverage hundreds of conversations with AI (and big data) founders and hard-learned lessons building companies from the ground up to highlight successful strategies and tactics.
Topics include:
Successful data acquisition strategies
Data network effects
Competing with the giants
A pragmatic approach to building an AI team
Why social engineering is just as important to success as groundbreaking AI technology
Driven by the rapid progress in Artificial Intelligence (AI) research, intelligent machines are gaining the ability to learn, improve and make calculated decisions in ways that will enable them to perform tasks previously thought to rely solely on human experience, creativity, and ingenuity. As a result, we will in the near future see large parts of our lives influenced by AI.
AI innovation will also be central to the achievement of the United Nations' Sustainable Development Goals (SDGs) and will help solving humanity's grand challenges by capitalizing on the unprecedented quantities of data now being generated on sentiment behavior, human health, commerce, communications, migration and more.
With large parts of our lives being influenced by AI, it is critical that government, industry, academia and civil society work together to evaluate the opportunities presented by AI, ensuring that AI benefits all of humanity. Responding to this critical issue, ITU and the XPRIZE Foundation organized AI for Good Global Summit in Geneva, 7-9 June, 2017 in partnership with a number of UN sister agencies. The Summit aimed to accelerate and advance the development and democratization of AI solutions that can address specific global challenges related to poverty, hunger, health, education, the environment, and others.
The Summit provided a neutral platform for government officials, UN agencies, NGO's, industry leaders, and AI experts to discuss the ethical, technical, societal and policy issues related to AI, offer reccommendations and guidance, and promote international dialogue and cooperation in support of AI innovation.
Please visit the AI for Good Global Summit page for more resources: https://www.itu.int/en/ITU-T/AI/Pages/201706-default.aspx
If you would like to speak, partner or sponsor the 2018 edition of the summit, please contact: ai@itu.int
Artificial intelligence (AI) is everywhere, promising self-driving cars, medical breakthroughs, and new ways of working. But how do you separate hype from reality? How can your company apply AI to solve real business problems?
Here’s what AI learnings your business should keep in mind for 2017.
Deep Learning - The Past, Present and Future of Artificial IntelligenceLukas Masuch
In the last couple of years, deep learning techniques have transformed the world of artificial intelligence. One by one, the abilities and techniques that humans once imagined were uniquely our own have begun to fall to the onslaught of ever more powerful machines. Deep neural networks are now better than humans at tasks such as face recognition and object recognition. They’ve mastered the ancient game of Go and thrashed the best human players. “The pace of progress in artificial general intelligence is incredible fast” (Elon Musk – CEO Tesla & SpaceX) leading to an AI that “would be either the best or the worst thing ever to happen to humanity” (Stephen Hawking – Physicist).
What sparked this new hype? How is Deep Learning different from previous approaches? Let’s look behind the curtain and unravel the reality. This talk will introduce the core concept of deep learning, explore why Sundar Pichai (CEO Google) recently announced that “machine learning is a core transformative way by which Google is rethinking everything they are doing” and explain why “deep learning is probably one of the most exciting things that is happening in the computer industry“ (Jen-Hsun Huang – CEO NVIDIA).
TEDx Manchester: AI & The Future of WorkVolker Hirsch
TEDx Manchester talk on artificial intelligence (AI) and how the ascent of AI and robotics impacts our future work environments.
The video of the talk is now also available here: https://youtu.be/dRw4d2Si8LA
What is Artificial Intelligence | Artificial Intelligence Tutorial For Beginn...Edureka!
** Machine Learning Engineer Masters Program: https://www.edureka.co/masters-program/machine-learning-engineer-training **
This tutorial on Artificial Intelligence gives you a brief introduction to AI discussing how it can be a threat as well as useful. This tutorial covers the following topics:
1. AI as a threat
2. What is AI?
3. History of AI
4. Machine Learning & Deep Learning examples
5. Dependency on AI
6.Applications of AI
7. AI Course at Edureka - https://goo.gl/VWNeAu
For more information, please write back to us at sales@edureka.co
Call us at IN: 9606058406 / US: 18338555775
Facebook: https://www.facebook.com/edurekaIN/
Twitter: https://twitter.com/edurekain
LinkedIn: https://www.linkedin.com/company/edureka
Top 5 Deep Learning and AI Stories - October 6, 2017NVIDIA
Read this week's top 5 news updates in deep learning and AI: Gartner predicts top 10 strategic technology trends for 2018; Oracle adds GPU Accelerated Computing to Oracle Cloud Infrastructure; chemistry and physics Nobel Prizes are awarded to teams supported by GPUs; MIT uses deep learning to help guide decisions in ICU; and portfolio management firms are using AI to seek alpha.
AI and Machine Learning Demystified by Carol Smith at Midwest UX 2017Carol Smith
What is machine learning? Is UX relevant in the age of artificial intelligence (AI)? How can I take advantage of cognitive computing? Get answers to these questions and learn about the implications for your work in this session. Carol will help you understand at a basic level how these systems are built and what is required to get insights from them. Carol will present examples of how machine learning is already being used and explore the ethical challenges inherent in creating AI. You will walk away with an awareness of the weaknesses of AI and the knowledge of how these systems work.
As the adoption of AI technologies increases and matures, the focus will shift from exploration to time to market, productivity and integration with existing workflows. Governing Enterprise data, scaling AI model development, selecting a complete, collaborative hybrid platform and tools for rapid solution deployments are key focus areas for growing data scientist teams tasked to respond to business challenges. This talk will cover the challenges and innovations for AI at scale for the Enterprise focusing on the modernization of data analytics, the AI ladder and AI life cycle and infrastructure architecture considerations. We will conclude by viewing the benefits and innovation of running your modern AI and Data Analytics applications such as SAS Viya and SAP HANA on IBM Power Systems and IBM Storage in hybrid cloud environments.
Advanced Analytics and Data Science ExpertiseSoftServe
An overview of SoftServe's Data Science service line.
- Data Science Group
- Data Science Offerings for Business
- Machine Learning Overview
- AI & Deep Learning Case Studies
- Big Data & Analytics Case Studies
Visit our website to learn more: http://www.softserveinc.com/en-us/
Looks at the different AI approaches and provides some practical categorisation and case studies. Then talks about the data fabric you need to put in place to improve model accuracy and deployment. Covers: supervised, unsupervised, machine learning, deep learning, RPA, etc. Finishes with how to create successful AI projects.
Presentation that I delivered at "Accelerate AI, Europe 2018" in London on Sept 19, 2018. My focus is on socio-cultural perspective as well as proving information about various tools, vendors and partners available to help companies get started using AI.
This presentation provides an objective approach to make your legacy and custom-built applications agile and infused with intelligence. This allows your apps to utilize new and more substantial data sets as well as apply artificial intelligence and machine learning to take in-the-moment actions.
IBM's Watson is a machine-learning platform that’s been built to mirror the same learning process that humans have: Observe, Interpret, Evaluate and Decide. Through the use of this cognitive framework, Watson can search through a database of information and pull out key insights to bridge gaps in human knowledge. It’s expertise scaling for enterprise.
Watson has already helped businesses across a variety of industries increase their customer engagement, data discovery and informed decision making abilities. Is your business next?
Infopulse AI, Data Science & RPA Managed ServicesInfopulse
Infopulse applies methods and tools like artificial intelligence, machine learning & deep learning, computer vision, natural language processing, predictive analytics, data mining, and robotic process automation to help our clients create customized tools that automate their operations, innovate and drive business, and increase customer service. Learn more about our expertise.
Big Data World Singapore 2017 - Moving Towards Digitization & Artificial Inte...Garrett Teoh Hor Keong
Presentation at Big Data World Asia Singapore 2017. A brief introduction to strategies for digitization transformation and introduction to Artificial Intelligence.
Watch here: https://bit.ly/3i2iJbu
You will often hear that "data is the new gold". In this context, data management is one of the areas that has received more attention by the software community in recent years. From Artificial Intelligence and Machine Learning to new ways to store and process data, the landscape for data management is in constant evolution. From the privileged perspective of an enterprise middleware platform, we at Denodo have the advantage of seeing many of these changes happen.
Join us for an exciting session that will cover:
- The most interesting trends in data management.
- Our predictions on how those trends will change the data management world.
- How these trends are shaping the future of data virtualization and our own software.
Correlation Analysis Modeling Use Case - IBM Power Systems Gautam Siwach
Do the people having good financial standing ,higher education level, a steady job corresponds to commit fewer crime, and Does the uneducated, or poor people commit more crime?
Data Source : From the Communities and Crime Un-normalized Data Set
Website : http://archive.ics.uci.edu/ml/machine-learning-databases/00211/CommViolPredUnnormalizedData.txt
Total Observations : 2215
Total Variables : 147
Once you’ve made the decision to leverage AI and/or machine learning, now you need to figure out how you will source the training data that is necessary for a fully functioning algorithm. Depending on your use case, you might need a significant amount of training data, and you’ll want to consider how that is labeled and annotated too.
View Applause's webinar with Cognilytica principal analysts Ronald Schmelzer and Kathleen Walch, alongside Kristin Simonini, Applause’s Vice President of Product, as they tackle the modern challenges that today’s companies face with sourcing training data.
Data is both our most valuable asset and our biggest ongoing challenge. As data grows in volume, variety and complexity, across applications, clouds and siloed systems, traditional ways of working with data no longer work.
Unlike traditional databases, which arrange data in rows, columns and tables, Neo4j has a flexible structure defined by stored relationships between data records.
We'll discuss the primary use cases for graph databases
Explore the properties of Neo4j that make those use cases possible
Look into the visualisation of graphs
Introduce how to write queries.
Webinar, 23 July 2020
AI for Manufacturing (Machine Vision, Edge AI, Federated Learning)byteLAKE
This is the extended presentation about byteLAKE's and Lenovo's Artificial Intelligence solutions for Manufacturing.
Topics covered: AI strategy for manufacturing, Edge AI, Federated Learning and Machine Vision.
It's the first publication in the upcoming series: AI for Manufacturing. Highlights: AI-assisted quality monitoring automation, AI-assisted production line monitoring and issues detection, AI-assisted measurements, Intelligent Cameras and many more. Reach out to us to learn more: welcome@byteLAKE.com.
Presented during the world's first Federated Learning conference (Jun'20). Recording: https://youtu.be/IMqRIi45dDA
Related articles:
- Revolution in factories: Industry 4.0.
https://medium.com/@marcrojek/revolution-in-factories-industry-4-0-conference-made-in-wroclaw-2020-translation-ae96e5e14d55
- Cognitive Automation helps where RPAs fall short.
https://medium.com/@marcrojek/cognitive-automation-helps-where-rpas-fall-short-a1c5a01a66f8
- Machine Vision, how AI brings value to industries.
https://medium.com/@marcrojek/machine-vision-how-ai-brings-value-to-industries-e6a4f8e56f42
Learn more:
- https://www.bytelake.com/en/cognitive-services/
- https://www.lenovo.com/ai
- https://federatedlearningconference.com/
Algorithm Marketplace and the new "Algorithm Economy"Diego Oppenheimer
Talk by Diego Oppenheimer CEO of Algorithmia.com at Data Day Texas 2016.
Peter Sondergaard VP of Research for Gartner recently said the next digital gold rush is "How we do something with data not just what you do with it". During this talk we will cover a brief history of the different algorithmic advances in computer vision, natural language processing, machine learning and general AI and how they are being applied to Big Data today. From there we will talk about how algorithms are playing a crucial part in the next Big Data revolution, new opportunities that are opening up for startups and large companies alike as well as a first look into the role Algorithm Marketplaces will play in this space.
Similar to AI in the Enterprise: Past, Present & Future - StampedeCon AI Summit 2017 (20)
Why Should We Trust You-Interpretability of Deep Neural Networks - StampedeCo...StampedeCon
Despite widespread adoption and success most machine learning models remain black boxes. Many times users and practitioners are asked to implicitly trust the results. However understanding the reasons behind predictions is critical in assessing trust, which is fundamental if one is asked to take action based on such models, or even to compare two similar models. In this talk I will (1.) formulate the notion of interpretability of models, (2.) provide a review of various attempts and research initiatives to solve this very important problem and (3.) demonstrate real industry use-cases and results focusing primarily on Deep Neural Networks.
The Search for a New Visual Search Beyond Language - StampedeCon AI Summit 2017StampedeCon
Words are no longer sufficient in delivering the search results users are looking for, particularly in relation to image search. Text and languages pose many challenges in describing visual details and providing the necessary context for optimal results. Machine Learning technology opens a new world of search innovation that has yet to be applied by businesses.
In this session, Mike Ranzinger of Shutterstock will share a technical presentation detailing his research on composition aware search. He will also demonstrate how the research led to the launch of AI technology allowing users to more precisely find the image they need within Shutterstock’s collection of more than 150 million images. While the company released a number of AI search enabled tools in 2016, this new technology allows users to search for items in an image and specify where they should be located within the image. The research identifies the networks that localize and describe regions of an image as well as the relationships between things. The goal of this research was to improve the future of search using visual data, contextual search functions, and AI. A combination of multiple machine learning technologies led to this breakthrough.
Predicting Outcomes When Your Outcomes are Graphs - StampedeCon AI Summit 2017StampedeCon
In many modern applications data are collected in unusual form. Connectome or brain imaging data are graphs. Wearable devices measuring activity are functions over time. In many cases these objects are collected for each individual or transaction leaving the statistician with the challenge of analyzing populations of data not in classical numeric and categorical formats in big spreadsheets. In this talk I introduce object oriented data analysis with an application we recently developed for regression analysis. This talk will be aimed at the general data scientist and emphasis on the concepts and not mathematical detail. The take home message is how can we use covariates (i.e., meta-data) to predict what the structure of a brain image graph will be.
Novel Semi-supervised Probabilistic ML Approach to SNP Variant Calling - Stam...StampedeCon
This talk aims to dive into technical details in machine learning model development, implementation and values it bring to Monsanto breeding pipeline. We genotype over 100 million seeds a year in order to save field resources and product development cycle time. Automation and high throughput production from the lab becomes key to R&D success. In house predictive model development incorporated random forest ensemble based approach with additional features derived from gaussian mixture model. The results show over 95% accuracy with less than 1% false positives/negatives. Model is highly generalizable with over 10 million data points being trained and tested on. The model also offers probabilistic approach to present genotypes in a more meaningful way and help enhanced downstream genomics analyses. The talk targets audience who are in breeding, genetics, molecular biology, and data scientists who are interested in practical applications.
How to Talk about AI to Non-analaysts - Stampedecon AI Summit 2017StampedeCon
While artificial intelligence for self-driving cars and virtual assistants gets a lot of the notion of communicating the needs, effectiveness and measurements is complicated when speaking “geek”! The work of an analyst, however, does not just involve conducting data analysis within but communicating, championing and speaking simply when talking to the organization, clients and management.
Getting Started with Keras and TensorFlow - StampedeCon AI Summit 2017StampedeCon
This technical session provides a hands-on introduction to TensorFlow using Keras in the Python programming language. TensorFlow is Google’s scalable, distributed, GPU-powered compute graph engine that machine learning practitioners used for deep learning. Keras provides a Python-based API that makes it easy to create well-known types of neural networks in TensorFlow. Deep learning is a group of exciting new technologies for neural networks. Through a combination of advanced training techniques and neural network architectural components, it is now possible to train neural networks of much greater complexity. Deep learning allows a model to learn hierarchies of information in a way that is similar to the function of the human brain.
Don't Start from Scratch: Transfer Learning for Novel Computer Vision Problem...StampedeCon
In this session, we’ll discuss approaches for applying convolutional neural networks to novel computer vision problems, even without having millions of images of your own. Pretrained models and generic image data sets from Google, Kaggle, universities, and other places can be leveraged and adapted to solve industry and business specific problems. We’ll discuss the approaches of transfer learning and fine tuning to help anyone get started on using deep learning to get cutting edge results on their computer vision problems.
Bringing the Whole Elephant Into View Can Cognitive Systems Bring Real Soluti...StampedeCon
Like the story of the six blind men trying to explain the nature of an elephant, current research in cognitive computational systems attempts to identify the nature of an illness, human behavior, or socio-economical phenomenon, from their own perspective.
At present, there is no agreed upon definition for cognitive systems. One large communication corporation defines cognitive systems as a category of technology that uses artificial intelligence, machine learning and reasoning, to enable people and machines to interact more naturally. It also extends and magnifies human expertise and cognition to enable accurate decisions on time. Two of the most famous risk and financial advisory firms agree with that interpretation. A different large corporation, however, considers “cognitive systems” as merely marketing jargon.
If cognitive systems are going to help us solve challenging problems in medicine, economics, or other fields, three aspects must be considered in order to reveal the “true nature of the elephant”.
§ All facets of the problem must be addressed, like the main parts of the elephant had to be touched by the men.
§ These facets must be properly assembled, like the men needed to join hands around the elephant in order to understand what it was.
§ This assembly must be completed within sufficient time to anticipate future decisions. Just like the men needed to know what an elephant is before the next one charges them.
This talk will explain how agnostic (unsupervised, blinded) machine learning findings can be assembled by multiobjective and multimodal optimization research techniques would be utilized to uncover a multifaceted view of the “elephant”, in this case the human being (e.g., genomic variants, personality traits, brain images). It will also give real-world examples of how this knowledge will “extend the human capabilities” by achieving an integrative assessment of the whole person in relation to their risk, which will allow professionals to generate accurate person-centered policies: from personalized diagnoses, business opportunities, or the prevention of outbreaks.
A Different Data Science Approach - StampedeCon AI Summit 2017StampedeCon
This session will focus on how to execute Data Science caliber efforts by creating teams with the attributes of Data Science to deliver meaningful results. As Data Scientists are harder to find and keep, this session should appeal to anyone who is either seeking an alternative approach to executing Data Science delivery or augmenting their current Data Science model with additional options.
Graph in Customer 360 - StampedeCon Big Data Conference 2017StampedeCon
Enterprises typically have many data silos of partial customer data and a common theme in big data projects to use big data tools and pipelines to unify all siloed customer data into a single, queryable, platform for improving all future customer interactions. This data often comes from billing, website traffic, logistics, and marketing; all in different formats with different properties. Graph provides a way to unify all of the data into a single place for use in tracking the flow of a user through the various silos. Graph can also be used for visualizations and analytics that are difficult in other systems.
In this talk we will explore the ways in which Graph can be leveraged in a customer 360 use case. What it can add to a more conventional system and what the approach to developing a graph based Customer 360 system should be.
End-to-end Big Data Projects with Python - StampedeCon Big Data Conference 2017StampedeCon
This talk will go over how to build an end-to-end data processing system in Python, from data ingest, to data analytics, to machine learning, to user presentation. Developments in old and new tools have made this particularly possible today. The talk in particular will talk about Airflow for process workflows, PySpark for data processing, Python data science libraries for machine learning and advanced analytics, and building agile microservices in Python.
System architects, software engineers, data scientists, and business leaders can all benefit from attending the talk. They should learn how to build more agile data processing systems and take away some ideas on how their data systems could be simpler and more powerful.
Doing Big Data Using Amazon's Analogs - StampedeCon Big Data Conference 2017StampedeCon
Big Data doesn’t have to just mean Hadoop any more. Big Data can be done in the cloud, using tools developed by the Cloud providers. This session will cover using Amazon AWS services to implement a Big Data application. We will compare and contrast different services from Amazon with the Hadoop equivalents.
Enabling New Business Capabilities with Cloud-based Streaming Data Architectu...StampedeCon
Using big data isn’t about doing the same things we’ve always done just with different technologies. The technology advances that we’ve chosen to label as big data create the opportunity for wholly new kinds of solutions. Two of the key advances that are enabling new business capabilities are cloud-based data management platforms and streaming data processing and analytics.
In this session, Paul Boal will drill into the cloud-based streaming data architecture that has made possible EVŌ, a new breakthrough health and wellness platform. EVŌ uses a game-changing approach that leverages over 60 billion data points and a predictive analytics engine to intervene BEFORE someone becomes critically ill. All of this is possible by leveraging data from smartphones and wearable fitness devices along with advanced analytics which then help users develop and sustain positive behaviors. Attendees will learn how to create a cloud- based architecture that can receive data, apply multiple layers of dynamic business rules, and drive alerts and decisions through real-time stream processing using technologies including web services, Amazon DynamoDB and Kinesis, Drools, and Apache Spark.
Big Data Meets IoT: Lessons From the Cloud on Polling, Collecting, and Analyz...StampedeCon
The collection and use of Big Data has become an important part of modern business practice. The Internet of Things (IoT) movement promises to provide new opportunities for businesses interested in the intersection of people and technology. It is also wrought with pitfalls for practitioners and researchers who struggle to make sense of an increasing cacophony of signals. How should they poll and collect data from millions of signals in a way that is manageable, scalable, and statistically valid? How should they analyze and predict using these data? This presentation will discuss these challenges with applied examples from monitoring and managing one of the world’s largest computers.
Innovation in the Data Warehouse - StampedeCon 2016StampedeCon
Enterprise Holding’s first started with Hadoop as a POC in 2013. Today, we have clusters on premises and in the cloud. This talk will explore our experience with Big Data and outline three common big data architectures (batch, lambda, and kappa). Then, we’ll dive into the decision points to necessary for your own cluster, for example: cloud vs on premises, physical vs virtual, workload, and security. These decisions will help you understand what direction to take. Finally, we’ll share some lessons learned with the pieces of our architecture worked well and rant about those which didn’t. No deep Hadoop knowledge is necessary, architect or executive level.
Creating a Data Driven Organization - StampedeCon 2016StampedeCon
Companies today are all focused on finding new consumption models to better utilize the data they produce. This presentation will provide insights and best practices for creating the organization and sponsorship necessary to set the foundation for success.
For this session, Dan will provide an overview of the process and methodologies he employs to establish and sustain a Data Driven Culture. Key topics will include:
Data Driven Culture
Executive Sponsorship
Organizational Structure – Collaboration Hubs and Bi-Modal Analytics
Role of Hadoop and Big Data as Part of Data Driven Culture
Using The Internet of Things for Population Health Management - StampedeCon 2016StampedeCon
The Internet of (Human) Things is just beginning to take shape. The human body is an inexhaustible source of data about personal health, and the healthcare industry is just beginning to scratch the surface of the potential insights and value that will come from that data. While much of healthcare traditionally focuses on the episodic delivery of services, the Affordable Care Act is pushing healthcare providers, payers, and self-funded employer groups to look at ways to proactively encourage healthy behaviors. Providing personal health devices as a way to promote individual health is one way that healthcare is beginning to take advantage of IoT technologies. This session provides insight into how IoT is being leveraged in population health management through a solution jointly delivered by Amitech Solutions and Big Cloud Analytics. Attendees will learn how Hadoop is being used to gather personal device from various vendors, integrate and analyze that information, differentiate trends across regional and cultural diversity, and provide personal recommendations and insights into health risks. This session presents one important way the healthcare industry is leveraging IoT.
Turn Data Into Actionable Insights - StampedeCon 2016StampedeCon
At Monsanto, emerging technologies such as IoT, advanced imaging and geo-spatial platforms; molecular breeding, ancestry and genomics data sets have made us rethink how we approach developing, deploying, scaling and distributing our software to accelerate predictive and prescriptive decisions. We created a Cloud based Data Science platform for the enterprise to address this need. Our primary goals were to perform analytics@scale and integrate analytics with our core product platforms.
As part of this talk, we will be sharing our journey of transformation showing how we enabled: a collaborative discovery analytics environment for data science teams to perform model development, provisioning data through APIs, streams and deploying models to production through our auto-scaling big-data compute in the cloud to perform streaming, cognitive, predictive, prescriptive, historical and batch analytics@scale, integrating analytics with our core product platforms to turn data into actionable insights.
The Big Data Journey – How Companies Adopt Hadoop - StampedeCon 2016StampedeCon
Hadoop adoption is a journey. Depending on the business the process can take weeks, months, or even years. Hadoop is a transformative technology so the challenges have less to do with the technology and more to do with how a company adapts itself to a new way of thinking about data. There are challenges for companies who have lived with an application driven business for the last two decades to suddenly become data driven. Companies need to begin thinking less in terms of single, silo’d servers and more about “the cluster”.
The concept of the cluster becomes the center of data gravity drawing all the applications to it. Companies, especially the IT organizations, embark on a process of understanding how to maintain and operationalize this environment and provide the data lake as a service to the businesses. They must empower the business by providing the resources for the use cases which drive both renovation and innovation. IT needs to adopt new technologies and new methodologies which enable the solutions. This is not technology for technology sake. Hadoop is a data platform servicing and enabling all facets of an organization. Building out and expanding this platform is the ongoing journey as word gets out to businesses that they can have any data they want and any time. Success is what drives the journey.
The length of the journey varies from company to company. Sometimes the challenges are based on the size of the company but many times the challenges are based on the difficulty of unseating established IT processes companies have adopted without forethought for the past two decades. Companies must navigate through the noise. Sifting through the noise to find those solutions which bring real value takes time. As the platform matures and becomes mainstream, more and more companies are finding it easier to adopt Hadoop. Hundreds of companies have already taken many steps; hundreds more have already taken the first step. As the wave of successful Hadoop adoption continues, more and more companies will see the value in starting the journey and paving the way for others.
Visualizing Big Data – The FundamentalsStampedeCon
This session will touch upon two visual languages, one to describe the context around what is being asked from the data, and the other, to describe what is quantifiable. From these two visual constructs we will go specifically into the following topics: Grids, Balance, Proximity, Contextual Kernels and Hierarchy.
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
StarCompliance is a leading firm specializing in the recovery of stolen cryptocurrency. Our comprehensive services are designed to assist individuals and organizations in navigating the complex process of fraud reporting, investigation, and fund recovery. We combine cutting-edge technology with expert legal support to provide a robust solution for victims of crypto theft.
Our Services Include:
Reporting to Tracking Authorities:
We immediately notify all relevant centralized exchanges (CEX), decentralized exchanges (DEX), and wallet providers about the stolen cryptocurrency. This ensures that the stolen assets are flagged as scam transactions, making it impossible for the thief to use them.
Assistance with Filing Police Reports:
We guide you through the process of filing a valid police report. Our support team provides detailed instructions on which police department to contact and helps you complete the necessary paperwork within the critical 72-hour window.
Launching the Refund Process:
Our team of experienced lawyers can initiate lawsuits on your behalf and represent you in various jurisdictions around the world. They work diligently to recover your stolen funds and ensure that justice is served.
At StarCompliance, we understand the urgency and stress involved in dealing with cryptocurrency theft. Our dedicated team works quickly and efficiently to provide you with the support and expertise needed to recover your assets. Trust us to be your partner in navigating the complexities of the crypto world and safeguarding your investments.
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
AI in the Enterprise: Past, Present & Future - StampedeCon AI Summit 2017
1. AI in the Enterprise: Past, Present & Future
Paul Huibers, Think Big Analytics
2. 2
“
By 2020 AI will be a top five
investment priority for more
than 30% of CIOs.
—Gartner BI Summit,
February, 2017
“The Resurgence of AI
By 2019, deep learning will provide best-
in-class performance for demand, fraud,
and failure prediction. - Gartner
4. 4
• Introduction to AI/DL
• AI in Industry
• Case Study: Financial Fraud
• Pilot to Production
Agenda
5. 5
What is AI?
Artificial Intelligence is usually defined as the
science of making computers do things that
require intelligence when done by humans.
6. 6
AI: A Brief History… …and now, Deep Learning!
• 1940’s – early concepts developed
• 1980’s – more concepts
– copy the brain, neurons, perceptron
– backpropagation for training
• 1990’s – LeCun handwriting reader
• AI winter
• 2009 – Netflix Prize $1M
• 2010 – first ImageNet competition
• 2012 - AI/deep learning comes of age
• ImageNet classification error:
– 2011: 25% using traditional methods
– 2012: 16% achieved by a ConvNet
– 2013: 11%
– 2014: 6.7%
– 2015: 3.6%
– 2016: < 3%
• 3% ~ human error rate (expert group)
• 0.3% mislabeling
• (1000 categories of images)
• What changed since the 1990s?
– 10,000X computing power, GPUs
– massive labeled datasets
8. 8
ImageNet
1.2 million images 1000 categories lots of animals…
a jungle of viewpoints,
lighting conditions, and variations of all
imaginable types.
…a jungle of viewpoints, lighting conditions, and variations of all imaginable types. – Karpathy
9. 9
What is Deep Learning?
• A machine learning method that involves learning data representations
rather than task-specific algorithms
• Deep Neural Networks – an artificial neural network with multiple hidden
layers of “neurons” between the input and the output
• Artificial Neural Networks – computing systems inspired by biological
neural networks, involving a collection of connected units, with learned
weights and activation functions between the units
How is it achieved?
10. 10
Deep Neural Networks
How are they different?
• Multiple hidden layers in neural network with intermediate data representations
to facilitate dimensional reduction
• Interpret non-linear relationships in the data through activation functions
• Derive patterns from data with very high dimensionality
Why do we care?
• Ability to create value with little
or no domain knowledge
required
• Ability to incorporate data from
across multiple, seemingly
unrelated sources
• Ability to tolerate very noisy data
11. 11
Data Quantity Drives Deep Learning Performance
Andrew Ng
Amount of Labeled Data
ModelPerformance
1990’s
Small Training Sets
Traditional ML
Small NN
Medium NN
Large NN
12. 12
Deep Learning Architectures
Convolutional Neural Network (ConvNet or CNN)
• CNN = Convolution + Pooling + ReLu + Fully Connected
• Convolution Layers are composable so can be chained
• Primary use: any problem that has a high
dimensional input (ex.: Image Labeling)
13. 13
Specialized APIs General Purpose Frameworks
AI Framework Landscape
Vision
Language
Speech
Keras
• Pretrained (fast)
• Public
• Google/Microsoft/Amazon
• Need to be trained (expensive)
• Private
• Fully customizable
14. 14
Touched by AI…
• Cognitive successes
• Siri, Alexa, OK Google!
– Understanding words
– Understanding context
– Language translation
• Face detection in images
• Recommender systems
• How about some practical
examples from industry?
15. 15
• Introduction to AI/DL
• AI in Industry
• Case Study: Financial Fraud
• Pilot to Production
Agenda
16. 16
Proven applications of Deep Learning
ANOMALY
DETECTION
Enables real-time
detection of
abnormal patterns of
data, usually time-
series events.
PREDICTIVE
MAINTENANCE
Improves preventative
measures &
performance with
greater accuracy at
the asset &
component level
RECOMMENDER
SYSTEMS
Enable more effective
search rankings based
on context, in
accordance with a
particular objective
such as purchase or
click-through
SPEECH
RECOGNITION
Enable capture of
voice to text with
higher fidelity of
speech transcription
and improved
precision of speaker
identification
COMPUTER
VISION
Enables dramatically
more accurate visual
recognition tasks
that include image
classification,
detection and
localization
DOCUMENT
AUTOMATION
Enables automation
of manual, paper-
based processes
that are human-
intensive with higher
speed, accuracy
and fidelity
17. 17
Industry Specific Use Cases
High-Dimensional Data
Image
Video
Audio
Time Series
Text
• Many already have working solutions using non-DL Machine Learning Techniques
• Deep Learning is delivering improvement in performance on complex problems
Automotive Retail
• Navigation, Guidance, Assistance
• Predictive Maintenance
• Visual Search
• Recommendation
• Text Analytics
• Assistants
• Brand Analytics
Manufacturing & High-Tech Health Care
• Image/Audio/Video
• Reinforcement Learning – Systems
Optimization
• Plant Operations Optimization
• Image-based Analysis
• Drug Discovery
Financial Services & Insurance Cross-Industry
• Anti-Fraud
• Portfolio Optimization
• Damage Assessment
• Cyber Security
• Call Center Audio
18. 18
Large European Logistics Provider
• Increasing use of plastic bags in
shipping
• Challenges with existing package
sorting and identification system
• Use Deep Learning Image
Analytics to improve identification
and sorting
• Tools: TensorFlow, Hadoop
• Techniques:
– Deep Learning: Convolutional
Neural Network
19. 19
• Road objects, traffic and accident
events are manually reported or not
at all
• Automated object detection and
scene labeling system from car
camera feed to improve navigation
and traffic
• Tools: Darknet, Caffe, TensorFlow
• Techniques:
– Object Detection: Single Shot MultiBox
Detector (SSD), You Only Look Once
(YOLO)
– Scene Labeling: Convolutional Neural
Network
Large Auto Parts Manufacturer Use Case
Real-Time
Streaming
Streaming
Results
Traffic Data Service
Navigation Update
Darknet/Darkflow –
Object Detection
TensorFlow – Scene
Labeling
Cloud GPU
Based Training
TF Serving
Cloud GPU
Based Inference
Model
Updates
20. 20
• Handwritten check volume is
decreasing however processing
checks has many fixed costs
• Handwriting recognition to reduce
manual processing and fraud
examination resulting in cost savings
• Tools: Spark, Hadoop, TensorFlow
• Techniques:
– Convolutional Neural Network
– Image Processing
Large US Multinational Bank
Check Images
To Hadoop
ImageMagick
Processing
Handwriting
Recognition
Fraud
Detection
21. 21
• Predict failure of pistons on large
container ships to reduce unplanned
and costly maintenance
• Utilized sensor data to predict piston
wear between 70-80%
• Failures are extremely infrequent so there
is a risk of overfitting
• Tools: R, Hadoop, Spark, AWS
• Techniques:
– ROC curve
– Internet of Things data
– Methods to prevent overfitting
Large Container Shipping Company Container Ship Sensors
Predict
Failures
1 month (December)
High
Low
Abnormalcylinderbehaviour*
Lead time
Port stays
PROB1
0.0
0.2
0.4
0.6
0.8
2015-12-06 2015-12-13 2015-12-20 2015-12-27
Piston ring change
Cylinders
Piston ring(s) changed Threshold
Abnormal behavior:
Everything above
the threshold
triggers an alarm
Other cylinders
below threshold
Worn piston ring was
changed
Each point is a
combination of
selected sensor data
for a specific cylinder
22. 22
Large European Railway
• Detecting rail switch failures
• Allows for switches to be fixed
ahead of time thus not delaying
trains
• Tools: R (Shiny and Studio),
Hadoop, Spark
• Techniques:
– Survival Analysis
– Machine Learning
– Internet of Things data
Railway Switch Sensor Data
Visualize
Failures
and Act
23. 23
• Fraud detection across products
• Trends
– Mobile payments exploding
– Fraud evolving rapidly, increased
sophistication
• Significant improvements over
traditional rules-based techniques
• Tools: Spark, Hadoop, TensorFlow
• Techniques:
– Boosted Decision Trees
– Convolutional Neural Network
Large European Bank
24. 24
State of AI in Industry
Successes
• Computer vision (e.g., ImageNet)
• Speech & NLP
• Simplification of general-purpose
ML (e.g., recommendation)
• Rapid advance of state of art,
growth of expertise & applications
• Major investment programs in
industry
Challenges
• Research-driven, fundamentals
change
• Mostly empirical, little theory
• Complexity in solution design
• Limited access to talent
• AI/DL still requires governed data,
and Analytics Ops integration
• Gaps in enterprise deployment
beyond lock-in clouds
25. 25
• Introduction to AI/DL
• AI in Industry
• Case Study: Financial Fraud
• Pilot to Production
Agenda
38. 38
• Introduction to AI/DL
• AI in Industry
• Case Study: Financial Fraud
• Pilot to Production
Agenda
39. 39
Operationalization is Hard
“We evaluated some of the new methods offline
but the additional accuracy gains that we
measured did not seem to justify the engineering
effort needed to bring them into a production
environment.”
- Netflix, 2012
40. 40
Focus First on Pilot into Production
Sets up Phase Two: Scale COE, Standardize Capabilities
Investigate
Test
Engineer
SimulateIntegration
Analyze
Data
Go Live
Handover
Validate
Activities: Define business
opportunity, understand data
available, test model
approaches, potentially
generate data
Outcome: Proposed solution
approach
Discovery/Insights
Activities: Architecture
selection, software engineering
of model and simulation
Outcome: Predicted impact of
model
Live Test
Activities: Integration into
live business process
(Champion/Challenger),
analysis, iteration
Outcome: Benefit
measurement, live learnings,
improvement
Production
Activities: Go Live, Analytics
Ops integration, Hand Over
Outcome: System scaled,
application teams and ops
trained and operating
Assessment
Insights
Production
Live Test
Cross-Functional
Teams
Cross-Functional Teams
41. 4141
For more information, please contact:
Paul.Huibers@ThinkBigAnalytics.com
603-395-6567
Thank You StampedeCon!
stampedecon.com/ai-summit-2017-st-louis/
42. 42
The Future… and more…
Architectural innovations: RNN, LSTM, GAN and more
Better training through new optimization, new activation functions and more
Transfer learning, pre-training and more
Theory catching up with practice (Tishby) – relevant information, bottlenecks
10,000X speed improvement would make many things possible – Moore’s Law
Unsupervised learning
Learning with few samples
AGI – artificial general intelligence
Singularity