it presents you
1.Introduction to Artificial Intelligence
2.History and Evolution
3.Speech synthesis
4.Robots and Image processing
5.Sensor fusion
6.Innovation in Artificial Intelligence
7.conclusion
Artificial Intelligence (AI) is one of the hottest topics in the tech and startup world at the moment. The field of AI and its associated technologies present a range of opportunities – as well as challenges – for corporates. Learn more about what Artificial Intelligence means for your organization.
it presents you
1.Introduction to Artificial Intelligence
2.History and Evolution
3.Speech synthesis
4.Robots and Image processing
5.Sensor fusion
6.Innovation in Artificial Intelligence
7.conclusion
Artificial Intelligence (AI) is one of the hottest topics in the tech and startup world at the moment. The field of AI and its associated technologies present a range of opportunities – as well as challenges – for corporates. Learn more about what Artificial Intelligence means for your organization.
Mr. Koushal Kumar Has done his M.Tech degree in Computer Science and Engineering from Lovely Professional University, Jalandhar, India. He obtained his B.S.C and M.S.C in computer science from D.A.V College Amritsar Punjab. His area of research interests lies in Artificial Neural Networks, Soft computing, Computer Networks, Grid Computing, and data base management systems
Past, Present and Future of AI: a Fascinating Journey - Ramon Lopez de Mantar...PAPIs.io
Possibly the most important lesson we have learned after 60 years of AI research is that what seemed to be very difficult to achieve, such as accurate medical diagnosis to playing chess at the level of a Grand Master, turned out to be relatively easy whereas what seemed easy, such as visual object recognition or deep language understanding, turned out to be extremely difficult. In my talk I will try to explain the reasons for this apparent contradiction by briefly reviewing the past and present of AI and projecting it into the near future.
Ramon Lopez de Mantaras is Research Professor of the Spanish National Research Council (CSIC) and Director of the Artificial Intelligence Research Institute of the CSIC. Technical Engineer EE (Electrical Engineering) from the Technical Engineering School of Mondragón (Spain) in 1973. Master of Sciences in Automatic Control from the University of Toulouse III (France) in 1974, Ph.D. in Physics from the University of Toulouse III (France), in 1977, with a thesis in Robotics (done at LAAS, CNRS). Master of Science in Engineering (ComputerScience) from the University of California at Berkeley (USA) in 1979. Ph.D. in Computer Science, from the Technical University of Catalonia, Barcelona (Spain) in 1981.
The State of Artificial Intelligence in 2018: A Good Old Fashioned ReportNathan Benaich
Artificial intelligence (AI) is a multidisciplinary field of science whose goal is to create intelligent machines.
We believe that AI will be a force multiplier on technological progress in our increasingly digital, data-driven world.
This is because everything around us today, ranging from culture to consumer products, is a product of intelligence.
In this report, we set out to capture a snapshot of the exponential progress in AI with a focus on developments in the past 12 months. Consider this report as a compilation of the most interesting things we’ve seen that seeks to trigger informed conversation about the state of AI and its implication for the future.
We consider the following key dimensions in our report:
Research: Technology breakthroughs and their capabilities.
Talent: Supply, demand and concentration of talent working in the field.
Industry: Large platforms, financings and areas of application for AI-driven innovation today and tomorrow.
Politics: Public opinion of AI, economic implications and the emerging geopolitics of AI.
Collaboratively produced in East London, UK by:
- Nathan Benaich, Founder of Air Street Capital (www.airstreet.com) and RAAIS (www.raais.co).
- Ian Hogarth, Visiting Professor at UCL's IIPP (https://www.twitter.com/IIPP_UCL) and angel investor.
Artificial Intelligence Research Topics for PhD Manuscripts 2021 - PhdassistancePhD Assistance
Imagine a world where knowledge isn’t limited to humans!!! A world in which computers will think and collaborate with humans to create a more exciting universe. Although this future is still a long way off, Artificial Intelligence has made significant progress in recent years. In almost every area of AI, such as quantum computing, healthcare, autonomous vehicles, the internet of things, robotics, and so on, there is a lot of research going on. So much so that the number of annual Published Research Papers on Artificial Intelligence has increased by 90% since 1996.
Ph.D. Assistance serves as an external mentor to brainstorm your idea and translate that into a research model. Hiring a mentor or tutor is common and therefore let your research committee know about the same. We do not offer any writing services without the involvement of the researcher.
Learn More: https://bit.ly/2Sdlfn4
Contact Us:
Website: https://www.phdassistance.com/
UK NO: +44–1143520021
India No: +91–4448137070
WhatsApp No: +91 91769 66446
Email: info@phdassistance.com
The Foundations of Artificial Intelligence, The History of
Artificial Intelligence, and the State of the Art. Intelligent Agents: Introduction, How Agents
should Act, Structure of Intelligent Agents, Environments. Solving Problems by Searching:
problem-solving Agents, Formulating problems, Example problems, and searching for Solutions,
Search Strategies, Avoiding Repeated States, and Constraint Satisfaction Search. Informed
Search Methods: Best-First Search, Heuristic Functions, Memory Bounded Search, and Iterative
Improvement Algorithms.
Applications of artificial intelligence techniques to combating cyber crimes ...ijaia
With the advances in information technology (IT) criminals are using cyberspace to commit numerous cyber crimes. Cyber infrastructures are highly vulnerable to intrusions and other threats. Physical devices and human intervention are not sufficient for monitoring and protection of these infrastructures; hence, there is a need for more sophisticated cyber defense systems that need to be flexible, adaptable and robust, and able to detect a wide variety of threats and make intelligent real-time decisions. Numerous bio-inspired computing methods of Artificial Intelligence have been increasingly playing an important role in cyber crime detection and prevention. The purpose of this study is to present advances made so far in the field of applying AI techniques for combating cyber crimes, to demonstrate how these techniques can be an effective tool for detection and prevention of cyber attacks, as well as to give the scope for future work.
AI&BigData Lab. Артем Чернодуб "Распознавание изображений методом Lazy Deep ...GeeksLab Odessa
23.05.15 Одесса. Impact Hub Odessa. Конференция AI&BigData Lab
Артем Чернодуб (Computer Vision Team, ZZ Wolf)
"Распознавание изображений методом Lazy Deep Learning в фото-органайзере ZZ Photo"
В докладе рассматривается проблема распознавания изображений методами машинного зрения. Проводится краткий обзор существующих подзадач в этой области (детекция обьектов, классификация сцен, ассоциативный поиск в базах изображений, распознавание лиц и др.) и современных методов их решения с акцентом на глубокое обучение (Deep Learning).
Подробнее:
http://geekslab.co/
https://www.facebook.com/GeeksLab.co
https://www.youtube.com/user/GeeksLabVideo
Mr. Koushal Kumar Has done his M.Tech degree in Computer Science and Engineering from Lovely Professional University, Jalandhar, India. He obtained his B.S.C and M.S.C in computer science from D.A.V College Amritsar Punjab. His area of research interests lies in Artificial Neural Networks, Soft computing, Computer Networks, Grid Computing, and data base management systems
Past, Present and Future of AI: a Fascinating Journey - Ramon Lopez de Mantar...PAPIs.io
Possibly the most important lesson we have learned after 60 years of AI research is that what seemed to be very difficult to achieve, such as accurate medical diagnosis to playing chess at the level of a Grand Master, turned out to be relatively easy whereas what seemed easy, such as visual object recognition or deep language understanding, turned out to be extremely difficult. In my talk I will try to explain the reasons for this apparent contradiction by briefly reviewing the past and present of AI and projecting it into the near future.
Ramon Lopez de Mantaras is Research Professor of the Spanish National Research Council (CSIC) and Director of the Artificial Intelligence Research Institute of the CSIC. Technical Engineer EE (Electrical Engineering) from the Technical Engineering School of Mondragón (Spain) in 1973. Master of Sciences in Automatic Control from the University of Toulouse III (France) in 1974, Ph.D. in Physics from the University of Toulouse III (France), in 1977, with a thesis in Robotics (done at LAAS, CNRS). Master of Science in Engineering (ComputerScience) from the University of California at Berkeley (USA) in 1979. Ph.D. in Computer Science, from the Technical University of Catalonia, Barcelona (Spain) in 1981.
The State of Artificial Intelligence in 2018: A Good Old Fashioned ReportNathan Benaich
Artificial intelligence (AI) is a multidisciplinary field of science whose goal is to create intelligent machines.
We believe that AI will be a force multiplier on technological progress in our increasingly digital, data-driven world.
This is because everything around us today, ranging from culture to consumer products, is a product of intelligence.
In this report, we set out to capture a snapshot of the exponential progress in AI with a focus on developments in the past 12 months. Consider this report as a compilation of the most interesting things we’ve seen that seeks to trigger informed conversation about the state of AI and its implication for the future.
We consider the following key dimensions in our report:
Research: Technology breakthroughs and their capabilities.
Talent: Supply, demand and concentration of talent working in the field.
Industry: Large platforms, financings and areas of application for AI-driven innovation today and tomorrow.
Politics: Public opinion of AI, economic implications and the emerging geopolitics of AI.
Collaboratively produced in East London, UK by:
- Nathan Benaich, Founder of Air Street Capital (www.airstreet.com) and RAAIS (www.raais.co).
- Ian Hogarth, Visiting Professor at UCL's IIPP (https://www.twitter.com/IIPP_UCL) and angel investor.
Artificial Intelligence Research Topics for PhD Manuscripts 2021 - PhdassistancePhD Assistance
Imagine a world where knowledge isn’t limited to humans!!! A world in which computers will think and collaborate with humans to create a more exciting universe. Although this future is still a long way off, Artificial Intelligence has made significant progress in recent years. In almost every area of AI, such as quantum computing, healthcare, autonomous vehicles, the internet of things, robotics, and so on, there is a lot of research going on. So much so that the number of annual Published Research Papers on Artificial Intelligence has increased by 90% since 1996.
Ph.D. Assistance serves as an external mentor to brainstorm your idea and translate that into a research model. Hiring a mentor or tutor is common and therefore let your research committee know about the same. We do not offer any writing services without the involvement of the researcher.
Learn More: https://bit.ly/2Sdlfn4
Contact Us:
Website: https://www.phdassistance.com/
UK NO: +44–1143520021
India No: +91–4448137070
WhatsApp No: +91 91769 66446
Email: info@phdassistance.com
The Foundations of Artificial Intelligence, The History of
Artificial Intelligence, and the State of the Art. Intelligent Agents: Introduction, How Agents
should Act, Structure of Intelligent Agents, Environments. Solving Problems by Searching:
problem-solving Agents, Formulating problems, Example problems, and searching for Solutions,
Search Strategies, Avoiding Repeated States, and Constraint Satisfaction Search. Informed
Search Methods: Best-First Search, Heuristic Functions, Memory Bounded Search, and Iterative
Improvement Algorithms.
Applications of artificial intelligence techniques to combating cyber crimes ...ijaia
With the advances in information technology (IT) criminals are using cyberspace to commit numerous cyber crimes. Cyber infrastructures are highly vulnerable to intrusions and other threats. Physical devices and human intervention are not sufficient for monitoring and protection of these infrastructures; hence, there is a need for more sophisticated cyber defense systems that need to be flexible, adaptable and robust, and able to detect a wide variety of threats and make intelligent real-time decisions. Numerous bio-inspired computing methods of Artificial Intelligence have been increasingly playing an important role in cyber crime detection and prevention. The purpose of this study is to present advances made so far in the field of applying AI techniques for combating cyber crimes, to demonstrate how these techniques can be an effective tool for detection and prevention of cyber attacks, as well as to give the scope for future work.
AI&BigData Lab. Артем Чернодуб "Распознавание изображений методом Lazy Deep ...GeeksLab Odessa
23.05.15 Одесса. Impact Hub Odessa. Конференция AI&BigData Lab
Артем Чернодуб (Computer Vision Team, ZZ Wolf)
"Распознавание изображений методом Lazy Deep Learning в фото-органайзере ZZ Photo"
В докладе рассматривается проблема распознавания изображений методами машинного зрения. Проводится краткий обзор существующих подзадач в этой области (детекция обьектов, классификация сцен, ассоциативный поиск в базах изображений, распознавание лиц и др.) и современных методов их решения с акцентом на глубокое обучение (Deep Learning).
Подробнее:
http://geekslab.co/
https://www.facebook.com/GeeksLab.co
https://www.youtube.com/user/GeeksLabVideo
The field of Artificial Intelligence (AI) has been revitalized in this decade, primarily due to the large-scale application of Deep Learning (DL) and other Machine Learning (ML) algorithms. This has been most evident in applications like computer vision, natural language processing, and game bots. However, extraordinary successes within a short period of time have also had the unintended consequence of causing a sharp difference of opinion in research and industrial communities regarding the capabilities and limitations of deep learning. A few questions you might have heard being asked (or asked yourself) include:
a. We don’t know how Deep Neural Networks make decisions, so can we trust them?
b. Can Deep Learning deal with highly non-linear continuous systems with millions of variables?
c. Can Deep Learning solve the Artificial General Intelligence problem?
The goal of this seminar is to provide a 1000-feet view of Deep Learning and hopefully answer the questions above. The seminar will touch upon the evolution, current state of the art, and peculiarities of Deep Learning, and share thoughts on using Deep Learning as a tool for developing power system solutions.
Deep learning is a collection of machine learning algorithms utilizing multiple layers, with which higher levels of raw data are slowly removed. For example, lower layers can recognize edges in image processing whereas higher layers may define concepts for humans such as numbers or letters or faces. In this paper we have done a literature survey of some other papers to know how useful is Deep Learning and how to define other Artificial Intelligence things using Deep Learning. Anirban Chakraborty "A Study of Deep Learning Applications" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-4 , June 2020, URL: https://www.ijtsrd.com/papers/ijtsrd31629.pdf Paper Url :https://www.ijtsrd.com/computer-science/artificial-intelligence/31629/a-study-of-deep-learning-applications/anirban-chakraborty
Recurrent Neural Networks (RNNs) represent the reference class of Deep Learning models for learning from sequential data. Despite the widespread success, a major downside of RNNs and commonly derived ‘gating’ variants (LSTM, GRU) is given by the high cost of the involved training algorithms. In this context, an increasingly popular alternative is the Reservoir Computing (RC) approach, which enables limiting the training algorithm to operate only on a restricted set of (output) parameters. RC is appealing for several reasons, including the amenability of being implemented in low-powerful edge devices, enabling adaptation and personalization in IoT and cyber-physical systems applications.
This webinar will introduce Reservoir Computing from scratch, covering all the fundamental design topics as well as good practices. It is targeted to both researchers and practitioners that are interested in setting up fastly-trained Deep Learning models for sequential data.
Why Neurons have thousands of synapses? A model of sequence memory in the brainNumenta
Presentation given by Yuwei Cui, Numenta Research Engineer at Beijing Normal University. December 2015.
Collaborators: Jeff Hawkins, Subutai Ahmad, Chetan Surpur
In this deck from the Perth HPC Conference, Rob Farber from TechEnablement presents: AI is Impacting HPC Everywhere.
"The convergence of AI and HPC has created a fertile venue that is ripe for imaginative researchers — versed in AI technology — to make a big impact in a variety of scientific fields. From new hardware to new computational approaches, the true impact of deep- and machine learning on HPC is, in a word, “everywhere”. Just as technology changes in the personal computer market brought about a revolution in the design and implementation of the systems and algorithms used in high performance computing (HPC), so are recent technology changes in machine learning bringing about an AI revolution in the HPC community. Expect new HPC analytic techniques including the use of GANs (Generative Adversarial Networks) in physics-based modeling and simulation, as well as reduced precision math libraries such as NLAFET and HiCMA to revolutionize many fields of research. Other benefits of the convergence of AI and HPC include the physical instantiation of data flow architectures in FPGAs and ASICs, plus the development of powerful data analytic services."
Learn more: http://www.techenablement.com/
and
http://hpcadvisorycouncil.com/events/2019/australia-conference/agenda.php
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
Tifinagh handwritten character recognition using optimized convolutional neu...IJECEIAES
Tifinagh handwritten character recognition has been a challenging problem due to the similarity and variability of its alphabets. This paper proposes an optimized convolutional neural network (CNN) architecture for handwritten character recognition. The suggested model of CNN has a multi-layer feedforward neural network that gets features and properties directly from the input data images. It is based on the newest deep learning open-source Keras Python library. The novelty of the model is to optimize the optical character recognition (OCR) system in order to obtain best performance results in terms of accuracy and execution time. The new optical character recognition system is tested on a customized dataset generated from the amazigh handwritten character database. Experimental results show a good accuracy of the system (99.27%) with an optimal execution time of the classification compared to the previous works.
2019년 파이콘 한국에서 진행된 튜토리얼 자료입니다. 최재식 교수님께서 설명가능인공지능이란 무엇인가에 대해 발표해주신 Part 1 발표자료입니다. 아래 링크를 통해 행사 관련 정보를 확인하실 수 있습니다.
http://xai.unist.ac.kr/Tutorial/2018/
https://github.com/OpenXAIProject/PyConKorea2019-Tutorials
Part 1: https://www.slideshare.net/OpenXAI/2019-part-1
Part 2: https://www.slideshare.net/OpenXAI/2019-lrp-part-2
Part 3: https://www.slideshare.net/OpenXAI/2019-shap-part-3
Similar to Paper sharing_deep learning for smart manufacturing methods and applications (20)
Paper sharing_Patient health locus of control the design of information syste...YOU SHENG CHEN
From European Journal of Information Systems
Authors:James Wallace, Matthew T Mullarkey& Alan Hevner (2023)
Presenter :CHEN,YOU-SHENG (Shane) 2023 / 02 / 08
Paper sharing_An assisted approach to business process redesignYOU SHENG CHEN
Tobias Fehrer, Dominik A. Fischer, Sander J.J. Leemans, Maximilian Röglinger, Moe T. Wynn, An assisted approach to business process redesign,Decision Support Systems, Volume 156,2022,113749, ISSN 0167-9236, https://doi.org/10.1016/j.dss.2022.113749
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Paper sharing_deep learning for smart manufacturing methods and applications
1. Deep learning for smart manufacturing:
Methods and applications
From Journal of Manufacturing Systems
Jinjiang Wang, Yulin Ma, Laibin Zhanga, Robert X. Gao, Dazhong Wu(2018)
報告人:陳佑昇
2021/08/06
4. Vocabularies 2/3
/34
3
P. English Chinese
146 contour 輪廓
146 spurious redundancy 干擾冗餘
147 sensory data 感知資料
147 aggregated 聚合
147 Prescriptive analytics 指示性分析
147 Up to date 最新的
147 Building blocks 基石
148 backpropagation 反向傳播算法
149 contractive 收縮的
149 Principle component
analysis (PCA)
主成分分析
P. English Chinese
149 Stochastic gradient
descent (SGD)
隨機梯度下降法
149 denoising 去噪
149 Gaussian noise 高斯噪聲
149 Resistant 抵抗
149 perturbation 擾動
150 vanishing 消失
150 Forget gate 遺忘閥
150 prognostics 預測
151 arbitrary 任意
152 thresholding 門檻
5. Vocabularies 3/3
/34
4
P. English Chinese
152 assessment 判定
152 fracture 斷裂
152 incipient 期初的
152 spectrum 幅度
152 vibration 震動
152 permutation 排列
152 energy operator 能量算子
152 planetary gearbox 行星齒輪
變速箱
152 corruption 腐壞
152 wind turbine 風力渦輪機
P. English Chinese
152 rolling bearing 滾動軸承
152 propagation 傳播
152 remaining useful life
(RUL)
剩餘使用壽命
152 turbofan 渦輪發動機
152 polishing 拋光
152 semiconductor 半導體
152 ceramic bearing 陶瓷軸承
153 matter 問題
153 curse of dimensionality 維度災難
153 fusion 融合
6. Content
1. Introduction
2. Overview of data driven intelligence
3. Deep learning for smart manufacturing
4. Applications to smart manufacturing
5. Discussions and outlook
/34
5
7. Introduction
• Various countries have developed
strategic roadmaps to transform
manufacturing to take advantage
of the emerging infrastructure
• According SMLC survey:
82% of the companies using smart
manufacturing technologies have
experienced increased efficiency
and 45% of the companies of the
companies experienced increased
customer satisfaction
Germany(2010)-
Industry 4.0
United States(2011)-
Created a systematic
framework
China(2015)-
Manufacturing 2025
/34
6
8. Introduction
• The massive data in smart manufacturing imposes a variety of challenges
• Data driven intelligence need to obtain more actionable and insightful
information
/34
7
• Deep learning
towards highly
nonlinear and
complex feature
abstraction
10. The evolution of data-driven artificial intelligence 1/8
Timeline Proposed models Reference
Infancy period
(1940s)
MP model
Mcculloch WS, Pitts WH. A logical calculus of the ideas immanent in
nervous activity. Bull Math Biophys 1943;5(4):115–33.
Hebb rule
Samuel AL. Some studies in machine learning using the game of checkers
II—recent progress. Annu Rev Autom Program 2010;44(1–2):206–26.
/34
9
• MP model(1943) and Hebb rule(1949) were proposed to discuss how neurons
worked in human brain
• Significant artificial intelligence capabilities like playing chess games and solving
simple logic problems were developed
11. The evolution of data-driven artificial intelligence 2/8
Timeline Proposed models Reference
First upsurge
period
(1960s)
Perceptron Rosenblatt F. Perceptron simulation experiments. Proc IRE1960;48(3):301–9.
Adaptive Linear Unit
Widrow B, Hoff ME. Adaptive switching circuits. Cambridge: MIT
Press;1960.
/34
10
• Perceptron(1956) was proposed to simulate the nervous system of human
learning with linear optimization
• Adaptive Linear Unit(1959) had been successfully used in practical applications
• Criticized due to the difficulty in handling non-linear problems, such as XOR (or
XNOR) classification
12. The evolution of data-driven artificial intelligence 3/8
/34
11
Timeline Proposed models Reference
Second upsurge
period
(1980s)
Hopfield network circuit
Tank DW, Hopfield JJ. Neural computation by concentrating
information intime. Proc Natl Acad Sci USA 1987;84(7):1896.
Back Propagation
Werbos PJ. Backpropagation through time: what it does and how
to do it.Proc IEEE 1990;78(10):1550–60.
Boltzmann Machine
Sussmann HJ. Learning algorithms for Boltzmann machines. 27th
IEEEconference on decision and control 1988;1:786–91.
Support Vector Machine
Vapnik VN. An overview of statistical learning theory. IEEE Trans
NeuralNetw 1998;10(5):988–99.
• Hopfield networks(1982) serve as associative memory systems with binary threshold nodes
• BP(1974) algorithm was proposed to solve non-linear problems in complex neural network
and put forward the BM(1985)
• SVM(1997) showed decent performance on classification and regression
13. The evolution of data-driven artificial intelligence 4/8
/34
12
Timeline Proposed models Reference
Second upsurge
period
(1980s)
Restricted
Boltzmann Machine
Smolensky P. Information processing in dynamical systems: foundations
ofharmony theory. Parallel distributed processing: explorations in
themicrostructure of cognition. Cambridge: MIT Press; 1986.
Auto Encoder
Rumelhart DE, Hinton GE, Williams RJ. Learning representations by back-
propagating errors. Nature 1986;323(6088):533–6.
• There are traditional machine learning techniques discuss before, they require
human expertise for feature extraction and highly relies on the engineered features.
RBM and AE are Deep learning models which uses data representation learning
• RBM(1986) was developed by obtaining the probability distribution of Boltzmann
Machine
• AE(1986) was proposed using the layer-by-layer Greedy learning algorithm to
minimize the loss function
14. The evolution of data-driven artificial intelligence 5/8
/34
13
Timeline Proposed models Reference
Third boom
period
(after 2000s)
Recurrent Neural Network
Hihi SE, Hc-J MQ, Bengio Y. Hierarchical recurrent neural networks
for Long-Term dependencies. Adv Neural Inf Process Syst
1995;8:493–9.
Long short-term Memory
Hochreiter S, Schmidhuber J. Long short-Term memory. Neural
Comput1997;9(8):1735.
Convolutional Neural
Network
Lécun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning
applied to document recognition. Proc IEEE 1998;86(11):2278–324.
• RNN(1995) was proposed for feature learning from sequence data
• LSTM(1997) was proposed to tackle the vanishing gradient problem and deal with
complex time sequence data
• CNN(1998) was put forward to handle two dimensional inputs
• Many attempts no satisfactory performance was reported before 2006
15. The evolution of data-driven artificial intelligence 6/8
/34
14
Timeline Proposed models Reference
Third boom
period
(after 2000s)
Deep Belief Network
Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data withneural
network. Science 2006;313(5786):504–7. Hinton GE, Osindero S, Teh YW. A fast
learning algorithm for deep beliefnets. Neural Comput 2014;18(7):1527–54.
Deep Auto Encoder
Deng L, Seltzer M, Yu D, Acero A, Mohamed A, Hinton GE. Binary coding of
speech spectrograms using a deep auto-encoder. Proceedings of 11th annual
conference of the international speech communication association2010;3:1692–5.
Sparse Auto Encoder
Schölkopf B, Platt J, Hofmann T. Efficient learning of sparse representations with an energy-
Based model. Proceedings of advances in neural information processingsystems 2006:1137–44.
Ranzato MA, Boureau YL, Lecun Y. Sparse feature learning for deep belief networks. Proceedings
of international conference on neural information processing systems 2007;20:1185–92.
• DBN(2006) was proposed by reducing computational complexity, and the parameters were
successfully learned through layer-wise pre-training and fine tuning
• Deep Auto Encoder(2005) was proposed by adding more hidden layers to deal with high
nonlinear input
• SAE(2006) was put forward to reduce dimensionality and learn sparse representations
• Deep learning gained increasing popularity
16. The evolution of data-driven artificial intelligence 7/8
/34
15
Timeline Proposed models Reference
Third boom
period
(after 2000s)
Deep Boltzmann Machine
Salakhutdinov RR, Hinton GE. Deep Boltzmann machines. J Mach Learn
Res2009;5(2):1967–2006.
Denosing Auto Encoder
Larochelle H, Lajoie I, Bengio Y, Manzagol PA. Stacked denoisingautoencoders:
learning useful representations in a deep network with alocal denoising criterion.
J Mach Learn Res 2010;11(12):3371–408.
Deep Convolutional
Neural Network
Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with
deepconvolution neural networks. International conference on neuralinformation
processing systems 2012;25:1097–105.
• DBM(2009) was proposed to learn ambiguous input data robustly, and the model
parameters were optimized using layer-wise pre-training
• DAE(2010) was presented to reconstruct the stochastically corrupted input data, and force
the hidden layer to discover more robust features
• DCNN(2012) was introduced with deep structure of Convolutional Neural Network, and it
showed superior performance in image recognition
17. The evolution of data-driven artificial intelligence 8/8
/34
16
Timeline Proposed models Reference
Third boom
period
(after 2000s)
Generative Adversarial
Network
Goodfellow IJ, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, et
al.Generative adversarial nets. Int Conf Neural Inf Process Syst2014;3:2672–80.
Attention-based LSTM
Wang Y, Huang M, Zhao L, Zhu X. Attention-based LSTM for aspect-level
sentiment classification. Proceedings of conference on empirical methods
innatural language processing 2016:606–15.
• GAN(2014) contained two independent models acting as adversaries
• Attention-based LSTM model(2016) was proposed by integrating attention
mechanism with LSTM
• Nowadays, more and more new models are being developed even per week.
19. Comparison between deep learning and traditional
machine learning
/34
18
• Deep learning is easier to model the nonlinear relationship using compositional
function
• The high level abstract representation in feature learning makes deep learning more
flexible and adaptable to data variety
• The ability to avoid feature engineering is regarded as a great advantage in smart
manufacturing
21. Deep learning for smart manufacturing
/34
20
Data
modelling
Analysis
Supporting
real-time data
processing
22. Convolutional neural network
CNN is a multi-layer feed-forward artificial neural network which is firstly
proposed for two-dimensional image processing
/34
21
23. Restricted Boltzmann machine and its variant
/34
22
RBM is a two-layer neural network consisting of visible and hidden layer
The highest
layers are
undirected /
Lower layers are
directed
Deep Belief
Network
(DBN)
The hidden
units are
grouped into a
hierarchy of
layers
Deep
Boltzmann
Machine
(DBM)
24. Auto encoder and its variants
AE is an unsupervised learning algorithm extracting features from input
data without label information needed
/34
23
Denoising and
discover more
robust features
DAE
(Denoising)
Imposing sparsity
constraints
SAE
(Sparse)
Force the model
resistant to small
perturbations
CAE
(Contractive)
25. Recurrent neural network and its variants
RNN has unique characteristic of topology connections between the
neurons formed directed cycles for sequence data
/34
24
RNN
Difficulty in dealing
with long-term
sequence data
LSTM
Allows information
flow down with linear
interactions
26. Model comparison
Model Principle Pros. Cons.
CNN
Abstracted features are learned
by stacked convolutional and
sampling layers.
Reduced parameter number,
invariance of shift, scale and
distortion.
High computational
complexity for high
hierarchical model training.
RNN
Temporal pattern stored in the
recurrent neuros connection
and distributed hidden states
for time-series data.
Short-term information is
retained and temporal
correlations are captured in
sequence data.
Difficult to train the model
and save the long-term
dependence.
PBM
Hidden layer describes variable
dependencies and connections
between input or output layers
as representative features.
Robust to ambiguous input
and training label is not
required in pre-training stage.
Time-consuming for joint
parameter optimization.
AE
Unsupervised feature learning
and data dimensionality
reduction are achieved through
encoding.
Irrelevance in the input is
eliminated, and meaningful
information is preserved.
Error propagation layer-by-
layer and sparse
representations are not
guaranteed.
/34
25
Table 3. Comparison between different deep learning models
28. Applications to smart manufacturing
• Computational intelligence
is an essential part of smart
manufacturing to enable
accurate insights for
better decision making
• Product quality inspection,
fault diagnosis, and
defect prognosis has been
investigated for a wide
range of manufacturing
systems recently
Model Application scenarios Reference
CNN
Surface integration inspection
1/4
[72–75]
Machinery fault diagnosis
8/8
[77–84]
DBN
Machinery fault diagnosis
8/8
[85–92]
Predictive analytics &
defect prognosis
2/4
[109–112]
AE Machinery fault diagnosis
3/11
[93–103]
RNN
Predictive analytics &
defect prognosis
4/5
[104–108]
/34
27
Table 5. A list of deep learning models with applications
[Reference (kind of application number) / (total number) ]
29. Descriptive analytics for product quality inspection
Inspected employing machine
vision and image processing
techniques to detect surface
defect for enhanced product
quality in manufacturing
Deep learning has been
investigated to learn high-level
generic features and applied to
a wide range of textures or
difficult-to-detect defects
cases
/34
28
This Photo by CASE PRESS is licensed under CC BY-NC
CNN
30. Diagnostic analytics for fault assessment
Monitor machinery conditions,
identify the incipient defects,
diagnose the root cause of
failures, and then incorporate the
information into manufacturing
production and control
Deep learning models
outperform traditional machine
learning technique in terms of
classification accuracy
CNN
• bearing, gearbox, wind
generator and rotor
DBN
• aircraft engine, chemical process,
reciprocating compressor, rolling element
bearing, high speed train and wind turbine
AE
• planetary gearbox, wind
turbine and rolling bearing
/34
29
31. Predictive analytics for defect prognosis
Develop and implement an
intelligent maintenance strategy
that allows manufacturers to
determine the condition of in-
service systems in order to predict
when maintenance should be
performed
Deep learning has been presented
for anomaly prediction of
machine, optimize job schedule
and balance the computational
load
/34
30
DBN
• material removal rate, chemical
mechanical polishing and ceramic
bearing
RNNs (LSTM)
• rolling bearing, machine health
monitoring, tool wear prediction
and aircraft turbofan engine
33. Discussions and outlook
Most companies do not know
what to do with the data they
have, and they lack software
and modelling to interpret
and analyze them
To address the challenges, the
deep learning for smart
manufacturing are discussed in
terms of data matter, model
selection, model
visualization, generic model,
and incremental learning
Improved data
collection
Use and
sharing
Predictive
model
design
Generalized
predictive
models
Connected factories
and control
processes
/34
32
Five gaps are identified in smart manufacturing From Kusiak A. Smart
manufacturing must embrace big data. Nature2017;544(7648):23–5.
34. Discussions and outlook
Challenge Description Solution
Data matter
1. Model heavily depend on the scale and
quality of datasets
2. Class imbalance problem
1. Extracting the relevant data and
applying appropriating task
2. Appropriate measures and
integration of boot strapping
Model
selection
Complexity in manufacturing process
(Different problems)
Supervised : dealing with data rich but
knowledge sparse problems, namely
labelled data are available
Model
visualization
Need to be understood by manufacturing
engineers
Visualization and fusion may contribute
a more effective model
Generic
model
Not bonded to specific machines
1. Increase its width or depth
2. Applied to large scale and real time
analytics using GPU
3. Choosing appropriate models
Incremental
learning
learning algorithms are not fundamentally
built to learn incrementally and are therefore
susceptible to the data velocity issues
Necessary to enable deep learning with
incremental learning capabilities
/34
33
35. Conclusions
Deep learning provides advanced analytics
and offers great potentials to smart
manufacturing in the age of big data
The emerging research effort of deep
learning in applications of manufacturing is
also summarized
Deep learning may be push into cloud,
enabling more convenient and on-
demand computing services for smart
manufacturing
/34
34