Machine Learning on Your Hand - Introduction to Tensorflow Lite PreviewModulabs
TF Dev Summit × Modulabs : Learn by Run !
Machine Learning on Your Hand - Introduction to Tensorflow Lite Preview (발표자 : 강재욱)
※ 모두의연구소 페이지 : https://www.facebook.com/lab4all/
※ 모두의연구소 커뮤니티 그룹 : https://www.facebook.com/groups/modulabs
Machine Learning on Your Hand - Introduction to Tensorflow Lite PreviewModulabs
TF Dev Summit × Modulabs : Learn by Run !
Machine Learning on Your Hand - Introduction to Tensorflow Lite Preview (발표자 : 강재욱)
※ 모두의연구소 페이지 : https://www.facebook.com/lab4all/
※ 모두의연구소 커뮤니티 그룹 : https://www.facebook.com/groups/modulabs
TensorRT is an NVIDIA tool that optimizes and accelerates deep learning models for production deployment. It performs optimizations like layer fusion, reduced precision from FP32 to FP16 and INT8, kernel auto-tuning, and multi-stream execution. These optimizations reduce latency and increase throughput. TensorRT automatically optimizes models by taking in a graph, performing optimizations, and outputting an optimized runtime engine.
This document summarizes recent advances in single image super-resolution (SISR) using deep learning methods. It discusses early SISR networks like SRCNN, VDSR and ESPCN. SRResNet is presented as a baseline method, incorporating residual blocks and pixel shuffle upsampling. SRGAN and EDSR are also introduced, with EDSR achieving state-of-the-art PSNR results. The relationship between reconstruction loss, perceptual quality and distortion is examined. While PSNR improves yearly, a perception-distortion tradeoff remains. Developments are ongoing to produce outputs that are both accurately restored and naturally perceived.
TensorRT is an NVIDIA tool that optimizes and accelerates deep learning models for production deployment. It performs optimizations like layer fusion, reduced precision from FP32 to FP16 and INT8, kernel auto-tuning, and multi-stream execution. These optimizations reduce latency and increase throughput. TensorRT automatically optimizes models by taking in a graph, performing optimizations, and outputting an optimized runtime engine.
This document summarizes recent advances in single image super-resolution (SISR) using deep learning methods. It discusses early SISR networks like SRCNN, VDSR and ESPCN. SRResNet is presented as a baseline method, incorporating residual blocks and pixel shuffle upsampling. SRGAN and EDSR are also introduced, with EDSR achieving state-of-the-art PSNR results. The relationship between reconstruction loss, perceptual quality and distortion is examined. While PSNR improves yearly, a perception-distortion tradeoff remains. Developments are ongoing to produce outputs that are both accurately restored and naturally perceived.
Hands-on tutorial of deep learning (Keras)Chun-Min Chang
Summary
# Fundamentals of deep learning
--- selection of activation function
--- selection of loss function
--- selection of optimizer
--- effect of learning rate
# How to prevent overfitting
--- Regularization
--- Dropout
--- Early stopping
--- Batch Normalization
Similar to Convolutional Neural Networks (CNN) — 卷積神經網路的前世今生 (18)
A short talk for a forum held by Taiwan Association for Human Rights: https://www.tahr.org.tw/event/2670
Video: https://youtu.be/-hYQRHqyR9g (28:10 - 50:35)
Lecture for Neural Networks study group held on February 8, 2020.
Reference book: http://hagan.okstate.edu/nnd.html
Video: https://youtu.be/TyyoPU13ME0
Python demo codes: https://bit.ly/3893GHB
Initiated by Taiwan AI Group (https://www.facebook.com/groups/Taiwan.AI.Group/permalink/2017771298545301/)
Lecture for Neural Networks study group held on January 11, 2020.
Reference book: http://hagan.okstate.edu/nnd.html
Video: https://youtu.be/H4NKgliTFUw
Initiated by Taiwan AI Group (https://www.facebook.com/groups/Taiwan.AI.Group/permalink/2017771298545301/)
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...Jason Tsai
This document provides an introduction to spiking neural networks (SNNs) through a presentation given by Jason Tsai. It begins with an overview of the characteristics and advantages of SNNs. It then covers relevant neuroscience concepts like neurons, synapses, action potentials, Hebb's rule, and spike-timing dependent plasticity. Learning algorithms like backpropagation and STDP are introduced. Common neuron models and coding schemes are described. Finally, several neuromorphic computing platforms are discussed.
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...Jason Tsai
The document provides an introduction to spiking neural networks (SNNs) and neuromorphic computing. It discusses the characteristics and advantages of SNNs, including their spatio-temporal nature, asynchronous processing, sparsity, and energy efficiency. It also covers basic neuroscience concepts like neurons, action potentials, synaptic plasticity, and learning rules like STDP. Common SNN models and neural encoding schemes are described. Examples of SNN applications in visual processing and pattern generation are presented. Finally, neuromorphic hardware platforms like Intel's Loihi chip are introduced.
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...Jason Tsai
Abstract:
Being the third generation of neural network models, the study of spiking neural networks is an interdisciplinary field among brain science, theoretical neuroscience, and artificial neural networks research. Recently it is gaining attention and momentum, especially in neuromorphic device design for real-time machine learning. Some of you might have heard of it, but its underneath principles probably remain unknown for most of you. In this talk, I will briefly illustrate the basic building blocks of this emerging architecture and technology.
Lecture for Reinforcement Learning study group held on August 19th, 2017.
Reference book: http://incompleteideas.net/book/the-book.html
Video: https://youtu.be/xv5ZsOSf6ZQ
Initiated by Taiwan AI Group (https://www.facebook.com/groups/Taiwan.AI.Group/permalink/1796526840669749/)
Deep Learning: Chapter 11 Practical MethodologyJason Tsai
Lecture for Deep Learning 101 study group to be held on June 9th, 2017.
Reference book: https://www.deeplearningbook.org/
Past video archives: https://goo.gl/hxermB
Initiated by Taiwan AI Group (https://www.facebook.com/groups/Taiwan.AI.Group/)
Deep Learning: Introduction & Chapter 5 Machine Learning BasicsJason Tsai
Given lecture for Deep Learning 101 study group with Frank Wu on Dec. 9th, 2016.
Reference: https://www.deeplearningbook.org/
Initiated by Taiwan AI Group (https://www.facebook.com/groups/Taiwan.AI.Group/)
Deep Learning: Introduction & Chapter 5 Machine Learning Basics
Convolutional Neural Networks (CNN) — 卷積神經網路的前世今生
1. Jason Tsai (蔡志順) 2019.10.24
Deep01(愛因斯坦人工智慧)
Convolutional Neural Networks (CNN)
卷積神經網路的前世今生
*Picture adopted from https://bit.ly/2o1OKct
2. Copyright Notice:
All figures in this presentation are taken from
miscellaneous sources and their copyrights
belong to the original authors. This
presentation itself adopts Creative Commons
license.