This document provides an overview of a Deep Learning course, including:
- The course code, pre-requisites, title, semester, and class schedule.
- The course will include offline classes, labs, assignments, and a group case study.
- The course objective is to introduce students to deep neural network architectures, training strategies, challenges, and tools for designing solutions to engineering problems.
- The syllabus covers topics like neural networks, convolutional neural networks, recurrent neural networks, and more.
1. 21AI637/21CS716
Deep Learning
M.Tech CSE & AI - II Sem - Elective
AMRITA VISHWA VIDYAPEETHAM
LECTURE 1
D R . S I K H A O K
R E F E R E N C E S : TO WA R D S D ATA S C I E N C E , M A C H I N E L E A R N I N G M A S T E R Y, T E X T B O O K S
2. Welcome To The Course
AMRITA VISHWA VIDYAPEETHAM
Course code: 21AI637/21CS716
Pre-Requisite(s): Computational Linear Algebra, Computational Methods for Optimization
Title: Deep Learning
Semester: 2
Batch: M.Tech CSE/AI
Slots:
Monday: Slot 2 (9.35-10.25 AM)
Tuesday : Slot 5 (12.20-1.10 )
Thursday: Slot 3 (10.30am-11.20am )
Thursday: Slot 6-7 (2.00 P M- 4.00 PM)
3. Course Delivery
AMRITA VISHWA VIDYAPEETHAM
Offline Classes (AB3 D206) ---- Theory
Lab (AB3 F204) --- 2hrs Per Week
Assignments ( Problems and programming)
Case Study(Group)
Course Repository – AUMS
4. Course Objective
AMRITA VISHWA VIDYAPEETHAM
To introduce to students, different deep neural network architectures, training
strategies/algorithms, possible challenges, tools and techniques available in designing and
deploying solutions to different practical/Engineering problems.
5. Syllabus
21AI637/21CS716 Deep Learning L-T-P-C: 3-0-2-4
https://kirkpatrickprice.com/blog/classifying-data/
AMRITA VISHWA VIDHYAPEETHAM
Unit 1
Neural Networks basics – Linear Separable Problems and Perceptron – Multi layer neural network sand Back
Propagation, Practical aspects of Deep Learning: Train/ Dev / Test sets, Bias/variance, Vanishing/exploding
gradients, Gradient checking, Hyper Parameter Tuning
Unit 2
Convolutional Neural Networks – Basics and Evolution of Popular CNN architectures –Transfer Learning–
Applications : Object Detection and Localization, Face Recognition, Neural Style Trans-fer Recurrent Neural
Networks–GRU–LSTM–NLP–Word Embeddings–Transfer Learning–Attention Models–Applications: Sentinel
Classification, Speech Recognition, Action Recognition
Unit 3
Restricted Boltzmann Machine– Deep Belief Network– Auto Encoders–Applications: Semi Supervised
classification, Noise Reduction, Non-linear Dimensionality Reduction Goal Oriented Decision Making– Policy and
Target Networks– Deep Quality Network for Reinforcement Learning Introduction to GAN–Encoder/ Decoder,
Generator/ Discriminator architectures Challenges in NN training– Data Augmentation– Hyperparameter
Settings–Transfer Learning–Developing and DeployingMLModels(e.g.,Matlab/TensorFlow/PyTorch)
6. Course Outcome
https://kirkpatrickprice.com/blog/classifying-data/
AMRITA VISHWA VIDHYAPEETHAM
COs Course Outcome
Bloom’s
Taxonomy
Level
CO 1 Be able to design, train, deploy neural networks for solving
different practical/ engineering problems and analyze and report
its efficacy
L4
CO 2 Have a good level of knowledge (Both Conceptual and
Mathematical) on different neural network settings to pursue
Research in this Field.
L3
CO 3 Build skills in using established ML tools/libraries and in
building self-learning skills in the field.
L3