Course Code: Course Title Credit
CSC701 Deep Learning 3
Prerequisite: Basic mathematics and Statistical concepts, Linear algebra, Machine
Learning
Course Objectives:
1 To learn the fundamentals of Neural Network.
2 To gain an in-depth understanding of training Deep Neural Networks.
3 To acquire knowledge of advanced concepts of Convolution Neural Networks,
Autoencoders and Recurrent Neural Networks.
4 Students should be familiar with the recent trends in Deep Learning.
Course Outcomes:
1 Gain basic knowledge of Neural Networks.
2 Acquire in depth understanding of training Deep Neural Networks.
3 Design appropriate DNN model for supervised, unsupervised and sequence learning
applications.
4 Gain familiarity with recent trends and applications of Deep Learning.
Modul Content 39Hrs
e
1 Fundamentals of Neural Network 4
1.1 History of Deep Learning, Deep Learning Success Stories, Multilayer
Perceptrons (MLPs), Representation Power of MLPs, Sigmoid Neurons,
Gradient Descent, Feedforward Neural Networks, Representation Power
of Feedforward Neural Networks
1.2 Deep Networks: Three Classes of Deep Learning Basic Terminologies
of Deep Learning
2 Training, Optimization and Regularization of Deep Neural 10
Network
2.1 Training Feedforward DNN
Multi Layered Feed Forward Neural Network, Learning Factors,
Activation functions: Tanh, Logistic, Linear, Softmax, ReLU, Leaky
ReLU, Loss functions: Squared Error loss, Cross Entropy, Choosing
output function and loss function
2.2 Optimization
Learning with backpropagation, Learning Parameters: Gradient
Descent (GD), Stochastic and Mini Batch GD, Momentum Based GD,
Nesterov Accelerated GD, AdaGrad, Adam, RMSProp
2.3 Regularization
Overview of Overfitting, Types of biases, Bias Variance Tradeoff
Regularization Methods: L1, L2 regularization, Parameter sharing,
Dropout, Weight Decay, Batch normalization, Early stopping, Data
Augmentation, Adding noise to input and output
3 Autoencoders: Unsupervised Learning 6
3.1 Introduction, Linear Autoencoder, Undercomplete Autoencoder,
Overcomplete Autoencoders, Regularization in Autoencoders
3.2 Denoising Autoencoders, Sparse Autoencoders, Contractive
Autoencoders
3.3 Application of Autoencoders: Image Compression
4 Convolutional Neural Networks (CNN): Supervised Learning 7
4.1 Convolution operation, Padding, Stride, Relation between input, output
and filter size, CNN architecture: Convolution layer, Pooling Layer,
Weight Sharing in CNN, Fully Connected NN vs CNN, Variants of
basic Convolution function, Multichannel convolution operation,2D
convolution.
4.2 Modern Deep Learning Architectures:
LeNET: Architecture, AlexNET: Architecture, ResNet : Architecture
5 Recurrent Neural Networks (RNN) 8
5.1 Sequence Learning Problem, Unfolding Computational graphs,
Recurrent Neural Network, Bidirectional RNN, Backpropagation
Through Time (BTT), Limitation of vanilla Vanishing and
Exploding Gradients, Truncated BTT
5.2 Long Short Term Memory(LSTM): Selective Read, Selective write,
Selective Forget, Gated Recurrent Unit (GRU)
6 Recent Trends and Applications 4
6.1 Generative Adversarial Network (GAN): Architecture
6.2 Applications: Image Generation, DeepFake
Textbooks:
1 Ian Goodfellow, Yoshua Bengio, Aaron Courville. MIT Press Ltd, 2016
2 Li Deng and Dong Yu, Methods and Publishers Inc.
3 Satish Kumar "Neural Networks A Classroom Approach" Tata McGraw-Hill.
4 JM Zurada Neural Jaico Publishing House
5 M. J. Kochenderfer, Tim A. Wheeler. for MIT Press.
References:
1 Deep Learning from Scratch: Building with Python from First Principles- Seth Weidman by
O`Reilley
2 François Chollet. learning with Python 361). 2018 New York: Manning.
3 Douwe Osinga. Learning SPD Publishers, Delhi.
4 Simon Haykin, Neural Network- A Comprehensive Foundation- Prentice Hall
International, Inc
5 S.N.Sivanandam and S.N.Deepa, Principles of soft computing-Wiley India
Assessment:
Internal Assessment:
The assessment consists of two class tests of 20 marks each. The first class test is to be
conducted when approx. 40% syllabus is completed and second class test when additional 40%
syllabus is completed. Duration of each test shall be one hour.
End Semester Theory Examination:
1 Question paper will comprise a total of six questions.
2 All questions carry equal marks.
3 Question 1 and question 6 will have questions from all modules. Remaining 4 questions
will be based on the remaining 4 modules.
4 Only four questions need to be solved.
5 In question paper weightage of each module will be proportional to the number of
respective lecture hours as mentioned in the syllabus.
Useful Links
1 http://www.cse.iitm.ac.in/~miteshk/CS6910.html
2 https://nptel.ac.in/courses/106/106/106106184/
3 https://www.deeplearningbook.org/