0 ratings0% found this document useful (0 votes) 81 views2 pagesDeep Learning
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here.
Available Formats
Download as PDF or read online on Scribd
R22 BToch. CSE (Al and ML) Syllabus UNTU Hyderabad
AM701PC: DEEP LEARNING
B.Tech. IV Year! Sem.
eq
ov
eo
Course Objectives:
+ To.understand deep Learning algorithms and their applications in real-world data
Course Outcome:
‘© Understand machine leaming basics and neural networks
‘© Understand optimal usage of data for training deep models
+ Apply CNN and RNN models for real-world data
+ Evaluate doop models
© Develop deep models for real-world problems
UNIT-1
Machine Learning Basics
Learning Algorithms, Capacity, Overfiting and Underfiting, Hyperparameters and Validation Sets,
Estimators, Bias and Variance, Maximum Likelihood Estimation, Bayosian Statistics, Suporvisod
Loaming Algorithms, Unsupervised Leaming Algorithms, Stochastic Gradiont Descent, Building a
Machine Learning Algorithm, Challenges Motivating Deep Learning
Deep Feedforward Networks Leaming XOR, Gradiant-Based Learning, Hidden Units, Architecture
Design, Back-Propagation and Qthor Differontiation Algorithms
UNIT =I
Regularization for Deep Learning
Parameter Norm Penalties, Norm Penalties as Constrained Optimization, Regularization and Under
Constrained Probloms, Dataset Augmentation, Noiso Robustness, Somi-Suporvised Learning, Multi
Task Loarning, Early Stopping, Parameter Tying and Parameter Sharing, Sparse Representations,
Bagging and Other Ensemble Methods, Dropout, Adversarial Training, Tangent Distance, Tangent
Prop, and Manifold Tangent Classifier, Optimization for Training Deap Modals, Learning vs Pure
Optimization, Challenges in Neural Network Optimization, Basic Algorithms, Parameter Initialization
Stratagias, Algorithms with Adaptive Learning Rates
UNIT =I
Convolutional Networks
‘The Convolution Operation, Motivation, Pooling, Convolution and Pooling as an Infinitely Strong Prior,
Variants of the Basic Gonvolution Function, Structured Outputs, Data Types, Efficient Convolution
Algorithms, Random or Unsupervised Featuros
UNIT-1V
Recurrent and Recursive Nets
Unfolding Computational Graphs, Recurrent Neural Networks, Bidirectional RNNs, Encoder-Decoder
Sequence-to-Sequence Architectures, Deep Recurrent Networks, Recursive Neural Networks, The
Challongo of Long-Term Depondoncios, Echo Stato Networks, Leaky Units and Other Stratogios for
Multiple Time Scales, The Long Short-Term Memory and Other Gated RNNs, Optimization for Long-
Torm Dependencies, Explicit Mamory
UNIT-Vv
Practical Methodology: Performance Metrics, Default Baseline Models, Determining Whether to
Gathor Moro Data, Solocting Hyperparamotors, Debugging Strategios, Example: Mult:Digit Number
Recognition
Page 106 of 147R22 BToch. CSE (Al and ML) Syllabus UNTU Hyderabad
Applications: Large-Scale Deep Learning, Computer Vision, Speech Recognition, Natural Language
Processing, Other Applications.
TEXT BOOK:
1
Desp Learning by lan Goodfellow, Yoshua Bongio and Aaron Courvill, MIT Pross.
REFERENCE BOOKS:
1
‘Tho Eloments of Statistical Learning. Hastio, R. Tibshirani, and J. Fridman, Springer.
Probabilistic Graphical Models. Kollor, and N. Friedman, MIT Pross.
Bishop. C.M., Pattern Recognition and Machine Learning, Springor, 2006,
Yegnanarayana, B., Arlificial Noural Networks PHI Learning Pvt. Ltd, 2009.
Golub, G.H., and Van Loan, C.F, Matrix Computations, JHU Press, 2013.
Satish Kumar, Neural Networks: A Classroom Approach, Tata McGraw-Hill Education, 2004,
Page 107 of 147