CS590DEEP LEARNING 3-0-0-6

Pre-requisites : NIL

Syllabus :
Machine Learning: Fundamentals; Neural Network: Perceptrons, Back Propagation, Over-fitting, Regularization. Deep Networks: Definition, Motivation, Applications; Principal Component Analysis; Restricted Boltzmann Machine; Sparse Auto-encoder; Deep Belief Net; Hidden Markov Model. Convolution Neural Network (CNN): Basic architecture, Activation functions, Pooling, Handling vanishing gradient problem, Dropout, Greedy Layer-wise Pre-training, Weight initialization methods, Batch Normalization; Different CNN Models: Alex Net, VGG Net, Google Net, Res Net, Dense Net, MIL, Highway Network, Fractal Network, Siamese Net; Graphical Model: Bayes Net, Variational Auto-encoders. Sequence Learning: 1D CNN, Recurrent Neural Network (RNN), Gated RNN, Long short-term memory (LSTM). Generative Modeling: Generative adversarial network. Zero Shot Learning. Applications.

Texts :

References :
1. Ian Goodfellow, Yoshua Bengio and Aaron Courville, Deep Learning, MIT Press, 2016
2. Michael A. Nielsen, Neural Networks and Deep Learning , Determination Press, 2015
3. Yoshua Bengio, Learning Deep Architectures for AI, now Publishers Inc., 2009