Preamble

In the last decade due to availability of cheap computation, several neural network approaches had been explored in order to advance the performance of many state-of-the-art visual recognition problems such as image searching, understanding, medical applications, autonomous vehicles such as drones and self-driving cars etc. All these problems relies of efficient, accurate and robust solutions for basic vision tasks such like image classification, localization and detection. In this course students will be given an exposure to the details of neural networks as well as deep learning architectures and to develop end-to-end models for such tasks. Students will learn to implement, train and debug their own neural networks. This is a project oriented practical course in which every student have to develop a complete working model to solve some real world problem


Course Content

Basics of artificial neural networks (ANN):

  1. Artificial neurons, Computational models of neurons, Structure of neural networks. [2 Hrs]
  2. Functional units of ANN for pattern recognition tasks. [2 Hrs]

Feedforward neural networks:

  1. Pattern classification using perceprton, Multilayer feedforward neural networks (MLFFNNs), Backpropagation learning, Empirical risk minimization, Regularization, Autoencoders[6 Hrs]

Deep neural networks (DNNs):

  1. Difficulty of training DNNs, Greedy layerwise training, Optimization for training DNNs, Newer optimization methods for neural networks (AdaGrad, RMSProp, Adam), Second order methods for training, Regularization methods (dropout, drop connect, batch normalization) [12 Hrs]

Convolution neural networks (CNNs):

  1. Introduction to CNNs - convolution, pooling, Deep CNNs, Different deep CNN architectures - LeNet, AlexNet, VGG, PlacesNet, Training a CNNs: weights initialization, batch normalization, hyperparameter optimization, Understanding and visualizing CNNs. [12 Hrs]

Recurrent neural networks (RNNs):

  1. Sequence modeling using RNNs, Back propagation through time, Long Short Term Memory (LSTM), Bidirectional LSTMs, Bidirectional RNNs, Gated RNN Architecture [8 Hrs]

Generative models:

  1. Restrictive Boltzmann Machines (RBMs), Stacking RBMs, Belief nets, Learning sigmoid belief nets, Deep belief nets [8 Hrs]

Applications:

  1. Applications in vision, speech and natural language processing [6 Hrs]

References:

  1. S. Haykin,Neural Networks and Learning Machines , Prentice Hall of India, 2010

Textbooks

  1. Ian Goodfellow, Yoshua Bengio and Aaron Courville, Deep learning, In preparation for MIT Press, Available online: http://www.deeplearningbook.org
  2. Satish Kumar, Neural Networks - A Class Room Approach, Second Edition, Tata McGraw-Hill, 2013
  3. B. Yegnanarayana, Artificial Neural Networks, Prentice- Hall of India, 1999
  4. C.M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006