DA672: Data Driven System Theory

Open to: B.Tech. (3rd and 4th Year)/M.Tech./Ph.D.

Course Content/ Syllabus:

Review of dynamical system properties along with differential equations, bifurcation and chaotic systems; Non-linear time series analysis; Review of machine learning methods; Data-driven modeling and dynamical systems (dynamic mode decomposition, Koopman operators, ODE, PDE); Construction of dynamical equation from time series data; State space reconstruction by machine learning methods, examples from model reduction (ODE, PDE), chaotic time series analysis.

Texts:

1. S.L. Brunton and J.N. Kutz, Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control, 1st Edition, Cambridge University Press, 2019.

References:

1. C. Novara and S. Formentin, Data-Driven Modeling, Filtering and Control: Methods and Applications, 1st Edition, Institution of Engineering and Technology, 2019.
2. S.H. Strogatz, Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering, 1st Edition, Perseus Books Publishing, 1994.
3. J.N. Kutz, B.W. Brunton, J.L. Proctor and S.L. Brunton, Dynamic Mode Decomposition: Data-Driven Modeling of Complex Systems, 1st Edition, Society for Industrial and Applied Mathematics, 2016.
4. G. Strang, Linear Algebra and Learning from Data, 1st Edition, Wellesley-Cambridge Press, 2019.

 

DA547: Introduction to Mathematical Biology

Open to: B.Tech.(3rd and 4th Year)/M.Sc./M.Tech./Ph.D.

Course Content/ Syllabus:

Review of linear systems of ODEs; Phase space analysis; Stability analysis of linear and non-linear system; Concepts of bifurcation; Dynamical models in ecology: population dynamics, single species, interacting species; Dynamical models for diseases: Infectious disease, SIR epidemics, SIS endemics; Dynamical models in cell biology: Gene regulation, cellular differentiation; Spatially structured models, tumor modeling, models for drug delivery; Stochastic processes in biology: Markov chains, birth and death processes, branching processes, Gillespie algorithm.

Texts:

1. N.F. Britton, Essential Mathematical Biology, Springer Science & Business Media, 2012.
2. M. Kot, Elements of Mathematical Ecology, Cambridge University Press, 2001.
3. L.J.S. Allen, An Introduction to Stochastic Processes with Applications to Biology, 2nd Edition, CRC Press, 2011.

References:
1. W.E. Boyce and R.C. DiPrima and D.B. Meade, Elementary Differential Equations, John Wiley & Sons, 2017.
2. J.D. Murray, Mathematical Biology I: An Introduction, Interdisciplinary Applied Mathematics, 3rd Edition, Springer, 2007.
3. J.D. Murray, Mathematical Biology II: Spatial Models and Biomedical Applications, Interdisciplinary Applied Mathematics, 3rd Edition, Springer 2003.

 

DA641: Nonlinear Regression

Open to: B.Tech. (4th Year)/M.Tech./Ph.D.

Course Content/ Syllabus:
Moving beyond linearity: Polynomial regression, step functions, basis functions, regression splines, smoothing splines, local regression, generalized additive models; A general non-linear model, non-linear least squares, local and global minimum, contour plots, linear approximation, asymptotic theory of nonlinear least squares; Numerical methods, starting values, non-iterative algorithms: Grid search, random search; Iterative algorithms: Gauss-Newton, Newton-Raphson, Genetic Algorithm, Simulated Annealing; Generalized least squares estimators, maximum likelihood estimators, robust estimation: Least absolute deviation estimation, M-estimation; Bayesian estimation, examples of non-linear regression models; Maximal margin classifier, support vector classifiers, support vector machines; Deep Learning: Single layer neural networks, multilayer neural networks, convolutional neural networks, document classification, recurrent neural network, fitting a neural network.

 

Texts:
1. T. Hastie, R. Tibshirani and J. Friedman, The Elements of Statistical Learning, 2nd Edition, 2017, Springer, 2017.
2. D.M. Bates and D.G. Watts, Nonlinear Regression Analysis and its Applications, John Wiley & Sons, 2007.

References:
1. G. James, D. Witten, T. Hastie and R. Tibshirani, An Introduction to Statistical Learning, 2nd Edition, Springer, 2021.
2. A.R. Gallant, Nonlinear Statistical Models, John Wiley & Sons, 2009.
3. G.A.F. Seber and C.J. Wild, Nonlinear Regression, John Wiley & Sons, 2003.

 

DA621: Deep Learning for Computer Vision

Open to: B.Tech. (4th Year)/M.Tech./Ph.D.

Course Content/ Syllabus:

Deep learning review: perceptron, Multi-Layer Perceptron (MLP), backpropagation, PyTorch for deep learning; Convolutional Neural Networks (CNNs): introduction, evolution of architectures, AlexNet, VGG, Inception, ResNet; Understanding CNNs: visualizing filters, gradient with respect to input, style transfer; CNNs for vision: Recognition, detection, segmentation, optical flow, depth estimation; Recurrent Neural Networks (RNNs): review, CNN+RNN models for video understanding; Attention Models: vision and language, image captioning and Visual Question Answering (VQA), Transformer networks; Deep Generative models: review of generative models, Generative Adversarial Network (GAN), Variational Autoencoder (VAE), PixelRNN, image2image transformation applications, inpainting, super-resolution; Recent trends: zero-shot, few-shot, self-supervised learning, meta-learning in vision.

Texts:

1. I. Goodfellow, Y. Bengio and A. Courville, Deep Learning, MIT Press, 2016.
 

References:
1. S.J.D. Prince, Computer Vision: Models, Learning, and Inference, 1st Edition, Cambridge University Press, 2012.
2. M. Nielsen, Neural Networks and Deep Learning, 2016.

 

DA622: Robustness and Interpretability in Machine Learning

Open to: B.Tech. (4th Year)/M.Tech./Ph.D.

Course Content/ Syllabus:

Review of Machine Learning and Deep learning. Robustness: Introduction to adversarial examples, different types of adversarial attacks, FGSM, CW, PGD, universal; Empirical defenses; Different hypotheses surrounding the adversarial examples; Backdoor attacks, data poisoning attacks; Recent advances and competitions/challenges. Interpretability: Understanding and evaluating interpretability; Accuracy-interpretability tradeoff in machine learning; Different types of interpretability approaches, rulebased, prototype-based, feature importance-based, post-hoc explanations, human-in-the loop based; Advanced topics – relation to debugging and fairness in machine learning.

Texts:

1. I. Goodfellow, Y. Bengio and A. Courville, Deep Learning, MIT Press, 2016.

References:

1. C. Bishop, Pattern Recognition and Machine Learning (Information Science and Statistics), 1st Edition, Springer, 2006.

 

DA771: Machine Learning in Quantum Physics

Open to: Ph.D.

Course Content/ Syllabus:

Fundamentals: Introduction to Material Modeling, Kernel Methods for Quantum Chemistry, and Introduction
to Neural Networks; Incorporating Prior Knowledge: Invariances, Symmetries, Conservation Laws: Building
Nonparametric n-Body Force Fields Using Gaussian Process Regression, Machine-Learning of Atomic-Scale Properties Based on Physical Principles, Accurate Molecular Dynamics Enabled by Efficient Physically Constrained Machine Learning Approaches, Quantum Machine Learning with Response Operators in Chemical Compound Space, Physical Extrapolation of Quantum Observables; Deep Learning of Atomistic Representations: Message Passing Neural Networks, Learning Representations of Molecules and Materials with Atomistic Neural Networks; Atomistic Simulations: High-Dimensional Neural Network Potentials for Atomistic Simulations, Construction of Machine Learned Force Fields with Quantum Chemical Accuracy: Applications and Chemical Insights, Active Learning and Uncertainty Estimation, Machine Learning for Molecular Dynamics on Long Timescales; Discovery and Design: Database-Driven High-Throughput Calculations and Machine Learning Models for Materials Design, Bayesian Optimization in Materials Science.

Texts:

1. K.T. Schutt, S. Chmiela, O.A. von Lilienfeld, A. Tkatchenko, K. Tsuda, K-R. Muller (Editors), Machine Learning Meets Quantum Physics, Lecture Notes in Physics, 1st Edition, Springer, 2020.

 

DA642: Time Series Analysis

Open to: B.Tech. (4th Year)/M.Tech./Ph.D.

Course Content/ Syllabus:

Time series and their characteristics; Linear time series analysis and its applications: MA, AR, ARMA, ARIMA ; Spectral analysis: population spectrum, sample periodogram, use of spectral analysis; Estimation and order detection in time series; Conditional heterosdastic models: ARCH, GARCH; Nonlinear models and their applications: TAR, STAR; High frequency data analysis and market microstructure: Models for price change and durations; Multivariate time series and their applications: VAR, cointegrated VAR models; State space models and Kalman filter.

Texts:

1. R.S. Tsay, Analysis of Financial Time Series, 2nd Edition, Wiley-Interscience, 2005.
2. R.H. Shumway and D.S. Stoffer, Time Series Analysis and its Applications: with R examples, 2nd Edition, Springer, 2006.

References:

1. C. Chatfield, The Analysis of Time Series: An Introduction, 6thEdition, CRC Press, 2016.
2. G.E.P. Box, G. M. Jenkins and G.C. Reinsel, Time Series Analysis, 4th Edition, John Wiley & Sons, 2008.
3. J.D. Hamilton, Time Series Analysis, Princeton University Press, 2020.

 

DA671: Introduction to Reinforcement Learning

Open to Ph.D., M.Tech., B.Tech.

Course Content/ Syllabus:
Introduction to Reinforcement Learning (RL); Markov Chains: Discrete-time Markov chains, Stationary Distribution, Continuous-time Markov Chains. Markov Decision Process (MDP): Terminologies & Fundamentals; Finite Horizon MDP: Dynamic Programming (DP), Bellman Equation; Infinite Horizon Discounted Cost Problems: DP Equation, Fixed Point & Contraction Mapping, Value Iteration (VI), Policy Iteration (PI), Linear Programming formulation; RL Techniques: Multi-armed Bandit; Exploration & Exploitation, Optimism in the Face of Uncertainty, Upper Confidence Bound, Thompson Sampling; Monte-Carlo Methods: First-visit and Every-visit, Monte-Carlo Control, Off-policy Monte Carlo Control; Temporal Difference Learning: TD Prediction, Optimality of TD(0), SARSA, Q-learning, n-step TD Prediction, TD (lambda) algorithm, Linear Function Approximation, Linear TD (lambda); Policy Search Methods: Policy Gradient (PG) Theorem, REINFORCE, Variance Reduction in PG Methods, Actor-critic Methods, Batch RL.

Texts:

  1. M.L. Puterman, Markov Decision Processes: Discrete Stochastic Dynamic Programming, 1st Edition, Wiley, 2005.

  2. R.S. Sutton and A.G. Barto, Reinforcement Learning: An introduction, 2nd Edition, MIT Press, 2018.

References:

  1. D.P. Bertsekas, Dynamic Programming and Optimal Control, Vol. I, 3rd Edition, Athena Scientific, 2005.

  2. D.P. Bertsekas, Dynamic Programming and Optimal Control, Vol. II: Approximate Dynamic Programming, 4th Edition, Athena Scientific, 2012.

  3. T. Lattimore and C. Szepesvári, Bandit Algorithms, 1st Edition, Cambridge University Press, 2020.

  4. M. Harchol-Balter, Performance Modeling and Design of Computer Systems: Queueing Theory in Action, 1st Edition, Cambridge University Press, 2013.

 

DA526: Image Processing with Machine Learning

Open to Ph.D. Only
Course Content/ Syllabus:
Fundamentals of machine learning: dataset generation, augmentation, standardization, train/validation/test set preparation, cross-validation, model training and evaluation; Supervised and unsupervised learning, regression and classification, artificial neural networks, deep architectures. Image processing with Machine Learning: Introduction to image processing; Machine learning workflow for image processing; Introduction to software tools for image processing and machine learning; Elements of visual perception, imaging geometry; Image acquisition: depth of field, auto exposure, high dynamic range imaging; Image processing in spatial and frequency domains, super-resolution; Image restoration: deblurring, dehazing, inpainting; Image segmentation: semantic segmentation; Color image processing, pseudo coloring; Image representation and image descriptors; Image recognition: localization and classification; Machine learning in video processing.

Texts:

  1. R.C. Gonzalez and R.E. Woods, Digital Image Processing, 3/e, Pearson Education, 2018.

  2. E. Alpaydin, Introduction to Machine Learning, 4/e, MIT Press, 2020.

References:

  1. A.K. Jain, Fundamentals of Digital Image processing, 1/e, Pearson Education, 2015.

  2. C. Bishop, Pattern Recognition and Machine Learning (Information Science and Statistics), 1/e, Springer, 2006.

  3. I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning, MIT press, 2016.

  4. R. O. Duda and P. E. Hart, Pattern Classification, 2/e, John Wiley & Sons, 2007

 

DA546: Introduction to Statistical Learning

Open to Ph.D. Only

Course Content/ Syllabus:

Correlation: the scatter diagram, the correlation coefficient, potential problems with correlation coefficient: outliers, non-linear association, association and causation. Simple and multiple linear regression: least squares fitting, computational issues in fitting the model, hypothesis testing on the coefficients, interval estimation, prediction of new observations, assessing the accuracy of the model, detection of outliers, multicollinearity, variable selection, predictions. Classification problems: an overview, logistic regression, Bayes classifier, linear discriminant analysis, quadratic discriminant analysis, naive Bayes, k-nearest neighbour, support vector machines. Re-sampling methods for model assessment: testing and training data, cross validation, leave-one-out-cross-validation, k-fold cross validation, bias-variance trade off; bootstrapping, jackknife. Tree-based methods for regression and classification: an overview, basics of decision trees, regression trees, classification trees.

Texts:

  1. G. James, D. Witten, T. Hastie and R. Tibshirani. An Introduction to Statistical Learning. 2nd Edition. Springer. 2021.

  2. T. Hastie, R.Tibshirani and J. Friedman, The Elements of Statistical Learning. 2nd Edition. Corr. 9th printing 2017 edition. Springer. (19 April 2017).

References:

  1. D.C. Montgomery, E.A. Peck and G.G. Vining, Introduction to Linear Regression Analysis. 6th Edition. Wiley. 2021.

  2. D. Freedman, R. Pisani, R. Purves and A. Adhikari, Statistics. 4th edition. Viva Books. 2011.

 

DA673:Neural Data Analysis

Open to Ph.D., M.Tech., B.Tech.

Course Content/ Syllabus:
Peripheral sensory processing, hearing and vision, sensory transduction, spiking neurons, cortical organization;
Leaky integrate and fire (LIF) neuron, McCulloch-Pitts model, perceptron, artificial neural networks; Model types;
Neural code, information theory, efficient coding hypotheses; Neuroimaging via EEG, physiological basis of EEG,
EEG measurement; Preprocessing EEG, signal and noise, spectral analysis, artifacts identification, bad channel
identification, application of ICA, Kalman filtering; Analyzing EEG signals, event-related potentials (ERPs), filtering,
Hilbert transform, topographic maps, time-frequency analysis; Classification, single-trial classification, common
spatial patterns (CSP), stimulus reconstruction approaches; Other neuroimaging modalities, reaction times,
pupillometry, MEG, fMRI

Texts:

    1. M.X. Cohen, Analyzing Neural Time Series Data: Theory and Practice, The MIT Press, Cambridge,
       Massachusetts, 2014.

    2. C. Bishop, Pattern Recognition and Machine Learning, Springer, 2006.

References:

    1. G. Buzsaki, Rhythms of the Brain. Oxford University Press, 2006.

    2. A.V. Oppenheim, R. Schafer, Discrete-time Signal Processing, Pearson, 2015.

    3. G. Lindsay, Models of the Mind: How Physics, Engineering and Mathematics Have Shaped Our
        Understanding of the Brain, Bloomsbury Publishing, 2021.

 

DA 623:Computing with Signals

Open to Ph.D., M.Tech., B.Tech.

Course Content/ Syllabus:
Introduction to signals, sensory signals, atmospheric signals, chemical signals; Sensory processing; Signal
representations, continuous and discrete-time, basis functions, Fourier series, Taylor series, linear algebra
fundamentals, LTI system, convolution, sampling, DFT, DCT, time-frequency analysis); Dimensionality reduction,
PCA, dictionary learning, compressive sensing; Spectral estimation; Filtering and filtering artifacts; Kalman filtering;
Modelling, model fitting, bias-variance trade-off, regularization, cross-validation; Generative models, Gaussians,
GMM, HMM, maximum likelihood, Expectation-maximization, Bayesian estimation; Classifiers, neural networks;
Applications in audio, image, EEG, weather monitoring.

Texts:

   1. M. Vetterli, J. Kovacevic, V. K. Goyal, Foundations of Signal Processing, Cambridge University Press, 2014.

   2. C. Bishop, Pattern Recognition and Machine Learning, Springer, 2006.

References:

   1. A.V. Oppenheim, R. Schafer, Discrete-time Signal Processing, Pearson, 2015.