Instructor:  

Shayan Srinivasa Garani,

[Thursday, Friday 6:00 pm – 7:30 pm, LH9]


Course Syllabus: 

  • Introduction to neural networks: Neuron models motivated by biology, feedback and other network architectures, Knowledge representation and artificial intelligence.
  • Learning Process: memory-based learning, error-correction based learning, Hebbian learning, competition learning, Boltzmann Learning, Supervised and unsupervised learning methods, memory and adaptation, Statistical nature of the learning processes.
  • Multilayer Perceptron: Perceptron, Perceptron convergence theorem, back propagation algorithm and Applications, XOR problem, functional approximation and curse of dimensionality.
  • Radial Basis Function networks: Cover’s Theorem for separability of Patterns, regularization theory and networks, approximation properties of RBFs, kernel regression and Learning strategies, applications.
  • Support Vector Machines: Optimal hyperplane for linear separable patterns and non- separable patterns, SVMS or pattern recognition, epsilon-insensitive loss function, SVMs or nonlinear regression, applications.
  • Committee Machines: Associative Gaussian mixture model, hierarchical mixture of experts (HME) model, EM algorithm, Application of EM algorithm to HME.
  • Principal Component Analysis: Eigen structure of PCA, Hebbian based maximum Eigen filter, Hebbian based PCA adaptive PCA using lateral inhibitions (APEX), PCA based on neural networks: reestimation and decorrelating algorithms, Kernel PCA applications.

Reference Books:

  • Neural Networks: A Comprehensive foundation by Simon Haykin, Prentice Hall of India. (required)
  • Machine Learning: Probabilistic Perspective. Kevin Murphy, MIT Press. (optional)
  • Class notes

Grading Policy:

Homeworks:      50%

Project:                25%

Final exam:         25%


Homeworks:

Exams:


Project:


Course announcements

  • Project report due date: 7th May 2017.
  • Project presentations and demos : 9th May 2017.
  • Final exam: 11th May 2017.