Lectures & Notes

Introduction and background
  1. Introduction to Optimization — 27 July

  2. Review of linear algebra & note on condition number — 28 July

  3. Review of analysis — 29 July

  4. Review of convexity and continuity — 02 Aug

  5. Review of calculus — 03 Aug, 04 Aug

Unconstrained optimization & Line Search methods
  1. Introduction, Taylors theorem, 1st and 2nd order conditions on a stationary point — 05 Aug

  2. Overview of line search methods, definition of descent directions including steepest descent, and Newton directions — 12 Aug

  3. Matlab code to show gradients and function contours, trust region methods and line search considerations — 16 Aug

  4. Wolfe conditions for inexact line search and backtracking line search — 17 Aug

  5. Proof of convergence of descent methods — 18 Aug, 19 Aug

Conjugate gradient method
  1. Introduction, visualizing quadratic forms, and property of conjugacy — 23 Aug

  2. Method of conjugate directions, and properties of conjugate vectors — 24 Aug

  3. Expanding subspace theorem for conjugate directions method — 25 Aug

  4. Formulation of the conjugate gradient method — 26 Aug

  5. Matlab demo of steepest descent v/s conjugate gradient link — 01 Sep

  6. Preconditioned CG method, code for playing around with condition number  — 06 Sep, 07 Sep

  7. Nonlinear CG - 07 Sep, 13 Sep

Newton and Newton-like methods
  1. Introduction and proof of convergence — 14 Sep

  2. Hessian modification and Gershgorin disk theorem for eigenvalue bounds — 15 Sep

  3. Quasi newton methods, intuition, and the BFGS formulation — 16 Sep

  4. Least squares problems — 21 Sep

  5. Linear least squares — 22 Sep

  6. Nonlinear least squares, Gauss Newton method — 23 Sep

Constrained Optimization
  1. Introduction, definitions and a look at equality constraints — 07 Oct

  2. Single equality and inequality constraints, Lagrangian — 11 Oct

  3. Linearized feasibility directions, example with multiple constraints — 12 Oct

  4. Geometry vs algebra — tangent cone and linear independence constraint qualification (ref)-- 13 Oct

  5. First order necessary conditions for a minima & the KKT conditions — 14 Oct

  6. Projected gradient method (ref1, ref2, showing descent of iterates) — 19 Oct

  7. Examples of projection operations (L2 ball), concept of subgradients ref — 20 Oct

  8. Projection onto the L1 ball (ref) — 21 Oct

  9. KKT and duality — 26 Oct

  10. Geometric interpretation of duality (ref, author unknown) — 27 Oct

  11. Weak duality and an example of applying KKT/duality — 28 Oct

Notes
  1. Introduction to Optimization

  2. Summary of background material

  3. Unconstrained optimization

  4. Line search methods

  5. Conjugate gradient methods

  6. Newton & quasi Newton methods

  7. Least squares problems

  8. Constrained optimization — first order

Course eval & important dates
  • Tutorials:

# Practice Test Weightage Mean Stdev

1

Wed 10 Aug

Thu 11 Aug

08%

10.1/15

2.9

2

Tue 30 Aug

Fri 02 Sep

14%

8.8/15

2.2

3

Tue 28 Sep

Fri 30 Sept

14%

10.8/15

2.5

4

Tue 01 Nov

Fri 04 Nov

14%

  • Poster day, Fri 28 Oct or Sat 29 Oct — 20%

    1. Group formation date: Thu 15 Sep

    2. Title and 1-pg abstract submission date: Tue 06 Oct (Sample template file — copy/paste into overleaf)

    3. Poster submission date (Sample posters): Wed 28 Oct

  • Endsem, Mon 21 Nov — 30%

    TAs
    1. Chandan (ee16d209)

    2. Aravint (ee18b125)

    3. Srivatsan (ee18b033)

    4. Anirudh (ee18b103)

    5. Sayan (ee18b156)

Course Flyer

Prerequisite

Linear algebra

Broad course contents
  1. Review: linear algebra, analysis, and calculus

  2. Unconstrained optimization - descent directions, line search methods, Wolfe conditions, steepest descent method and its analysis, conjugate gradient method and its analysis, preconditioned conjugate gradient method, extension of the conjugate gradient method to its nonlinear versions, introduction to the Newton and quasi Newton methods, discussion of linear and nonlinear least squares problems

  3. Constrained optimization - first order necessary conditions, Karush–Kuhn–Tucker (KKT) conditions with proof, projected gradient method and some examples of projection operatations, weak and strong duality, second order necessary conditions (time permitting)

  4. Programming implementations of the above methods

References books
  1. Numerical Optimization by Jorge Nocedal and Stephen J. Wright, Springer, 2006. Available online. Keyword NW

  2. Convex Optimization by Stephen Boyd and Lieven Vandenberghe, Oxford University Press, 2009. Available online. Keyword: BL

Back to Home