Lectures & Notes
Course to be administered via Google Classroom — ask us to join
Unconstrained optimization & Line search methods

Introduction, Taylors theorem, 1st and 2nd order conditions on a stationary point — 10,11 Aug — 5, 6

Properties of descent directions, matlab visualization of gradients — 16 Aug — 6

Line search algorithms, Wolfe conditions, backtracking algorithm — 17 Aug — 7

Line search analysis, convergence — 23 Aug — 8
Notes: Unconstrained optimization and Line search methods
Newton, quasi Newton methods, and Least squares problems
Notes: Newton & quasi Newton methods and Least squares problems
Constrained optimization

Terminology, feasible set, active set — 12 Oct — 18b

Equality constrained optimization — 16 Oct — 19

Inequality constrained optimization, linearized feasible directions — 18 Oct — 20

Constraint qualification and first order necessary conditions (KKT) — 19 Oct — 21

Proof sketch of KKT conditions — 25 Oct — 22

Projected gradient descent algorithm — 26 Oct — 23, extra ref

Subgradients and the projection operator on the L1 ball — 01 Nov — 24, extra ref1, ref2.

KKT and duality, geometric interpretation (ref, author unknown) — 02 Nov — 25

Properties of the Lagrangian dual function, solving an example — 06 Nov — 26
Notes: Constrained optimization — first order
Course structure

Evaluations — quiz 1 (20), quiz 2 (20), project (25), endsem (35), all exams as per Institute schedule.

First day of classes — 02 Aug

Tutorial 1 — 21 and 28 Aug 2023 during class

Quiz 1 — 30 Aug 2023 from 2  3.15p in ESB127/128. One A4 size cheat sheet allowed.

Tutorial 2 — 11 Sep 2023 from 56p in ESB242.

Tutorial 3 — 05 Oct 2023 from 3.30p in CRC, and 09 Oct from 5p in ESB242.

Quiz 2 — 11 Oct 2023 from 2  3.15p in ESB127. One A4 size cheat sheet allowed.

Tutorial 4 — 30 Nov from 56p in ESB242.

Endsem — 21 Nov from 25p, two sided A4 cheat sheet allowed (reuse of older sheets not allowed).
TAs

Sai Dinesh ee20d401

Sai Sanjay Narayanan ep20b031

Anant Goyal ee21d202

Kunchakara Alekhya ee21d006

Aaditya Kumar ee21d411
Course Flyer
Prerequisite
Linear algebra
Broad course contents

Review: linear algebra, analysis, and calculus

Unconstrained optimization  descent directions, line search methods, Wolfe conditions, steepest descent method and its analysis, conjugate gradient method and its analysis, preconditioned conjugate gradient method, extension of the conjugate gradient method to its nonlinear versions, introduction to the Newton and quasi Newton methods, discussion of linear and nonlinear least squares problems

Constrained optimization  first order necessary conditions, Karush–Kuhn–Tucker (KKT) conditions with proof, projected gradient method, subdifferentials and some examples of projection operatations, weak and strong duality, second order necessary conditions (time permitting)

Programming implementations of the above methods