EE5120 Linear Algebra (July-Dec 2018), Instructor: Dr Uday Khankhoje
Lectures (J slot): Mo 4:50-5:40p, We 2-3:15p, Th 3:25-4:40p. All in ESB 128.
News
  1. If you have taken any linear algebra course previously (esp the Math dept version), you will automatically be dropped from the course.
  2. Mid-sem exam will be held on Wed 19 Sept from 2-4pm. Closed book, one sided A4 cheat sheet allowed.
  3. End-sem exam will be held on Tue 27 Nov, 1-4p. Closed book, two sided A4 cheat sheet allowed.
Tutorial dates:
Tutorial # 1, sol, Q 2, sol, Q 3, sol, Q 4, sol, Q 5, sol, Q 6, sol, Q 7, sol, Q
Date, Quiz date 20/8, 27/8 29/8, 05/9 10/9, 12/9 01/10, 04/10 15/10, 18/10 25/10, 05/11 8/11, 12/11
Quiz avg 5.5/10 7.2/10 6.4/10 7.5/10 7.3/10 5/10

Resources
  1. Gilbert Strang's website.
  2. A popular YouTube channel for visualizations in linear algebra.
  3. On using the open source software SAGE to do linear algebra: tutorial.
Lecture Topics
  1. Introduction and solving a linear system of equations, Ax=b (Ch 1 of GS) [4 lectures]
    1. Geometric (row) and algebraic (column) picture of matrix equations. Lecture 1, 31 Jul
    2. Refresher of Gaussian elimination. Lecture 2, 06 Aug
    3. Gaussian elimination as matrix multiplications: LU decomposition (sample code), pivoting, round-off errors. Lecture 3, 09 Aug
    4. Pivoting, matrix inverse and transpose, Finite difference matrices: tridiagonal and their LU decomposition. Lecture 4, 13 Aug
  2. Vector spaces (Ch 2 of GS) [7 lectures]
    1. Definitions of vector spaces and sub-spaces, column and null space of a matrix with examples. Lecture 5, 15 Aug
    2. Echelon and row reduced echelon form of a matrix, matrix rank and dimensionality of col space and null space. Lectures 6,7, 16,23 Aug
    3. Span of a vector space, basis, dimension. Lecture 8, 27 Aug
    4. Four fundamental subspaces related to a matrix, Inverses of rectangular matrices. Lecture 9, 29 Aug
    5. Linear transformations. Why are matrix computations preferred? Discussion here. Lectures 10,11, 03,05 Sept
  3. Orthogonality (Ch 3 of GS) [6 lectures]
    1. Orthogonality of vectors, subspaces, notion of orthogonal compliment of a subspace, and orthogonality relations between the four fundamental subspaces of a matrix. Lecture 12, 06 Sept
    2. Solutions to least square error problems, and connection to pseudo-inverse. Lecture 13, 12 Sept

    3. --- Mid Sem ---

    4. Projection onto a vector space as a matrix operation, projection onto a line. Minimum norm solution in the under-determined case, and connection to pseudo-inverse. Lecture 14, 20 Sept
    5. Orthogonal vector and matrices. Lecture 15 24 Sept
    6. Gram-Schmidt process of orthonormalization, QR decomposition of a matrix. Lecture 16 26 Sept
    7. Hilbert spaces, function spaces and the concept of orthogonality in these spaces. Lecture 17 27 Sept
  4. Special lecture on compressive sensing by Yash Sanghvi. Lecture 19 03 Oct
  5. Determinants in brief (Ch 4 of GS) : Properties of determinants (sec 4.2 of GS); Geometrical interpretation of determinants; determinant of the Jacobian. Lecture 20 04 Oct
  6. Eigenvalues and eigenvectors (Ch 5 of GS) [6 lectures]
    1. Definition and a few properties of the matrix eigenvalue problem. Lecture 21 08 Oct
    2. Algebraic and geometric multiplicity of an eigenvalue, some properties; Proof regarding multiplicity. Lecture 22 10 Oct
    3. Diagonalization of a matrix (also called its eigen decomposition); its use to compute powers of a matrix. Lecture 23 11 Oct
    4. Powers of a matrix: application based on Fibonacci numbers, Hermitian matrices and their properties, Spectral theorem, Unitary matrices. Lecture 24 17 Oct
    5. Change of basis and similarity transforms. Lecture 25 18 Oct
    6. Schur decomposition of a matrix (instructor notes, extra notes). Lecture 25 22 Oct
  7. Positive definite matrices and the SVD (Ch 6 of GS) [4 lectures]
    1. Idea of optimization, quadratic forms, definition of and tests for positive definite matrices, geometric interpretations. Lecture 26 24 Oct
    2. Proof of the singular value decomposition. Lecture 27 29 Oct
    3. Properties of the Singular value decomposition (resources) and applications to image compression (sample code, image, output). Lecture 28 31 oct
    4. SVD and matrix computations; psuedo-inverses, condition number, regularization (truncated SVD and Tikhonov). Lectures 29 1 Nov

Course Project : Explaining concepts in Linear Algebra as used in modern research

  1. Deadlines (default is 2359hrs of the mentioned dates):
    1. Group formation (28 Sept): Form groups of two -- The first 5 alphabets of your roll numbers must be different within a group.
    2. Title formation (12th Oct): You must give a title and link to a reference paper which must be approved by me (I mark approved or not in the sheet itself). Paper must have a nontrivial linear algebra component (which you must explain); it must have been published post 2000 in a "respectable" journal/conference. The title of the paper and your project can't be the same.
    3. Video submission (30 Nov): Update the sheet with appropriate YouTube links. Test that it is accessible to anyone.
    View-only link to all projects/links/marks/comments.
  2. Guidelines and tips
    • Time budget for your video (approximate guidelines): first 15% lays out the problem at a "40,000 feet" view, next 60-70% picks out the linear algebra aspects and explains them, final 15-20% connects the linear algebra aspects back to the original problem and shows how the original problem is solved. Please identify the relevant linear algebra aspects very clearly.
    • Where to look for possible topics? Many IEEE societies publish magazines which explain topics at a high-level. These can be good starting points. Examples: IEEE Signal Processing Magazine, IEEE Communications Magazine, IEEE Antennas and Propagation Magazine. Some more elementary resources: applications of linear algebra (url1, url2).
    • Some areas that extensively use linear algebra: compressive sensing, image and signal processing, information and coding theory, quantum computing, computer graphics and vision, graphs and networks, numerical linear algebra, numerical physics.
    • Your choice of topic must be approved by the instructor (see spreadsheet)
    • Technical details of the video that you will make: duration 7 minutes, and no robot voices.
    • Some tips on how to make a video: url1, url2.
    • Do not share huge, long links. tinyurl to shorten them.
    • See last year's website for examples, HOWEVER you can't repeat any of the papers used on this website.
  3. Evaluation criteria
    • 5 points: technical depth (self explanatory)
    • 4 points: clarity of presentation (how accessible is your video to a non-expert, how creatively have you used the visual medium to convey your ideas, etc)
    • 1 point: something extra (e.g. you wrote your own code to simulate something, did some original work, etc.)
    Total: 10 points. Reviews will be shared with you in the same sheet.
Course flyer
  • Evaluation: 30% midsem, 25% tutorial quizs (best n out of n+1), 10% research paper summary, 35% endsem
  • Refer to the academic course listing for syllabus. In short, we will study most of the topics in the textbook, with an inclination towards numerical linear algebra where necessary.
  • Text book: Linear Algebra and its applications, Gilbert Strang, 4th ed. GS.

Policies
  • As per institute rules, 85% attendance (minimum) is mandatory and will be enforced.
  • Academic misconduct: There will be zero tolerance towards any unethical means, such as plagiarism (COPYING in plain and simple terms) or proxy attendance. Read these links to familiarize yourself, there will be no excuse for ignorance: URL1, URL2. Penalties incude: receiving a zero in a particular assignment/examination, receiving a fail grade for the entire course, having a note placed in your permanent academic record, suspension, or all of the above.


home