Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Massachusetts Institute of Technology

Matrix Methods in Data Analysis, Signal Processing, and Machine Learning (Spring 2018)

Massachusetts Institute of Technology via MIT OpenCourseWare

Overview

Course Features
  • Video lectures
  • Captions/transcript
  • Assignments: problem sets (no solutions)
Educator Features
  • Instructor insights
  • Podcast - audio
Course Description

Linear algebra concepts are key for understanding and creating machine learning algorithms, especially as applied to deep learning and neural networks. This course reviews linear algebra with applications to probability and statistics and optimization–and above all a full explanation of deep learning.

Syllabus

Course Introduction of 18.065 by Professor Strang.
An Interview with Gilbert Strang on Teaching Matrix Methods in Data Analysis, Signal Processing,....
1. The Column Space of A Contains All Vectors Ax.
2. Multiplying and Factoring Matrices.
3. Orthonormal Columns in Q Give Q'Q = I.
4. Eigenvalues and Eigenvectors.
5. Positive Definite and Semidefinite Matrices.
6. Singular Value Decomposition (SVD).
7. Eckart-Young: The Closest Rank k Matrix to A.
8. Norms of Vectors and Matrices.
9. Four Ways to Solve Least Squares Problems.
10. Survey of Difficulties with Ax = b.
11. Minimizing _x_ Subject to Ax = b.
12. Computing Eigenvalues and Singular Values.
13. Randomized Matrix Multiplication.
14. Low Rank Changes in A and Its Inverse.
15. Matrices A(t) Depending on t, Derivative = dA/dt.
16. Derivatives of Inverse and Singular Values.
17. Rapidly Decreasing Singular Values.
18. Counting Parameters in SVD, LU, QR, Saddle Points.
19. Saddle Points Continued, Maxmin Principle.
20. Definitions and Inequalities.
21. Minimizing a Function Step by Step.
22. Gradient Descent: Downhill to a Minimum.
23. Accelerating Gradient Descent (Use Momentum).
24. Linear Programming and Two-Person Games.
25. Stochastic Gradient Descent.
26. Structure of Neural Nets for Deep Learning.
27. Backpropagation: Find Partial Derivatives.
30. Completing a Rank-One Matrix, Circulants!.
31. Eigenvectors of Circulant Matrices: Fourier Matrix.
32. ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule.
33. Neural Nets and the Learning Function.
34. Distance Matrices, Procrustes Problem.
35. Finding Clusters in Graphs.
36. Alan Edelman and Julia Language.

Taught by

Prof. Gilbert Strang

Reviews

5.0 rating, based on 3 Class Central reviews

Start your review of Matrix Methods in Data Analysis, Signal Processing, and Machine Learning (Spring 2018)

  • R Akila Visali
    It is very useful to learn new things in the field of algebra. The explanation was very beautiful . In depth of the session was nice .
  • Profile image for Farhan Rahman Farabi
    Farhan Rahman Farabi
    Good course please join and upgrade our knowledge.it helps us to renew our minds and know strategies that we may use in teaching mathematics
  • Fulufhelo Veronica Masia
    Good course please join and upgrade our knowledge.it helps us to renew our minds and know strategies that we may use in teaching mathematics

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.