Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Imperial College London

Mathematics for Machine Learning: PCA

Imperial College London via Coursera

Overview

Prepare for a new career with $100 off Coursera Plus
Gear up for jobs in high-demand fields: data analytics, digital marketing, and more.
This intermediate-level course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a fundamental dimensionality reduction technique. We'll cover some basic statistics of data sets, such as mean values and variances, we'll compute distances and angles between vectors using inner products and derive orthogonal projections of data onto lower-dimensional subspaces. Using all these tools, we'll then derive PCA as a method that minimizes the average squared reconstruction error between data points and their reconstruction.

At the end of this course, you'll be familiar with important mathematical concepts and you can implement PCA all by yourself. If you’re struggling, you'll find a set of jupyter notebooks that will allow you to explore properties of the techniques and walk you through what you need to do to get on track. If you are already an expert, this course may refresh some of your knowledge.

The lectures, examples and exercises require:
1. Some ability of abstract thinking
2. Good background in linear algebra (e.g., matrix and vector algebra, linear independence, basis)
3. Basic background in multivariate calculus (e.g., partial derivatives, basic optimization)
4. Basic knowledge in python programming and numpy

Disclaimer: This course is substantially more abstract and requires more programming than the other two courses of the specialization. However, this type of abstract thinking, algebraic manipulation and programming is necessary if you want to understand and develop machine learning algorithms.

Syllabus

  • Statistics of Datasets
    • Principal Component Analysis (PCA) is one of the most important dimensionality reduction algorithms in machine learning. In this course, we lay the mathematical foundations to derive and understand PCA from a geometric point of view. In this module, we learn how to summarize datasets (e.g., images) using basic statistics, such as the mean and the variance. We also look at properties of the mean and the variance when we shift or scale the original data set. We will provide mathematical intuition as well as the skills to derive the results. We will also implement our results in code (jupyter notebooks), which will allow us to practice our mathematical understand to compute averages of image data sets. Therefore, some python/numpy background will be necessary to get through this course.

      Note: If you have taken the other two courses of this specialization, this one will be harder (mostly because of the programming assignments). However, if you make it through the first week of this course, you will make it through the full course with high probability.
  • Inner Products
    • Data can be interpreted as vectors. Vectors allow us to talk about geometric concepts, such as lengths, distances and angles to characterize similarity between vectors. This will become important later in the course when we discuss PCA. In this module, we will introduce and practice the concept of an inner product. Inner products allow us to talk about geometric concepts in vector spaces. More specifically, we will start with the dot product (which we may still know from school) as a special case of an inner product, and then move toward a more general concept of an inner product, which play an integral part in some areas of machine learning, such as kernel machines (this includes support vector machines and Gaussian processes). We have a lot of exercises in this module to practice and understand the concept of inner products.
  • Orthogonal Projections
    • In this module, we will look at orthogonal projections of vectors, which live in a high-dimensional vector space, onto lower-dimensional subspaces. This will play an important role in the next module when we derive PCA. We will start off with a geometric motivation of what an orthogonal projection is and work our way through the corresponding derivation. We will end up with a single equation that allows us to project any vector onto a lower-dimensional subspace. However, we will also understand how this equation came about. As in the other modules, we will have both pen-and-paper practice and a small programming example with a jupyter notebook.
  • Principal Component Analysis
    • We can think of dimensionality reduction as a way of compressing data with some loss, similar to jpg or mp3. Principal Component Analysis (PCA) is one of the most fundamental dimensionality reduction techniques that are used in machine learning. In this module, we use the results from the first three modules of this course and derive PCA from a geometric point of view. Within this course, this module is the most challenging one, and we will go through an explicit derivation of PCA plus some coding exercises that will make us a proficient user of PCA.

Taught by

Marc P. Deisenroth

Reviews

2.0 rating, based on 3 Class Central reviews

4 rating at Coursera based on 3039 ratings

Start your review of Mathematics for Machine Learning: PCA

  • Profile image for DietCoke
    DietCoke
    I have completed the first 2 courses in the specilization, and this is the 3rd and the last course in the specilization. Everything was very easy until the last week of the last course. But when the hard part comes, the lecturer does not give proofs/explanation in detail, and the questions remain unanswered in the forum for months. Generally speaking, if you know the subject before the course, you will learn nothing; if you do not know the subject before taking the course, you won't understand it by taking the course etiher unless you do research yourself; you will remember the conclusion but not how to derive it, which I believe is undesirable in a MATHEMATICS course. Not recommend to devote either your time or money to it.

  • Anonymous
    Although the topics and lecturer's delivery were nice, but as compared to the two previous courses of the specialization, this one doesn't fare well. The content in the video lessons and that in the notebook were not really planned well in terms of scope. A participant who isn't already familiar with these concepts, would struggle a lot. Only if the reading material, video content and notebook assignments were designed keeping that in mind, it would have been better. Apart from that it was a good course. For some additional resources refer to this link:

    https://www.linkedin.com/posts/sagar-ladhwani-713b96112_datascience-machinelearning-mathematics-activity-6647131251033628672-6Qkm
  • Sagar Ladhwani
    Although the topics and lecturer's delivery were nice, but as compared to the two previous courses of the specialization, this one doesn't fare well. The content in the video lessons and that in the notebook were not really planned well in terms of scope. A participant who isn't already familiar with these concepts, would struggle a lot. Only if the reading material, video content and notebook assignments were designed keeping that in mind, it would have been better. Apart from that it was a good course. For some additional resources refer to this link:

    https://www.linkedin.com/posts/sagar-ladhwani-713b96112_datascience-machinelearning-mathematics-activity-6647131251033628672-6Qkm

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.