Get started with custom lists to organize and share courses.

Sign up

Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Machine Learning: Regression

University of Washington via Coursera

19 Reviews 7731 students interested
  • Provider Coursera
  • Cost Free Online Course (Audit)
  • Session Upcoming
  • Language English
  • Certificate Paid Certificate Available
  • Start Date
  • Duration 6 weeks long
  • Learn more about MOOCs

Taken this course? Share your experience with other students. Write review


Case Study - Predicting Housing Prices

In our first case study, predicting house prices, you will create models that predict a continuous value (price) from input features (square footage, number of bedrooms and bathrooms,...). This is just one of the many places where regression can be applied. Other applications range from predicting health outcomes in medicine, stock prices in finance, and power usage in high-performance computing, to analyzing which regulators are important for gene expression.

In this course, you will explore regularized linear regression models for the task of prediction and feature selection. You will be able to handle very large sets of features and select between models of various complexity. You will also analyze the impact of aspects of your data -- such as outliers -- on your selected models and predictions. To fit these models, you will implement optimization algorithms that scale to large datasets.

Learning Outcomes: By the end of this course, you will be able to:
-Describe the input and output of a regression model.
-Compare and contrast bias and variance when modeling data.
-Estimate model parameters using optimization algorithms.
-Tune parameters with cross validation.
-Analyze the performance of the model.
-Describe the notion of sparsity and how LASSO leads to sparse solutions.
-Deploy methods to select between models.
-Exploit the model to form predictions.
-Build a regression model to predict prices using a housing dataset.
-Implement these techniques in Python.


-Regression is one of the most important and broadly used machine learning and statistics tools out there. It allows you to make predictions from data by learning the relationship between features of your data and some observed, continuous-valued response. Regression is used in a massive number of applications ranging from predicting stock prices to understanding gene regulatory networks.

This introduction to the course provides you with an overview of the topics we will cover and the background knowledge and resources we assume you have.

Simple Linear Regression
-Our course starts from the most basic regression model: Just fitting a line to data. This simple model for forming predictions from a single, univariate feature of the data is appropriately called "simple linear regression".

In this module, we describe the high-level regression task and then specialize these concepts to the simple linear regression case. You will learn how to formulate a simple regression model and fit the model to data using both a closed-form solution as well as an iterative optimization algorithm called gradient descent. Based on this fitted function, you will interpret the estimated model parameters and form predictions. You will also analyze the sensitivity of your fit to outlying observations.

You will examine all of these concepts in the context of a case study of predicting house prices from the square feet of the house.

Multiple Regression
-The next step in moving beyond simple linear regression is to consider "multiple regression" where multiple features of the data are used to form predictions.

More specifically, in this module, you will learn how to build models of more complex relationship between a single variable (e.g., 'square feet') and the observed response (like 'house sales price'). This includes things like fitting a polynomial to your data, or capturing seasonal changes in the response value. You will also learn how to incorporate multiple input variables (e.g., 'square feet', '# bedrooms', '# bathrooms'). You will then be able to describe how all of these models can still be cast within the linear regression framework, but now using multiple "features". Within this multiple regression framework, you will fit models to data, interpret estimated coefficients, and form predictions.

Here, you will also implement a gradient descent algorithm for fitting a multiple regression model.

Assessing Performance
-Having learned about linear regression models and algorithms for estimating the parameters of such models, you are now ready to assess how well your considered method should perform in predicting new data. You are also ready to select amongst possible models to choose the best performing.

This module is all about these important topics of model selection and assessment. You will examine both theoretical and practical aspects of such analyses. You will first explore the concept of measuring the "loss" of your predictions, and use this to define training, test, and generalization error. For these measures of error, you will analyze how they vary with model complexity and how they might be utilized to form a valid assessment of predictive performance. This leads directly to an important conversation about the bias-variance tradeoff, which is fundamental to machine learning. Finally, you will devise a method to first select amongst models and then assess the performance of the selected model.

The concepts described in this module are key to all machine learning problems, well-beyond the regression setting addressed in this course.

Ridge Regression
-You have examined how the performance of a model varies with increasing model complexity, and can describe the potential pitfall of complex models becoming overfit to the training data. In this module, you will explore a very simple, but extremely effective technique for automatically coping with this issue. This method is called "ridge regression". You start out with a complex model, but now fit the model in a manner that not only incorporates a measure of fit to the training data, but also a term that biases the solution away from overfitted functions. To this end, you will explore symptoms of overfitted functions and use this to define a quantitative measure to use in your revised optimization objective. You will derive both a closed-form and gradient descent algorithm for fitting the ridge regression objective; these forms are small modifications from the original algorithms you derived for multiple regression. To select the strength of the bias away from overfitting, you will explore a general-purpose method called "cross validation".

You will implement both cross-validation and gradient descent to fit a ridge regression model and select the regularization constant.

Feature Selection & Lasso
-A fundamental machine learning task is to select amongst a set of features to include in a model. In this module, you will explore this idea in the context of multiple regression, and describe how such feature selection is important for both interpretability and efficiency of forming predictions.

To start, you will examine methods that search over an enumeration of models including different subsets of features. You will analyze both exhaustive search and greedy algorithms. Then, instead of an explicit enumeration, we turn to Lasso regression, which implicitly performs feature selection in a manner akin to ridge regression: A complex model is fit based on a measure of fit to the training data plus a measure of overfitting different than that used in ridge. This lasso method has had impact in numerous applied domains, and the ideas behind the method have fundamentally changed machine learning and statistics. You will also implement a coordinate descent algorithm for fitting a Lasso model.

Coordinate descent is another, general, optimization technique, which is useful in many areas of machine learning.

Nearest Neighbors & Kernel Regression
-Up to this point, we have focused on methods that fit parametric functions---like polynomials and hyperplanes---to the entire dataset. In this module, we instead turn our attention to a class of "nonparametric" methods. These methods allow the complexity of the model to increase as more data are observed, and result in fits that adapt locally to the observations.

We start by considering the simple and intuitive example of nonparametric methods, nearest neighbor regression: The prediction for a query point is based on the outputs of the most related observations in the training set. This approach is extremely simple, but can provide excellent predictions, especially for large datasets. You will deploy algorithms to search for the nearest neighbors and form predictions based on the discovered neighbors. Building on this idea, we turn to kernel regression. Instead of forming predictions based on a small set of neighboring observations, kernel regression uses all observations in the dataset, but the impact of these observations on the predicted value is weighted by their similarity to the query point. You will analyze the theoretical performance of these methods in the limit of infinite training data, and explore the scenarios in which these methods work well versus struggle. You will also implement these techniques and observe their practical behavior.

Closing Remarks
-In the conclusion of the course, we will recap what we have covered. This represents both techniques specific to regression, as well as foundational machine learning concepts that will appear throughout the specialization. We also briefly discuss some important regression techniques we did not cover in this course.

We conclude with an overview of what's in store for you in the rest of the specialization.

Taught by

Carlos Guestrin and Emily Fox

Help Center

Most commonly asked questions about Coursera Coursera

Reviews for Coursera's Machine Learning: Regression
4.6 Based on 19 reviews

  • 5 stars 74%
  • 4 stars 16%
  • 3 star 5%
  • 2 star 5%
  • 1 star 0%

Did you take this course? Share your experience with other students.

Write a review
  • 1
Gregory S
5.0 4 years ago
by Gregory completed this course and found the course difficulty to be medium.
Machine Learning: Regression is the second course in the 6-part Machine Learning specialization offered by the University of Washington on Coursera. The 6-week course builds from simple linear regression with one input feature in the first week to ridge regression, the lasso and kernel regression. Week 3 also takes a detour to discuss important machine learning topics like the bias/variance trade-off, overfitting and validation to motivate ridge and lasso regression. Like the first course in the specialization, "Regression" uses GraphLab Create, a Python package that will only run on the 64-bi…
10 people found
this review helpful
Was this review helpful to you? Yes
Norman B
5.0 4 years ago
by Norman completed this course, spending 9 hours a week on it and found the course difficulty to be hard.
This course delves into regression in a big way. You start off fairly simple, a simple linear model on some housing data (this should be pretty familiar if you took the case study class that is prerequisite to this one), and delves into the concepts at a good pace. You will be surprised by how much you can learn just by following along in the ipython notebooks' assignments. The lectures are laid out in a logical order of progression, and go at a pace that is slow enough to fully grasp the concepts. I recommend this course to anybody that wishes to learn about regression from a ML standpoint.<…
4 people found
this review helpful
Was this review helpful to you? Yes
Saransh A
5.0 3 years ago
by Saransh completed this course.
This is perhaps one of the best course which I could have taken on regression, each and every aspect was thoroughly discussed, the assignments were good, in fact the programming assignments were built with the learning part kept in mind, and not to trap the students in programming part of it

The course is heavy in comparison to other MOOCs.

God, this would have been perfect had it been in Scikit-Learn, but then again it might have been asking too much of it

Also, I suggest the people who complete the course to go to Kaggle and try to attempt a couple of questions of this technique after the completion of this course. It would definitely help you cement your understanding

All in all this course was a total 5/5

Definitely continuing with the specialization
Was this review helpful to you? Yes
Jason C
5.0 4 years ago
by Jason completed this course, spending 8 hours a week on it and found the course difficulty to be very hard.
This is one of the most informative and useful online classes I've taken to date. The material covered is detailed and applicable broadly. It is also exceptionally hard! The assignments are very challenging and extremely precise.

I struggled frequently and it ended up taking a significant amount of time, but it was extremely well worth it in the end. I'm very excited for the next class in the specialization!
3 people found
this review helpful
Was this review helpful to you? Yes
Steve S
4.0 3 years ago
by Steve completed this course, spending 10 hours a week on it and found the course difficulty to be hard.
Just finished the class. It's not easy and I definitely learned a lot. My only complaints might be that if you're taking this through Coursera, you're pretty much on your own if you get stuck on something. There aren't many students taking it, and there don't seem to be any mentors to answer questions. It's also one of those theoretical classes where you don't really know how to apply the concepts after you finish.
Was this review helpful to you? Yes
Daniel R
5.0 4 years ago
by Daniel completed this course, spending 6 hours a week on it and found the course difficulty to be hard.
It is just excellent!

At the end of the course you should have your own toolbox to create regression models without needing any license or support.

It is hard though, but its worth it!
4 people found
this review helpful
Was this review helpful to you? Yes
Y. N
2.0 3 years ago
by Y. completed this course, spending 10 hours a week on it and found the course difficulty to be medium.
The course is "chapter 2"of the Machine Learning certification from this university. The start of this course was interesting. Videos are great and iPython assignements may prove difficult. But all in all I found this course much less interesting than the "Foundatins course (chapter 1 of the specialization. It looses its objectives very fast and basically what you will learn is to code "gradient descent" algorithms on and on....after listening to hours of videos that will have no use in your daily activities. Quite disappointed, especially now that I know that the GraphLab librairy used for the course is not a free package and the home company was acquired by Apple.... I intented to do the whole Machine Learning specialization of Uni of Washington on Coursera but actually..I won't.
Was this review helpful to you? Yes
Dietcoke D
5.0 a year ago
Dietcoke completed this course, spending 3 hours a week on it and found the course difficulty to be medium.
The professors introduce many advanced topics in a smooth way that you can understand easily. It covers more details than most other ML MOOCs since it spends a whole class talking about regression, while other courses may spend only 1 or 2. I suggest people with some basic backgrounds in stats/regression taking this course, and you will learn more advenced topics such as ridge/lasso and how to implement it in Python. I plan to take the whole specialization and then go to more advance courses such as Neural Network offered by Toronto/Learning from data offered by CalTech.

Was this review helpful to you? Yes W
5.0 3 years ago
by completed this course, spending 1 hours a week on it and found the course difficulty to be easy.
"Very fun" with professors are very informative and clearly explain, Course video is very great quality, Slideshow is full of color and picture that make the course not boring and also had commentary from instructors so you can read it without watching video and you will found it's also can understand with the instructors's commentary, I really want to give 10 star here.
Was this review helpful to you? Yes
Raphael F
4.0 3 years ago
Raphael completed this course.
Was this review helpful to you? Yes
Dhawal S
5.0 3 years ago
by Dhawal completed this course.
Was this review helpful to you? Yes
Abhilash V
5.0 3 years ago
by Abhilash completed this course.
Was this review helpful to you? Yes
Colin K
5.0 4 years ago
by Colin completed this course.
Was this review helpful to you? Yes
Fagner S
5.0 3 years ago
by Fagner completed this course.
Was this review helpful to you? Yes
Jinwook J
5.0 4 years ago
by Jinwook completed this course.
Was this review helpful to you? Yes
Gerhard G
5.0 4 years ago
Gerhard completed this course.
Was this review helpful to you? Yes
Vikram P
5.0 4 years ago
Vikram completed this course.
Was this review helpful to you? Yes
Alex I
3.0 3 years ago
Alex completed this course.
Was this review helpful to you? Yes
Zhen J
4.0 3 years ago
Zhen completed this course.
Was this review helpful to you? Yes
  • 1

Class Central

Get personalized course recommendations, track subjects and courses with reminders, and more.

Sign up for free

Never stop learning Never Stop Learning!

Get personalized course recommendations, track subjects and courses with reminders, and more.