Probability Theory: Foundation for Data Science
University of Colorado Boulder via Coursera

76

 Write review
Overview
Unlock Unlimited Opportunities: Get 50% Off Your First Month of Coursera Plus
Understand the foundations of probability and its relationship to statistics and data science. We’ll learn what it means to calculate a probability, independent and dependent outcomes, and conditional events. We’ll study discrete and continuous random variables and see how this fits with data collection. We’ll end the course with Gaussian (normal) random variables and the Central Limit Theorem and understand its fundamental importance for all of statistics and data science.
This course can be taken for academic credit as part of CU Boulder’s Master of Science in Data Science (MSDS) degree offered on the Coursera platform. The MSDS is an interdisciplinary degree that brings together faculty from CU Boulder’s departments of Applied Mathematics, Computer Science, Information Science, and others. With performancebased admissions and no application process, the MSDS is ideal for individuals with a broad range of undergraduate education and/or professional experience in computer science, information science, mathematics, and statistics. Learn more about the MSDS program at https://www.coursera.org/degrees/masterofsciencedatascienceboulder
Logo adapted from photo by Christopher Burns on Unsplash.
Syllabus
 Start Here!
 Welcome to the course! This module contains logistical information to get you started!
 Descriptive Statistics and the Axioms of Probability
 Understand the foundation of probability and its relationship to statistics and data science. We’ll learn what it means to calculate a probability, independent and dependent outcomes, and conditional events. We’ll study discrete and continuous random variables and see how this fits with data collection. We’ll end the course with Gaussian (normal) random variables and the Central Limit Theorem and understand it’s fundamental importance for all of statistics and data science.
 Conditional Probability
 The notion of “conditional probability” is a very useful concept from Probability Theory and in this module we introduce the idea of “conditioning” and Bayes’ Formula. The fundamental concept of “independent event” then naturally arises from the notion of conditioning. Conditional and independent events are fundamental concepts in understanding statistical results.
 Discrete Random Variables
 The concept of a “random variable” (r.v.) is fundamental and often used in statistics. In this module we’ll study various named discrete random variables. We’ll learn some of their properties and why they are important. We’ll also calculate the expectation and variance for these random variables.
 Continuous Random Variables
 In this module, we’ll extend our definition of random variables to include continuous random variables. The concepts in this unit are crucial since a substantial portion of statistics deals with the analysis of continuous random variables. We’ll begin with uniform and exponential random variables and then study Gaussian, or normal, random variables.
 Joint Distributions and Covariance
 The power of statistics lies in being able to study the outcomes and effects of multiple random variables (i.e. sometimes referred to as “data”). Thus, in this module, we’ll learn about the concept of “joint distribution” which allows us to generalize probability theory to the multivariate case.
 The Central Limit Theorem
 The Central Limit Theorem (CLT) is a crucial result used in the analysis of data. In this module, we’ll introduce the CLT and it’s applications such as characterizing the distribution of the mean of a large data set. This will set the stage for the next course.
Taught by
Anne Dougherty