Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Massachusetts Institute of Technology

MIT RES.6-012 Introduction to Probability, Spring 2018

Massachusetts Institute of Technology via YouTube

Overview

This course on Introduction to Probability aims to teach learners the tools of probability theory and statistical inference to analyze and interpret data. The course covers topics such as sample space, probability axioms, conditional probabilities, independence of events, random variables, expectation, variance, probability density functions, and joint probability distributions. The teaching method includes lectures, examples, and supplemental materials. This course is intended for individuals interested in gaining a foundational understanding of probability and its applications in various fields such as science, engineering, and management.

Syllabus

L01.1 Lecture Overview.
L01.2 Sample Space.
L01.3 Sample Space Examples.
L01.4 Probability Axioms.
L01.5 Simple Properties of Probabilities.
L01.6 More Properties of Probabilities.
L01.7 A Discrete Example.
L01.8 A Continuous Example.
L01.9 Countable Additivity.
L01.10 Interpretations & Uses of Probabilities.
S01.0 Mathematical Background Overview.
S01.1 Sets.
S01.2 De Morgan's Laws.
S01.3 Sequences and their Limits.
S01.4 When Does a Sequence Converge.
S01.5 Infinite Series.
S01.6 The Geometric Series.
S01.7 About the Order of Summation in Series with Multiple Indices.
S01.8 Countable and Uncountable Sets.
S01.9 Proof That a Set of Real Numbers is Uncountable.
S01.10 Bonferroni's Inequality.
L02.1 Lecture Overview.
L02.2 Conditional Probabilities.
L02.3 A Die Roll Example.
L02.4 Conditional Probabilities Obey the Same Axioms.
L02.5 A Radar Example and Three Basic Tools.
L02.6 The Multiplication Rule.
L02.7 Total Probability Theorem.
L02.8 Bayes' Rule.
L03.1 Lecture Overview.
L03.2 A Coin Tossing Example.
L03.3 Independence of Two Events.
L03.4 Independence of Event Complements.
L03.5 Conditional Independence.
L03.6 Independence Versus Conditional Independence.
L03.7 Independence of a Collection of Events.
L03.8 Independence Versus Pairwise Independence.
L03.9 Reliability.
L03.10 The King's Sibling.
L04.1 Lecture Overview.
L04.2 The Counting Principle.
L04.3 Die Roll Example.
L04.4 Combinations.
L04.5 Binomial Probabilities.
L04.6 A Coin Tossing Example.
L04.7 Partitions.
L04.8 Each Person Gets An Ace.
L04.9 Multinomial Probabilities.
L05.1 Lecture Overview.
L05.2 Definition of Random Variables.
L05.3 Probability Mass Functions.
L05.4 Bernoulli & Indicator Random Variables.
L05.5 Uniform Random Variables.
L05.6 Binomial Random Variables.
L05.7 Geometric Random Variables.
L05.8 Expectation.
L05.9 Elementary Properties of Expectation.
L05.10 The Expected Value Rule.
L05.11 Linearity of Expectations.
S05.1 Supplement: Functions.
L06.1 Lecture Overview.
L06.2 Variance.
L06.3 The Variance of the Bernoulli & The Uniform.
L06.4 Conditional PMFs & Expectations Given an Event.
L06.5 Total Expectation Theorem.
L06.6 Geometric PMF Memorylessness & Expectation.
L06.7 Joint PMFs and the Expected Value Rule.
L06.8 Linearity of Expectations & The Mean of the Binomial.
L07.1 Lecture Overview.
L07.2 Conditional PMFs.
L07.3 Conditional Expectation & the Total Expectation Theorem.
L07.4 Independence of Random Variables.
L07.5 Example.
L07.6 Independence & Expectations.
L07.7 Independence, Variances & the Binomial Variance.
L07.8 The Hat Problem.
S07.1 The Inclusion-Exclusion Formula.
S07.2 The Variance of the Geometric.
S07.3 Independence of Random Variables Versus Independence of Events.
L08.1 Lecture Overview.
L08.2 Probability Density Functions.
L08.3 Uniform & Piecewise Constant PDFs.
L08.4 Means & Variances.
L08.5 Mean & Variance of the Uniform.
L08.6 Exponential Random Variables.
L08.7 Cumulative Distribution Functions.
L08.8 Normal Random Variables.
L08.9 Calculation of Normal Probabilities.
L09.1 Lecture Overview.
L09.2 Conditioning A Continuous Random Variable on an Event.
L09.3 Conditioning Example.
L09.4 Memorylessness of the Exponential PDF.
L09.5 Total Probability & Expectation Theorems.
L09.6 Mixed Random Variables.
L09.7 Joint PDFs.
L09.8 From The Joint to the Marginal.
L09.9 Continuous Analogs of Various Properties.
L09.10 Joint CDFs.
S09.1 Buffon's Needle & Monte Carlo Simulation.
L10.1 Lecture Overview.
L10.2 Conditional PDFs.
L10.3 Comments on Conditional PDFs.
L10.4 Total Probability & Total Expectation Theorems.
L10.5 Independence.
L10.6 Stick-Breaking Example.
L10.7 Independent Normals.
L10.8 Bayes Rule Variations.
L10.9 Mixed Bayes Rule.
L10.10 Detection of a Binary Signal.
L10.11 Inference of the Bias of a Coin.
L11.1 Lecture Overview.
L11.2 The PMF of a Function of a Discrete Random Variable.
L11.3 A Linear Function of a Continuous Random Variable.
L11.4 A Linear Function of a Normal Random Variable.
L11.5 The PDF of a General Function.
L11.6 The Monotonic Case.
L11.7 The Intuition for the Monotonic Case.
L11.8 A Nonmonotonic Example.
L11.9 The PDF of a Function of Multiple Random Variables.
S11.1 Simulation.
L12.1 Lecture Overview.
L12.2 The Sum of Independent Discrete Random Variables.
L12.3 The Sum of Independent Continuous Random Variables.
L12.4 The Sum of Independent Normal Random Variables.
L12.5 Covariance.
L12.6 Covariance Properties.
L12.7 The Variance of the Sum of Random Variables.
L12.8 The Correlation Coefficient.
L12.9 Proof of Key Properties of the Correlation Coefficient.
L12.10 Interpreting the Correlation Coefficient.
L12.11 Correlations Matter.
L13.1 Lecture Overview.
L13.2 Conditional Expectation as a Random Variable.
L13.3 The Law of Iterated Expectations.
L13.4 Stick-Breaking Revisited.
L13.5 Forecast Revisions.
L13.6 The Conditional Variance.
L13.7 Derivation of the Law of Total Variance.
L13.8 A Simple Example.
L13.9 Section Means and Variances.
L13.10 Mean of the Sum of a Random Number of Random Variables.
L13.11 Variance of the Sum of a Random Number of Random Variables.
S13.1 Conditional Expectation Properties.
L14.1 Lecture Overview.
L14.2 Overview of Some Application Domains.
L14.3 Types of Inference Problems.
L14.4 The Bayesian Inference Framework.
L14.5 Discrete Parameter, Discrete Observation.
L14.6 Discrete Parameter, Continuous Observation.
L14.7 Continuous Parameter, Continuous Observation.
L14.8 Inferring the Unknown Bias of a Coin and the Beta Distribution.
L14.9 Inferring the Unknown Bias of a Coin - Point Estimates.
L14.10 Summary.
S14.1 The Beta Formula.
L15.1 Lecture Overview.
L15.2 Recognizing Normal PDFs.
L15.3 Estimating a Normal Random Variable in the Presence of Additive Noise.
L15.4 The Case of Multiple Observations.
L15.5 The Mean Squared Error.
L15.6 Multiple Parameters; Trajectory Estimation.
L15.7 Linear Normal Models.
L15.8 Trajectory Estimation Illustration.
L16.1 Lecture Overview.
L16.2 LMS Estimation in the Absence of Observations.
L16.3 LMS Estimation of One Random Variable Based on Another.
L16.4 LMS Performance Evaluation.
L16.5 Example: The LMS Estimate.
L16.6 Example Continued: LMS Performance Evaluation.
L16.7 LMS Estimation with Multiple Observations or Unknowns.
L16.8 Properties of the LMS Estimation Error.
L17.1 Lecture Overview.
L17.2 LLMS Formulation.
L17.3 Solution to the LLMS Problem.
L17.4 Remarks on the LLMS Solution and on the Error Variance.
L17.5 LLMS Example.
L17.6 LLMS for Inferring the Parameter of a Coin.
L17.7 LLMS with Multiple Observations.
L17.8 The Simplest LLMS Example with Multiple Observations.
L17.9 The Representation of the Data Matters in LLMS.
L18.1 Lecture Overview.
L18.2 The Markov Inequality.
L18.3 The Chebyshev Inequality.
L18.4 The Weak Law of Large Numbers.
L18.5 Polling.
L18.6 Convergence in Probability.
L18.7 Convergence in Probability Examples.
L18.8 Related Topics.
S18.1 Convergence in Probability of the Sum of Two Random Variables.
S18.2 Jensen's Inequality.
S18.3 Hoeffding's Inequality.
L19.1 Lecture Overview.
L19.2 The Central Limit Theorem.
L19.3 Discussion of the CLT.
L19.4 Illustration of the CLT.
L19.5 CLT Examples.
L19.6 Normal Approximation to the Binomial.
L19.7 Polling Revisited.
L20.1 Lecture Overview.
L20.2 Overview of the Classical Statistical Framework.
L20.3 The Sample Mean and Some Terminology.
L20.4 On the Mean Squared Error of an Estimator.
L20.5 Confidence Intervals.
L20.6 Confidence Intervals for the Estimation of the Mean.
L20.7 Confidence Intervals for the Mean, When the Variance is Unknown.
L20.8 Other Natural Estimators.
L20.9 Maximum Likelihood Estimation.
L20.10 Maximum Likelihood Estimation Examples.
L21.1 Lecture Overview.
L21.2 The Bernoulli Process.
L21.3 Stochastic Processes.
L21.4 Review of Known Properties of the Bernoulli Process.
L21.5 The Fresh Start Property.
L21.6 Example: The Distribution of a Busy Period.
L21.7 The Time of the K-th Arrival.
L21.8 Merging of Bernoulli Processes.
L21.9 Splitting a Bernoulli Process.
L21.10 The Poisson Approximation to the Binomial.
L22.1 Lecture Overview.
L22.2 Definition of the Poisson Process.
L22.3 Applications of the Poisson Process.
L22.4 The Poisson PMF for the Number of Arrivals.
L22.5 The Mean and Variance of the Number of Arrivals.
L22.6 A Simple Example.
L22.7 Time of the K-th Arrival.
L22.8 The Fresh Start Property and Its Implications.
L22.9 Summary of Results.
L22.10 An Example.
L23.1 Lecture Overview.
L23.2 The Sum of Independent Poisson Random Variables.
L23.3 Merging Independent Poisson Processes.
L23.4 Where is an Arrival of the Merged Process Coming From?.
L23.5 The Time Until the First (or last) Lightbulb Burns Out.
L23.6 Splitting a Poisson Process.
L23.7 Random Incidence in the Poisson Process.
L23.8 Random Incidence in a Non-Poisson Process.
L23.9 Different Sampling Methods can Give Different Results.
S23.1 Poisson Versus Normal Approximations to the Binomial.
S23.2 Poisson Arrivals During an Exponential Interval.
L24.1 Lecture Overview.
L24.2 Introduction to Markov Processes.
L24.3 Checkout Counter Example.
L24.4 Discrete-Time Finite-State Markov Chains.
L24.5 N-Step Transition Probabilities.
L24.6 A Numerical Example - Part I.
L24.7 Generic Convergence Questions.
L24.8 Recurrent and Transient States.
L25.1 Brief Introduction (RES.6-012 Introduction to Probability).
L25.2 Lecture Overview.
L25.3 Markov Chain Review.
L25.4 The Probability of a Path.
L25.5 Recurrent and Transient States: Review.
L25.6 Periodic States.
L25.7 Steady-State Probabilities and Convergence.
L25.8 A Numerical Example - Part II.
L25.9 Visit Frequency Interpretation of Steady-State Probabilities.
L25.10 Birth-Death Processes - Part I.
L25.11 Birth-Death Processes - Part II.
L26.1 Brief Introduction (RES.6-012 Introduction to Probability).
L26.2 Lecture Overview.
L26.3 Review of Steady-State Behavior.
L26.4 A Numerical Example - Part III.
L26.5 Design of a Phone System.
L26.6 Absorption Probabilities.
L26.7 Expected Time to Absorption.
L26.8 Mean First Passage Time.
L26.9 Gambler's Ruin.

Taught by

MIT open courseware

Reviews

Start your review of MIT RES.6-012 Introduction to Probability, Spring 2018

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.