This course introduces students to the modeling, quantification, and analysis of uncertainty. The tools of probability theory, and of the related field of statistical inference, are the keys for being able to analyze and make sense of data. These tools underlie important advances in many fields, from the basic sciences to engineering and management.
This course has been designed for independent study. It provides everything you will need to understand the concepts covered in the course. The materials include:
Lecture Videos by MIT Professor John Tsitsiklis
Lecture Slides and Readings
Recitation Problems and Solutions
Recitation Help Videos by MIT Teaching Assistants
Tutorial Problems and Solutions
Tutorial Help Videos by MIT Teaching Assistants
Problem Sets with Solutions
Exams with Solutions
A complementary resource,Introduction to Probability,is provided by the videos developed for an EdX version of 6.041. These videos cover more or less the same content, in somewhat different order, and in somewhat more detail than the videotaped live lectures.
1. Probability Models and Axioms. The Probability of the Difference of Two Events. Geniuses and Chocolates. Uniform Probabilities on a Square. 2. Conditioning and Bayes' Rule. A Coin Tossing Puzzle. Conditional Probability Example. The Monty Hall Problem. 3. Independence. A Random Walker. Communication over a Noisy Channel. Network Reliability. A Chess Tournament Problem. 4. Counting. Rooks on a Chessboard. Hypergeometric Probabilities. 5. Discrete Random Variables I. Sampling People on Buses. PMF of a Function of a Random Variable. 6. Discrete Random Variables II. Flipping a Coin a Random Number of Times. Joint Probability Mass Function (PMF) Drill 1. The Coupon Collector Problem. 7. Discrete Random Variables III. Joint Probability Mass Function (PMF) Drill 2. 8. Continuous Random Variables. Calculating a Cumulative Distribution Function (CDF). A Mixed Distribution Example. Mean & Variance of the Exponential. Normal Probability Calculation. 9. Multiple Continuous Random Variables. Uniform Probabilities on a Triangle. Probability that Three Pieces Form a Triangle. The Absent Minded Professor. 10. Continuous Bayes' Rule; Derived Distributions. Inferring a Discrete Random Variable from a Continuous Measurement. Inferring a Continuous Random Variable from a Discrete Measurement. A Derived Distribution Example. The Probability Distribution Function (PDF) of [X]. Ambulance Travel Time. 11. Derived Distributions (ctd.); Covariance. The Difference of Two Independent Exponential Random Variables. The Sum of Discrete and Continuous Random Variables. 12. Iterated Expectations. The Variance in the Stick Breaking Problem. Widgets and Crates. Using the Conditional Expectation and Variance. A Random Number of Coin Flips. A Coin with Random Bias. 13. Bernoulli Process. Bernoulli Process Practice. 14. Poisson Process I. Competing Exponentials. 15. Poisson Process II. Random Incidence Under Erlang Arrivals. 16. Markov Chains I. Setting Up a Markov Chain. Markov Chain Practice 1. 17. Markov Chains II. 18. Markov Chains III. Mean First Passage and Recurrence Times. 19. Weak Law of Large Numbers. Convergence in Probability and in the Mean Part 1. Convergence in Probability and in the Mean Part 2. Convergence in Probability Example. 20. Central Limit Theorem. Probabilty Bounds. Using the Central Limit Theorem. 21. Bayesian Statistical Inference I. 22. Bayesian Statistical Inference II. Inferring a Parameter of Uniform Part 1. Inferring a Parameter of Uniform Part 2. An Inference Example. 23. Classical Statistical Inference I. 24. Classical Inference II. 25. Classical Inference III.