In this course you will learn a whole lot of modern physics (classical and quantum) from basic computer programs that you will download, generalize, or write from scratch, discuss, and then hand in. Join in if you are curious (but not necessarily knowledgeable) about algorithms, and about the deep insights into science that you can obtain by the algorithmic approach.
Monte Carlo algorithms (Direct sampling, Markov-chain sampling)
welcome to the first week of Statistical Mechanics: Algorithms and Computations!
Here are a few details about the structure of the course: For each week, a lecture and a tutorial videos will be presented, together with a downloadable copy of all the relevant python programs mentioned in the videos. Some in-video questions and practice quizzes will help you to review the material, with no effect on the final grade. A mandatory peer-graded assignment is also present, for weeks from 1 to 9, and it will expand on the lectures' topics, letting you reach a deeper understanding. The nine peer-graded assignments will make up for 50% of the grade, while the other half will come from a final exam, after the last lecture.
In this first week, we will learn about algorithms by playing with a pebble on the Monte Carlo beach and at the Monaco heliport. In the tutorial we will use the 3x3 pebble game to understand the essential concepts of Monte Carlo techniques (detailed balance, irreducibility, and a-periodicity), and meet the celebrated Metropolis algorithm. Finally, the homework session will let you understand some useful aspects of Markov-chain Monte Carlo, related to convergence and error estimations.
Hard disks: From Classical Mechanics to Statistical Mechanics
-In Week 2, you will get in touch with the hard-disk model, which was first simulated by Molecular Dynamics in the 1950's. We will describe the difference between direct sampling and Markov-chain sampling, and also study the connection of Monte Carlo and Molecular Dynamics algorithms, that is, the interface between Newtonian mechanics and statistical mechanics. The tutorial includes classical concepts from statistical physics (partition function, virial expansion, ...), and the homework session will show that the equiprobability principle might be more subtle than expected.
Entropic interactions and phase transitions
-After the hard disks of Week 2, in Week 3 we switch to clothe-pins aligned on a washing line. This is a great model to learn about the entropic interactions, coming only from statistical-mechanics considerations. In the tutorial you will see an example of a typical situation: Having an exact solution often corresponds to finding a perfect algorithm to sample configurations. Finally, in the homework session we will go back to hard disks, and get a simple evidence of the transition between a liquid and a solid, for a two-dimensional system.
Sampling and integration
-In Week 4 we will deepen our understanding of sampling, and its connection with integration, and this will allow us to introduce another pillar of statistical mechanics (after the equiprobability principle): the Maxwell and Boltzmann distributions of velocities and energies. In the homework session, we will push the limits of sampling until we can compute the integral of a sphere... in 200 dimensions!
Density matrices and Path integrals (Quantum Statistical mechanics 1/3)
-Week 5 is the first episode of a three-weeks journey through quantum statistical mechanics. We will start by learning about density matrices and path integrals, fascinating tools to study quantum systems. In many cases, the Trotter approximation will be useful to consider non-trivial systems, and also to follow the time evolution of a system. All these topics, including the matrix-squaring technique, will be reviewed in detail in the homework session, where you will also study the anharmonic potential.
Note that previous knowledge of quantum mechanics is not really necessary to go through the next three weeks. Follow us in our journey through algorithms and physics, and don't forget to ask on the forum if you have any doubt!
Lévy Quantum Paths (Quantum Statistical mechanics 2/3)
-In Week 6, the second quantum week, we will introduce the properties of bosons, indistinguishable particles with peculiar statistics. At the same time, we will also go further by learning a powerful sampling algorithm, the Lévy construction, and in the homework session you will thoroughly compare it with standard sampling techniques.
Bose-Einstein condensation (Quantum Statistical mechanics 3/3)
-At the end of our quantum journey, in Week 7, we discuss the Bose-Einstein condensation phenomenon, theoretically predicted in the 1920's and observed in the 1990's in experiments with ultracold atoms. In the path-integral framework, an elegant description of this phenomenon is in term of permutation cycles, which will also lead to a great sampling algorithm, to be discussed in the homework session.
Ising model - Enumerations and Monte Carlo algorithms
-In Week 8 we come back to classical physics, and in particular to the Ising model, which captures the essential physics of a set of magnetic spins. This is also a fundamental model for the development of sampling algorithms, and we will see different approaches at work: A local algorithm, the very efficient cluster algorithms, the heat-bath algorithm and its connection with coupling. All of these will be revisited in the homework session, where you will get a precise control over the transition between ordered and disordered states.
Dynamic Monte Carlo, simulated annealing
-Continuing with simple models for spins, in Week 9 we start by learning about a dynamic Monte Carlo algorithm which runs faster than the clock. This is easily devised for a single-spin system, and can also be generalized to the full Ising model from Week 8. In the tutorial we move towards the simulated-annealing technique, a physics-inspired optimization method with a very broad applicability. You will also revisit this in the homework session, and apply it to the sphere-packing and traveling-salesman problems.
The Alpha and the Omega of Monte Carlo, Review, Party
-The lecture of Week 10 includes the alpha and the omega of our course. First we repeat the experiment of Buffon's needle, already performed in the 18th century, and then we touch the sophisticated theory of Lévy stable distributions, and their connection with the central limit theorem. In the tutorial there will be time for a review of the entire course material, and then a little party is due, to celebrate the end of the course!
(There is no homework session for Week 10, but don't forget that the final exam is still there!)
Start your review of Statistical Mechanics: Algorithms and Computations
Anonymous completed this course.
Excellent course overall.
Friendly and carefully crafted lectures, approx 1h / week. Expect to spend time on homework, but it's clearly worth it as it goes way beyond ticking boxes and provides deeper understanding (and fun). This sets the bar higher but the reward is much more significant.
All in all the level was quite high, audience was great in the forums, teaching staff present and responsive during the course. Minor glitches here and there including delays for certificates but nothing that affected the substance of it.
Jiting Tian completed this course, spending 10 hours a week on it and found the course difficulty to be medium.
This is a graduate or advanced undergraduate level class on statistical physics, focusing on the computational tools (MC and MD). The materials are organized very well and the concepts are illustrated in a clear way. A lot of Python examples are provided to help students master the contents. The homework and exam is not hard, as most of the code is already present by the teachers, and students only need to fill the blank or do a little changes. It's not difficult to go through this course and pass the exam, but it's truly difficult to deeply understand all the materials. Although, for the guys who love statistical mechanics, this course deserve your effects.
Mauro Lacy completed this course, spending 12 hours a week on it and found the course difficulty to be hard.
If you want to learn some Statistical Physics from one of the masters, and end up with some working and useful code, plus, a lot of questions, more things to learn, and tons of problems to solve, go for it. It's amazing how W. Krauth moves back and forth from the mathematical notation and the physical notions to clear and concise algorithms. Always taking into account and explaining the subtleties of the process.
The best course I've taken. Simply excellent.