Classical Machine Learning refers to well established techniques by which one makes inferences from data. This course will introduce a systematic approach (the “Recipe for Machine Learning”) and tools with which to accomplish this task. In addition to the typical models and algorithms taught (e.g., Linear and Logistic Regression) this course emphasizes the whole life cycle of the process, from data set acquisition and cleaning to analysis of errors, all in the service of an iterative process for improving inference.
Our belief is that Machine Learning is an experimental process and thus, most learning will be achieved by “doing”. We will jump-start your experimentation: Engineering first, then math. Early lectures will be a "sprint" to get you programming and experimenting. We will subsequently revisit topics on a greater mathematical basis.
Week 1: Classical Machine Learning: Overview
What is Machine Learning (ML) ?
ML and Finance; not ML for Finance
Classical Machine Learning: Introduction
Our first predictor
Week 2: Linear regression. Recipe for Machine Learning
The Recipe for Machine Learning
The Regression Loss Function
Bias and Variance
Week 3: Transformations, Classification
Data Transformations: Introduction and mechanics
Non-numeric variables: text, images
The Classification Loss Function
Week 4: Classification continued, Error Analysis
The Dummy Variable Trap
Loss functions: mathematics
Week 5: More Models: Trees, Forests, Naive Bayes
Entropy, Cross Entropy, KL Divergence
Week 6: Support Vector Machines, Gradient Descent, Interpretation