Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Long Short-Term Memory with PyTorch + Lightning

StatQuest with Josh Starmer via YouTube

Overview

Learn how to code a Long Short-Term Memory (LSTM) unit from scratch and train it, as well as utilize the PyTorch function nn.LSTM. Discover two useful tricks Lightning offers: extending training epochs without restarting and visualizing training results efficiently. The course covers creating and initializing tensors, LSTM math, configuring optimizers, calculating loss, and evaluating training using TensorBoard. Intended for individuals interested in deep learning, PyTorch, and LSTM networks.

Syllabus

Awesome song and introduction
Importing the modules
An outline of an LSTM class
init: Creating and initializing the tensors
lstm_unit: Doing the LSTM math
forward: Make a forward pass through an unrolled LSTM
configure_optimizers: Configure the...optimizers.
training_step: Calculate the loss and log progress
Using and training our homemade LSTM
Evaluating training with TensorBoard
Adding more epochs to training
Using and training PyTorch's nn.lstm

Taught by

StatQuest with Josh Starmer

Reviews

Start your review of Long Short-Term Memory with PyTorch + Lightning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.