Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Pluralsight

How Neural Networks Learn: Exploring Architecture, Gradient Descent, and Backpropagation

via Pluralsight

Overview



So, you understand neural networks conceptually—what they are and generally how they work. But you might still be wondering about all the details that actually make them work. In this course, How Neural Networks Learn: Exploring Architecture, Gradient Descent, and Backpropagation, you’ll gain an understanding of the details required to build and train a neural network. First, you’ll explore network architecture—made up of layers, nodes and activation functions—and compare architecture types. Next, you’ll discover how neural networks adjust and learn to use backpropagation, gradient descent, loss functions, and learning rates. Finally, you’ll learn how to implement backpropagation and gradient descent using Python. When you’re finished with this course, you’ll have the skills and knowledge of neural network architectures and learning needed to build and train a neural network.

Syllabus

  • Course Overview 1min
  • Network Architecture: Layers, Nodes, and Activation Functions 11mins
  • Backpropagation and Learning: How Neural Networks Adjust and Learn 19mins

Taught by

Amber Israelsen

Reviews

4.7 rating at Pluralsight based on 17 ratings

Start your review of How Neural Networks Learn: Exploring Architecture, Gradient Descent, and Backpropagation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.