Inference with Torch-TensorRT Deep Learning Prediction for Beginners - CPU vs CUDA vs TensorRT

Inference with Torch-TensorRT Deep Learning Prediction for Beginners - CPU vs CUDA vs TensorRT

Python Simplified via YouTube Direct link

- intro

1 of 24

1 of 24

- intro

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

Inference with Torch-TensorRT Deep Learning Prediction for Beginners - CPU vs CUDA vs TensorRT

Automatically move to the next video in the Classroom when playback concludes

  1. 1 - intro
  2. 2 - clone Torch-TensorRT
  3. 3 - install and setup Docker
  4. 4 - install Nvidia Container Toolkit & Nvidia Docker 2
  5. 5 - Torch-TensorRT container option #1
  6. 6 - Torch-TensorRT Nvidia NGC container option #2
  7. 7 - import Pytorch
  8. 8 - load ResNet50
  9. 9 - load sample image
  10. 10 - sample image transforms
  11. 11 - batch size
  12. 12 - prediction with ResNet50
  13. 13 - softmax function
  14. 14 - ImageNet class number to name mapping
  15. 15 - predict top 5 classes of sample image topk
  16. 16 - speed test benchmark function
  17. 17 - CPU benchmarks
  18. 18 - CUDA benchmarks
  19. 19 - trace model
  20. 20 - convert traced model into a Torch-TensorRT model
  21. 21 - TensorRT benchmarks
  22. 22 - download Jupyter Notebook
  23. 23 - HOW DID I MISS THIS???
  24. 24 - thanks for watching!

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.