Overview
This course covers the fundamentals of transformers and BERT in natural language processing (NLP). By the end of the course, learners will understand the working principles of transformers, BERT, and their applications. The course includes hands-on activities using PyTorch to demonstrate how these models work. The intended audience for this course includes individuals interested in NLP, machine learning, and deep learning techniques.
Syllabus
Introduction.
Sequence to Sequence Model.
Introduction to Transformers in NLP.
Text Summarization using PyTorch.
Introduction to BERT.
Use Case for BERT.
Summary.
Taught by
Great Learning