Overview
This course aims to help learners understand the internals and technology behind large language models using a combination of Prompt Engineering and ChatGPT. By the end of the course, students will be able to grasp concepts such as Transformer architecture, Keywords Generation, Text embedding, Encoder and Decoder, Self attention, Multi-head self attention, Scaled Dot Product Attention, and key deep learning methods. The teaching method involves a video presentation with a detailed timeline covering various topics related to large language models. This course is intended for individuals interested in artificial intelligence, deep learning, natural language processing, and the workings of large language models like ChatGPT and DALL-E.
Syllabus
Content Intro
ChatGPT
Transformer architecture
Keywords Generation
Text embedding
Encoder and Decoder
Self attention
Multi-head self attention
PyTorch Code Multi-head self attention
Scaled Dot Product Attention
Key deep learning methods
Large language models
LLM Parameter Count Python Code
DALL-E large language model
Key Differences DALL-E & ChatGPT
List all Prompts
ChatGPT Session Summary
Taught by
Prodramp