Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

freeCodeCamp

Building AI Applications Locally with Ollama - From Setup to Advanced Projects

via freeCodeCamp

Overview

Coursera Plus Annual Sale: All Certificates & Courses 25% Off!
Master building powerful AI applications locally in this comprehensive 3-hour course covering essential aspects of local Large Language Model deployment and integration. Starting with fundamental setup and model management, progress through hands-on exercises pulling and customizing models, implementing REST APIs, and creating Python integrations. Develop practical skills through real-world projects including a Grocery List Organizer, RAG (Retrieval-Augmented Generation) System, and an AI Recruiter Agency. Explore advanced topics like LLM parameters, model benchmarks, CLI commands, and multimodal capabilities with Llava model. Gain expertise in vectorstore implementations, embeddings, and document ingestion while building a complete PDF RAG system. Access included resources like code templates, cheat sheets, and prompt guides to accelerate development. Perfect for developers and AI enthusiasts seeking to create sophisticated AI applications with local processing capabilities.

Syllabus

⌨️ Intro
⌨️ What Is this course about?
⌨️ Course Prerequisites
⌨️ Development Environment Setup
⌨️ Ollama Deep Dive
⌨️ Ollama Key Features
⌨️ Ollama Setup
⌨️ Download Ollama Locally
⌨️ Ollama Models - How to Pull Different Ollama Models Locally
⌨️ LLM Parameters Deep Dive
⌨️ Understanding Model Benchmarks
⌨️ Ollama Basic CLI Commands - Pull and Testing Models
⌨️ Pull in the Llava Multimodal MOdel and Captioning an Image - Hands-on
⌨️ Summarize and Sentiment Analysis and Customizing Models with the Modelfile
⌨️ Ollama REST API
⌨️ Ollama REST API - Request JSON
⌨️ Ollama Models Support Different Tasks - Summary
⌨️ Different Ways to Interact with Ollama Models - Overview
⌨️ Ollama Model Running Under Msty App - Frontend Tool - RAG Hands-on
⌨️ Introduction to Python Library for Building LLM Applications Locally
⌨️ Interact with Llama3 in Python using Ollama REST API
⌨️ Ollama Python Library Chatting with our Model
⌨️ Chat Example with Streaming
⌨️ Using Ollama Show Function
⌨️ Create a Custom Model in Code
⌨️ Build a Real-world Use case Application - Introduction
⌨️ Build a LLM App - Grocery List Organizer
⌨️ Building RAG Systems with Ollama - Overview of RAG Systems and Langchain Crash Course
⌨️ Deep Dive into Vectorstore and Embeddings - the Whole Picture - Crash Course
⌨️ Overview of Our PDF RAG System We will be Building
⌨️ Set up our RAG System - Document Ingestion and Vector DB Creation and Embeddings
⌨️ RAG System - Retrieval and Querying - Final
⌨️ RAG System - Cleaner Code Code Refactoring
⌨️ RAG System - Streamlit Version
⌨️ BONUS for YOU!
⌨️ Introduction to the Next Application - AI Recruiter Agency
⌨️ Building the AI Recruiter Agency
⌨️ Outro - Final Thoughts and Your Bonus - Thank you!

Taught by

freeCodeCamp.org

Reviews

Start your review of Building AI Applications Locally with Ollama - From Setup to Advanced Projects

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.