Building and Serving Local LLM Agents with LangGraph and LangServe
The Machine Learning Engineer via YouTube
Overview
Learn how to serve a Local LLM Agent in this 51-minute technical video that demonstrates the integration of Langraph with Vector Store Chroma. Explore the implementation process using Nomic.ai embeddings and Llama 3.2 8B instruct model in GGUF int8 format. Master the use of LangServe for creating inference points and agent deployment. Access the complete implementation through the provided GitHub repository, which includes detailed notebooks and code examples. Build upon concepts from the prerequisite video covering Local Agent construction with Llama 3.2 8B, Ollama, and Chroma to enhance your understanding of machine learning agent deployment and serving capabilities.
Syllabus
RAG: LangGraph. Serving Local LLM agents with LangServe #datascience #machinelearning
Taught by
The Machine Learning Engineer