Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Stop Hallucinations and Half-Truths in Generative Search - Haystack US 2023

OpenSource Connections via YouTube

Overview

Explore strategies to mitigate hallucinations and inaccuracies in generative search systems during this 43-minute conference talk from Haystack US 2023. Dive into the challenges of integrating Large Language Models (LLMs) into search systems and learn proven solutions to maintain user trust. Discover techniques such as reranking, user warnings, fact-checking systems, and effective LLM usage patterns, prompting, and fine-tuning. Gain insights from Colin Harman, Head of Technology at Nesh, as he shares his expertise in combining cutting-edge NLP technologies with deep domain understanding to solve complex problems in heavy industries.

Syllabus

Haystack US 2023 - Colin Harman: Stop Hallucinations and Half-Truths in Generative Search

Taught by

OpenSource Connections

Reviews

Start your review of Stop Hallucinations and Half-Truths in Generative Search - Haystack US 2023

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.