Stop Hallucinations and Half-Truths in Generative Search - Haystack US 2023
OpenSource Connections via YouTube
Overview
Explore strategies to mitigate hallucinations and inaccuracies in generative search systems during this 43-minute conference talk from Haystack US 2023. Dive into the challenges of integrating Large Language Models (LLMs) into search systems and learn proven solutions to maintain user trust. Discover techniques such as reranking, user warnings, fact-checking systems, and effective LLM usage patterns, prompting, and fine-tuning. Gain insights from Colin Harman, Head of Technology at Nesh, as he shares his expertise in combining cutting-edge NLP technologies with deep domain understanding to solve complex problems in heavy industries.
Syllabus
Haystack US 2023 - Colin Harman: Stop Hallucinations and Half-Truths in Generative Search
Taught by
OpenSource Connections