×
Mayo Clinic combats AI hallucinations with “reverse RAG” technique
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Mayo Clinic has developed an innovative approach to combat AI hallucinations in healthcare by implementing a “reverse RAG” technique that meticulously traces every piece of information back to its source. This breakthrough method has virtually eliminated retrieval-based hallucinations in non-diagnostic applications, allowing the prestigious hospital to deploy AI across its clinical practice while maintaining the strict accuracy standards essential in medical settings.

The big picture: Mayo Clinic has tackled the persistent problem of AI hallucinations by implementing what amounts to a backward version of retrieval-augmented generation (RAG), linking every data point back to its original source to ensure accuracy.

  • This approach has effectively eliminated nearly all data-retrieval-based hallucinations in non-diagnostic use cases, enabling Mayo to confidently deploy AI across its clinical practice.
  • The technique involves pairing the clustering using representatives (CURE) algorithm with LLMs and vector databases to create a robust verification system.

How it works: The system splits AI-generated summaries into individual facts and methodically matches each one back to its source documents for verification.

  • A second LLM then evaluates and scores how well these facts align with their cited sources, creating an additional layer of verification.
  • This double-check mechanism ensures that information provided by the AI system is firmly grounded in legitimate medical documentation rather than fabricated.

What they’re saying: “With this approach of referencing source information through links, extraction of this data is no longer a problem,” Matthew Callstrom, Mayo’s medical director for strategy and chair of radiology, told VentureBeat.

  • Callstrom emphasized the transformative potential while acknowledging the need for caution: “We recognize the incredible capability of these [models] to actually transform how we care for patients and diagnose in a meaningful way, to have more patient-centric or patient-specific care versus standard therapy.”

Future applications: While currently focused on non-diagnostic applications, Mayo Clinic envisions expanding this approach to more ambitious medical AI use cases.

  • Potential applications include genomic models for treatment prediction, medical image analysis, comprehensive patient record synthesis, and personalized medicine approaches.
  • Callstrom stressed that diagnosis-related AI applications will still require extensive validation and careful testing before clinical implementation.

Why this matters: In healthcare, AI hallucinations aren’t just embarrassing mistakes—they could potentially lead to harmful medical decisions affecting patient outcomes.

  • Mayo Clinic’s innovative solution represents a significant advance in making AI systems trustworthy enough for healthcare settings where accuracy is literally a matter of life and death.
Mayo Clinic’s secret weapon against AI hallucinations: Reverse RAG in action

Recent News

AI on the sly? UK government stays silent on implementation

UK officials use AI assistant Redbox for drafting documents while withholding details about its implementation and influence on policy decisions.

AI-driven leadership demands empathy over control, says author

Tomorrow's successful executives will favor orchestration over command, leveraging human empathy and diverse perspectives to guide increasingly autonomous AI systems.

AI empowers rural communities in agriculture and more, closing digital gaps

AI tools create economic opportunity and improve healthcare and education access in areas where nearly 3 billion people remain offline.