Patients may suffer from hallucinations of AI medical transcription tools
CIO
OCTOBER 28, 2024
An AI-powered transcription tool widely used in the medical field, has been found to hallucinate text, posing potential risks to patient safety, according to a recent academic study. Although Whisper’s creators have claimed that the tool possesses “ human-level robustness and accuracy ,” multiple studies have shown otherwise.
Let's personalize your content