Part 4/10:
Recent research underscores the prevalence of these hallucinations. In a study by the University of Michigan, hallucinations were present in 80% of public meeting transcriptions. Other analyses revealed that nearly half of over 100 hours of Whisper-generated transcripts contained hallucinated content. Given that even well-recorded audio isn’t immune to this issue, reliance on AI transcription in healthcare settings has become a pressing concern.