Overworked physicians have increasingly adopted AI medical scribes to automate the creation of structured clinical notes from patient-doctor conversations, diagnoses, and care decisions. However, a recent audit by the auditor general of Ontario has exposed significant risks associated with these tools.
The audit, detailed in the 2024 report on Use of Artificial Intelligence in the Ontario Government, evaluated transcription accuracy across 20 AI scribe vendors pre-approved by the provincial government for healthcare providers. The results were alarming:
- All 20 vendors demonstrated accuracy or completeness issues in at least one test scenario.
- Nine vendors hallucinated patient information, fabricating details not discussed in the conversations.
- Twelve vendors recorded information incorrectly, including medication names and test referrals.
- Seventeen vendors missed key details, particularly regarding mental health discussions.
The auditor general highlighted several examples of errors that could directly compromise patient care:
- AI scribes fabricated nonexistent referrals for blood tests or therapy sessions.
- Prescription medication names were transcribed incorrectly.
- Critical details about mental health issues discussed during visits were omitted.
These inaccuracies raise concerns about the potential for inadequate or harmful treatment plans, which could negatively affect patient health outcomes. The audit underscores the need for stricter oversight and validation of AI tools in healthcare settings.