Part 1: GP Journey with Heidi AI - The Starting Line
It's March 2025. Standing in my North Central London practice, drowning in my administrative burden, I made a decision that would reveal just how unprepared our healthcare system is for the AI revolution. This is the story of implementing AI scribes in the regulatory vacuum of early 2025.

Dr. Chad Okay
NHS Resident Doctor & Physician-Technologist
A GP Registrar's Journey Implementing Heidi AI Scribe in NHS Primary Care: Part 1 - The Starting Line
Vision, Vacuum, and the 'Hidden Workload'
March 2025. Standing in my North Central London practice during my lunch break, staring at another mountain of extra patient notes that I needed to correct, another that I need to add, and telephone consultation notes that I needed to transcribe, I made a decision that would fundamentally change how we deliver care and reveal just how unprepared our healthcare system is for the AI revolution.
The statistics are damning, and I was experiencing their effects. Research shows that GPs suffer from minor psychiatric disorders, with most experiencing mild to severe exhaustion. I was part of those statistics, spending hours each day on documentation instead of patient care, burning out alongside my colleagues. Despite all the technological advancements, medicine at times still seems to be living in dark ages. My mind was clear that something had to change.
That's when I encountered Heidi AI Scribe. The promise was compelling: real-time consultation transcription, automated clinical note generation, and the ability to focus entirely on my patients instead of typing while they spoke. But as I would soon discover, the path from vision to implementation was filled with regulatory requirements that no one had adequately prepared us for.
The Regulatory Vacuum of March 2025
In March 2025, we existed in what I can only describe as a regulatory vacuum. The technology was ready, our patients needed better care, and GPs were drowning in an administrative burden. But formal guidance from NCL ICB (North Central London Integrated Care Board) wasn't yet fully developed. The comprehensive 11-step guidance that practices now rely on was still months away from publication.
This left us, like many early adopter practices, navigating uncharted waters. We knew we needed to implement AI scribes responsibly, but the question was: how do you comply with regulations that haven't been clearly articulated for your specific use case?
I reached out directly to Clinical Safety Officers across our region, trying to understand the scope of what we were taking on. What I discovered was that while NCL ICB had Clinical Safety Officers like Dr Ode Omohwo and Sithabile Tshabalala who could provide support, the specific processes and templates for AI scribe implementations weren't yet established. We were all learning together.
The Compliance Reality Check
The first shock came when I learned about DCB0160 – the Clinical Risk Management standard that's mandatory under the Health and Social Care Act 2012. This wasn't optional guidance; it was law. Every healthcare organisation deploying digital health technology must comply with DCB0160, which means:
- •Working with a qualified Clinical Safety Officer (fortunately, NCL ICB provided this support, though many practices didn't initially know it was available)
- •Conducting comprehensive clinical risk assessments
- •Maintaining hazard logs throughout the technology's lifecycle
- •Producing clinical safety case reports
- •Establishing ongoing clinical risk management processes
Understanding how to access ICB Clinical Safety Officer support became our first challenge. These professionals were in high demand, and the process for engaging with them wasn't yet streamlined.
The DPIA Discovery
Then came the Data Protection Impact Assessment (DPIA) requirement. Under UK GDPR, AI systems processing special category data (like health information) on a large scale require a comprehensive DPIA before implementation. This isn't a one-time tick-box exercise. It's a living document that must be maintained throughout the AI system's deployment.
The DPIA process revealed the complexity of what we were undertaking:
- •Detailed analysis of the purpose and methods of data processing
- •Assessment of necessity and proportionality
- •Identification of risks to patients' rights and freedoms
- •Documentation of mitigation measures
- •Consultation with relevant stakeholders
What should have been a straightforward software deployment suddenly required legal, technical, and clinical expertise. While NCL ICB's Data Protection Officer Steve Durbin would eventually provide templates and support, in March 2025 these resources were still being developed.
DTAC: The Hidden Compliance Layer
The third major requirement was Digital Technology Assessment Criteria (DTAC) compliance. While not legally mandated, DTAC has become the de facto standard for any digital health technology used in the NHS. The assessment covers five critical areas:
- •Clinical Safety (including DCB0129/0160 compliance)
- •Data Protection (GDPR compliance and security measures)
- •Technical Security (including Cyber Essentials certification)
- •Interoperability (system integration capabilities)
- •Usability and Accessibility (ensuring the technology works for all users)
Each area requires detailed evidence and documentation. For a small GP practice, assembling this portfolio of evidence represents months of work across multiple domains of expertise.
The Unfunded Mandate Reality
Here's the uncomfortable truth that no one talks about openly: implementing AI scribes responsibly represents a massive unfunded mandate. The NHS expects practices to adopt innovative technologies to improve efficiency and patient care, but provides no additional funding for the substantial compliance work required.
The Hidden Workload
Beyond the direct costs lies what I call the 'hidden workload'. This is the ongoing administrative burden that compliance creates:
- •Every system update requires clinical safety review
- •Every process change needs DPIA amendment
- •Every incident needs documentation and reporting through multiple channels
What was supposed to reduce administrative burden initially created more of it.
Yet despite these challenges, I would make the same decision again. The patient benefits such as more engaged consultations, improved documentation quality, reduced GP burnout, justify the compliance investment. But we must be honest about what implementing AI responsibly actually requires.
Looking Forward
As I write this in August 2025, NCL ICB has since published comprehensive guidance, creating a clearer pathway for practices wanting to implement AI technologies. The 11-step framework now available would have saved us months of work and uncertainty.
But our experience as early adopters has taught me valuable lessons about the reality of healthcare innovation in 2025. The technology is ready, the clinical need is urgent, but our regulatory and funding structures haven't caught up with the pace of change.
The starting line for AI in primary care isn't just about choosing the right technology. It's about accepting that responsible implementation requires significant investment in compliance infrastructure that the system doesn't yet adequately support or fund.
This is Part 1 of a 6-part series documenting the implementation of Heidi AI Scribe in NHS primary care. Continue to Part 2: Deep Dive into Digital Clinical Safety →
Share this article

Dr. Chad Okay
I am a London‑based NHS Resident Doctor with 8+ years' experience in primary care, emergency and intensive care medicine. I'm developing an AI‑native wearable to tackle metabolic disease. I combine bedside insight with end‑to‑end tech skills, from sensor integration to data visualisation, to deliver practical tools that extend healthy years.