Benchmarking and Accountability
Compare DHCW's performance against England, Scotland, and Denmark — and publish the results. Introduce personal accountability for directors and give the Senedd the tools to scrutinise digital health properly.
DHCW operates in isolation. It is the only national digital health body in the UK that is not systematically benchmarked against its peers. NHS England's digital performance is scrutinised by CDDO, NAO, and the Public Accounts Committee. Scotland's NES Digital Service reports against defined digital maturity indicators. Northern Ireland's BSO ITS is subject to regular review. Wales has no equivalent scrutiny mechanism, no published benchmarking, and no systematic comparison against the nations it should be learning from.
The result is that DHCW can claim progress while falling further behind. Without external reference points, any internal metric can be made to look positive. With them, the gap between DHCW and comparable organisations becomes visible, measurable, and politically impossible to ignore.
1. Annual Four-Nations Digital Health Comparison
The problem: DHCW does not publish — and is not required to publish — any comparison of its performance, spending, or outcomes against NHS England, Scotland, or Northern Ireland. When the CEO claims progress, there is no public reference point against which to test the claim. When DHCW reports that 7% of Welsh GP practices use electronic prescriptions, there is no automatic comparison to England (where the equivalent programme was completed a decade ago). When DHCW announces a "digital strategy," there is no assessment of whether it meets the standards adopted by every other UK nation.
The proposal: Commission an annual Four-Nations Digital Health Comparison, published by an independent body (Audit Wales is the natural candidate), covering:
Metrics for comparison:
- Digital maturity: Adoption rates for key digital capabilities across primary care, secondary care, and community settings. Measured using a standardised instrument (see below).
- Programme delivery: Number of national digital programmes delivered to production in the past 12 months, compared to number initiated. Time from initiation to live service. Cost variance against original business case.
- Interoperability: Percentage of national systems compliant with HL7 FHIR (or equivalent national standard). Availability of published APIs. Existence of a national data exchange layer.
- Spending efficiency: Total digital spend per capita. Digital spend as a percentage of total health expenditure. Ratio of permanent to contractor staff. Proportion of spend on legacy maintenance vs. new development.
- Clinical impact: Clinician-reported digital satisfaction (using a standardised instrument). Measurable clinical outcomes attributable to digital interventions (e.g., reduction in referral processing times, reduction in medication errors, availability of electronic records at point of care).
- Transparency: Score against a transparency checklist covering programme cost publication, contract register, off-payroll data, whistleblowing data, board paper publication, and performance dashboard availability.
Publication: The comparison is published annually, presented to the Senedd, and made available to the Welsh public. It uses consistent methodology year-on-year, enabling trend analysis.
Why this matters: Benchmarking makes mediocrity visible. When the people of Wales can see — in simple, comparative terms — that their digital health services are a decade behind England's, that Scotland delivers shared components Wales cannot build, and that DHCW spends more per capita for worse outcomes, the political case for reform becomes unanswerable.
Comparator: The UK Government publishes cross-departmental digital performance comparisons through CDDO. The OECD publishes international digital government benchmarks. NHS England trusts are benchmarked against each other through the Model Hospital dashboard. Wales is the only UK nation without systematic digital health benchmarking.
2. Welsh Government Learning Lab (WGLL) Digital Maturity Assessment
The problem: Digital maturity — the extent to which an organisation has adopted, integrated, and optimised digital capabilities — is a well-defined concept with validated assessment frameworks. NHS England uses the HIMSS Electronic Medical Record Adoption Model (EMRAM) and its own What Good Looks Like (WGLL) framework to assess digital maturity across trusts. These assessments identify capability gaps, prioritise investment, and track progress over time.
DHCW has never been subject to an independent digital maturity assessment. Neither has any Welsh Health Board. The result: nobody knows, with any rigour, where Wales stands on digital maturity — and without a baseline, progress cannot be measured.
The proposal: Adopt the NHS England "What Good Looks Like" (WGLL) digital maturity framework — adapted for Wales — and conduct a baseline assessment of DHCW and all Welsh Health Boards within 12 months.
The WGLL framework covers seven success measures:
- Well-led: Digital leadership, strategy, and governance
- Ensure smart foundations: Infrastructure, hosting, network, devices
- Safe practice: Cyber security, clinical safety, data protection
- Support people: Digital skills, training, user support
- Empower citizens: Patient-facing digital services
- Improve care: Clinical digital systems, decision support, analytics
- Healthy populations: Population health management, data-driven prevention
Implementation:
- Assessment conducted by an independent body — not DHCW itself. Audit Wales, working with digital health assessment specialists, is the appropriate organisation.
- Baseline assessment within 12 months of DHCW and all Health Boards.
- Results published at organisational level, enabling comparison across Welsh Health Boards and against NHS England trusts that have undergone the same assessment.
- Annual reassessment using the same framework, tracking progress.
- Remediation plans required for any organisation scoring below "Developing" on any success measure, with progress monitored by the Independent Technical Advisory Panel.
Comparator: Over 200 NHS England organisations have been assessed against the WGLL framework. The data enables targeted investment, peer learning, and accountability. Wales has no equivalent dataset.
3. Personal Accountability Statements for Directors
The problem: Collective board responsibility sounds democratic but in practice enables evasion. When all directors are collectively responsible for everything, no individual director is accountable for anything. This is how DHCW operates: when programmes fail, the failure is attributed to "the organisation" — never to the specific director whose portfolio it falls under, whose decisions shaped its direction, and whose oversight was supposed to prevent failure.
The proposal: As outlined in the Governance section, every DHCW executive director must publish a Personal Accountability Statement (PAS). The benchmarking dimension adds a critical layer: accountability is measured against external standards, not internal ones.
How benchmarking connects to accountability:
- Each director's PAS includes specific, measurable commitments aligned to the four-nations comparison metrics and the WGLL framework.
- Example: The Director of Programmes commits to "deliver at least 2 programmes to live service within 12 months, each meeting the NHS Service Standard, with cost variance no greater than 15% against the approved business case."
- Example: The Chief Technology Officer commits to "achieve FHIR UK Core compliance for at least 3 national systems within 24 months, as verified by the Independent Technical Advisory Panel."
- Performance is assessed against these commitments annually by the ITAP, with results published.
- Consequences are real: Two consecutive years of failure to meet PAS commitments triggers an independent capability review (see Governance section).
Why benchmarked accountability matters: A director who commits to "improve programme delivery" can always claim progress by adjusting the baseline or redefining success. A director who commits to "match NHS England's Transformation Directorate on the percentage of programmes passing service assessment" cannot hide behind ambiguity. External benchmarks create accountability that internal targets never can.
Comparator: UK financial services' SM&CR regime requires Senior Managers to have documented, specific responsibilities. NHS England's executive performance frameworks use benchmarked metrics. DHCW has no equivalent structure.
4. Standing Senedd Digital Scrutiny Panel
The problem: Senedd scrutiny of DHCW is sporadic, reactive, and dependent on committee schedules that are crowded with competing priorities. The Health and Social Care Committee and the Public Accounts Committee have both examined DHCW — but neither has sustained, systematic digital expertise or the capacity for ongoing scrutiny. The result: DHCW receives intense scrutiny during crises (the 2018 PAC inquiry, the 2025 OpenEyes inquiry) but no continuous oversight between crises. This allows problems to develop, worsen, and reach crisis point before they attract attention.
The proposal: Establish a Standing Senedd Digital Scrutiny Panel with a dedicated focus on DHCW and NHS Wales digital services.
How it would work:
- Composition: A cross-party panel of 5-7 Senedd Members drawn from the Health and Social Care Committee and the Public Accounts and Public Administration Committee, supported by an independent digital health adviser (external, not from DHCW or the Welsh Government digital team).
- Remit:
- Quarterly evidence sessions with DHCW's CEO and executive directors — not just during crises, but as routine scrutiny
- Annual review of the four-nations comparison and WGLL assessment, with published conclusions
- Power to call evidence from the Independent Technical Advisory Panel, the Freedom to Speak Up Guardian, and the Staff Digital Council
- Annual published report on the state of NHS Wales digital services, with recommendations to the Welsh Government
- Standing powers: The panel operates year-round, not just during formal committee inquiries. It can request information from DHCW at any time and receive it within 14 days.
- Transparency: All evidence sessions are live-streamed and transcribed. All correspondence between the panel and DHCW is published.
Why standing scrutiny matters: DHCW's problems did not develop overnight. They developed over years, during which no one was looking. The PAC warned in 2018 of a toxic culture — and then no one followed up until the problems became undeniable. Standing scrutiny prevents the long silences in which failure compounds unseen.
Comparator: The UK Parliament's Science, Innovation and Technology Committee provides ongoing scrutiny of government digital programmes. NHS England's digital spending is subject to continuous scrutiny by CDDO and the NAO. Wales has no equivalent standing scrutiny mechanism for digital health.
These four reforms — four-nations benchmarking, digital maturity assessment, personal accountability statements, and standing Senedd scrutiny — create a system where DHCW's performance is continuously measured against external standards, individual directors are accountable for specific outcomes, and elected representatives have the tools and information to hold the organisation to account. Without benchmarking, DHCW will continue to grade its own homework. Without scrutiny, it will continue to set the exam.