Every major programme DHCW is responsible for is under the Welsh Government's highest tier of intervention — simultaneously. This is not a streak of bad luck. It is the inevitable result of an organisation that runs too many programmes with too little discipline, no external assessment, no standard for what "good" looks like, and no mechanism for killing programmes that are failing.

Reform requires adopting proven programme delivery standards, rationalising the portfolio, introducing automatic review mechanisms for stalled work, reforming procurement, and establishing a framework for measuring whether programmes actually deliver benefits.

1. Adopt the NHS Service Standard

The problem: DHCW has no published standard for what a good digital service looks like. There is no equivalent of the GDS Service Standard's 14 points (now 17 in NHS England's adaptation). Programmes proceed through phases without external assessment. There are no defined quality gates. Self-assessment by the programme team is the only mechanism — and it has consistently failed to identify the problems that the Welsh Government later found.

The proposal: Adopt the NHS Service Standard in full. The standard — adapted from the GDS Service Standard for healthcare contexts — sets 17 requirements that every digital service must meet:

  1. Understand users and their needs in the context of health and care
  2. Work towards solving a whole problem for users
  3. Provide a joined up experience across all channels
  4. Make the service simple to use
  5. Make sure everyone can use the service (accessibility)
  6. Create a team that includes multidisciplinary skills
  7. Use agile ways of working
  8. Iterate and improve frequently
  9. Respect and protect users' confidentiality and privacy
  10. Define what success looks like and publish performance data
  11. Choose the right tools and technology
  12. Make new source code open
  13. Use and contribute to open standards, common components and patterns
  14. Operate a reliable service
  15. Support a culture of care
  16. Make your service clinically safe
  17. Make your service interoperable

Implementation:

  • All new DHCW programmes must be designed against the NHS Service Standard from inception.
  • All existing programmes must undergo a retrospective assessment against the standard within 12 months.
  • Assessments are conducted by the Independent Technical Advisory Panel (see Governance), not by DHCW staff.
  • Assessment reports are published in full.
  • Programmes that fail assessment at any phase gate cannot proceed until deficiencies are remediated and re-assessed.

Comparator: Every NHS England digital service must pass a service assessment. The assessment process is transparent, the reports are published, and services that fail are not allowed to proceed. The result is a dramatically higher standard of digital service delivery. DHCW has no equivalent process.

2. Portfolio Rationalisation

The problem: DHCW is simultaneously running nine major programmes, all of which are failing. This is beyond any organisation's capacity — let alone one that has never successfully delivered a single programme to completion on time and on budget. The portfolio is too large, too unfocused, and spread across too many competing priorities.

The proposal: Impose a hard limit of 3-4 active major programmes at any time, with strict prioritisation based on patient impact and clinical urgency.

How it would work:

  • Immediate triage: The Independent Technical Advisory Panel conducts a portfolio review within 90 days, classifying each programme as:

    • Continue — clear clinical need, viable delivery plan, team in place
    • Pause — valuable but not deliverable alongside higher priorities; placed in structured hibernation with a defined restart trigger
    • Terminate — no viable path to delivery, benefits case collapsed, or superseded by other approaches
  • Prioritisation criteria: Patient safety impact (weighted highest), clinical user demand, cost-benefit ratio, delivery confidence, and strategic alignment. Subjective "strategic importance" claims by DHCW leadership are excluded — the assessment is based on evidence, not advocacy.

  • Active portfolio cap: DHCW may run a maximum of 4 major programmes (defined as >£2M total cost) concurrently. Additional programmes may only enter the active portfolio when an existing programme completes or is terminated.

What this means in practice: Some programmes will be paused or killed. This is not failure — it is the responsible management of finite resources. Continuing to run nine failing programmes simultaneously is failure.

Comparator: GDS enforces strict portfolio discipline through spend controls and the Government Major Projects Portfolio (GMPP) process. NHS England's Transformation Directorate prioritises ruthlessly, focusing resources on programmes with the highest probability of delivery and impact. DHCW does neither.

3. The Sunlight Rule

The problem: DHCW programmes persist for years — sometimes a decade — without delivering usable services. The laboratory system (LINC/LIMS) has been in progress for eight years with one lab partially live. The eye care system (OpenEyes) has consumed £8.5 million over seven years and missed two national deadlines. The social care system (WCCIS) has spent over £42 million with organisations trying to leave. There is no automatic mechanism for reviewing programmes that exceed their planned timelines, no escalation trigger, and no consequence for indefinite delay.

The proposal: Introduce a "Sunlight Rule" — any programme that has not reached public beta within 3 years of Discovery commencement is subject to automatic, independent review.

How it would work:

  • At the 3-year mark, the Independent Technical Advisory Panel conducts a mandatory review, answering three questions:

    1. Is the programme still viable? (Has the original business case survived contact with reality?)
    2. Is the current approach working? (Has the team, technology, and delivery method demonstrated progress?)
    3. Would starting again be faster? (Is the accumulated technical debt, contractual obligation, or approach so compromised that a fresh start would deliver sooner?)
  • The review report is published. The panel recommends one of: Continue (with specified conditions), Restructure (new team/approach, same goal), or Terminate.

  • The DHCW board must respond publicly within 30 days.

Why "Sunlight Rule": Because sunlight is the best disinfectant. The mere existence of the rule changes behaviour — programme teams know that extended delay will trigger external scrutiny, which creates an incentive to deliver rather than to manage expectations indefinitely.

Comparator: The UK Government's Major Projects Authority (now Infrastructure and Projects Authority) conducts Gateway Reviews at defined points. Projects that consistently receive "Red" ratings face intervention. DHCW programmes have persisted for years in what would be rated "Red" anywhere else, without any external review.

4. Procurement Reform

The problem: DHCW's procurement practices have contributed directly to programme failures. Major contracts are awarded without published values. Supplier relationships persist for decades without competitive re-tender. The organisation lacks the commercial expertise to negotiate effectively with major technology vendors. And procurement decisions are made without reference to open source alternatives, open standards compliance, or exit strategy requirements.

The proposal: Four procurement reforms:

a) Publish all contracts above £500,000

Every DHCW contract above £500,000 must be published in full on the Contracts Finder equivalent for Wales, including: total contract value, duration, supplier, whether sole-sourced or competed, evaluation criteria, and named DHCW contract owner. This is already required under Welsh procurement rules — DHCW must actually comply.

b) Open source by default

All new software development commissioned or undertaken by DHCW must be open source unless a specific, published exemption is granted by the Independent Technical Advisory Panel. Exemptions are limited to genuine security concerns (not commercial preference) and must be renewed annually.

Rationale: Open source code can be inspected, audited, reused, and maintained independently of the original supplier. It eliminates vendor lock-in, reduces long-term costs, and enables other nations and NHS organisations to benefit from Welsh investment. The UK Government has operated an "open source by default" policy since 2012.

c) Mandatory exit strategies

Every contract above £1 million must include a published exit strategy covering: data portability requirements, source code escrow or open source provisions, knowledge transfer obligations, and maximum exit timeline. No contract should create a dependency that DHCW cannot escape from within 12 months.

d) Disaggregation

Following GDS procurement guidance, DHCW must disaggregate large contracts wherever possible — breaking monolithic procurements into smaller lots that can be competed separately, delivered incrementally, and replaced independently. The era of single-supplier, multi-year, multi-million-pound contracts for entire systems must end.

Comparator: GDS transformed UK Government procurement through open source by default, disaggregation, the Digital Marketplace (now Digital Outcomes and Specialists), and mandatory exit strategies. The result was dramatically improved competition, lower costs, and better outcomes. DHCW operates as if none of this happened.

5. Benefits Realisation Framework

The problem: DHCW's CEO admitted publicly that the organisation cannot demonstrate return on investment for its spending. When challenged, she compared measuring digital ROI to measuring the value of electricity — arguing that accountability for outcomes is inherently impossible. This is not true. It is an excuse for not doing the work.

The proposal: Adopt a mandatory Benefits Realisation Framework for all programmes, aligned with the HM Treasury Green Book methodology.

Requirements:

  • Benefits case at Discovery: Every programme must define specific, measurable benefits at the outset — not vague aspirations, but quantified outcomes. Examples: "Reduce GP referral processing time from 5 days to same-day," "Eliminate 80% of paper-based lab requests," "Reduce patient administration errors by 50%."
  • Benefits tracking at Beta: Baseline measurement before the new service is introduced. Ongoing measurement against defined metrics during beta and after go-live.
  • Benefits report at 12 months post-live: A published report comparing actual outcomes to predicted benefits, with an explanation of any variance and corrective actions.
  • Independent review at 24 months: The Independent Technical Advisory Panel assesses whether the programme has delivered its stated benefits. If not, a remediation plan is required or the programme is classified as a failure — with consequences for the responsible director under the Personal Accountability Statement regime.

Comparator: HM Treasury requires benefits realisation for all major government programmes. NHS England tracks benefits for digital programmes through its Benefits Realisation Framework. DHCW does not track benefits at all — its CEO has said so, publicly, on the record.


These five reforms — service standards, portfolio rationalisation, sunlight rules, procurement reform, and benefits realisation — would transform DHCW from an organisation that runs too many programmes with no standards and no accountability into one that runs fewer programmes to higher standards with measurable outcomes. The tools exist. The frameworks are proven. What has been lacking is the will to use them.