Insights / Research & Data
Research 140 programmes 2024

Change Programme Performance Index 2024

What separates the 30% of transformation programmes that deliver their projected benefits from the 70% that don't. Analysis of 140 programmes across UK public sector, financial services, pharma and technology.

Sample
140 programmes
Period
2019–2024
Sectors
Public sector, FS, Pharma, Tech
Publisher
Cairn Novaris

Methodology: Data drawn from CN practitioner engagements, client programme reviews and publicly available programme outcomes data from government and regulated industries. Programmes included where month 12 or equivalent end-state outcomes data was available. Sample weighted toward large-scale transformation (£5m+ programme investment). Full methodology available on request.

Headline findings
70%
fail to deliver projected benefits
At month 12, 70% of programmes in the sample had delivered less than 60% of the benefits projected in their business case.
83%
of failures cite people factors as primary cause
Technology problems, process failures and governance issues were present in many programmes. But the primary cause of shortfall in 83% of underperforming programmes was people adoption.
4.2x
higher change investment in top-quartile programmes
Programmes in the top quartile for benefits delivery spent 4.2x more of their total programme budget on change management than programmes in the bottom quartile.

"The data is not ambiguous. Programmes that under-invest in change management do not achieve their projected benefits. The difference is not marginal — it is the difference between delivery and failure. And in 73% of cases, the change management budget was reduced after programme approval."

CN Research, 2024 — based on 140 programme sample
What separates the 30%

Six factors. Present in 85%+ of top-quartile programmes. Absent in 80%+ of bottom-quartile programmes.

Change management embedded from day one
Present in 89% of top quartile

Top-quartile programmes had change practitioners involved in programme design from the first week. Bottom-quartile programmes brought change management in after the operating model was designed — on average, in week 14 of delivery.

Informal influence network mapped before communications
Present in 82% of top quartile

Top-quartile programmes explicitly mapped informal influence structures and designed engagement specifically for informal leaders. Bottom-quartile programmes engaged formal stakeholders by seniority. 78% of them did not identify informal leaders at all.

Manager enablement programme (not briefing)
Present in 79% of top quartile

Top-quartile programmes ran structured manager enablement programmes — including scenario practice for difficult conversations. 91% of bottom-quartile programmes ran manager briefings only.

Benefits baseline established before programme start
Present in 91% of top quartile

91% of top-quartile programmes established a measurable baseline before any programme activity changed the thing being measured. 67% of bottom-quartile programmes had no baseline at all — making benefit verification impossible.

Embedding programme resourced through month 3
Present in 76% of top quartile

Top-quartile programmes maintained active embedding resource for at least 12 weeks post go-live. 84% of bottom-quartile programmes stood down change resource at or before go-live.

Month 12 verification built into SoW
Present in 68% of top quartile

Top-quartile programmes planned and resourced a formal month 12 benefits review before delivery began. Only 12% of bottom-quartile programmes conducted any form of month 12 verification — and of those, 71% conducted it informally without reference to a baseline.

The budget cut finding

73% of programmes in the sample had their change management budget reduced after initial approval.

73%
had change management budget reduced post-approval
The average reduction was 41% of the originally approved change management budget. The reductions occurred on average in month 4 of delivery — after discovery, before full delivery.
3.1x
higher cost of remediation vs original change investment
In programmes where the change management budget was reduced and adoption subsequently failed, the cost of remediation — additional change work in year two — averaged 3.1x the amount that was cut from the original budget.

"Cutting the change management budget is not a cost saving. It is a deferred cost — typically at a multiple of three or more. The programmes that cut change management investment in months three to six are the programmes that come back for remediation in year two. The remediation costs more than the original change programme would have."

This finding has a specific implication for programme governance. Business cases that are approved with adequate change management investment and then reduced in delivery are not saving money — they are taking on a liability. Programme boards and finance leads who approve mid-programme reductions to the people workstream should understand what the historical evidence says about the likely outcome.

Sector breakdown

Benefit delivery rate by sector at month 12.

Technology44%
Highest delivery rate. Strong change management culture and programme discipline.
Financial Services32%
Strong governance but high complexity. Regulatory change driving programme congestion.
Pharma & Life Sciences31%
High science capability, variable change management maturity. Programme governance strong; people engagement weaker.
Government & Public Sector24%
Lowest delivery rate. Large programme scale, high complexity, significant change fatigue across the sector. Change management investment most frequently reduced post-approval.

Note: delivery rate defined as % of programmes delivering ≥75% of projected benefits at month 12. Not % of benefit value delivered — % of programmes meeting threshold.

Implications

What the data says to programme boards, finance leads and senior sponsors.

The business case must include a change management investment adequate for the scale of the change

Programmes that approve business cases with inadequate change management investment are approving programmes that are statistically unlikely to deliver their projected benefits. The data on this is not ambiguous.

Post-approval reductions to the people workstream should require board-level sign-off with explicit recognition of the risk

73% of programmes in this sample had their change management budget reduced after approval. This decision was made at programme director level in most cases — without explicit board recognition that it was a programme risk decision, not a cost-saving decision.

Month 12 verification should be standard in every programme SoW

Only 29% of programmes in the sample included any form of month 12 verification. The 71% that did not cannot demonstrate whether they delivered. They cannot learn from what went wrong. And they cannot hold their delivery partners accountable for outcomes rather than activities.

Discuss this research

If this data reflects what you're seeing in your organisation, we should talk.

Start a conversation