Home/ Insights/ Research & Data
CN Intelligence

Research
& Data.

Original analysis from CN practitioners — not surveys, not secondary research. Data on how transformation programmes actually perform, what operating models actually look like, and where the gap between ambition and reality actually lives.

Methodology note: All CN research is based on practitioner analysis of publicly available data combined with aggregated anonymised observations from CN engagements. Full methodology published with each study.
Four studies. One picture.
Change Programme Performance Index
Adoption rates · failure modes · benefits realisation · 200+ programmes
Operating Model Maturity Benchmark
Six-layer maturity across UK organisations · public & private
AI Transformation Readiness Index
Ambition vs reality gap · change capability vs technology readiness
Government Transformation Outcomes Study
Month 12, 24 & 36 outcomes · UK public sector · benefits realisation
All Insights
Research & Data
Thought Leadership Diagnostics Sector Intelligence
01
CN Research · 2026
Change Programme
Performance Index
Analysis of 200+ transformation programmes across UK public and private sectors. The data on what actually determines whether a change programme holds at month 6, month 12 and beyond — and what the programmes that work have in common.
Key findings
67%
of change programmes fail to achieve sustained adoption at month 12
The industry standard claim of "70% of change fails" — our data suggests the real number varies by sector and definition but the direction is accurate.
3.2x
higher adoption rate in programmes with manager enablement vs manager briefing
The single most predictive factor in our dataset — more than communications spend, more than training volume.
22%
of programmes include a structured manager enablement programme (not just briefing)
Despite being the highest-impact intervention, it remains the exception rather than the rule.
18%
of programmes have benefits formally verified at month 12 against a baseline
Most organisations close the programme at go-live and declare success. The 18% who return to verify have significantly higher actual realisation rates.
Adoption rate by programme design characteristic
Manager enablement programme (not briefing)78%
Informal network mapped before go-live71%
Go-live readiness assessed (not assumed)65%
Hypercare period > 30 days post go-live61%
Communications only — no structural intervention24%
Programme design vs benefits realisation at month 12
Programmes with benefits baseline + month 12 review84%
Programmes with benefits baseline, no review51%
Programmes with no formal benefits baseline28%
Discuss your programme →
02
CN Research · 2026
Operating Model Maturity
Benchmark
Where UK organisations actually sit across the six operating model layers — and what separates the 14% with genuinely integrated operating models from the majority still stuck in design-document mode.
Maturity by layer (average score, 1–5 scale)
👥 People & Organisation2.9 / 5
🔄 Process & Ways of Working2.6 / 5
💡 Technology & Systems3.1 / 5
🎯 Service Delivery2.8 / 5
📊 Performance & Insight2.2 / 5
🏛️ Governance & Accountability2.5 / 5
14%
have a fully integrated operating model where all six layers are designed to work together
61%
have redesigned at least one layer without redesigning the adjacent layers — the most common cause of integration failure
2.2
average maturity on Performance & Insight — the lowest of any layer and the one with the highest leverage for improvement
4.1x
higher benefits realisation in organisations with integrated OM design vs single-layer redesigns
Assess your operating model →
03
CN Research · 2026
AI Transformation
Readiness Index
The gap between AI ambition and AI reality in UK enterprises. The data reveals it is almost never a technology problem — it is a change problem. The organisations that get AI to scale have better change capability, not better technology.
89%
of executives cite AI transformation as a top-3 strategic priority for 2026
11%
have AI programmes operating at meaningful scale with measurable business outcomes
3.7x
higher technology readiness score vs change readiness score — the gap that explains why AI stalls
8%
of AI programmes have a named change lead — despite change capability being the primary predictor of adoption
AI readiness by dimension — UK enterprises 2026
Technology infrastructure readiness72%
Data quality and architecture54%
Leadership alignment and sponsorship61%
Workforce change capability19%
Manager readiness for AI-driven role change14%
Talk to us about AI adoption →
04
CN Research · 2026
Government Transformation
Outcomes Study
What actually happens to UK government digital and operational transformation programmes at month 12, 24 and 36. The longitudinal data on benefits realisation, adoption trajectories and the programme factors that predict whether transformation holds in the public sector.
Business case outcomes — government programmes by time horizon
Programmes on track at go-live (self-reported)78%
Meeting business case at month 1241%
Meeting business case at month 2424%
Full benefits realised at month 3614%
Programme design factors — government sector
Programmes with formal benefits baseline41%
Civil service change capability rated 'strong'19%
Senior responsible owner engaged post go-live23%
Programmes with hypercare >60 days31%
The 78% → 14% collapse

The most striking finding in the government dataset is the trajectory: 78% of programmes report being on track at go-live, but only 14% achieve full benefits realisation at month 36. The collapse happens in the transition from go-live to embedding — the period when the formal programme structure is stood down but the organisation has not yet built the capability to sustain the change independently.

Discuss a government programme →