Full guide in Portal →
Methodology Guides/Performance, Insight & Data
📊
OM Layer 05 — Operating Model

Performance, Insight & Data.

Management information designed around decisions, not available data. KPI frameworks that measure what matters. Performance culture that treats problems as information.

Full practitioner methodology in the CN Portal Log in →
The key principles

How CN approaches this work.

01
Start with the decision, not the data
The question for MI design is not "what data do we have?" but "what decisions do managers need to make, and what information do they need to make them well?" MI that is designed forward from available data produces reports that describe what happened. MI designed backward from decisions produces information that changes what happens next.
02
Connect strategic metrics to operational ones
A performance framework in which the board measures one set of outcomes and front-line managers measure a completely different set of activities is a performance framework that does not connect. The chain from strategic objective to operational metric must be explicit and understood at every level.
03
Measure what matters, not what is easy to count
Organisations accumulate metrics over time — each initiative adding its own KPIs, none ever removed. The result is a reporting suite that consumes resource and influences no decisions. CN conducts a rigorous review: what is measured now, why, by whom, and what decisions it influences. Metrics that influence no decisions are candidates for removal.
04
Data quality is a design requirement, not a data team problem
MI is only as good as the data it is built on. Data quality problems that are known but not addressed produce MI that is distrusted and therefore ignored. Addressing data quality before designing the MI framework prevents the rework of building on unreliable foundations.
05
Performance culture is designed, not assumed
A technically excellent MI framework will not improve performance in an organisation that treats performance data as a threat rather than information. The culture — how reviews are conducted, how targets are set, how problems are surfaced — is a design decision as much as the framework itself.
What good looks like
  • KPIs designed backward from decisions, not forward from available data
  • Clear line of sight from strategic objectives to operational metrics
  • MI rationalised — every report connected to a decision
  • Data quality assessed and addressed before MI framework is built
  • Performance reviews designed to surface problems, not explain them away
  • Reporting frequency matched to the cadence of decision-making
Warning signs
  • KPIs accumulated over time, never rationalised
  • Dashboard exists but behaviour does not change when numbers are red
  • Strategic metrics disconnected from what front-line managers measure
  • Data quality problems known but not addressed
  • Performance reviews focused on explanation and mitigation, not improvement
Diagnostic questions

Use these in client conversations or team reviews to quickly surface where the real issues are.

QFor each report produced last month: what decision was made differently because of it?
QCan a front-line manager draw a direct line between their daily operational metrics and the strategic objectives of the organisation?
QWhen was the last time a performance review produced a decision to do something differently — rather than an explanation of why the number was what it was?
Full Practitioner Guide

The complete methodology is in the CN Portal.

The full guide covers: decision-first MI design methodology, KPI framework construction, data quality assessment, reporting rationalisation, performance review design, data architecture for MI, and the approach to building a performance culture alongside the framework.

Access Portal Join the network