How Do We See What Happened in the Business?
Most finance teams produce reports. The question is whether anyone trusts them enough to act.
Not “Do we have a dashboard?” — do we have decision-ready facts, on time, every time?
Mid-market teams rarely fail because they can’t calculate. The pack closes late. The numbers shift after publication. Sales and finance argue over “the real revenue.” Two weeks of close leaves no capacity for the analysis the business actually needs.
What Good Reporting Produces
- Trust: The first reaction to any number is “what do we do?” — not “let me check this first.”
- Cadence: Decision-ready information at the right level, on an agreed schedule, every cycle without exception.
- Decision focus: Information structured around what needs to be decided — not around what data happens to be available.
Key Business Questions
- Do we trust the numbers we see? If the first reaction is re-verification, reporting has failed its purpose.
- Does everyone look at the same information? Parallel spreadsheets mean competing computation paths, not one version of the truth.
- How quickly can we understand deviations? A variance explained in five days is a lesson. In five weeks, it is history.
- How much effort does reporting consume? If month-end close absorbs the team for two weeks, there is no capacity left for analysis.
- Are results explained, or just presented? Numbers without context are data. Numbers with structure, thresholds, and ownership are insight.
The Reporting Control System
Reporting is not a collection of outputs. It is a control system — six mutually reinforcing components that produce reliable, decision-ready information every cycle. A gap in any one undermines the rest.
1) KPI framework: governed definitions and computation paths
Each KPI has one definition, one owner, one computation path — established in Governance & Data Trust and maintained through its change control process. Changes go through a documented process. When three departments calculate “revenue” differently, the board doesn’t have a KPI problem — it has a trust problem. Fewer KPIs with rules beat more KPIs without them.
2) Structure: packs by decision
Audiences, levels, standard packs, exception views — defined once, used every cycle. The board pack is not the management pack. Each pack is designed around what a specific audience needs to decide — not around what the system can export.
3) Cadence: a closed cycle with deadlines
What is available on Day 1, Day 3, Day 5. Which numbers are preliminary vs final. When leadership reviews, and what decisions are taken. Companies that discipline their close cycle often move from two-week closes to three-to-four-day closes — freeing weeks for analysis instead of preparation.
4) Data: single computation path per metric
One agreed definition and one approved computation path per metric, with controlled exceptions documented. One reconciled set of actuals. One hierarchy for customers, products, regions. Not “the finance version” — the company version.
5) Design: exceptions-first information architecture
What changed? Where? How big? Who owns it? The same 40-page pack lands on every desk, and nobody reads past page three. Reports that work surface deviations first, set thresholds, assign accountability — not information volume, direction. Each view earns its place by triggering a decision: if a variance exceeds its threshold, it requires an owner response. No breach — no page required.
6) Quality: accuracy, traceability, reconciliation
Every number is accurate (ties to source records), traceable (path from source to pack is documented), reconciled (sub-ledgers align to ledger), and consistent (definitions do not shift by user). Not aesthetic perfection — provable accuracy.
Cadence Blueprint
Cadence is not about speed for its own sake. It is about decision windows — defining what truth looks like at each point in the close cycle, then running it reliably.
- Day 0 (close night): Data ingestion rules applied, cutoffs confirmed, ledger locked
- Day 1: Flash view — material movements, early exceptions, preliminary revenue
- Day 3: Reconciled P&L, revenue bridge, working capital view — reviewed by finance leadership
- Day 5: Final management pack — narrative, variance owners, actions logged
Different organisations choose different rhythms. The principle is fixed: define the schedule, assign what is available at each point, hold it.
Reporting ownership at a glance:
- Metric owner: defines the KPI, controls its definition and computation path
- Data owner: ensures source data arrives accurately and on time
- Finance (reconcile / release): validates, reconciles, and publishes the pack
- Business (commentary): explains variances against agreed deadlines
- Decision / action owner: converts pack findings into a logged management decision
Reporting Health: Quality Metrics
A reporting control system is only as strong as its weakest component. Six metrics indicate whether it is performing.
- Timeliness: Pack delivered within the agreed close schedule — Day 3 flash, Day 5 final. Recurring delays signal a cadence or data dependency failure.
- Stability: Frequency of post-publication corrections. More than two restatements per quarter is a data quality signal, not an exception.
- Reconciliation coverage: Percentage of reported figures tied to sub-ledger records. Target: 100% for P&L and balance sheet. Any gap is a quality control failure.
- Definition compliance: Percentage of reported KPIs with a documented definition and named owner. A gap means any user can recalculate independently — and will.
- Effort ratio: Share of reporting cycle time spent on data preparation vs analysis. Above 70% preparation is a structural problem, not a headcount problem.
- Decision usage: Whether leadership decisions reference the standard pack — or route around it to shadow reports and verbal updates.
Measuring these requires no new tool. The close log, restatement history, and pack distribution records contain the evidence.
Together, they protect the baseline — when any one degrades, the disciplines built on top inherit the instability.
Reporting Areas
Management Reporting
The gap is rarely a missing report. It is a report that arrives too late, uses inconsistent definitions, or requires manual assembly every month. Most mid-market companies have the data — they lack the framework to consolidate it into decision-ready information without heroic manual effort.
→ Management Reporting Framework · Building Effective Management Reports · Reporting Frequency and Cadence
KPIs and Dashboards
A dashboard without governed KPI definitions is a liability — the illusion of insight over inconsistencies underneath. Who owns the metric? How is it calculated? Where does the data come from? Without answers, dashboards become contested territory rather than decision aids.
→ Designing Effective KPIs · KPI Hierarchies · Dashboard Design
Automation and Efficiency
Most finance teams spend 80% of reporting time on data preparation and 20% on analysis. The goal is to invert that ratio through automation of the repeatable — extraction, consolidation, validation, formatting — not through tools alone, but through governed, repeatable pipelines.
→ Reporting Automation Fundamentals · Reducing Manual Reporting Effort · Self-Service Reporting
Quality and Governance
Reporting quality is not an internal finance concern. It is enterprise risk. In many sectors and jurisdictions, audit processes increasingly include full-population testing through automated analytics. Inconsistent definitions, untraceable adjustments, and missing reconciliation trails are increasingly treated as findings rather than inconveniences.
→ Reporting Accuracy and Data Quality · Reconciliation · Audit Trail and Traceability
Inputs, Controls, Outputs, Decisions
- Inputs: Source system records, ERP extracts, sub-ledger data — governed by definitions and computation paths controlled in Governance
- Controls: Single computation path per metric, reconciliation to sub-ledger, threshold-based exception design — each view triggers a decision or is not required
- Outputs: Management packs, flash views, KPI dashboards — published on an agreed cadence with ownership and variance narrative attached
- Decisions enabled: Period review, variance owner response, resource reallocation — logged with owner and due date in each reporting cycle
What Reporting Is Not
Reporting is overloaded. Boundaries matter.
- Why did it happen? — that is Performance Analysis .
- What will happen? — that is Planning & Projections .
- Can we trust it? — that is Governance & Data Trust .
Reporting answers one question: what happened? It is the reliable baseline. Other disciplines build on it.
Why Reporting Is the Foundation
Without reliable reporting, every other finance capability degrades:
- Performance analysis cannot identify true drivers when underlying numbers are inconsistent.
- Planning disconnects from reality when actuals are delayed or disputed.
- Governance becomes policy without enforcement when data lacks traceability.
Governed definitions and single computation paths from Governance are what make consistent reconciliation possible — reporting can only hold its cadence when the inputs it receives are controlled and versioned.
Reconciled actuals and stable definitions from Reporting are the uncontested baseline Performance Analysis depends on to decompose variances by driver — not to first reconstruct whether the numbers are right.
Strong reporting is the foundation of one control system. Every analytical, planning, and governance capability becomes more effective when built on trusted, timely information.
→ Why Reporting Matters for Mid-Market Companies — the full argument
Typical Situations
- Month-end close runs to day 12, so the management meeting happens when most of the corrective-action window has already closed
- Finance and the commercial team run separate revenue trackers with different adjustment logic, so the board meeting opens with a 20-minute reconciliation discussion before the agenda begins
- A deviation surfaces in the board meeting that is not in the pack — two team members spend the following three days rebuilding the attribution from source records rather than acting on the finding
- An acquisition adds a new entity to the consolidation, and the close extends by five days each month because the chart of accounts does not map — not a one-time effort but a structural addition to every future cycle
- An investor requests a restated prior-quarter figure, and the restatement requires re-extracting source data because the adjustment made at the time was not documented in the published pack
Next Steps
- Explore reporting topics in depth — Knowledge Hub
- See how organisations apply reporting capability — Use Cases
- Discuss your situation — Contact