Financial data quality has five measurable dimensions — accuracy, completeness, consistency, timeliness, and validity — and each requires specific assessment questions, not generic checklists. ACCA’s Global Survey 2024 found 62% of finance professionals spend significant time fixing data errors rather than analysing results, while Hackett Group benchmarks show top-quartile finance organisations spend 30% less time on data reconciliation than peers. The assumption that “the ERP ensures data quality” is false — ERP systems enforce transaction integrity, not data correctness; wrong data entered correctly is still wrong. Data quality is not a one-time cleanup project but a continuous discipline; organisations that treat it as a project see quality degrade back to baseline within months. A 20-question diagnostic across the five dimensions maps each answer to four levels of data governance maturity, giving CFOs a practical maturity score and a prioritised action list for closing the gaps that matter most.
Data quality in finance is not an abstract concept — it is the difference between a board meeting that debates decisions and one that debates numbers. When leadership asks “is this number right?” before every discussion, the organisation has a data quality problem. When leadership acts on numbers without that question, it has decision-grade data — data reliable enough for management decisions without additional verification.
ACCA’s Global Survey 2024 found that 62% of finance professionals spend significant time fixing data errors rather than analysing results. The Hackett Group benchmarks show that top-quartile finance organisations spend 30% less time on data reconciliation than their peers. The difference is not technology — it is discipline: systematic quality checks applied at the right points in the data lifecycle.
This article provides a diagnostic checklist — 20 specific questions across five quality dimensions — that any CFO or controller can use to assess where their organisation stands and what to fix first.
Five Dimensions of Financial Data Quality
Data quality frameworks in the academic literature typically list six to twelve dimensions. For financial data in mid-market organisations, five dimensions capture the practical reality:
| Dimension | Definition | What It Means in Finance |
|---|---|---|
| Accuracy | Data correctly represents the real-world event it records | The invoice amount in the system matches the actual invoice; the cost centre allocation reflects where the cost was incurred |
| Completeness | All required data is present — no gaps, no missing records | Every transaction is recorded; every entity is included in the consolidation; every period is closed |
| Consistency | The same data yields the same answer regardless of where it is accessed | Revenue in the management report matches revenue in the board pack matches revenue in the statutory accounts (or differences are documented and explainable) |
| Timeliness | Data is available when the decision it informs needs to be made | Monthly numbers are available by working day 5, not working day 15; cash position is updated daily, not weekly |
| Validity | Data conforms to defined rules, formats, and ranges | Account codes exist in the chart of accounts; posting dates fall within open periods; currency codes are valid ISO codes |
Each dimension fails independently. Data can be accurate but late. It can be timely but incomplete. It can be complete but inconsistent across reports. The checklist assesses each dimension separately because the remediation differs for each.
The 20-Question Diagnostic
The following questions are designed for finance leaders at mid-market companies (£1–50M revenue). For each question, score your organisation on a four-point scale:
- 1 — Not at all: This does not happen or does not exist
- 2 — Partially: This happens sometimes or for some data, but not systematically
- 3 — Mostly: This is standard practice with occasional gaps
- 4 — Fully: This is systematic, documented, and consistently applied
Accuracy (Questions 1–4)
1. Do you have documented data validation checks that run before the monthly report is distributed? Validation checks — control totals, balance checks, reasonableness tests — catch errors before they reach decision-makers. Without them, the board meeting becomes the quality control step.
2. When was the last time a material error was found in a distributed report? If the answer is “within the last three months,” the validation process is insufficient. Material errors in distributed reports destroy trust and take months to rebuild.
3. Can you trace any reported number back to its source transaction within 30 minutes? An audit trail is not a compliance luxury — it is the mechanism that allows errors to be identified, traced, and corrected. If tracing a number requires asking three people and opening four spreadsheets, the audit trail is broken.
4. Are intercompany transactions reconciled and eliminated systematically before consolidation? Unreconciled intercompany balances are the single most common source of material errors in multi-entity reporting. If elimination is manual and ad hoc, accuracy is dependent on the skill of one individual rather than the robustness of a process.
Completeness (Questions 5–8)
5. Does a documented checklist confirm all entities, cost centres, and accounts are included before the month-end close is finalised? Missing entities or incomplete postings are not always visible in the finished report. A completeness checklist catches what is absent — which is harder to spot than what is wrong.
6. Are accruals and provisions reviewed systematically each period, or are they carried forward by default? Stale accruals distort the P&L silently. A provision booked six months ago and never reviewed is not conservative accounting — it is incomplete data management.
7. Do you have a process to identify transactions that failed to post or were parked without resolution? Parked, held, or failed transactions represent completeness gaps that are invisible in standard reports. Without a systematic process to identify and resolve them, the reported numbers understate or misstate reality.
8. Is non-financial data (headcount, FTEs, units sold) captured with the same discipline as financial data? KPIs that combine financial and non-financial data — revenue per employee, cost per unit, margin per client — are only as reliable as the weakest input. If headcount data comes from a spreadsheet updated quarterly while financial data is monthly, the resulting KPI is structurally flawed.
Consistency (Questions 9–12)
9. If you asked three people in the company “what was last month’s revenue?” would they give the same answer? This is the definitive consistency test. If the sales director, the controller, and the CEO cite different numbers, the company has multiple sources of truth. Gartner estimates this is the norm — three to five competing “truths” in the average organisation.
10. Are key metric definitions documented and accessible to everyone who uses them? A single source of truth begins with agreed definitions. If “revenue” means booked deals in the CRM and invoiced amounts in the ERP, the company does not have one metric — it has two. The KPI framework article covers this in depth.
11. When you compare the management report to the statutory accounts, can you explain every difference? Differences between management and statutory reports are normal (different classification, different treatment of certain items). Unexplained differences are a governance failure.
12. Do all entities in your group use the same chart of accounts structure? Inconsistent chart of accounts architecture is the structural root of most consistency problems. If each entity has a different account structure, consolidation requires manual mapping — and every mapping introduces potential inconsistency.
Timeliness (Questions 13–16)
13. By which working day after month-end are final management accounts available? Top-quartile companies close by working day 3–5. If your close takes more than 10 working days, the data is potentially outdated by the time leadership sees it.
14. Is cash position data available daily? Cash is the one metric that cannot wait for the monthly close. If the CFO must ask the treasury team or log into multiple banking portals to determine the current cash position, the process is too slow for the decision it informs.
15. When a data error is identified, how long does it take to correct and redistribute? Error correction speed is a proxy for governance maturity. If correcting one number requires re-running five spreadsheets and redistributing three reports, the data pipeline has too many manual dependencies.
16. Are rolling forecasts updated with current-period actuals within five working days of month-end? A forecast that uses stale actuals is a forecast built on yesterday’s assumptions. The value of a rolling forecast depends on its currency.
Validity (Questions 17–20)
17. Are new accounts, cost centres, or dimension values created through a documented approval process? Ungoverned master data creation is a primary cause of invalid postings. When anyone can create a new cost centre or account code without review, the structure degrades with every new entry.
18. Does the system prevent postings to closed periods, inactive accounts, or invalid dimension combinations? System-enforced validity checks are the baseline. If the ERP allows postings to closed periods or non-existent cost centres, every report must be manually validated for invalid entries.
19. Are duplicate records (suppliers, customers, accounts) identified and resolved systematically? Duplicate master records produce inconsistent reporting, overstated balances, and reconciliation failures. A quarterly duplicate review is the minimum for maintaining validity.
20. Do you have automated alerts for transactions that exceed defined thresholds or violate business rules? Threshold-based alerts catch outliers before they propagate into reports. A single misposted invoice of £500,000 distorts the entire P&L if caught at the board meeting rather than at the point of entry.
Scoring and interpretation
Total your score across all 20 questions (minimum 20, maximum 80).
| Score Range | Maturity Level | Interpretation |
|---|---|---|
| 20–35 | Level 1 — Ad-hoc and undocumented | Data quality is ad hoc. Errors are found in distributed reports. Leadership debates numbers rather than decisions. Immediate governance intervention required. |
| 36–50 | Level 2 — Defined processes with basic validation | Basic controls exist but are inconsistently applied. Quality depends on key individuals rather than systematic processes. Focus on documentation and validation cadence. |
| 51–65 | Level 3 — Integrated and systematically reconciled | Quality processes are systematic and documented. Most errors are caught before distribution. Focus on automation and cross-source consistency. |
| 66–80 | Level 4 — Governed, monitored, and audit-ready | Data is governed, validated, and trusted. Leadership acts on numbers without questioning their accuracy. Focus on continuous improvement and AI readiness. |
Interpreting Dimension Scores
Beyond the total, examine which dimensions score lowest. The remediation priority differs:
| Lowest Dimension | Typical Root Cause | First Action |
|---|---|---|
| Accuracy | Insufficient validation, no reconciliation cadence | Implement pre-distribution validation checklist |
| Completeness | No close checklist, stale accruals, missing entities | Document month-end completeness checklist |
| Consistency | Undefined metrics, inconsistent CoA, multiple truths | Document definitions for five key metrics |
| Timeliness | Manual processes, sequential dependencies, late source data | Map the close process and identify bottlenecks |
| Validity | Ungoverned master data, no system-enforced controls | Implement approval process for master data changes |
Common Pitfalls
The One-Time Cleanup
A data quality project cleans up the general ledger, resolves duplicates, and standardises naming. Six months later, quality has degraded back to its prior state. Without ongoing governance — validated cadence, ownership, change control — a cleanup is a temporary fix. McKinsey research suggests data quality degrades at 2–3% per month without active governance.
The ERP Equals Quality Assumption
“We have an ERP, so our data is good.” This assumption is pervasive and false. An ERP enforces transaction integrity — debits equal credits, periods are controlled, access is restricted. It does not enforce data correctness. A transaction posted to the wrong cost centre, with the wrong description, in the wrong amount is processed by the ERP without complaint. The ERP ensures the data is structurally valid. It does not ensure the data is right.
Fixing Symptoms, Not Causes
The report shows an error. Someone corrects the report. The underlying data is not investigated. Next month, the same error recurs. Symptom-fixing is the most expensive quality approach because it consumes time every period without reducing the error rate. Root cause analysis — tracing the error back to its origin and fixing the process that allowed it — is the only approach that improves quality over time.
Measuring Quality Without Acting on Results
Some organisations implement quality metrics — error rates, reconciliation variances, correction counts — but do not act on them. Measurement without consequence is theatre. Every quality metric must have an owner, a threshold, and a defined response when the threshold is breached.
Building a Continuous Quality Discipline
| Frequency | Activities |
|---|---|
| Daily | Cash position reconciliation; automated validation alerts reviewed and resolved |
| Weekly | Parked and failed transaction review; intercompany balance check (multi-entity) |
| Monthly | Completeness checklist (all entities, cost centres, material accruals); accuracy validation (control totals, period-over-period variances, cross-source reconciliation); consistency check (management figures to trial balance) |
| Quarterly | Master data review (duplicate detection, dormant cleanup); definition review; dimension review (cost centres, profit centres, projects) |
| Annually | Full chart of accounts review; quality trend analysis; control framework review |
Frequently Asked Questions
How long does this assessment take? A first pass takes 60–90 minutes with the CFO or controller. The value is not in the scoring precision — it is in the conversations the questions provoke. Questions that cannot be answered confidently identify governance gaps more reliably than any audit.
Should I involve IT in the assessment? For questions about system-enforced controls (validity dimension), yes. For questions about definitions, accuracy, and consistency, the assessment should be led by finance. The distinction matters: IT can tell you whether the system prevents invalid postings; only finance can tell you whether the posted data is correct.
What if we score low — does that mean we need to start a major project? No. The financial data governance framework article describes a minimum viable approach: start with five key metrics, document definitions, assign owners, introduce a validation checklist. This moves the organisation from Level 1 to Level 2 within 60–90 days without a major project.
How often should we repeat the assessment? Quarterly for the first year, then semi-annually. The assessment tracks progress and identifies regressions before they become systemic.
Can this checklist be used for audit preparation? It is not a substitute for an audit programme, but it identifies the data quality risks that auditors will also identify. Organisations that score above 50 on this checklist typically experience fewer audit findings related to data quality.
Related Reading
- The Financial Data Governance Framework — the governance framework that this checklist diagnoses
- Financial Data Governance — Why It Is the Foundation of Trustworthy Reporting — the foundational case for governance
- Chart of Accounts Architecture — structural layer that determines what quality is possible
- Single Source of Truth in Finance — achieving consistency across reports
- Data Ownership Framework — assigning accountability for quality
- Variance Analysis — A Practical Guide — analysis that depends on quality data
- FP&A Maturity Framework — broader maturity assessment for finance functions
Sources
- ACCA Global Survey 2024 — 62% of finance professionals spend significant time fixing data errors
- The Hackett Group — top-quartile finance organisations spend 30% less time on reconciliation
- McKinsey — “The Data-Driven Enterprise” 2024 — poor data quality costs 15–25% of revenue; 2–3% monthly quality degradation without governance
- EDM Council — only 12% of organisations score above “managed” on data governance maturity
- Gartner — average organisation maintains 3–5 sources of truth for the same financial data
- Deloitte CFO Signals Q4 2025 — 54% of CFOs cite data quality as a barrier
- BDO Mid-Market Report 2025 — 68% of mid-market CFOs lack confidence in data consistency
Martin Duben is the founder of Onetribe, where he helps mid-market CFOs build the financial data infrastructure that turns reporting from a reconciliation exercise into a decision-making system. His work focuses on the intersection of financial governance, reporting architecture, and AI readiness for companies with £1–50M revenue.