By the time the report is ready, it is already outdated. That complaint is real in many mid-market finance teams — but the diagnosis that follows is usually wrong. Leadership hears “outdated” and concludes the organisation needs real-time data. The finance team, already struggling to produce accurate monthly numbers within two weeks, receives a mandate to deliver information in real time.
The problem is rarely data latency. It is process latency. The report is outdated not because data cannot move faster, but because the close takes fifteen days, the reconciliation is manual, and the consolidation depends on three spreadsheets and one person’s availability. This article examines when real-time reporting genuinely adds value, when it is an expensive distraction, and what most mid-market companies actually need instead.
What real-time reporting means — and does not mean
Real-time reporting is the delivery of data and insights with minimal latency between an event occurring and its appearance in a report. But “minimal latency” spans a wide range, and the distinctions matter:
| Category | Latency | Example | Typical use |
|---|---|---|---|
| Real-time | Seconds to minutes | A transaction appears in a dashboard within moments of being processed | Trading floors, production line monitoring |
| Near-real-time | Minutes to hours | An hourly data refresh updates the operational dashboard | Inventory management, daily cash position |
| Batch | Hours to days | The nightly ETL process loads yesterday’s data into the reporting layer | Management accounts, monthly reporting |
| Periodic | Days to weeks | The monthly close process produces the management pack | Board reporting, statutory accounts |
The right choice depends on the decision the data informs, not on what is technically possible. A board that meets monthly does not need real-time revenue data — it needs accurate, reconciled revenue figures produced reliably within five days of month-end. A warehouse manager deciding whether to reorder stock may need hourly inventory counts. A trader executing positions needs sub-second pricing.
Most financial reporting decisions fall into the batch or periodic categories. The aspiration for “real-time” often reflects frustration with a slow close process, not a genuine need for sub-minute data delivery.
The corporate-practitioner gap
There is a revealing disconnect between what leadership requests and what the finance team actually needs. Research across mid-market companies consistently surfaces this pattern:
- Leadership says: “We need real-time insights into business performance.”
- The finance team says: “We would settle for accurate numbers by day ten.”
This gap is not about ambition — it is about diagnosis. Leadership experiences report staleness and prescribes real-time as the remedy. The finance team knows that the staleness comes from the close process, not from data infrastructure. Closing the books faster — from fifteen days to five — delivers more freshness than any real-time streaming architecture.
FloQast (2025) data quantifies the gap: the average mid-market close takes 10–15 working days, while best practice is five or fewer. At the ICV Congress (Poland) , a case study demonstrated a reduction from ten days to one day using BI automation of the close process — not real-time streaming, but automated data collection, reconciliation , and report generation.
The lesson: the freshness problem most organisations face is solved by process automation, not by real-time infrastructure.
When real-time is justified
Real-time reporting earns its cost when three conditions are met simultaneously:
- The decision window is short — minutes, not days. If a decision can wait until tomorrow, daily data is sufficient.
- The cost of delay is high — acting on yesterday’s data produces a materially worse outcome than acting on today’s data. If the cost difference is negligible, real-time adds complexity without value.
- The data is trustworthy at speed — real-time data that has not been validated, reconciled, or quality-checked may be fast but unreliable. Speed without accuracy is not an improvement.
Genuine real-time use cases in a business context:
| Scenario | Decision window | Cost of delay | Real-time justified? |
|---|---|---|---|
| Production line quality monitoring | Minutes | Defective output, waste | Yes |
| Cash position for intraday treasury | Hours | Missed investment window, overdraft | Near-real-time |
| Daily sales tracking for retail | End of day | Suboptimal replenishment | Near-real-time |
| Monthly management accounts | Days | Delayed decisions | No — faster close, not real-time |
| Quarterly board reporting | Weeks | None — decision cycle is quarterly | No |
| Annual budget vs. actual | Months | None | No |
For most mid-market financial reporting, the honest answer is: real-time is not justified. Near-real-time (daily automated refresh) or faster batch processing (closing in five days instead of fifteen) addresses the actual need.
The latency-needs decision framework
Before investing in faster data delivery, ask three questions:
1. What decision does this data inform?
Map each report or dashboard to the decision it serves. Monthly management accounts inform resource allocation and performance assessment — decisions made weekly or monthly. A cash position report informs treasury management — decisions made daily. A production dashboard informs quality intervention — decisions made in minutes.
2. What is the decision cycle?
If the decision is made monthly, data that is five days old at the point of decision is entirely adequate. If the decision is made hourly, yesterday’s data is insufficient. Match latency to the cycle.
3. What is the cost of the current latency versus the cost of reducing it?
Reducing the close from fifteen days to five might require process documentation, automation of data collection, and standardised reconciliation — a moderate investment with broad benefits. Reducing data delivery from daily to real-time might require streaming infrastructure, in-memory databases, and continuous data quality monitoring — a significant investment with narrow benefits.
Most mid-market companies will find that the first investment (faster close) delivers ten times the value of the second (real-time infrastructure) at a fraction of the cost.
The three tiers of data freshness
Rather than a binary choice between batch and real-time, consider three tiers matched to different needs:
Tier 1: Faster batch refresh
Automate the daily or hourly refresh of data from source systems into the reporting layer. This eliminates manual extraction, reduces the close timeline, and resolves the majority of “outdated report” complaints.
What it addresses: The fifteen-day close. The manual data scavenger hunt. The copy-paste from ERP to Excel to report.
Evidence: The ICV Congress case study showed a reduction from ten days to one day using BI automation of the batch process. KPMG + ACCA (PL, 2024) found that only 7% of CFOs use AI despite 41% seeing automation as a top-three opportunity — indicating that foundational batch automation remains the priority for most organisations.
Cost: Moderate. Requires data pipeline configuration and process standardisation.
Tier 2: Near-real-time (event-triggered updates)
Data refreshes when a triggering event occurs — a transaction is posted, an invoice is approved, a shipment is confirmed. The dashboard reflects the change within minutes to hours.
What it addresses: Operational visibility — cash position, inventory levels, order status. Decisions with daily or intraday cycles.
Cost: Higher. Requires event-driven architecture, change data capture, and continuous validation.
Tier 3: True real-time streaming
Sub-second data delivery. Every transaction, every sensor reading, every market price appears in the dashboard as it happens.
What it addresses: Trading, production monitoring, fraud detection. Decisions with minute-level cycles.
Cost: Significantly higher. Requires streaming infrastructure, in-memory processing, and continuous quality assurance.
For most mid-market companies, Tier 1 alone is sufficient to resolve the freshness complaint. Tier 2 is justified for specific operational use cases. Tier 3 is rarely justified outside manufacturing process control or financial trading.
Common pitfalls
Investing in real-time while the close takes fifteen days
This is the most common and most expensive mistake. The organisation builds real-time dashboards that stream data from source systems — but the monthly close still takes two weeks because reconciliation is manual, consolidation is spreadsheet-based, and the chart of accounts has not been standardised. The result: real-time views of data that has not been reconciled, validated, or closed. Fast and wrong is not an improvement over slow and right.
Confusing corporate demand with practitioner need
When leadership says “we need real-time,” the appropriate response is not to build real-time infrastructure. It is to ask: “What decision would you make differently if you had the data sooner?” Often, the answer reveals that the real need is reliable data within a reasonable timeframe — not sub-second delivery.
Real-time dashboards nobody watches
A dashboard that updates every thirty seconds is pointless if nobody looks at it more than once a day. Real-time infrastructure carries ongoing cost — compute, storage, maintenance, monitoring. That cost is justified only if someone is genuinely making decisions based on minute-level data changes. If the dashboard is checked daily, daily refresh is sufficient and far cheaper.
Ignoring data quality at speed
Data quality is harder to maintain at speed. Batch processing allows validation rules, reconciliation checks, and human review before data reaches reports. Real-time processing must validate data in flight — and the consequences of a validation gap appear in dashboards immediately, in front of the users who were promised trustworthy data.
A real-time dashboard showing an obviously wrong number does more damage to trust than a batch report delivered a day later with the right number.
Building real-time before automating batch
SolveXia/Gartner research shows that 98% of CFOs have invested in digitisation, yet 41% say fewer than 25% of their finance processes are automated. Rossum (DAT25) found that 49% of finance departments operate with zero automation. For these organisations, real-time is several maturity levels away. The sequence matters: automate batch first, then accelerate to near-real-time where justified, then consider true real-time only for specific use cases with proven need.
Technology perspective
The technology choice follows the tier:
| Tier | Technology approach | Mid-market suitability |
|---|---|---|
| Tier 1 | Scheduled ETL, automated data pipelines, BI refresh schedules | High — mature, well-understood, moderate cost |
| Tier 2 | Change data capture, event-driven architecture, webhook-triggered refresh | Moderate — requires technical capability and ongoing maintenance |
| Tier 3 | Stream processing, in-memory databases, real-time BI connectors | Low — high cost, high complexity, narrow use cases |
For mid-market companies, Tier 1 technology — automated data pipelines with scheduled refresh — addresses the vast majority of reporting freshness needs. The ICV Congress evidence (ten days reduced to one) was achieved entirely within Tier 1.
The critical technology decision is not “which streaming platform?” but “how do we automate the data collection, transformation, and validation steps that currently take our team ten days?”
Frequently asked questions
Do we need real-time reporting? Probably not for financial reporting. Ask what decision would change if data arrived minutes faster. For most management reporting, the answer is none — the constraint is the close process and reconciliation, not data delivery speed. Focus on closing faster and automating data collection first.
What is the difference between real-time and near-real-time? Real-time delivers data in seconds to minutes — the dashboard reflects an event almost as it happens. Near-real-time delivers data in minutes to hours — an hourly or event-triggered refresh that keeps operational dashboards current. For most business purposes, near-real-time provides adequate freshness at far lower cost and complexity.
How do we know if our close process is the real bottleneck? If your monthly numbers are available more than five working days after month-end, the close process is almost certainly the bottleneck. Map the close timeline: how many days for data collection, reconciliation, consolidation, review, and distribution? The largest block is where to focus — and it is rarely data delivery speed.
Can we achieve near-real-time with our existing systems? Often, yes. Most modern ERPs and accounting systems can export data on a schedule. A BI environment that refreshes hourly or daily from these exports provides near-real-time operational visibility. The investment is in configuring the pipeline and maintaining data quality , not in new infrastructure.
What should we do first? Automate your data collection and close process. Reduce the close from fifteen days to five. Then assess whether specific operational decisions need faster data. Start with Tier 1 (faster batch) and only move to Tier 2 (near-real-time) when you have evidence that daily data is insufficient for a specific use case.
Related Reading
- Reporting Automation Fundamentals — the automation foundation that precedes real-time
- Management Reporting Frequency and Cadence — matching reporting frequency to decision cycles
- Management Dashboard Design — dashboard design for different refresh rates
- Business Intelligence Reporting — BI as the infrastructure layer for data delivery
- Self-Service Reporting — self-service that depends on governed, timely data
- Month-End Close Best Practices — the close process that determines actual report freshness
- Reducing Manual Reporting Effort — manual effort as the true cause of report staleness
- Financial Dashboards for Executive Decisions — dashboards matched to decision cycles
- Glossary: Dashboard | KPI | Reporting Frequency
Sources
- FloQast — 2025 Controller’s Guidebook — average mid-market close 10–15 days; best practice 5 or fewer
- ICV Congress (Poland) — case study: 10 days reduced to 1 day with BI automation
- KPMG + ACCA (PL, 2024) — only 7% of CFOs use AI; 41% see automation as top-3 opportunity
- SolveXia/Gartner — 98% of CFOs invested in digitisation; 41% have fewer than 25% of processes automated
- Rossum DAT25 — 49% of finance departments operate with zero automation
Martin Duben is a finance and reporting specialist at Onetribe. He works with mid-market companies across Central Europe to build reporting infrastructure that delivers the right data at the right time — matching freshness to the decisions that matter.