Recent discussions with senior data, privacy, and technology leaders highlighted a blunt reality: only 20 to 30% of the data organisations create is actually used. That leaves 70 to 80% consuming storage, engineering time, governance effort, and risk capacity without delivering measurable business value.
For vendors selling data platforms, analytics, governance, privacy, and AI capabilities into enterprise IT, this matters right now. Buyers are no longer funding “more reporting” by default. They are rewriting budgets around a tighter question:
Which decisions will improve, which outcomes will change, and how quickly will it pay back?
This article unpacks what those discussions revealed about the new logic driving data investment, why the dashboard era is breaking down, and how vendors can align to what enterprise IT buyers are prioritising without making the conversation feel sales-led.
The 70% waste problem is not a storage problem
When only 20 to 30% of data is used, the symptoms look like a tooling gap, but the root cause is almost always trust and usability.
Enterprises typically see the same pattern:
- Dashboards multiply, but executives still ask for manual extracts because the numbers are not trusted.
- Data exists, but users cannot find it, or they find multiple versions and cannot tell which is safe.
- Lineage is unclear, so when metrics change, teams cannot explain why quickly.
- Ownership is fuzzy, so issues bounce across teams and take too long to resolve.
- AI initiatives expose quality, classification, and access gaps immediately.
The point leaders kept returning to is simple: you cannot analyse your way out of low trust. Until people believe the data, they will keep verifying it manually, which makes analytics slower, riskier, and more expensive.
Why IT leaders are rewriting data budgets now
Three forces came through clearly in recent discussions. Together, they are shifting enterprise spend away from outputs and towards outcomes.
1) Dashboards are no longer the finish line
Buyers are looking for high-performing data that users can trust, delivered through reusable data products rather than one-off extracts. Reporting still matters, but reporting is not enough.
The budget shift is towards:
- Confidence signals (quality, lineage, auditability)
- Reuse (data products that can be consumed repeatedly)
- Outcome impact (decisions improved, time saved, risk reduced)
If your value story starts and ends with “better visibility”, you are competing in a shrinking category.
2) AI has exposed every weakness in quality, classification, and governance
AI adoption is forcing long-standing data problems into the spotlight. As organisations trial assistants and models, they immediately hit issues like:
- Data that is incomplete, inconsistent, or outdated
- Documents and datasets that are not classified reliably
- Access models that are too open for sensitive contexts
- Governance that exists on paper but does not work as a workflow
This is why data budgets are increasingly attached to “AI readiness”, not because AI is a buzzword, but because it is the fastest way executives discover the cost of weak foundations.
3) Risk capacity has become a budget constraint
A striking point raised in the discussions is that 96% of breaches were traced back to individual errors, not system failures.
This changes how IT leaders think about data investment:
- If a tool increases human error risk, it struggles to get funded.
- If a tool reduces human error risk, it becomes easier to justify.
- If a tool makes the safe path the easiest path, adoption improves, and risk exposure drops.
For vendors, this is a crucial shift. Buyers are not only buying capability. They are buying behaviour change at scale.
Centralised vs decentralised data is not a debate anymore
Several leaders described organisations oscillating between centralised and decentralised models. They centralise to improve governance and scale, then reintroduce decentralised elements to keep delivery close to the business.
Examples raised included:
- Moving from a federated model towards greater centralisation to support globalisation and AI initiatives.
- Unifying multiple legacy asset systems into one to achieve standardisation and cost savings.
- Navigating stakeholder resistance when autonomy changes as control moves to the centre.
The takeaway for vendors is that “pick one model” positioning is increasingly unhelpful. What buyers want is:
Centralised governance and standards, decentralised delivery and ownership close to outcomes.
If you can support that hybrid reality, you become easier to buy.
Where the new data spend is going
The discussions highlighted a clear trend: budgets are shifting towards making the right data trustworthy, findable, reusable, and safe, rather than simply increasing data volume.
Metadata, cataloguing, and lineage that works in a legacy world
Persistent cataloguing and lineage issues across systems remain a major barrier. Many platforms provide internal catalogues, but building a unified master catalogue is still difficult, especially where legacy systems cannot be scanned reliably.
Leaders discussed using AI to accelerate metadata creation and cataloguing. The nuance is important. The goal is not to automate governance away. The goal is to generate a usable first draft at scale, then refine and approve it through a publishing function so it stays current.
Data products, trust scoring, and marketplace thinking
Participants emphasised that businesses need trusted data products rather than one-off work. A marketplace model with trust scoring was discussed as a way to help users select rated data products that include metadata tagging, governance, and quality signals.
That approach is appealing because it tackles the real adoption issue:
People do not reuse data when they are uncertain about it.
Data curation and communication as a trust mechanism
Data curation was positioned as a practical way to reduce verification cycles, alongside the need to translate technical concepts into business language. A data product is not complete until users understand:
- What it means
- What it does not mean
- When it is safe to use
- When it is risky
This is where many initiatives fail. The dataset exists, but its meaning is not operationalised.
Data contracts and accountability, staged to maturity
Data contracts came up as a concept gaining traction, but leaders were realistic about where most organisations are today. Some described formal contracts as an aspiration for 2026, with lightweight documentation as a more immediate step.
The intent is accountability:
- Who owns the dataset?
- Who owns the SLA?
- Who communicates changes?
- Who fixes issues when something breaks?
Observability and automated testing without overhead
Leaders highlighted automated testing and data observability for critical processes, while also warning that advanced pipeline visualisation can create overhead if not designed carefully.
Buyers want reliability, but they do not want another operational burden. The winning approach is:
- Automate checks where they matter most
- Keep workflows simple
- Prioritise exception handling and fast root cause
Governance that accelerates the business
Governance was described as essential for AI adoption, but leaders warned that too much governance too early can stifle innovation. The preference is light-touch early, stronger controls as value and risk increase.
Practices such as privacy by design and techniques like differential privacy were discussed as ways to keep governance practical and scalable, rather than purely policy-driven.
The adoption problem vendors underestimate
Across the discussions, the same barrier kept appearing: people.
Even when technical solutions exist, adoption does not happen by default.
Leaders talked about:
- The need for training, encouragement, and communication that fits busy stakeholder realities.
- The challenge of scaling capability-building across a workforce of 120,000 people, where traditional presentations do not reach enough employees.
- The value of targeted training for senior leaders to build steering groups and programme support.
- The reality that storytelling and practical communication can land better than abstract “data literacy” narratives.
This matters because adoption intersects directly with privacy and risk. Leaders described shifting from open access to more controlled approaches when AI initiatives revealed classification gaps. That is exactly how data exposure happens in modern organisations: access is too permissive, and employees do not understand boundaries.
Vendors that support behaviour change through guidance, transparency, and simple controls become easier to buy and harder to replace.
AI in production has changed what gets funded
Recent discussions on AI implementation reinforced a consistent message: enterprises can experiment, but production requires discipline.
Leaders referenced:
- The need for robust monitoring and optimisation
- The importance of learning from failed implementations
- The reality that value improves after addressing underlying data quality and enriching training data
- The need to manage drift and hallucinations through ongoing control, not one-time governance
One example described a secure internal generative assistant associated with $46 million in savings, with the emphasis placed on controlled environments, clear guidelines, protocols, and monitoring.
For vendors, the implication is straightforward:
Buyers fund AI when it reduces uncertainty in production.
Stats table vendors can use in enterprise conversations
| What leaders described | Stat or figure | Why it matters for enterprise IT | What vendors should align to |
|---|---|---|---|
| Created data actually used | 20 to 30% used | Adoption and trust are the bottleneck, not storage | Data products, trust signals, reuse |
| Implied unused data | 70 to 80% unused | Waste is now a budget target | Waste reduction tied to outcomes |
| Breach root cause | 96% individual errors | Human risk is the dominant risk | Usable controls and training built in |
| AI value when controlled | $46 million savings example | AI budgets unlock when controls are real | Monitoring, policies, safe environments |
| QA discipline trend | 100% of calls reviewed against 12 criteria example | Buyers are moving towards measurable quality systems | Quality workflows and evidence trails |
| Predictive performance reality | 67% recall example plus human judgement needed | Accuracy alone does not remove the need for oversight | Human review paths and thresholds |
| Transformation progress visibility | 80+ of 93 assets migrated example | Buyers want measurable milestones | Phased delivery and measurable checkpoints |
| Capability-building scale | 120,000 employees referenced | Adoption requires a system, not occasional comms | Enablement design for scale |
| Privacy certification shift | 27701 possible without 27001 example | Privacy posture is becoming more formalised | Audit-ready privacy outcomes |
| Data contracts maturity | Exploration for 2026 | Accountability is being formalised | Staged contract approach for key data |
Graph: what the enterprise data usage gap looks like
Data created: 100% |████████████████████████████████████████████|
Data used: 20–30% |███████████ |
Data unused: 70–80% |██████████████████████████████████ |
The point of the graph is not to shame data teams. It is to clarify why budgets are shifting. If 70 to 80% of data is unused, buyers will prioritise making the most valuable data usable before funding new collection and new dashboards.
What this means for vendors selling the “data factor” in enterprise IT
Enterprise IT leaders are buying data capability as an operating model, not a toolkit.
Your positioning and proof should map to the priorities raised in the discussions:
Reposition from reporting to decision quality
Buyers are moving from analytics outputs to decision inputs.
Translate your offer into outcomes that IT leaders can defend:
- Trust: reduce verification cycles with lineage, quality signals, and clear ownership.
- Reuse: publish data products that business teams can consume repeatedly without bespoke engineering.
- Risk reduction: reduce human error exposure through usable controls and embedded enablement.
- Reliability: detect issues early and shorten root cause cycles without adding operational overhead.
Sell an operating model pathway, not a platform claim
Stakeholder resistance is often an operating model issue, not a tooling issue. Vendors that win help buyers define:
- What gets standardised centrally and what stays close to the business
- How ownership is assigned for critical datasets
- How data products are published, rated, and deprecated
- How governance strengthens as value and risk increase
This is especially important when buyers are shifting between centralisation and decentralisation. Your offer should support the hybrid reality rather than arguing for a single ideology.
Bring enterprise-grade proof points that match buyer scepticism
The discussions showed that buyers are not naive about AI and automation. They know:
- Accuracy is not perfect (the 67% recall example matters here)
- Oversight remains necessary
- Quality systems are becoming more rigorous (the 100% call QA example is a signal of that direction)
Proof that resonates includes:
- How you surface lineage and trust signals to users
- How you reduce human error risk through design
- How you handle exceptions and out-of-policy behaviour
- How you monitor, measure, and improve over time
Use discovery questions that match real enterprise priorities
These questions tend to create more useful conversations:
- Which datasets drive the decisions leaders disagree about most?
- Where does verification happen today, and what does it cost in time and risk?
- Which systems cannot be scanned reliably, and how do you handle lineage in those environments?
- Where would controlled access improve safety without killing innovation?
- What would it take to publish three trusted data products that business teams will reuse?
This positions your solution as a way to reduce waste and improve outcomes, not as another tool to manage.
How The Leadership Board helps vendors meet their ideal clients
Enterprise IT and data leaders are rewriting budgets around trust, governance, safe AI readiness, and adoption at scale. The hardest part for vendors is getting time with the right senior decision-makers early enough, before budgets are committed and preferences harden.
The Leadership Board helps vendors by enabling:
- Better conversations grounded in what enterprise IT leaders are prioritising now
- More precise access to the right senior roles aligned to your ideal customer profile
- Stronger proof-led messaging that resonates with both technology and risk stakeholders