From data chaos to AI-ready: what IT teams now demand from governance, lineage and metadata

Recent discussions with senior data and technology leaders indicated that “AI readiness” is increasingly a governance, lineage, and metadata problem, not a model problem.

Leaders repeatedly returned to two hard signals:

  • Only 20 to 30% of created data is actually used, contributing to a “death of dashboards” dynamic where reporting grows but confidence and adoption do not.
  • Most breaches (96%) were attributed to individual errors rather than system failures, making “human-safe” controls a buying requirement, not an afterthought.

For vendors selling into enterprise IT, this changes the buyer’s definition of value. Tools that create more outputs are easier to deprioritise. Tools that make data findable, trustworthy, auditable, and safe to use in AI-enabled workflows are easier to fund.

This blog explains what IT teams are now demanding across governance, lineage, and metadata, and how vendors can position solutions in ways that match what senior buyers are prioritising.

Why governance, lineage and metadata just moved up the budget stack

Enterprise IT teams are under pressure from multiple directions:

  • AI initiatives are forcing organisations to expose how data is classified, where it flows, and who can access it.
  • Leaders are increasingly unwilling to scale AI without clear oversight, monitoring, and operational guardrails.
  • Risk capacity is tightening, especially as security teams emphasise that human behaviour is a dominant driver of incidents.

In the discussions, leaders highlighted that even where predictive systems show promise, human judgement remains crucial. One example described an attendance prediction system achieving 67% recall for flagging absences, but requiring human intervention to prevent negative outcomes through proactive interactions. That is a useful analogy for enterprise AI more broadly:

Accuracy helps, but governance and oversight determine whether organisations will use it at scale.

The new “data factor” inside enterprise IT

Enterprise buyers are shifting from a dashboard-first view of data to a production-first view:

  • Data as a product that can be reused, trusted, and governed.
  • Metadata and lineage as the control plane that makes reuse safe.
  • Observability and testing as the reliability layer that prevents drift and silent failures.
  • Privacy and access controls as operational workflows, not policy documents.

This shift is visible in how leaders described their priorities:

  • Building high-performing data users can trust, even while balancing structured and unstructured legacy data.
  • Improving maturity fast enough to keep up with growth and acquisition-driven complexity.
  • Managing vendor risk and compliance where external data sources and tooling are involved.
  • Monitoring AI systems for drift and hallucinations in production settings, with observability treated as essential.

For vendors, the implication is straightforward: your solution is increasingly evaluated on whether it strengthens this control plane, not on whether it adds another analytics interface.

What IT teams are demanding in 2026

1) A usable catalogue that works in messy, legacy environments

Leaders described cataloguing and lineage challenges across systems, especially where legacy estates make scanning and standardisation difficult. Buyers are looking for approaches that recognise the reality:

  • Some systems are modern and instrumented.
  • Some systems are legacy, brittle, and poorly documented.
  • Some data is structured, some unstructured, and both must be governed.

What IT teams demand is not “a catalogue”. It is a usable discovery experience where teams can:

  • Find the right data quickly
  • See what it means and how it is used
  • Understand trust signals (quality, freshness, ownership)
  • Confirm access and permitted usage

Vendor positioning that lands: reduce time-to-find and time-to-trust, not “one catalogue to rule everything”.

2) Metadata that scales without collapsing into manual overhead

The discussions included interest in using AI to accelerate metadata creation and documentation. The key is how it is governed.

What buyers want:

  • Fast metadata generation as a starting point
  • A clear approval and publishing workflow
  • Ownership so metadata stays current
  • Controls so sensitive classification does not drift

The goal is a “minimum viable metadata” standard that can be applied widely, then hardened over time for critical assets.

3) Lineage that answers real enterprise questions

Lineage only matters if it resolves enterprise pain quickly. Leaders want lineage that answers questions like:

  • Why did this metric change this month?
  • What downstream reports or models depend on this dataset?
  • If we change a field, what will break?
  • Is this dataset safe to use for AI workflows and assistants?

Lineage is now being evaluated as a risk control as much as an engineering convenience. If your lineage story cannot be explained clearly to non-technical stakeholders, it will struggle in cross-functional review.

4) Governance that behaves like workflow, not documentation

Several leaders highlighted governance requirements that include experimentation and empowerment, but also guardrails. The point is balance:

  • Too much governance too early can stifle innovation.
  • Too little governance makes scale unsafe and slows adoption later.

One practical approach discussed was automated governance through categorising decisions: determining what can be automated and what requires human oversight, especially for critical policy decisions.

This is a strong operating model for enterprise IT because it avoids a binary stance. It turns governance into a decision framework with clear thresholds.

5) Observability and monitoring as a default production requirement

In production AI discussions, leaders described controlled approaches including sandbox environments, clear guidelines and protocols, and the need for observability monitoring to prevent drift and hallucinations.

The lesson for vendors selling governance and metadata capabilities is that buyers increasingly want:

  • Monitoring that flags issues early
  • A measurable feedback loop that improves quality over time
  • Evidence trails that support operational confidence

This is part of why “dashboard solutions” are losing share of budget. Monitoring and trust systems are becoming the new table stakes.

6) Quality control that is measurable, not assumed

One life insurance example described using AI to enhance QA processes by covering 100% of calls against 12 criteria, while maintaining human oversight for key elements.

That is a clear signal of where enterprise quality systems are going:

  • Greater coverage
  • Explicit criteria
  • Ongoing oversight
  • Exceptions handled in workflow

For vendors in data and governance, the direct parallel is pipeline and data product quality:

  • Define criteria for “trusted”
  • Increase automated coverage where feasible
  • Keep human review where risk is high
  • Make exceptions visible and actionable

7) Clear ownership and a path toward data contracts

Data contracts were discussed as an emerging direction, with some leaders viewing 2026 as the horizon for broader adoption. The emphasis was accountability and clarity, not bureaucracy.

Buyers want:

  • Ownership for critical datasets
  • Basic SLAs for availability, quality, and change communication
  • A lightweight contract approach for key data products first, then expansion

Vendors who present data contracts as an immediate, heavyweight transformation tend to trigger resistance. Vendors who present a staged approach aligned to critical decision datasets tend to be welcomed.

8) Vendor and third-party risk considerations are now part of the data story

One discussion referenced scraping social media data via external vendors while maintaining compliance, with the legal team reviewing potential vendors. The broader point is that governance is not limited to internal systems.

Enterprise buyers increasingly want:

  • Third-party risk visibility for data sources
  • Clear compliance alignment
  • Controls that prevent unapproved tools from creating data leakage pathways

A practical model: the enterprise trust stack

A useful way to describe what buyers are building is a “trust stack” for data in IT. This is not a tool list. It is an operating model.

  1. Discovery: can users find the right data?
  2. Understanding: do users know what it means and when it is safe to use?
  3. Permission: are access and usage rules clear and enforceable?
  4. Quality: is there evidence the data is reliable and current?
  5. Lineage: can teams trace upstream and downstream dependencies quickly?
  6. Monitoring: can teams detect drift, anomalies, and failures early?
  7. Accountability: is ownership clear and are changes communicated?
  8. Enablement: do users behave safely and consistently at scale?

Adoption rises when the trust stack is complete. Enterprises are funding trust stack components because they reduce waste and enable AI safely.

The stats table buyers will repeat internally

What leaders describedStat or figureWhat it signals to enterprise ITHow vendors should respond
Created data actually used20 to 30% usedTrust and usability are the bottleneckPrioritise time-to-find and time-to-trust improvements
Breaches tied to behaviour96% individual errorsHuman-safe controls are essentialEmbed guardrails, training and workflow-based controls
Predictive performance reality67% recall example, with human judgement requiredOversight remains necessaryBuild thresholds, review paths, and explainability into workflows
AI value when controlled$46 million savings example from a secure approachProduction controls unlock fundingLead with monitoring, protocols, and safe environments
Quality systems becoming explicit100% of calls checked against 12 criteria exampleBuyers want measurable quality evidenceDefine quality criteria and increase automated coverage
Complex migrations still dominate80 out of 93 assets migrated in one transformationBuyers need phased, trackable progressShow milestones, cutover planning, and governance alignment
Enablement at scale is difficult120,000 employees referenced in one organisationAdoption needs a systemProvide enablement frameworks, not one-off training
Governance maturity variesSmaller organisations struggle to establish governance programmesBuyers want pragmatic stepsOffer staged governance roadmaps, not “big bang” programmes

What this means for vendors selling into enterprise IT

The same themes will show up in procurement, architecture reviews, and executive discussions. Vendors that align to them tend to shorten sales cycles because they reduce uncertainty.

Shift your messaging from “capability” to “confidence”

Enterprise IT is prioritising solutions that create confidence at scale:

  • Data that can be trusted
  • Controls that can be audited
  • Workflows that can be governed
  • Monitoring that prevents silent failure
  • Enablement that reduces human error

A practical message direction:

  • “We reduce the time it takes for teams to find trusted data and use it safely, including for AI initiatives.”

Sell the outcome: time-to-trust

A strong commercial framing that fits these discussions is a metric vendors can own:

  • Time-to-trust: the time from “I need data” to “I have trusted, governed data I can act on”.

You can influence time-to-trust through:

  • Better discovery and metadata
  • Clearer lineage and ownership
  • Quality evidence and monitoring
  • Embedded governance workflows
  • Enablement and behaviour change mechanisms

It is a measurable, executive-friendly framing that maps to the “70% waste” reality without making the conversation about blame.

Package an “enterprise governance and lineage readiness” offer

A deal-winning approach is to bring a clear, reviewable package that risk, security, and IT can evaluate quickly.

Include:

  • A catalogue and metadata acceleration plan (including approval workflows)
  • A lineage coverage matrix (what is covered first and why)
  • A classification and access model that supports AI usage safely
  • A quality and observability model (criteria, monitoring, exception handling)
  • An ownership and accountability approach (including a staged path to data contracts)
  • An enablement plan designed for scale, not small teams

This makes the buyer feel that implementation risk is understood and controlled.

Prove you reduce human error risk

Given the 96% individual error signal, buyers will increasingly ask:

  • “How does this prevent mistakes?”
  • “How does it stop unsafe access or unsafe usage?”
  • “How does it make the safe path easy?”

Vendors who can answer that clearly gain an advantage, especially in regulated or high-sensitivity environments.

How The Leadership Board helps vendors meet ideal clients

Recent discussions with senior data and technology leaders show a clear shift: budgets are moving to governance, lineage, metadata, monitoring, and enablement because these capabilities determine whether enterprises can use data and AI safely at scale.

The Leadership Board helps vendors by enabling:

  • Access to senior enterprise IT and data decision-makers working through these exact priorities
  • Better insight into what buyers will approve, fund, and scale
  • Clearer positioning grounded in what leaders are actively prioritising, including trust, oversight, and measurable quality systems
Optimized by Optimole