Data Governance in Financial Services:
How to Turn Compliance Into Your AI Advantage

Data Governance in Financial Services:
How to Turn Compliance Into Your AI Advantage

Two leaders of data governance in financial services reveal why the infrastructure required for BCBS 239 compliance is the same foundation that makes AI possible—and how to build it strategically rather than reactively.

Your board wants AI results yesterday. Your regulators demand pristine data governance. Most banks approach these as separate initiatives requiring separate investments.

Increasingly, financial services data leaders are discovering they’re actually the same project: the infrastructure that satisfies BCBS 239 audits is the same foundation that makes AI possible. It’s not about choosing between compliance and innovation—it’s about building one system that serves both.

The timing of this convergence couldn’t be more critical. Financial institutions dedicated 61% more employee hours to compliance between 2016 and 2023, even as total employee hours grew only 20%, according to the Bank Policy Institute. Meanwhile, 88% of financial services firms report governance and security challenges with AI adoption.

The opportunity? Both problems require the same solution.

At CONTEXT: The Metadata & AI Summit, Ravi Josyula, Head of Enterprise Data Governance at Webster Bank and Sid Narayan, Head of Data Governance at Valley National Bank shared how they transformed episodic compliance exercises into continuous competitive infrastructure.

Their strategic framework addresses the dual mandate facing financial institutions today: satisfy regulatory requirements while building the data foundations that AI initiatives need to succeed.

You’ll learn why they see compliance as a strategic asset rather than a cost center, their risk-based prioritization method that focuses governance efforts where they matter most, how building governance into systems from the start prevents expensive remediation cycles, and why AI Data Catalogs have become essential infrastructure for executing these strategies at scale.

Watch the full session on-demand → 

The proactive advantage: Building for business value, not just compliance

Both Ravi and Sid emphasized building governance proactively rather than reactively. Not because exams are imminent threats, but because reactive approaches cost more and deliver less.

Ravi recounted a regulatory exam scenario: “If there is an exam and they ask you, ‘Show me in this cell, where does this data come from? What are the systems that this data originated from?’ If you can’t answer that question by the end of the day, you can give yourself a few MRAs at that point.”

Matter of Regulatory Attention (MRA) findings trigger extensive remediation, follow-up exams, and board-level reporting. But the deeper problem isn’t regulatory consequences. It’s that organizations failing basic lineage questions can’t possibly support the data foundation that AI requires.

Don’t wait for an exam. Do it for the AI agent and you’re already halfway there to pass your regulatory exam.

Sid Narayan

Head of Data Governance, Valley Bank

To take a proactive approach with your data governance, don’t just build for your next regulatory exam. Build because:

  • Your business needs good data to make informed decisions regardless of regulatory pressure. Even without compliance mandates, you still need reliable data to run operations and make sound business decisions.
  • AI models require the same rigor that regulators demand. There’s no separate infrastructure for AI readiness and compliance. They’re the same capabilities serving different stakeholders.
  • Proactive governance costs less than reactive remediation. Failing to answer basic questions during exams results in MRAs that trigger extensive remediation while regular compliance obligations continue unchanged.
  • Automated approaches remain current as systems evolve. Manual documentation becomes obsolete immediately; modern catalogs adapt continuously as pipelines change.

This reframing transforms expensive episodic exercises into continuous competitive infrastructure. Whether stakeholders are auditors, business analysts, or AI systems, they all need the same thing: trusted, traceable, high-quality data.

How BCBS 239 principles set the stage for AI innovation

In 2013, The Basel Committee on Banking Supervision established 14 principles for risk data aggregation and reporting, covering architecture, completeness, accuracy, timeliness, clarity, and distribution. These principles were designed to prevent the kind of data failures that contributed to the 2008 financial crisis.

It was prescient that these same principles from over a decade ago describe exactly what AI models demand today: traceable data provenance, validated quality and freshness, clear ownership, and systematic guardrails.

Sid elaborated on how these requirements have expanded beyond their original scope: “Just the practical use of [BCBS 239] has extended to more than risk reporting. It also applies to your regulatory reporting, your financial reporting, and any of your critical AI models that are making decisions based on your firm’s data.”

This isn’t a coincidence. Both regulatory compliance and AI innovation require the same fundamental capabilities:

  • Complete lineage showing where every data point originated and how it was transformed
  • Continuous quality monitoring that catches issues before they cascade downstream
  • Clear ownership and accountability for each dataset
  • Automated validation that data meets defined standards
  • Impact analysis showing what breaks when systems change

Download our guide BCBS 239 Compliance and Beyond to see how DataHub helps you meet regulatory requirements today and build AI-ready foundations for tomorrow.

Why do boards suddenly care about data governance?

For years, data governance leaders advocated for better metadata management and quality controls. The C-Suite nodded politely and approved minimal budgets. Just enough to avoid regulatory trouble.

AI changed everything.

Now the board and the shareholders are excited about AI. They were never excited about the regulators. They were only doing it to stay in business. But now, to be competitive in today’s market, you have to be a leader in AI.

Sid Narayan

Head of Data Governance, Valley Bank

A 2025 survey by EY found that 72% of organizations have already integrated AI into their initiatives, and 99% are at least in the process of doing so. But the same survey revealed that governance is lagging behind, with half of organizations now making significant investments in governance frameworks to mitigate AI risks.

This lag creates a strategic advantage for institutions that made early investments. Organizations that built robust data governance capabilities to satisfy BCBS 239 and similar regulations now find themselves unexpectedly well-positioned for AI. They have the lineage, quality controls, and metadata management that AI projects require.

Meanwhile, institutions that took shortcuts on compliance fundamentals must now backfill governance capabilities before advancing their AI strategies. They’re laying the foundation while competitors are already constructing the building.

The competitive advantage lies in having AI ambitions accompanied by the data governance infrastructure to execute them safely and at scale. As both leaders emphasized, this infrastructure isn’t separate from compliance capabilities—it’s the same infrastructure serving a new, more strategic purpose.

From principles to practice: Where to actually start with data governance?

Understanding that BCBS 239 principles apply to AI is one thing. Implementing them across your entire data estate is another. The question both leaders addressed head-on: if these principles apply to everything from regulatory reporting to AI models, where do you actually start?

Start with what matters: Risk-based prioritization

A consistent theme throughout the discussion was the impossibility of trying to govern everything at once.

“You cannot boil the ocean,” Ravi emphasized. “What we’ve done [in our organization] is we’ve said, ‘We’ll go after our most critical high priority reports that go to the regulators, that go to the board, and senior management committees of the board. Let’s make sure we have that under good governance before we tackle more things.’”

Sid echoed this practical approach: “You can only govern your critical data elements. You can’t govern your entire data estate.”

This isn’t a compromise—it’s a strategic necessity. Attempting to govern everything equally dilutes resources and fails to protect what matters most. The organizations that succeed concentrate their initial efforts on the highest-risk data flows, then expand systematically as their governance infrastructure matures.

Build governance into systems and workflows

Both leaders emphasized embedding data governance into development processes rather than treating it as a validation gate at the end.

“Data governance needs to be embedded in any of your projects,” Ravi stressed. “If you’re redoing your data pipelines, you wanna make sure data governance is in there, and it’s not, ‘Oh, let’s finish the implementation, and then we’ll just run it by data governance people.’”

Sid highlighted why this matters structurally: “It’s usually not data governance people either. It’s a data governance person in an organization. They expect one or two people to govern the entire organization’s data.”

The shift from governance-as-afterthought to governance-by-design prevents expensive remediation cycles. As Sid explained, there’s been a necessary evolution toward thinking “about data quality as a requirement before you actually build.” Traditionally, IT teams faced time pressures “to get the data in, not thinking about the quality while they’re building the system.”

When governance becomes a project requirement rather than a post-implementation validation step, quality controls get built into pipelines, lineage gets captured automatically, and teams don’t face expensive rework when audits reveal gaps.

Why manual approaches can’t keep pace

These principles sound straightforward. The challenge is executing them at scale as systems continuously evolve.

Both leaders spoke candidly about the limitations of manual approaches. Before modern metadata platforms, banks spent millions on consultants to manually document lineage for regulatory exams—only to have that documentation become obsolete the moment systems changed.

Manual lineage is a waste of money. If you’re gonna do it, do it automated. And you might as well do it at an element level because the AI agent of the future will go to that level of granularity.

Sid Narayan

Head of Data Governance, Valley Bank

The transformation from manual to automated isn’t just about cost, it’s about sustainability. Valley National Bank’s enterprise data warehouse migration required consistent management of critical data elements across multiple years and iterations. Manual lineage documentation could not have kept pace with that evolution.

As Ravi noted: “By the time you’re done with one [manual documentation exercise], those ETLs have changed.” The documentation you paid consultants to create six months ago no longer reflects your current systems. During regulatory exams, outdated documentation becomes worse than useless. It becomes evidence that your governance processes can’t keep up with operational reality.

This sustainability gap is where infrastructure choice becomes critical.

How AI Data Catalogs enable sustainable governance

Building governance into systems from the start requires infrastructure that can keep pace with continuous change. Manual documentation and periodic audits can’t support the governance-by-design approach that Ravi and Sid advocate.

What’s needed is continuous, automated visibility that captures lineage and quality as part of normal operations. Not through separate documentation exercises that immediately fall out of date.

What do AI Data Catalogs actually do?

AI Data Catalogs like DataHub represent a fundamental shift from earlier generations of data governance tools. Instead of requiring teams to manually document what they’ve built, these platforms automatically ingest metadata from systems themselves:

  • Automated cross-platform lineage that traces data across databases, pipelines, transformations, and BI tools to provide the column-level precision that answers both regulatory questions and AI implementation needs
  • Continuous data quality monitoring that catches freshness delays, schema changes, and quality degradation before they cascade downstream to regulatory reports or AI models
  • Impact analysis before changes showing exactly which downstream reports, dashboards, and models depend on each dataset, preventing the breaking changes that crash both regulatory reporting and production AI systems
  • Self-service governance workflows that eliminate bottlenecks where lean governance teams can’t review every access request, while maintaining full audit trails
  • Natural language interfaces enabling conversational search for both human analysts and AI agents

When developers deploy a new pipeline, the catalog captures its lineage automatically. When analysts create a dashboard, dependencies get mapped in real-time. When schemas evolve, impact analysis updates in seconds.

This solves the sustainability problem where manual documentation falls short. But more importantly, this same infrastructure serves both compliance and AI readiness at the same time.

Five priorities for sustainable data governance in financial services

These implementation priorities from Ravi and Sid’s session represent the next steps for any financial institution building governance that satisfies both compliance mandates and AI ambitions:

1. Reframe compliance as business enablement, not cost

BCBS 239 and similar regulations establish the data quality, lineage, and governance standards that make AI possible. Elevate compliance from an obligation to a strategic foundation.

2. Treat governance as a marathon instead of a sprint

The most critical shift in mindset is recognizing that data governance is not a project to be completed, but a capability to be continuously refined. This journey starts by focusing on core requirements, then iterating over time. By treating governance as a system—not a task—financial institutions ensure their foundation remains current, auditable, and ready to support the next wave of AI innovation.

3. Integrate governance into everyday processes

While effective governance boosts overall productivity, where there is friction, there will be resistance. Successful governance requires seamless integration with everyday processes so that compliance is a frictionless part of the job.

4. Automate what consultants used to document manually

Manual lineage exercises cost millions while providing limited operational value between exams. Automated approaches provide continuous visibility for a fraction of recurring consultant costs.

5. Invest in an AI Data Catalog to build beyond compliance

AI Data Catalogs transform governance from periodic compliance exercises into always-on infrastructure that serves regulators, data teams, and AI systems simultaneously.

The bottom line

Data governance is evolving from mere regulatory compliance to become the powerful foundation for next-generation AI strategies and a decisive competitive advantage in financial services. 

Organizations have a clear path to sustainable success by recognizing this convergence and choosing to invest proactively, automate manual processes, prioritize strategically, and build data systems for long-term business value rather than just regulatory demands. 

The opportunity lies in modernizing your metadata management and data governance capabilities to create a unified system that successfully serves all stakeholders and positions your firm for sustainable AI innovation and market leadership.

Learn more

Watch the full session

Hear the complete insights from Ravi Josyula (Webster Bank), Sid Narayan (Valley National Bank), and Stephen Goldbaum (DataHub) on building sustainable data governance for compliance and AI readiness.
Watch the session on demand → 

Download: BCBS 239 Compliance and Beyond

Learn how leading financial institutions are implementing BCBS 239 compliance while building AI-ready data foundations with DataHub.
Download the free guide

Recommended Next Reads