Data Governance Innovation in Financial Services
Dan Power, managing director of data governance at State Street, joins BigIDeas on the Go to talk about AI and ML in data governance programs, the data management landscape in financial services, and why automation is necessary.
During his tenure in the data governance world, Dan Power has served several Fortune 500 firms—first on the corporate side, later as a consultant under his own firm, and now as a director for State Street Bank in Boston.
Reflecting on his current position, “I really liked the idea that with financial services there’s a lot of complexity, a lot of regulations, a lot of systems—I didn’t think it would get boring anytime soon, which it hasn’t. I’ve been really excited to be a part of their [State Street] kind of digital transformation.”
Using AI in Master Data Management (MDM)
What does a digital transformation for a large scale financial incumbent like State Street entail? A close examination into their current master data management practices for starters. “We should have a single source of truth for things like customer data, legal entities, vendors, and product items”.
“We have a big chunk of our processes that are still manual, but the problem is the people doing those processes are eventually going away and we have to put either a program or a robot [in]. We’re in the process of upgrading a big chunk of our technology stack around data management and BigID is a part of that.”
Eliminating outdated processes and substituting for ones that provide clear value has been a guiding principle of State Street’s transformation. “Nowadays”, Power says, “we’re trying to automate all that so the digging is automated and everything kind of shows up on the data stewards desk and then they have to curate it and put it together.”
“What we’re finding is we’re starting to move away from that manual process where there’s a lot of reverse engineering and even the reading of code”.
Reconciling Data Quality with Automation
Power has been known to describe himself as a “data quality junkie”. He and his team have been in the process of changing their approach to data quality and how it impacts probability within their organization.
“Even with the legacy systems, you don’t really know the data was—whether it was bad or not inside the application, but sure enough when you bring it out it’s pretty bad, it’s missing things or coded wrong.” The ability to harmonize between systems has been a crucial component to streamlining their operation.
Dan believes data quality is an inherent function, so instead of having application owners and IT teams develop and maintain those applications from scratch, Power has implemented systems to assist.
“It’s essentially a step change from the kind of very distributed way that we’re doing data quality today, which works, but it’s a lot of effort to a more centralized data quality tool driven approach, where we will have the ability to kind of manage that catalog of data quality checks and controls.”
Listen to the full podcast to hear more about Dan’s insights on the financial data landscape and the innovations changing the industry.