EY CTO outlines data governance challenges
Multinational professional services firm EY has taken a strategic view of how to manage and use data in a federated approach powered by a trusted data fabric.
Managing multiple sources of data across a global organization is no easy task.
For EY, data governance, data quality and data lineage are paramount for itself and its customers.
EY is a multinational professional services firm operating in 150 countries, across multiple lines of business including tax, audit, consulting and strategy.
The firm collects and uses data from its customers to help them to make informed decisions, as well as for regulatory and compliance purposes related to taxation and audit. With employees and customers around the world, EY has built out a strategic function for its data management efforts to enable a unified, optimized approach to data.
In this Q&A, EY's global CTO Nicola Morini Bianzino outlines how his organization manages data and provides insight into the challenges and opportunities of data quality.
What is the EY data environment and how does it enable proper data governance?
Nicola Morini Bianzino: We are effectively like a gigantic data processing company. What we do is we consume data from our clients' environment. Then we apply rules, processes, regulations and human insights on those on those data sets.
Then, on the other end, we produce the right deliverable from data. That could be a tax return. It could be an audit opinion, or advice that we give to a client such as due diligence for merger and acquisition types of activity. A lot of our operating model is actually centered on the use of data.
For some of our businesses, it's absolutely critical to know where a piece of data is coming from and when it was extracted. We need to keep track of data for regulatory purposes as well. We need to be able to, for example, in an audit go back and say, 'We got this data from this client at this particular point in time for this purpose.'
We use a combination of tools, but we had to actually custom build a lot of the technology ourselves, because there are not, yet, out-of-the-box software systems that provide the level of accuracy that we need.
What is your view on using a data lake as a basis for data management and data governance?
Bianzino: I am not a big believer in data lakes. We use a different concept that we call the trusted data fabric.
We're now in a space where we know we want to load the data once in our system and use it in a way that is abstracted from the application. So, the trusted data fabric is a data layer for all our platforms. Applications consume the data from the data fabric as a service.
Nicola Morini BianzinoGlobal CTO, EY
I don't believe in the concept of taking data and dumping that somewhere. I think you need to have a strategy where you create an abstraction layer, and then build products that can use the data fabric as a source of data.
What do you see as the as the real rules organizations need to think about regarding data governance?
Bianzino: At EY, we cannot impose data rules that apply globally. But at the same time, we can't have the Wild West where everyone does anything they want.
That's why we have focused on a data abstraction layer where all the data policies can be configured. It's one place where all the policy configurations, the data lineage, the master data management and the data quality can be defined.
So, it's just one cross-platform capability, but then, effectively, we leave the ability of the different businesses to define their own specific data policies. For a business like ours where we work in 150 counties, we have all these diverse types of clients. It becomes a process to set the right policies. So, we work in a federated model versus a decentralized model.
What do you see as biggest challenge organizations face with getting value from their data?
Bianzino: I think a challenge with data is that we all tend to think about data in one dimension -- be it from a risk perspective or from a quality perspective.
Sometimes, you don't have perfect data quality, but it's better to do some analytics on the data, because what you can gain are insights about a trend without needing absolute perfection from the data.
Sure, there are specific use cases where, because of regulations you need to strive for perfection. But I think there is a lot of value in data, even if the data is not absolutely pristine.
Editor's note: This interview has been edited for clarity and conciseness.