Sergey Nivens - Fotolia

GoodData says it is CCPA compliant

GoodData says it is complying with the California Consumer Privacy Act. Meanwhile, founder and CEO Roman Stanek talks about why complexity is a key challenge for data analytics.

On Monday, GoodData said its platform is CCPA compliant and that it is helping organizations use data in a way that protects privacy.

A key challenge that organizations face with data is complying with an expanding universe of data privacy and security regulations, according to Roman Stanek, the San Francisco-based vendor's founder and CEO.

In this interview, Stanek talks about being CCPA compliant and goes beyond adhering to the California Consumer Privacy Act as well to touch on how making data available for analytics is a key part of modern data management.

GoodData's approach -- and that of other analytics vendors -- is to try to make data and data analytics as easy as possible for end users.

Another challenge is keeping that data private.

How does GoodData handle privacy and compliance for data?

Roman StanekRoman Stanek

Roman Stanek: In one sentence, our vision is to make analytics available everywhere and to everyone. That deals with this notion that analytics were previously available to only a few analysts and now we want to make data available to everyone.

So governance, visibility and access rights are a huge part of our story, since we are dealing with not just one or two analysts, but potentially thousands of people in any organization.

We started working on GDPR [General Data Protection Regulation] many years ago and security layers are part of our infrastructure. So for us a new regulation like CCPA is just about how we need to map it to our processes and reports and data protection. You know, we can expect dozens of similar regulations around the world, so we have to be ready for that level of complexity.

Over the past decade various trends in data management have come and gone. What trends have remained constant?

Stanek: Let me start with the negative one and that is that people always underestimate the complexity of their data. That's why you see these kinds of fads where someone says, well data is complex, but we will put it all in Hadoop and somehow magically, it will be simpler.

People always underestimate the complexity of their data.
Roman StanekFounder and CEO, GoodData

It's interesting that there's always a new shiny object that people think will make data less difficult and easier to manage.

The complexity of the data actually comes from the complexity of the business. As the business is evolving, as the business is getting more and more complex, data challenges are not getting smaller, and there's no shortcut. If your business is changing, your data is changing.

For most companies it's difficult to manage data and most companies are not good at it. Everyone's looking for the shortcut, but the complexity is not in the rules, it's actually in the business itself.

How are data formats still a problem for enterprises?

Data formats, schemas and normalizations are absolutely important.

But if you go to any company you will assume that their Salesforce implementation is the same. But everyone has dozens of custom schemas and custom objects and custom mappings, and so on. So the format and schemas are just the basics.

You have to understand how the business is evolving and how to actually present information. But everything that kind of makes it more automated, more structured, more defined helps. But again, there's no shortcut.

What's your view on data virtualization and do you use it at GoodData?

I'm not a big believer in data virtualization. I actually have seen over the last twenty years many failed attempts at data virtualization.

The biggest problem with data virtualization is that there is no guaranteed performance. Data virtualization is good for some sort of an overview of the data, what is it that I have, and where is it and so on.

But in any kind of predictable business process, it's very difficult to rely on data sources that are outside of my control. So for GoodData, we typically copy the data into our in-memory so that we have performance and scalability.

Why do some organizations tend to fall short when it comes to data analytics projects?

They underestimate the fact that analytics is not a one-and-done approach.

Most failed projects don't fail on day one, they just fail over the course of time as the data doesn't reflect the business and people go back to spreadsheets.

Most companies don't really invest enough. Maybe they invested initially in infrastructure build-up. But -- no pun intended -- they just don't have good data evolution strategies to keep investing in data analytics so it doesn't become stale and reflects their business.

Editor's note: This interview was edited for clarity and conciseness.

Next Steps

GoodData to unveil analytics platform overhaul in April

Dig Deeper on Data governance

Business Analytics
SearchAWS
Content Management
SearchOracle
SearchSAP
Close