It's GDPR Day. Let the privacy regulation games begin!

May 25, 2018 was “GDPR Day;” the day enforcement of the European Union’s new General Data Protection Regulation began; the day so many information security professionals have been preparing for over the past two years; the day so many have been anticipating and fearing.

GDPR Day is a day many have been treating as a deadline to comply with an entirely new privacy regulation, and woe to all who are not ready by the deadline.

However, GDPR Day is not a deadline — it’s a starting date.

If you’re new to the GDPR game, last Friday was the first day the new regulation could be enforced in the EU against any organization collecting personal data and failing to comply with the new rules.

Max Schrems, the Austrian attorney and privacy activist who helped bring down the long-established Safe Harbor framework governing trans-Atlantic data flows over privacy concerns in 2015, is on the job now as well. His group, NOYB (“None of Your Business”) filed the first complaints under GDPR, alleging that Facebook and its Instagram and WhatsApp services, as well as Google, were attempting to do an end-run around GDPR consent policies by “forcing” consent: telling users there is a new privacy policy, but giving them no way to opt out of sharing other than to stop using the service entirely.

And, anyone who imagined Facebook and Google would be the only companies facing this type of charge was simply wrong.

Monday morning after GDPR Day saw more complaints: Seven claims against Facebook and Google (in three separate complaints against Gmail, Youtube and Search) as well as claims against Apple, Amazon and LinkedIn by the French digital rights group La Quadrature du Net. The group had originally intended to target a dozen services but held back on complaints against Whatsapp, Instagram, Android, Outlook and Skype in order to avoid overwhelming the system.

Forced consent is not OK under GDPR

The intent of the GDPR is to return control of their data to EU data subjects. Up until now, companies like Facebook and the rest have been gathering data about their users and then finding ways to turn that data into revenue, for example, through targeted advertisements. Previously, there have been no significant obstacles keeping those big data companies from sharing or reselling some or all of the personal data they collect with other companies. And users have had little to no recourse to prevent all of this from happening. At best, services would bury controls to opt out of targeted advertising deep in settings and at worst, even leaving (or not joining) the service all together might not stop the data collection and sale as was the case with Facebook’s “shadow profiles.”

What was seen in the run-up to GDPR Day from the big data companies has been a form of “opting in” consent policies that effectively force consent from users. This forced consent is not just a bad look on the part of these big corporations but, as NOYB put it in its statement, it is in fact illegal under the new rules.

Schrems said in a statement that when Facebook blocked accounts of users who withheld consent, “that’s not a free choice, it more reminds of a North Korean election process.”

NOYB pointed out that, under Article 7(4) of the GDPR, “such forced consent and any form of bundling a service with the requirement to consent” is prohibited under GDPR — and Schrems said that “this annoying way of pushing people to consent is actually forbidden under GDPR in most cases.”

Schrems and NOYB also note that the GDPR doesn’t mean companies can’t collect any data from their users, because there are some pieces of information that they need in order to provide their services. “The GDPR explicitly allows any data processing that is strictly necessary for the service – but using the data additionally for advertisement or to sell it on needs the users’ free opt-in consent.”

In other words, if the data is required for the service provider to be able to provide the service, consent is no longer required — but for any other use, the users must be given a real choice.

So, who should be worried about GDPR enforcement?

In the days since GDPR Day and the start of enforcement, it is clear that companies that have failed in some way to comply with the new rules — especially those that have attempted to comply in a way that circumvents the consumer protections provided by GDPR — should be worried.

If your organization has taken the steps necessary to comply — in good faith — with the GDPR, it is probably safe. If your organization cares for the personally identifying data of its customers, employees and anyone else whose data it collects, you are also probably safe.

However, if your company is making an effort to appear to be in compliance with GDPR, but in a way that attempts to subvert the privacy regulation, you should worry.

Enterprise Desktop
Cloud Computing