The explosion of IoT devices has created an equally explosive need for improved privacy and security measures. But it can be challenging, especially when innovations in technology outpace both regulatory efforts and security capabilities of the IT workforce. To safely and efficiently access the IoT-connected world, people must be able to trust that their privacy matters.
In order to secure identity-enabled experiences and prove they take user privacy seriously, organizations must embrace the challenges that land in their laps. Implementing data privacy regulations, such as GDPR or the California Consumer Privacy Act (CCPA), should be viewed as an opportunity, not a hindrance.
Eve Maler, interim CTO at identity and access management platform ForgeRock, is a progressive proponent of privacy, user consent and IAM innovation. She directs the User-Managed Access (UMA) standards initiative and coinvented the SAML and XML standards. Her passion for privacy is apparent in her four-step strategy to improve and uphold data privacy and protection.
Here, Maler explains how organizations can use the opportunity to approach privacy in a holistic way to the benefit of the user, the institution and the future of user data privacy in the digital age.
Editor's note: This interview has been edited for length and clarity.
How should organizations define privacy today?
Eve Maler: In the modern era, we've got a new view on what data privacy needs to be -- it's a far cry from the Data Protection initiative of 1995. Today, it involves building a pyramid of requirements and privacy. Data protection is the first layer of this pyramid -- the baseline. The next layer is data transparency -- enterprises are expected to tell people why they want data. The third and final layer of data privacy is data control. This is where we start to have no business model. The business model needs to give people control over their own lives.
How should organizations approach user data privacy and protection?
Maler: With this data privacy pyramid in mind, organizations should consider the following four steps to improve user data privacy and consent management:
- Identify where digital transformation opportunities and user trust risks or gaps intersect. With all this data, users are more cynical and skeptical about their privacy. We've seen cases where new digital products and smart devices have created trust risks. For example, a mother using an IoT device in the home was like, 'Oh my goodness, you're listening to my children.' The trust risks have had a terrible interaction, but organizations can use this unfortunate case as a way to discover new data privacy opportunities.
- Consider personal data as a joint asset. Organizations understand their chief privacy officer and data protection officer are incentivized by regulations. According to regulations, users own that data, and this is to the data subject's benefit. But there are people in the enterprise who may feel differently about it. This is because data can be so valuable to the business model. Getting all the stakeholders together and defining the total data privacy proposition is important. Individuals and departments need to think about it as a joint effort between the organization and the customer.
- Embrace user consent. Regulations like GDPR and CCPA require companies to think about consent. Some businesses have a choice about whether to offer content to an end user in exchange for data versus simply taking the data without being transparent. There are benefits to giving users a choice. It is tough, but that's what the top of the privacy pyramid is about: giving authority and control to the end user.
- Build and market trust. Once consent management is treated as something valuable to the organization, privacy becomes clear. The value of consumer IAM in building trust with end users will also become clear.
How can organizations be incentivized to improve customer data privacy?
Maler: They must present themselves as trustworthy. This can be accomplished by extending authority to individual users in appropriate ways and by reinforcing transparency by protecting the data. We have innovated user consent and access management in the world of open banking, for example. Open banking in the U.K. specifically started to innovate in what it calls 'strong customer authentication.' It requires strong patterns of user authentication -- beyond just the password.
Give legitimate customers an experience that is the appropriate amount of friction. For example, a mobile app may ask for Touch ID if you have an iPhone. But, periodically, an app might prompt you to reconfirm with a PIN. The question is: How often should it reconfirm with the PIN?
The enterprise must design the right kind of access journey for the user. For example, it can contextualize you are connected to a familiar Wi-Fi network. But, when you connect to the hairdresser's likely insecure Wi-Fi, it is best to reconfirm authentication. This is how you can become a partner in the security of your own users. Strong customer authentication can be used for mutual assurance that a company is doing the right thing.
How do you communicate the importance of privacy and user consent in cases where organizations experience dismay or fatigue about new privacy laws?
Maler: Maybe not dismay -- more like a big sigh. There is a sense that previously collected consent may no longer be good. Among IT professionals, this is viewed as another challenge we must accept. One response has been to build in the ability to attach version numbers to terms and conditions that could be accepted across different application versions and different countries.
These are the vicissitudes of regulatory changes -- complications come with the territory. I think there is pressure on us to federalize privacy laws, as opposed to state-by-state laws. GDPR was the result of trying to create a digital single market. It sought to increase individual empowerment. Perhaps the best success of GDPR was that enterprises started to feel an imperative to do data inventories and improve data hygiene controls.
I think we will start to see the same thing in the U.S. Because we lack a digital single market in the U.S. around privacy laws, we are suffering. We may be used to doing things differently for California markets -- gasoline is one example -- but now that CCPA and GDPR have been implemented and organizations have to do that for personal data, it is seen as inefficient. I think we are really under pressure to create better regulatory efficiencies.
You've identified yourself as an UMAnitarian -- do you find your work with User-Managed Access and data privacy fulfilling?
Maler: Those of us who have worked on standards in this area for a long time tend to be passionate about digital identity in general. It is hard not to be passionate about the digital capture of a part of you. In the area of IAM, there is a passionate core of people who care. There is even a bunch of people who have a rock-and-roll band together!
In IAM, there is always some work that is in flux. We can absolutely do better and actualize more mature standards. I'm excited about what we can do better.
When it comes to the future of data privacy, what are you excited about in particular -- new attitudes, new technology, new regulation?
Maler: I don't know many people who are necessarily excited about new regulation. We are investigating an area that is the talk of the IAM industry: decentralized identity. It sort of remains to prove itself, but that's something we are keeping an eye on.
We also believe identity relationship management is crucially important to the future of data privacy. For example, in the healthcare industry, regulations are increasing pressure on organizations to reduce what they call 'information-blocking' or 'data-blocking maneuvers.' There are attempts to ensure healthcare providers do not stop patients from accessing their healthcare information. Enabling people to share information with other parties is important in healthcare. The future of identity is about relationships.