What is emotional data and what are the related privacy risks?

SearchSecurity talks with UC Berkeley professor Steven Weber about the concept of emotional data, where it comes from and how it can potentially be used -- and abused.

Personal data can reveal users' health and financial states, but can it betray users' emotional states as well?

Steven Weber believes the answer is yes. Weber is a professor at the University of California, Berkeley School of Information and director of the university's Center for Long Term Cybersecurity. He's been studying a variety of security issues, and one of them is the concept of emotional data: analysis derived from biometric data and other methods that indicates when users are happy, sad, stressed and in other emotional states. SearchSecurity spoke with Weber earlier this year and asked him about the concept of emotional data, the value that it may provide to users and the privacy risks if the data is abused or falls into the wrong hands. Here is his answer:

Steven Weber: One of the areas that we've been interested in [at the Center for Long Term Cybersecurity] is data that is broadly about emotional states. If you look around at devices that we're building and the ability to do various kinds of biometrics, then you can see there's a whole class of data around people's emotional states that the marketing people are very aware of and are interested in. It's a little bit clunky to collect in many respects because it's hard to look at people's pupil dilations unless they want you to at this point. Heart rate variability and data like that is not easy to sense from across the room, but you can see those technologies come online in the next couple of years with devices like Apple Watch and beyond. One of the things that I'm intrigued about is if this [emotional data] as a classic data class is fundamentally different in some way to other classes, mainly because it's information that we don't know about ourselves. You might not remember it, but you could find out what your previous address history is. But if I asked you, "What were the three things that happened last year that made you really unhappy?" You might have a memory of that, but you might also get it wrong. And so the person who had that data about you might know more about your emotional profile than you do. To me, that's like another level of experience.
 
And it could be manipulative and invasive in ways, too. If I was sitting in the intelligence community, then this would be the goldmine for me. If I'm sitting in a marketing department it's the goldmine, although it probably doesn't need to be quite so intrusive on people. The question is, how do you value that emotional data and how will people value that data about themselves? There's so much good that can become of that as well. One of my favorite studies that [Princeton University professor] Daniel Kahneman did some years ago was on what's called "experience sampling." The concept is you don't ask people a day later, "What did you like about yesterday?" You ask them over the course of the day, "How are you feeling right now?" And you get much better data. And so he did this study and ranked the 15 most unpleasant activities and the 15 most pleasant that people experience throughout their day. It's really interesting. And on average, what did people like the least? Their morning commute. Knowing that granular level of joy and pain and personal experience about other people -- in a way that they themselves don't know -- is really interesting. Think of all the wonderful things you could do for people in an effort to make them happier. It can be abused, too. You can do good things with it, and you can do bad things with it. That's true with every piece of data that I've ever seen.

Dig Deeper on Identity and access management