Data Privacy Takes on Emerging Technologies

Data Privacy

In BigIDeas on the Go, Kurt Opsahl, Deputy Executive Director and General Counsel of the Electronic Frontier Foundation (EFF), shares his insights about how privacy law can adapt to changing technologies to “preserve a private space” for individuals.

Privacy Is the “Core” of Your Other Rights

“We’re trying to make sure we are going toward a world we would want to live in — one that’s not too dystopian,” says Opsahl of his work at the EFF, a nonprofit he describes as “dedicated to defending rights online and fighting for free speech, fair use, innovation, and privacy.”

Once termed a “rabid dog” by the Department of Justice for his work responding to government subpoenas, Opshal has watched the state of privacy evolve right along with the internet in his three-decade career.

A lot of the privacy issues we face today have their roots in the 1990s, says Opshal, “as more and more things went online, as commerce moved online, and some of the user-generated content sites began.”

Privacy, which Opsahl describes as “the right to be left alone, to have some autonomy in what you are doing, not having somebody looking over your shoulder, is very important to society — and core to a lot of other rights.

“If you’re going to have freedom of expression, you can’t also have it that the government is looking at everything you write … If you’re going to be able to exercise rights to organize and assemble, to have some privacy in how you’re able to organize them is important.”

The Inherent Privacy Challenge of Technology

New technologies present further challenges to privacy rights and regulations. Many technological advantages go hand-in-hand with privacy risk. Calls and data sent to or from your phone reveal its location, for example. Same with browser requests to and from your IP address. Your Fitbit logs health data about your exercise, activity, heart rate, sleep patterns, and so on.

Most likely, there are some things you do want to share with the world and some things you don’t — and you want that information to go to the right people, not the wrong ones. “We want to have a world in which you can take advantage of those technologies without unduly sacrificing the privacy that is inherent to what makes people who they are,” says Opshal.

People sharing information about themselves online has been happening for a long time, and this data becomes “a public conversation,” says Opshal. “There was an interesting time period when people had a different attitude about things. There was a cartoon that was around in the 90s showing a dog in front of a computer, and it said, ‘On the internet, no one knows that you’re a dog.’

“Those were some of the early attitudes, that the internet gave you a pseudo-anonymity … What we’ve seen is an evolution where they definitely know that you’re a dog. They know what kind of dog you are — and what your favorite dog food is.”

Part of the challenge involves a disconnect between what people think their privacy settings are, and what they actually are — and this is growing increasingly complex.

“Most of the privacy scandals that we’ve seen over the last years can be drawn down to: people had an expectation of what was happening, and it wasn’t what was actually happening … If people think they’re more [protected] than they actually are, that’s when you’ll have a privacy scandal.”

What’s Changed Since GDPR

For a long time, the main consumer privacy regulation was about “honesty.” If someone stated in their privacy policy, “you have no privacy,” that would still be considered “honest” and therefore not a restriction.

The EU’s General Data Protection Regulation (GDPR) changed all that by imposing minimum standards and prioritizing user consent — how companies get it, and that it can be withdrawn. “We’ve seen more of a shift to that, and the GDPR has had a worldwide effect and has inspired things like the CCPA (California Consumer Privacy Act).”

Over the past 20 years, we’ve seen more technologies that collect information become connected and networked. That means that the amount of personal and sensitive information out there will increase as people want to enjoy the advantages new technologies offer.

If we “make technology that is collecting just the minimum [information] that is necessary to provide the function, storing that data for the minimum time necessary to do it, and then getting rid of it, plus having legal protections to protect people from unauthorized attempts to get that information — whether it be from the government or from bad actors,” we’ll be headed toward a future that preserves a private space.

“People want to share their information. I really reject the notion that in order to have privacy, you have to live alone in the woods and cut yourself off from the internet. But it also means that we have to take some precautions.

“There’s a future where we can get to a state of privacy — if we work hard.” Listen to the full podcast to learn more about how Opsahl sees us getting there.