How to Win the Privacy War With Facebook

delete Facebook
In the wake of the Cambridge Analytica scandal, calls to #DeleteFacebook suddenly are swamping the internet. NurPhoto/Getty Images

Privacy on the internet never has been anything more than a vague concept. And when it comes to social media, which by definition includes an open exchange of ideas and information (not to mention memes, embarrassing photos, cat videos and fake news), "privacy" turns truly unrecognizable.

It's only now, with the news that a research firm gathered and used information on millions of Facebook users without their knowledge — to, perhaps, nefarious ends involving the 2016 presidential election — that people are starting to get that. Only now are some people, finally, beginning to balk at the notion that social media is, by and large, a safe place that we control.

Advertisement

In the wake of the Cambridge Analytica scandal, calls to #DeleteFacebook suddenly are swamping the internet. Facebook's market worth has plunged by tens of billions of dollars. Its founder, Mark Zuckerberg, has been forced to apologize. Congress may call him to the woodshed on Capitol Hill.

Even Elon Musk, a fellow visionary who Zuckerberg has jousted with at times, spiked his Facebook accounts for Tesla and SpaceX.

Advertisement

A Reckoning for Facebook

This may be the first true large-scale reckoning for the information age, a 21st-century problem screaming for an immediate answer. Woodrow Hartzog, a professor of law and computer science at Northeastern and an affiliate scholar at The Center for Internet and Society at Stanford Law School, has a suggestion.

The idea, says Hartzog, should be to carefully rethink and redo the fundamental agreement between social media users and social media platforms like Facebook. The new idea, he says, should include a binding, legal agreement that platforms like Facebook shouldn't — we'll use some internet parlance here — screw you.

Advertisement

"When someone solicits our personal information, which is exactly what social media platforms do," Hartzog says, "then we trust them with our information. And they should be required to keep that trust."

We should trust Facebook not to screw us? Since when?

"They should be required to be discreet with our information. They should be required to be honest with us about what they're doing with that info — and that means more than just baring it in the fine print or on some privacy switch that no one's ever going to find," he says. "It means being very honest and making sure to dispel any misperceptions that we might have. They owe it to us when they solicit our trust to protect our information, to make sure that it doesn't get hacked, to follow up when they share information with third parties to make sure that they're treating it appropriately. If it's been de-identified, they need to keep it de-identified.

"And then, most importantly, in the modern data age, I think that when companies ask for our trust, they should be loyal to us," Hartzog adds. "In other words, they should not elevate their own interests, or the interests of a third party, over our interests. Because they're the ones that asked for our information."

That, of course, is all very noble, quite consumer-friendly, probably even the right thing to do. But Facebook is a business, and like all businesses, trust can be hard to monetize. These businesses might be even harder to, as Hartzog suggests, regulate.

"Trust isn't a concept that is foreign to the law. We have regulatory regimes that exist like this," he insists. "Your accountant owes you those sorts of duties of trust. Your doctor owes you those duties of trust. The mass collection of information is becoming such a powder keg that I think that platforms should be required to give that same duty of trust to us."

Advertisement

Cambridge Analytica Fallout

Facebook found itself in this quagmire after journalists discovered that a Cambridge University researcher, Aleksandr Kogan, developed an app for Facebook — a "personality quiz" — then shared the information gathered from that app (against Facebook policy) with research and data firm Cambridge Analytica. The quiz not only revealed information for those who downloaded the app, but also information for those users' friends who didn't download the app. In all, Cambridge Analytica mined data on some 50 million Facebook users, many of whom had no idea it was happening.

The worst part, according to news reports, including this one from the U.K.'s Guardian: That data — which, according to Hartzog, can include such seemingly esoteric minutiae as what the user buys online, what sites are looked at, even how long a cursor hovers over a certain link — was used to create psychological profiles on millions of Americans and then target those users with political ads designed to influence their votes. Among Cambridge Analytica's board members then was Steve Bannon, the one-time chief strategist to now-President Donald Trump.

Advertisement

In short: Information on millions of Facebook users was pilfered without their knowledge and then used to try to get them to vote a certain way.

The naive who are left among us might ask how such an invasion of privacy could happen. The answer is, among other places, in all those boxes not checked (or left checked) when you download an app, in the 50th paragraph of every end user agreement that goes unread but is agreed upon, in every mistaken assumption that these companies — including platforms like Facebook — won't screw you.

We're led, by these companies, to believe that we have control over what information we share.

In short again: Ha!

Advertisement

Privacy Shouldn't Be Just an Illusion

"The problem is that the control that people are given is an illusion, either because they are made to think that they have more control than in reality they have," says Hartzog, whose book "Privacy's Blueprint: The Battle to Control the Design of New Technologies," comes out later in 2018, "[or] the other part of the illusion is that sometimes we're given so much control we drown in it."

Pushing that illusion, of course, are platforms like Facebook, who do so for the very simple reason that having your information is valuable. Cambridge Analytica proved that.

Advertisement

"They have every incentive to exploit as much data about you as possible," Hartzog says.

Facebook's Zuckerberg addressed the uproar in a post on Facebook using — perhaps strangely, maybe encouragingly — much the same language that Hartzog uses:

"This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that."

The ultimate answer — perhaps the only one — may lie in the regulation that Hartzog and others are calling for, laws that strictly control and explain the gathering and distribution of the information these sites solicit from their users. With proper regulation, social media users may regain some sort of control over their personal data. They can reclaim, perhaps, some of that privacy that everybody loves but few have.

And, maybe, we can all start to trust the internet.

Advertisement

Advertisement

Loading...