Worthwhile Explainers On the Facebook/Cambridge Analytica "Fiasco"

I put fiasco in quotes because this situation is all about context. Political campaigns have been using variations of this strategy for a while. By "this strategy," I mean mining social media accounts, and using the information they have for influence. (But no, the Obama campaign did not deceptively ask people to take a quiz and then use that information.)

The sad fact is that privacy in the digital world is complicated, and it's a very nuanced issue. Many of the services that you have come to rely on to help you get through the day rely on the use of your personal information -- where you're going, what you're cooking for dinner, how much you ate during the day, how much you exercised.

In order to protect your privacy, and have the right expectations, you need to understand how each app works and uses your data, and what their security system does. You need to be mindful of your settings. You need to be told what might happen if you do take a quiz on Facebook, for example. You need to be told that if you leave your settings a certain way on Facebook, your travels across the web will be tracked by Facebook. At the same time, if all the apps you use disclose everything at all times, we might end up in a situation like in California where you see all these signs everywhere about various things having the potential to cause cancer. I end up ignoring the signs. I suppose the difference is that I can't change "the settings" on the paint regime at a car dealership, for example.

Institutions such as Stanford Law School, The Federal Trade Commission, or the Electronic Frontier Foundation should make a concerted effort to establish some sort of privacy literacy initiative to explain the basics of how various apps work. Companies should work with them to establish this initiative to re-establish trust and boost confidence in their products.

In the meantime, Daphne Keller, Director of Intermediary Liability at the Stanford Center for Internet and Society, has this useful Q&A about what the issues are in the Facebook/Cambridge Analytics debacle.

While I'm on the subject, John Battelle has this excellent explainer on Facebook's business model and what's wrong with it. What's even better is that he has a solution that proposes to restore the values of the original open Web to the platform. The key is giving individuals more control over their own data. Others have been proposing their own solutions to this issue for a while.

As my former editor Micah Sifry puts it, action needs to be taken to change the system because:

"Privacy, as Edward Snowden has eloquently argued, is the “fountainhead of all other rights.” It is “the right to a self [and] what gives you the ability to share with the world who you are on your own terms.”

If we don’t insist on a digital public sphere that treats the information of individuals as private by default, we will just be rats in a maze built and owned by a few digital wizards and their investors. If we want a way out of this mess, it starts by recognizing that we have to remake the Internet back into a public square owned by us."

"Fake" News

There's a real obsession out there with "fake" news.

Here are the two best thought pieces that I've read on the matter:

  • MIT Media Lab's Ethan Zuckerman: "Stop Saying Fake News." It's Not Helping.

    "Immediately after the US election, “fake news” emerged as a major story, a partial explanation for Trump’s surprise electoral victory. Within a week, I’d been invited to four different conferences, brainstorms or hackathons to combat fake news, done a dozen media interviews and briefed the heads of two major progressive foundations on the issue. Fake news was a problem for American democracy and progressive leaders were on it! ... "

    Zuckerman argues that we should build strength into our institutions such as mainstream media. i.e. if mainstream media outlets make mistakes, they should be fixed. And we need more trustworthy, diverse voices. 

    I would argue that we need a better educated public as well. But then Dana Boyd argues that "media literacy" educational efforts might have backfired.

  • NYU's Dana Boyd's "Did Media Literacy Backfire?"


"Addressing so-called fake news is going to require a lot more than labeling. It’s going to require a cultural change about how we make sense of information, whom we trust, and how we understand our own role in grappling with information. Quick and easy solutions may make the controversy go away, but they won’t address the underlying problems."