Mark Zuckerberg’s keynote kicking off Facebook’s F8 developer conference was all about privacy. More than a little ironic, all things considered. For a company that’s been the focus of one scandal, privacy issue, data breach, investigation, fine, etc., after the other during the course of the last year alone, a pivot to privacy is, well, interesting.
In fact, there were some downright awkward moments for Facebook employees and developers when Zuckerberg attempted to joke about privacy. Like this line: “I know that we don’t exactly have the strongest reputation on privacy right now, to put it lightly” which was met with crickets from the audience. But really, what do you say?
But the privacy talk is just a shiny object, while the fundamental part of Facebook remains the same. Facebook and Mark Zuckerberg’s idea of privacy might not be the same as the rest of us.
Business Insider summed up Facebook’s privacy changes using a great analogy:
“Imagine you’re having a chat with your closest friend, while a businessman sits in the room wearing earplugs. The man assures you that he can’t hear what you two are saying — but he is remembering everything about where you are, who you’re talking with, what you’re wearing, and more, and then using this information to try to sell you stuff for weeks afterwards. That’s not privacy in any traditional sense of the word.”
Analyst Take: Zuckerberg’s central theme at the Facebook’s F8 conference was a “privacy-built platform.” That was a mistake. We should have been talking about a *trust-built* platform instead.
There’s no privacy without trust. And so far, Facebook hasn’t done anything to earn trust. “Privacy” features without trust mean that the next 20 Facebook data/privacy scandals will be just like the previous 20.
“Privacy” was a cheap pivot for Facebook, perhaps even a lame attempt to detract attention. The platform’s privacy problem is a symptom of a bigger cultural problem at the company, not its core problem.
Privacy doesn’t have a thing to do with fake news, for instance, or fraudulent ads, hate speech, misinformation, and hostile influence operations. A focus on privacy doesn’t fix any of that.
In fact, an emphasis on “privacy” runs the risk of making all of those problems worse for users of the platform: Instead of being out in the open, a “private” (opaque) social network will make those activities more difficult to spot and root out. Facebook appears to be moving the user experience in a more dangerous direction—one in which interactions on the platform will be more hidden and opaque, and in which Facebook will give itself license to be less transparent and accountable outside of its ecosystem of walled gardens.
There was also no concrete evidence that Facebook is effectively catching and removing fraudulent ads, fake news, and misinformation from its platform.
I feel even less comfortable about Facebook’s direction today than I did yesterday, and that isn’t a good thing.
Futurum Research provides industry research and analysis. These columns are for educational purposes only and should not be considered in any way investment advice.