Listen to this article now
Many of you have asked for my opinion on the Facebook and Cambridge Analytica data scandal and what it means for the future of Facebook so here it is:
No company can survive very long without its customers’ (or users’) trust. Banks, insurance companies, retailers, hotels… it doesn’t matter: Lose your customers’ trust, and they will take their business elsewhere. Facebook is no different.
If you are somehow unaware of the growing scandal involving Facebook, Cambridge Analytica, and how the personal data from 50 million Facebook users was allegedly scrubbed by Cambridge Analytica either with the help or Facebook or under Facebook’s nose, here are a few articles that will help you get caught up:
(Just remember to come back once you’re done reading.)
Facebook’s troubling pattern/history of data privacy failures –
Now let’s take a stroll down memory lane, thanks to Vanity Fair’s Nick Bilton:
Back in his Harvard days, Zuckerberg told a friend that he could use The Facebook (as it was called back then) to find out anything on anyone. “I have over 4,000 emails, pictures, addresses, SNS,” Zuckerberg proudly wrote to his friend. “People just submitted it. I don’t know why. They ‘trust me.’ Dumb fucks.”
To be fair, Mark Zuckerberg isn’t a teenager anymore. He is now in his thirties, and people tend to mature as they age. Unfortunately though, I see little indication that his attitudes with regard to Facebook users’ expectation of privacy have changed since his Harvard days. For example, in 2011, Facebook’s behavior forced the Federal Trade Commission to go after the company for allegedly engaging in deceptive practices with respect to behind-the-curtain access to user data. Facebook and the FTC agreed to settle, but the FTC obtained some concessions from Facebook. Per the FTC’s statement on 29 November 2011:
“The social networking service Facebook has agreed to settle Federal Trade Commission charges that it deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public.
“The proposed settlement requires Facebook to take several steps to make sure it lives up to its promises in the future, including giving consumers clear and prominent notice and obtaining consumers’ express consent before their information is shared beyond the privacy settings they have established.”
Specifically, the FTC complaint listed eight promises that Facebook allegedly made to users but failed to keep:
- In December 2009, Facebook changed its website so certain information that users may have designated as private – such as their Friends List – was made public. They didn’t warn users that this change was coming, or get their approval in advance.
- Facebook represented that third-party apps that users’ installed would have access only to user information that they needed to operate. In fact, the apps could access nearly all of users’ personal data – data the apps didn’t need.
- Facebook told users they could restrict sharing of data to limited audiences – for example with “Friends Only.” In fact, selecting “Friends Only” did not prevent their information from being shared with third-party applications their friends used.
- Facebook had a “Verified Apps” program & claimed it certified the security of participating apps. It didn’t.
- Facebook promised users that it would not share their personal information with advertisers. It did.
- Facebook claimed that when users deactivated or deleted their accounts, their photos and videos would be inaccessible. But Facebook allowed access to the content, even after users had deactivated or deleted their accounts.
- Facebook claimed that it complied with the U.S.- EU Safe Harbor Framework that governs data transfer between the U.S. and the European Union. It didn’t.
Per the settlement, Facebook was barred from “making any further deceptive privacy claims,” and was required to obtain their users’ consent before making changes to the way it shares their data. The settlement also required that Facebook obtain “periodic assessments of its privacy practices by independent, third-party auditors” for the next 20 years (until 2031).
Let’s play a timeline game:
Step one – Draw a straight line between Mark Zuckerberg’s “I don’t know why they ‘trust me.’ Dumb fucks” Harvard days in (presumably) 2004 and the Facebook settlement with the FTC in 2011.
Step two – Now draw another straight line between Facebook’s settlement with the FTC in 2011 and the current (2016-2018) scandal involving the misuse of user data by Facebook and/or a third party (Cambridge Analytica.)
Look at your timeline. Notice how the straight line connects 2004 to 2011, and 2011 to 2018? The social network that Mark Zuckerberg built still appears to have the same problematic attitude towards his users’ data after 14 years. Why should a reasonable person expect Mark Zuckerberg or Facebook to change with regard to this issue at this point?
So here we are: No company can survive very long without its customers’ (or users’) trust. Banks, insurance companies, retailers, hotels… it doesn’t matter: Lose your customers’ trust, and they will take their business elsewhere. Facebook is no different.
A quick recap, as I see it:
- Facebook has shattered consumer trust for going on ten years.
- Mark Zuckerberg has undermined that trust for going on fourteen years.
- Facebook will not get that trust back unless it takes drastic measures to earn it again.
How Facebook may yet survive this in the long term-
The only two things that may be keeping Facebook from collapsing under the weight of a scandal like this one are 1) the absence of an alternative social network for fed up Facebook users to flock to, and 2) habit. This leaves Facebook vulnerable to collapse via mass user exodus the moment a reasonable alternative is made open to Facebook users (provided Facebook continues to fail its users with regard to their data and privacy). Any way you look at it, the clock is ticking: Whatever Facebook intends to do, it needs to do it fast, before an enterprising startup gains the initiative and capitalizes on Facebook’s failures.
At this juncture, the only things that may yet save Facebook before someone comes along to scoop up its users are:
- A Zuckerberg exit.
- A complete rebuild of Facebook’s data security apparatus.
- The introduction of a user data ‘bill of rights’ not only as a cosmetic mantelpiece but as a company core value.
Why a Zuckerberg exit? Because he may be at the heart of the problem. A CEO who has demonstrated an unwillingness and/or inability to take user data privacy seriously for the company’s entire history probably cannot be trusted to ever create a company culture that will prioritize user data privacy. Translation: If he stays, Facebook is unlikely to change the way that it needs to. It’s a matter of faith, I suppose: Do I have faith in Mark Zuckerberg’s ability to change with regard to this issue? No. I don’t. Here’s why:
- The consistent pattern of behavior.
- If Cambridge Analytica hadn’t gotten caught, and if the scandal hadn’t splashed into Facebook’s pond, would Mark Zuckerberg be speaking about what happened and what he knew?
- It took five days from when the scandal first broke for Mark Zuckerberg to finally speak up about Facebook’s role in it. Count them: Five.
So no, I have no faith whatsoever in Mark Zuckerberg’s ability to lead Facebook in its efforts to adequately protect user data.
Why a complete rebuild of Facebook’s data security apparatus? Because, best case scenario, per Mark Zuckerberg’s own version of events, it is clearly not robust enough to adequately protect user data from third-party access and abuse.
Why a data bill of rights? Because without it, Facebook and its users will never have a clear understanding of what to expect and how to address data privacy investments and policies. Just changing the TOS isn’t enough. Facebook needs to make this a cornerstone of its business model.
Hypothetically, could Facebook get away with delivering on items 2 and 3 but not on item 1? Maybe… but I don’t think Facebook can fix its problem without delivering on all three, and it must solve this problem if it is to go on.
So what is next for Facebook? –
So what comes next? Is Facebook dead? No. And despite the current #DeleteFacebook campaign, it won’t die anytime soon either. But the writing is on the wall for Facebook to either get its house in order or suffer the fate of every company that fails to read its surroundings and adapt to changing conditions. Unless Facebook makes fundamental changes to its leadership, culture, and business model, its demise isn’t a question of if but a question of when.
And if you don’t believe me, here are four indicators that Facebook’s future may not be as bright as Mark Zuckerberg’s VR avatar may want you to think:
1. 24-and-under user erosion: Facebook is bleeding younger users (a vital demographic for advertisers). New York-based eMarketer reports that less than half of Americans aged 12-17 will even use Facebook 2x per month in 2018. Furthermore, eMarketer predicts that Facebook will lose 2 million users under the age of 25 this in 2018. (They are moving to Snapchat.) Facebook’s monthly user growth is coming from older users rather than from younger users. In other words, Facebook is no longer cool, and that is never a good sign of things to come.
2. Facebook stock’s relative fragility: In spite of having done fairly well in the past few years, Facebook’s stock took a significant hit this week, and some analysts are starting to ask themselves what it may mean for Facebook for the rest of the year. (Read William Watts’ take over at MarketWatch for a quick but thorough analysis.) Points of fact: Facebook’s stock dropped 9% in just a couple of days (through Tuesday) following weekend media reports about the Cambridge Analytica scandal. That roughly accounted for $50 billion in market value wiped out since Friday. Shares of Facebook did rise 2.5 % Wednesday, so the bleeding may have stopped for now… at least until the next scandal involving Facebook’s inability or unwillingness to protect its users’ data and privacy.
3. The FTC is getting involved again. As expected, the FTC is back. As we already saw a little earlier, the FTC already had to deal with Facebook’s mismanagement of its users’ data in 2011. I don’t see how the FTC having to step in a second time can possibly end well for Facebook. Will Facebook play ball? Yes. Sort of. We’ve been here before. How did that work out?
4. European regulators are getting involved as well. If you are a regular Futurum Insights visitor, you may have come across our coverage of European regulators slapping hefty fines on US tech companies whenever the opportunity arises. Facebook just sent them an open invitation to do just that. It has only been a few days, but we are already seeing signs of movement:
- Per US News and World Report: “Facebook’s lead regulator in the European Union, the Irish Data Protection Commissioner, is ‘following up’ with the U.S. internet giant to ensure its oversight of app developers’ use of its data is effective, her office said on Tuesday.”
- This case is sure to exacerbate Facebook’s ongoing problems with German and French regulators specifically, and their scrutiny of Facebook’s data practices.
- Giovanni Buttarelli, the EU’s data protection supervisor, just called on authorities from across the EU to form a joint task force to look into the possibility that Facebook and Cambridge Analytica broke EU data protection laws.
- The timing couldn’t possibly be worse as EU leaders are having a little get-together this week.
- This issue is going to snowball into a major set of problems for Facebook.
5. Facebook’s ambitions in China will be impacted by this scandal. Let’s leave that discussion for another day, but here’s some light reading for you from Forbes, the Wall Street Journal, and the New York Times in the meantime.
However you look at it, 2018 is going to be an interesting year for Facebook.
Look for follow-ups to this story in the coming weeks. Something tells me we’ve only scratched the surface.