Germany goes up 1-nil over Facebook in the User Privacy Game–Futurum Tech Podcast Episode 031
by Daniel Newman | February 8, 2019

On this edition of the Futurum Tech Podcast, German courts push back against Facebook. Will privacy protections finally begin to find fertile ground in 2019? Sprint sues AT&T over its questionable use of the term 5G. A recap of Twitter’s quarterly numbers. This week’s IOT fail of the week. Jeff Bezos’ very strange week, and the future of privacy. Those stories and more coming up on this episode of FTP.

Our Main Dive

Germany’s Federal Cartel Office (FCO) has taken aim at Facebook’s business model, using antitrust regulations to prohibit the social-media giant from forcing users to grant unrestricted access to their online data (including Facebook linking users to 3rd-party data) in order to use the service. Is the “data for value, users are the product” business model on the line?

Our Fast Five

We dig into this week’s interesting and noteworthy news:

  • Twitter’s Q4 Revenue Up, User Growth Down
  • IBM’s Think 2019 Preview
  • Remotely Defrosting IoT Refrigerators via Default Passwords
  • Why AT&T’s 5G Isn’t
  • Rumor Alert: Amazon’s HQ2 Plans for NYC are at risk

Tech Bites

Jeff Bezos and a serious case of oversharing.

Crystal Ball: Future-um Predictions and Guesses

How real is the “transparency in tech” move, and can the US ever catch up to the EU’s user-first approach to data?


Daniel Newman: Welcome to this week’s edition of FTP, the Futurum Tech Podcast. I’m Dan Newman, and I will be your host today. I’m joined by the one and only Olivier Blanchard. Olivier, welcome back to the show.

Olivier Blanchard: It’s good to be back.

Daniel Newman: He gets a laugh out of me when I do my radio voice. Fred McClimans won’t be with us today. Fred is head down buried in a deadline and hopefully he’s kicking its butt. Olivier and I are here and we’re excited to be here on this and every Friday as Futurum does our best to tell the world what’s going on in the high tech industry and cover the stories that you maybe heard about and maybe didn’t, finding some you know and some you don’t.

Just so you know, we are going to be talking about a whole bunch of companies that are publicly traded. While this show does cover those companies and sometimes even talks about their earnings and stocks, we are not a show designed to provide any sort of guidance for you to make financial investment decisions. So don’t do it.

All right, moving on. This week’s main topic, which I think is a big near and dear to my heart as it’s something I find myself constantly responding to, constantly tweeting about is the subject of data privacy. Big news came out this week. Facebook in Germany has hit a major wall. Basically, their whole ad model is getting shut down as it has been decided that the data that they are requiring people to provide is out of bounds and not within what Germany considers an acceptable exchange of data for service.

Olivier, I know you dove deep into this, and the second the story came out, you dropped into our Cisco teams’ room and you said something like, “If this isn’t the main dive, I don’t know what is.”

Olivier Blanchard: Yeah.

Daniel Newman: While, yes, we do cover a lot of tech in the industry that goes beyond those – what do you call it? – fame companies, these companies just consistently every week just suck us back in by having the big headlines about them. So first of all, Olivier, what are your thoughts on this whole thing? I mean, we over the last several weeks have been talking so much about how regulation has got to come around on this and these companies are out of control.

Our main topic last week was almost that as a whole. How much power do these companies actually have?

Well, here we go. Government’s taking its power back. Is this what you were talking about? Is this what you were thinking? Is this fair?

Olivier Blanchard: Yeah, yeah. Pretty much. Yes, yes, and yes. So we’re also going to, in this segment, I think address the flip side of that coin where companies actually get organized to help establish frameworks and guidelines to regulate certain aspects of their industry and don’t wait for governments to intervene. So that’s … we’re going to get into that in a few seconds with AWS and Google I believe as one example.

In this particular case, I think that Europe and particularly Germany is kind of pointing the way, first with GDPR which established a pretty clear framework of consumer protections and data privacy protections in Europe, but also this within the context of GDPR, there essentially brings us two completely different … connected, but different dimensions to the Facebook problem.

The first one, my interpretation of the court’s ruling is that it establishes that because Facebook has such tremendous market power with, I think, it’s roughly 80% of the market share for social platform usage or engagement in Germany. By virtue of the fact that Facebook is different from YouTube or Twitter, et cetera, in that it’s kind of almost a necessity nowadays for people to use it, to share pictures, to stay in touch with each other, to be able to interact with one another.

Facebook, in the court’s opinion I think, used an unfair or an advantageous market position to require users to use its platform or rather to agree to its term in order to be able to use its platform, and the court deemed that unfair. It deemed that to be an abuse of its market power. So there is that, that it’s not necessarily just an attack on the model itself, but an attack on the fact that there’s not really much choice for consumers if the only choice is either do what we tell you or don’t use our platform at all. So that was the first thing they aimed to address.

The second element of this, the second dimension, is that the court doesn’t really agree with Facebook that the cross-platform kind of bundling of data is necessary for Facebook, the platform itself, to function properly for users. The kind of cross platform data collection that it engages in is not in their consumer’s best interest. So these are the two core decisions that make up what happened in Germany and what are causing Facebook some pretty serious problems now.

Daniel Newman: Yeah. So you have these particular rulings. We’ll see if Facebook, like Apple, just does what they want to do anyway after a ruling is made that’s not necessarily in their favor. I mean, these things are also not simple, right? You can’t really just be like, “Oh, okay. We’re just going to suddenly turn off Germany.” It’s complex and there’s coding involved and there’s design involved and there’s experience involved.

See, Facebook’s built a platform that’s somewhat universal. So to suddenly … basically you’re going to have to – what – remove the create a app button altogether just from anybody who’s operating within a certain location. Then you still have issues with VPN access and stuff because people could just be using a VPN but still be in the country and then you’ve got all kinds of creator issues. So if someone creates an ad using a VPN that says they’re in England but they’re actually in Germany, then whose fault is that?

So it’s going to be really tough to enforce this in mass, but the fact that they’ve come to attempt to enforce it I think is a great first step because I don’t think what they’re ultimately trying to do, Olivier, is get Facebook to stop doing ads. I think they’re trying to say, “You need to start to think about how to make it accessible for people in markets where privacy and data are more sensitive, and giving a way to create them with less info. Can you minimize the information required to a more typical advertisement placement like it would have been in the age of the yellow pages or the billboard where you don’t necessarily have to give them the name of all your children, your first born, your listening data, your messages, all the other things that Facebook wants from every single person that uses the platform.”

I don’t know if you saw this, too, but I believe Germany – or was it another AMEA country? – also was trying to propose some type of injunction from the combination of data that Facebook was trying to do with the Instagram, Facebook. Was that Germany as well?

Olivier Blanchard: Yeah, yeah.

Daniel Newman: So Germany is having a full on … they’re leashing a full on war with Facebook this week.

Olivier Blanchard: There are different layers to this. So the easiest one for Facebook to address is the disclosure aspect of it, right? It’s basically letting customers know, “Look, this is the type of data that we collect on you,” and making it, I would assume for the Germans, making it more conspicuous and clear for consumers to be able to access that where it’s not buried in fine print and 35 pages of terms of service.

So the disclosure aspect of it is probably something that Facebook could probably address and fix relatively quickly, giving consumers a choice once they’re given this information and allowing them to opt out of certain types of data collection is the next step. That would be more difficult, I think, for Facebook to be able to enact, and definitely something they would fight in the courts a little bit harder.

Then the third aspect of that is the possibility that Facebook may have to kind of completely recode its advertising model because it may no longer be allowed to basically take data from different collection points and bring them together under one roof. The German courts may require Facebook essentially to stop bundling that data collection under one roof. That would be a technical problem for them as well as a huge revenue, I would say, impediment for the future of Facebook.

Daniel Newman: Well, think about it, Olivier. They would basically be eliminating the data that allows their whole advertising system to work. So like I said, even if they could recreate it, they would still have a challenge because all these advertisers, especially the local, right? It’s a restaurant. It’s a bar. It’s a grocery store. It’s a retail apparel provider. It’s at the mall, wherever. In Germany, so much of why it worked was because they could get so local. They could understand behaviors and they could actually target people.

Well, that’s kind of predicated on being able to track all that behavior and get all that data and then target to those people. So the disclosure sort of leads to … it almost forces companies to go back to do some marketing on their own instead of just buying the audience all the time and buying that hyper targeted audience. I just wonder how the market is going to respond, because like you said, it’s going to be a huge revenue impediment, because if people don’t feel they can … I can see you shrugging at me, but if people don’t feel like they can actually reach the people they want to reach, are they going to continue to pay the money to try to reach them?

Olivier Blanchard: I don’t think … I mean, I think it would be unfortunate if in an attempt to kind of protect consumers, the German courts actually ended up harming consumers. Here’s what I mean by that. You know, you and I have discussed many times this notion of big brother, big mother, and big butler, right? We talk about … actually, we have a new book coming out later this year that focuses on this that’s on the future of human-machine partnerships.

I was just writing about this issue this morning and yesterday, and essentially it just boils down to this. The same technology can be used in a big brother mode where it’s used against you. It’s invasive. It’s used to spy on you and surveil you and collect data on you and you don’t really have a lot of control over that.

The same exact technology can also be used for the benefit of mankind in a big butler capacity where essentially by collecting that data, platforms like Facebook or Google or whomever can help consumers create efficiencies in terms of time and search and essentially be steered automatically towards the types of products that they actually want, away from the products that they don’t want, to be shown ads and offers when they are most likely to be conducive, as opposed to being bombarded by spam and ads and marketing that they don’t want.

So there’s actually a potential benefit to all of this data collection all of the time and being able to get local with it. So we have to kind of strike a balance there, and I think that the right balance, whether the Germans will be able to achieve it or not, is first of all transparency and choice. It’s giving consumers all of the tools and all the transparency they need to make the decisions they want.

Then the second part of that is giving consumers an opt in and an opt out capability at all times that gives them agency and gives them control over, A, the data that’s collected on them, and B, when and how that data can be used. In that way, they can essentially create using the same platform, an environment by which these technologies and this data collection can help them improve their lives as opposed to being exploited by advertisers.

So we have to find that right balance, which is why I think it’s so important for tech companies to be involved in this discussion, and consumer protection groups to be involved in this discussion and not leave it simply to the courts and to legislators who may not be as well versed in all of the minutia of these types of problems as they should be.

Daniel Newman: Yeah. I mean, I think in the end, people will always exchange data for value. So that’s the problem. The big question mark really comes back to transparency. It’s not so much about privacy because a lot of people are willing to trade it. It just is about the fact that companies have become very, very good at masking what they’re doing in the name of gaining more control and getting more data than people think they’re giving.

Like I said, people don’t really care. I think simple terms are possible. This is also a problem with lawyers. I mean, let’s face it. The lawyers advise the companies on what to do here. They write it into the T’s and C’s and they know that if they write a 400 page term and condition, no one’s going to read it. So even people who are very, very pro privacy tend to read very few of the terms and conditions for apps they sign up for. Who has the time for that?

Olivier Blanchard: Well, yeah, but that’s … you know, it’s kind of a cart and horse issue. Why do the terms of service have to be so nebulous? Why do these business practices have to be so hidden and opaque and shady? If you’re not doing anything that could possibly harm or exploit your customers or consumers … let’s remember that users of Facebook are not the customers of Facebook. The advertisers are the customers of Facebook, not the users. The users are actually the product. Any business model that tries to hide what it’s doing or conceal in some way or misrepresent what it’s doing for its users, to its users, about its users is by definition deliberately kind of engaged in a process of hiding something. That’s not good.

So I’ll give you an example that we discussed earlier. Apple, we like to bash Apple a lot these days because we’re disappointed by a lot of the things they do, but in this instance, Apple actually did something good this week. They’re telling app developers to remove or properly disclose their use of analytics code that allows them to record how a user interacts with their phone app.

So I don’t know if you’re aware of this, but a few days ago, we found out that certain apps were able to – like FaceTime for instance – were able to kind of access your phone without necessarily your knowledge, whether it’s your camera or your microphone, and other apps, while they were on, were also giving access to your screen to third parties.

So Apple has decided proactively, vitally, to tell appl developers to either remove those features or disclose them to their users. That’s exactly what we’re talking about. Here’s a tech company that’s saying, “Okay, you use our products. This is what we’re going to do. We’re going to be transparent about it, or you’re not going to be playing in this field.” I think that’s the right spirit. It’s the right direction that I’d like to see Facebook take. So far I have not seen Facebook take it. I think that’s why the German courts are being so harsh against Facebook and why Google should also be careful because they could very well be next.

Daniel Newman: Yeah. It definitely seems like there’s three or four constant batters coming up and getting balls chucked at them. It’s the Facebook, Google, Amazons. They’re all just … because what they do is like you said. It just borders invasion of privacy to an extent that isn’t going to fly in places where that is a priority to people.

I do think one thing to point out about the Apple News like you said, which again I thought was positive as well, but they actually had apps that allowed for … I think it’s called Glass box, I believe it is. It actually allowed them to capture the screen of the app you are in. It didn’t just …

So I wrote a while ago about an uber instance where they gave him God mode and they could actually capture people’s screens in and out of the app, which is just an unbelievably bad decision by Apple. This particular thing was so it was like They could see what you were clicking and how you were using the application, which again, that’s quite a bit of privacy invasion, but you know what? If it’s in the app, as Fred who’s not on the show said to me so wisely, he said, “Well, they could easily just overlay it as long as they’re capturing your touches, which they’re totally able to do, they can overlay the screen and see it anyway.”

The problem and why Apple got really hot about this was the app that everyone was using wasn’t necessarily covering up sensitive data like passport numbers, credit card numbers. So what happened was then I think Air Canada got hacked and they were using that app. Then all of the sudden, in the hack, there was a whole bunch of that data available.

So just kind of a note of how this stuff all ties together. It starts off with something fairly nebulous and ends up with something that’s super sensitive data being stolen, but this is all really interesting. I want to end this segment with Cisco.

So the CEO this week. Actually, I tweeted about it earlier, but Chuck Robins basically came out with a tweet this morning that said, “Cisco calls for comprehensive US federal privacy legislation that aligns us with other data protection systems around the world.” So as you can see, tech companies are starting to take a stand. Cisco being more of a B to B company and not one that’s necessarily as actively involved in using data to drive user growth or sell that data, but you’re seeing the technology ecosystem and then furthermore, I believe Microsoft and Amazon now have both joined up in some sort of request for more legislation and governance around facial recognition. Given that both of those companies are involved in technologies that can create and utilize facial recognition for things like law enforcement, it’s good to see even the companies who stand to benefit from things where privacy is being bought and sold, is that a fair assessment, are getting involved and saying, “Hey, let’s regulate that a little bit more.”

So there’s some encouraging news out there. Apple is taking a little bit of a stand with asking about their apps to be pulled off. Cisco is stepping up. So if nothing else, they’re giving the lip service. Now the real question is going to be action. Let me come back to that later.

We got to move to our Fast Five. Olivier, it’s you and me today, so we’re going to have to do a little extra work.

Olivier Blanchard: I know.

Daniel Newman: Go ahead, though. Give me your first.

Olivier Blanchard: Well, my first is about Twitter and it’s kind of a dual Fast Five. So the first part of it is that Twitter just reported their Q4 revenues and they were awesome. 909 million, up 24% a year ago. So that’s pretty excellent.

User growth is a problem and I’ve identified a new trend in tech companies quarterly reports where if there’s a data point that used to be relevant I believe is still relevant today that is not trending well, they just stop reporting it altogether.

So we saw Apple stop reporting on the number of iPhone shipments. They don’t count those anymore. They don’t report the actual numbers. They just report the dollar amounts now. Twitter has decided to stop reporting active monthly users because they’ve been in kind of a free fall for several months now.

What they will do, however, is continue to report daily active users, but the monthly active user number being as bad as it is, they just stopped reporting it. It’s something that I’m seeing across the board, whether it’s Twitter, Apple, Facebook, Google, just something to look out for just for your own amusement when data points that are not indeed black stop disappearing altogether.

Daniel Newman: Yeah. You’re seeing that, right. It’s kind of like Edward Scissorhands to our financial reports. We’re going to shape and mold this thing and we’re only going to give the story that we think the investors want to hear because they’re such a fickle group, aren’t they? When I’ve watched Apple’s market cap grow 100 billion over a few weeks when nothing has actually changed. They’ve done nothing to material improve or hurt their situation from the 35% or whatever that it fell from the trillion, but it’s just like, “Okay, investors are back on board,” I mean, just for no reason.

So we’re seeing some definitely landscaping, man-scaping go on with some of these numbers. So I’m heading next week out to the IBM Think 2019. Think 2019 is in San Francisco this year. It has IBM’s biggest marquee event. They used to have lots of different events for partners and for amusers and for technologies, but now they’ve really kind of boiled it down to this one massive event.

It’s everything. So it’s the block chain, the cloud, mobile, services. So I’m really interested this year in a couple of things, though. I want to hear, one, about red hat. I want to hear what’s going on after making the $20+ billion acquisition of red hat. I am really interested in what’s going on with IBM cloud. I follow Azure and AWS, Google cloud very closely. We’re not hearing a lot about IBM cloud. So what’s going on there and is that something that is making a play or are they going to settle into a niche? Is that more what their thing is?

The third one, which is a little bit of a smaller group, but a lot of people don’t realize that IBM’s main frame business is actually the largest profit generator in the company. It’s a huge profits. IBMZ, and they had a little bit of a fall at the end of last year in their numbers. So what I’m really interested in is what’s happening with them going forward. No doubt, especially with all this privacy and data security issues that people are having right now, main frames are coming back in vogue. So while people don’t think about it a lot, they think about the cloud. That’s still a big thing.

Then the final one is services have always been at the core of IBM. I’m kind of interested in what are they doing, how are their services expanding, things like AI and block chain, Watson, what’s going on with these particular services? Where are the real life business cases really taking off? We heard about Watson going back to the jeopardy days, but I still think while they put so much energy behind the brand, we still have only seen it reach the beginnings of the potential that it has, especially considering the investment that they made.

So Olivier, I’m going to punt it back your way. I believe you have an interesting IOT story.

Olivier Blanchard: Yeah, I think this might be the IOT fail of the week. So tech crunch reported earlier today that thousands of industrial refrigerators, so the refrigerators that were used by restaurants and hotels in commercial applications can be remotely defrosted thanks to default password. So essentially, all you needed was a URL and you could basically just go in and select particular connected refrigerators and turn them off or play with their thermostat.

The reason why is first of all because it wasn’t super well protected, but also because all of the systems shipped with default passwords, and a lot of the operators of these refrigerators failed, for whatever reason, to change the passwords. So a chunk of these refrigerators, because of these connected systems, were still set to the default passwords and therefore vulnerable to a very, very simple, low tech hack of the refrigeration.

So that’s just kind of … it’s kind of funny. At the same time, it’s a pretty big deal because of the potential scale of the problem, and also it helps highlight that in day of increasing connectivity and internet of things integration, we need to be a little bit more diligent about the security of the items that we are responsible for.

Daniel Newman: Well, it’s a good thing they didn’t have a centralized network of all of them so if you got to one you could get to all of them. I mean, it sounds like any sort of damages that were really created would have been somewhat in isolation because …

Olivier Blanchard: No, no. You kind of could. No, you could basically go to a certain website that essentially is kind of like the portal for these systems, and just, you know, find the devices and select them and do whatever you wanted with them.

Daniel Newman: That’s hilarious. Well, would you be surprised that a bunch of restaurateurs didn’t realize that their refrigerators were IOT enabled and remotely controllable and didn’t have IT? I don’t know. I think that’s one of those things when you sell somebody a very, very expensive refrigeration system, you should probably give them a little run through of some of its capability.

Olivier Blanchard: One of the problems with this is first of all, you could obviously hurt specific restaurants and create some widespread food born illnesses, but at the same time, some of these refrigerators are used by laboratories, not just restaurants. So imagine a scenario in which a lab that’s developing cultures or keeping DNA samples for crime unit processing might have been affected by this. So everybody really needs to kind of go into their refrigerator settings if they have a connected refrigerator, for any reason. Restaurants, laboratories, hospitals, whomever, and change their password or create one.

Daniel Newman: Yeah, absolutely. So I got one here, and this one’s been on my mind for a while. I’ve been giving a wrath of crap to AT&T about their 5G revolution network because I think that they’re completely manipulating unknowing customers into believing that they have some sort of superior solution for the market that the rest of the carriers do not. It turns out the rest of the carriers agree with us and me with this.

Over the last few months, T mobile a mocked AT&T by putting a physical sticker on top of their phones to pretend that they are 5G. Verizon wrote a letter to AT&T, but this week, Sprint decided to sue AT&T in federal court. I think my opinion, which in Fast Five sometimes we don’t give, but I’m going to give it here is that, good, because 5G is a standard that is not available yet in any sort of wide offering to consumers, and people who are buying phones from AT&T believe that they are getting some sort of better solution and it’s just pure manipulation. It is a reason that the FTC exists. Is that the FTC or the FCC? It’s a reason one of those two exists is to make sure this stuff does not happen.

Olivier Blanchard: Both of them.

Daniel Newman: Yeah, I guess both of them have a play in this. Furthermore, I believe they even came to an agreement to put that 5G evolution on Apple phones, which you and I both know intimately, do not support 5G, will not support 5G, do not even have the technical capabilities to support an actual 5G network. So this is going to be a really interesting one to follow. AT&T is digging in. They’re saying they think they branded well enough to differentiate themselves from what regular 5G is. They’re talking about a movement, but I think putting 5G really big with an E next to it does nothing for the average person that does not understand telecommunication standards to understand that what they are getting is not any better than the enhanced LTE Networks being offered by all of their competitors. AT&T, you almost made our tech bites, but I had one that bit just a little bit more.

Olivier Blanchard: Yeah, and let’s also just bring up that in its claim, Sprint pointed out that they did a survey about this. They found that over half, so 54% of consumers, believe that 5G E is either the same or better than 5G even though it has nothing to do with 5G whatsoever. So their argument will probably be that AT&T is deliberately trying to mislead consumers into buying product and services that are 5G related or even better than the standard 5G products and services, when in fact they are not. So I think they have a pretty good chance of winning that court case.

My last Fast Five for today is a rumor. So it’s not necessarily an actual news item, but the opposition and disgruntlement over the Amazon HQ 2 in New York is kind of reaching a boiling point, and it’s getting so bad that apparently Amazon may be reconsidering its decision to build a major headquarters in New York. They have not as of yet purchased any land. They have not made a decision one way or the other. This is purely conjecture, but the Washington Post is reporting that there are internal talks to maybe or maybe not build that headquarters in New York after all. So we will see.

Daniel Newman: Wow. Well, that would be crazy after all that time and talk. Who knows? Amazon got a lot of good PR out of that, so why not do it one more time, right?

Olivier Blanchard: Yeah, it’s still better than the Fox Con plants in Wisconsin that’s being built and not really being built and maybe it’s not a plant. Maybe it’s a plant.

Daniel Newman: It’s a little bit like the wall. I think it’s already done.

Olivier Blanchard: Right?

Daniel Newman: Who knows.

Olivier Blanchard: It’s like, yeah. Just don’t build it. Tell people it was built and they’ll believe it.

Daniel Newman: It depends which channel you listen to and where it’s actually at since we’d have no more media consensus. So that’s for certain.

Speaking of media consensus, Jeff Bezos, wow. Where do I start? Amazon typically gets a pass on most of the things as being cool upfront, progressive. We’ve forgotten their fire phone because that was a massive fail, but otherwise, the Amazon Go stores, the Whole Foods acquisitions, the largest eCommerce on the planet, momentarily the most valuable company on the planet.

Pics of his wiener and other intimate photos that he was sharing with his mistress or one of his mistresses, now on the heels of …

Olivier Blanchard: Allegedly. Allegedly. Allegedly sharing.

Daniel Newman: Yeah, we should say allegedly. Allegedly, but this goes back to a squabble between Bezos and him now owning the Washington Post and the Inquirer. We don’t have enough time on this show to possibly dig into all of the back and forth there, but it’s essentially conservative media versus liberal media perceptions of those two things. It’s two powerful CEO’s plus the president at squabbles and at odds of things, but when I say tech bites, I guess it’s sort of when someone that you really admired. It’s a little bit like, Olivier, you and I having tesla on future proof as a future proof CEO, and then you hear one thing after another and you go, “It’s really hard to put him in that bucket of people who follow all those concepts that we spent so much time investing in and sharing and building great culture.”

Well, Bezos has been someone that’s always been in that column, too. He’s been someone that’s started an office in his tiny garage, ran every package to the UPS store himself, and turned that into a fortune of many billions. Look, people are human and kind of like Elizabeth Warren making a mistake 40 years ago on some paperwork, we can only hold it against someone for so long, but I did. I said this to you in a message last night. I’m like, “But how successful do you have to be to realize there is never a good time to send a picture of your privates?”

Olivier Blanchard: Between 3:30 am and 5:00 am.

Daniel Newman: What do you think about this, man? Because it really bites to think of this happening, but at the same time, like I said … I also thought, by the way, it was really brave and smart that he came out because you can’t blackmail somebody that puts it all out there. If there’s nothing to keep, you can’t blackmail them, which is what essentially the Inquirer was trying to do.

Olivier Blanchard: Right. Yeah, I mean, his reaction to it, now that the damage is done and the deed has been performed and clicked and sent, obviously he made the right move. It’s great to essentially have the, first of all, the freedom and the courage to go ahead and say, “No, blackmailer, I’m not going to let you have any power over me. So I’m going to go out and get ahead of this and share it on my terms.”

I think, yeah, it’s a little bit ironic. I mean, on the one hand, yes, it is disappointing when you look at somebody like Jeff Bezos who has a lot of admirable traits and accomplishments that we try to, I guess, kind of lionize on this show and use as an example. It’s disappointing to see somebody like that fall from grace a little bit.

At the same time, it’s kind of nice to know that they actually are human and they do make stupid mistakes like this. It’s especially ironic, when we talk about privacy and technology and the role that Amazon itself plays in these discussions, right, whether it’s AWS or anything else that we’re talking about technology and privacy and consumer rights and the struggle that we have to maintain that privacy 24/7 when we’re surrounded by technology and technology embeds itself into everything that we do.

Here we have the most powerful, the richest man in the world in charge of the most successful technology company in the world, getting caught by traditional media. Mind you, the Inquirer has been around a lot longer than the internet, sending dick pics, excuse my French, but that’s the technical term these days, to his alleged girlfriend. Now he’s been exposed, his privacy has been shown.

Daniel Newman: Literally.

Olivier Blanchard: Yeah. Literally. So I don’t know. I haven’t seen any photos and I don’t know that they’ve been released. I haven’t followed the story.

Daniel Newman: I don’t know if I would look at them if they were to be honest.

Olivier Blanchard: Right.

Daniel Newman: I’m not that interested.

Olivier Blanchard: So there’s another aspect of privacy, which is you don’t have to look at the pictures when they’re released, when they’re published, either. That’s a choice that you can make as well, but I think it speaks to the vulnerability that we all experience, whether we deliberately send nude photos of ourselves to other people or whether those pictures are taken of us without our consent or without our knowledge through various devices.

If Jeff Bezos cannot protect himself, whether it’s because of bad decisions or bad IT security, then we’re all vulnerable that’s a discussion that we should also have, I think.

Daniel Newman: Yeah, I just have to say as a whole, probably what bites most for me is just thinking that there’s nobody that is above making horrible decisions when it comes to their love lives. No matter how smart they are, and there’s no amount of technology … there needs to be an app for that. Like, you know, it’s kind of like I’ve heard ideas about like a drunk texting app. You breath into the phone if you’ve been drinking, the phone somehow knows you have, and then it does an, “Are you sure you want to send that? Are you really sure? How sure are you? Wait, did you have that?” Like before it lets you send a picture, machine learning could be like, “Yep, that’s definitely a dick picture. All right, let’s prompt a few more questions before you hit send.” Do you realize that 80% of the time, this ends really badly?

Olivier Blanchard: I think that’s a million-dollar idea. Let’s do it.

Daniel Newman: I just feel like …

Olivier Blanchard: Let’s create a GoFundMe page and create that app.

Daniel Newman: God, I can’t believe it hasn’t been created. It’s just like areas of human that are so deficient. For the sake of our book human machine, this is an area where machines can help. There’s a lot of things like, I watched the human machine comparison of Messy playing soccer and a robot trying to go four yards dribbling the ball and then kicking it into the goal. You realize humans do a lot of things that robots cannot do. A robot could detect that that is a bad idea.

Olivier Blanchard: I may insert that into our book.

Daniel Newman: It may at least need to be like a shadow box story, like an app idea. We need to trademark the idea because you and I don’t have the intelligence to build the app.

Anyway, so let’s go onto the Crystal Ball. We talked a lot about data privacy, transparency on today’s show. This is a little bit more of a broader, more subjective one, but in a few minutes at most, Olivier, how real do you think this whole movement for transparency really is? So look in your crystal ball, and when do you think US and broader global legislature/governing bodies are going to really get behind this, if ever? Or are these moves really more just posturing and power and pendulum? Talk about that.

Olivier Blanchard: You know, I think that Europe has much stronger consumer protection laws and even just culturally speaking, because the structures of government. It’s not a question of socialism, it’s just that the structures of government are so much more embedded in rules and regulations that govern pretty much everything in people’s lives. I think that Europeans expect more of their government in terms of regulation. They expect better controls. They expect more safety guidelines for this kind of stuff to kind of force companies to stay in line.

In the United States, we have a very opposite culture legislatively and in terms of our, I guess, fervor for all freedoms at all times no matter what. That precludes us from looking to government for solutions. So there’s always going to be a resistance on the business front and even on the public front to the risk that government will over regulate something or set too many rules because they’re anti-business and anti-progress. So I think that we can have a realistic expectations of strong privacy and strong transparency laws regarding technology companies in Europe in the next decade.

I think that in the US, it’s going to be much harder going, much slower, and I think that we’ll make progress, but we’ll continue to struggle with this 10-15 years down the road, I think. I think we’ll still be talking about this long after Facebook is no longer a successful company.

Daniel Newman: Maybe they will or won’t be. That could be a Crystal Ball question for another day. Will Facebook always be successful or when does that come to an end?

Now I’m going to chime on one thing. I think transparency and privacy are going to be enterprises next corporation social responsibility. So I think it’s part of it, but I think there was a lot of like sustainability became a big thing. “We’re a sustainable company. We’re a global company. We are a giving company, a charitable company.” I think right now it’s a, “We’re a data protection company. We value our customer’s privacy,” and I think that’s going to be a selling point. I think governments are going to be using to some extent, depending on which side of the pendulum from liberal to conservative and where people stand, the parties that tend to stand …

I think one party in each case, like us in the US being a two-party system, one will really grab onto it. Right now, I can see that more likely being the democrats just because of the youth and interest in serving the more common people. Again, that’s not me taking a political stand, I’m just saying, I haven’t really ever heard the republican party do much outwardly about their tech interests other than protecting the national interest of companies here. So they don’t talk much about broader, technological issues, and this is kind of more global than local.

As a whole, my gut feel is it’s a corporate social responsibility story. That’s what it’s going to be. It’s kind of whether they actually care or not or believe it or not, I have my doubts. I do believe it’s going to be a very popular trend over the next two to five years, and we’re going to see it become a very hotly debated topic. You’ll even hear more about it in the next round of elections. We’re going to hear more about it in the news. We’re going to hear more about it in the press and in the media because companies are all starting to come out and take a stand on data privacy, which you think would have happened a long time ago, but really, right now it just seems to be coming to a boil.

Olivier Blanchard: I think that the discussion may begin to diverge or change a little bit from just data privacy to privacy on one hand and trust on the other. Privacy is very subjective. What we consider private is not so much a matter of what a company determines is private. It’s what we determine is private. So our nudity.

So let’s take Jeff Bezos for example. What is private 99% of the time, might from the decision of the user not be private anymore for 30 seconds, or not be private between him and a certain number of friends. So if private is essentially a decision, or privacy is a decision made by the user and not the company, then in terms of policy for a company, for an industry, then we’re not really talking about privacy anymore. We’re talking about, A, data protection, and we’re talking, B, about trust.

I think that trust is a much stronger way of communicating what we’re talking about for companies. It’s a value proposition. It’s a brand attribute. At some point, because emotions work better than logic, and they’re simpler to understand and get behind, I think that what you will see is companies not so much talk about the privacy element, which is a technology element, but about trust. Can you trust us? Are we a trustworthy company? We want you to trust us.

Right now, the difference between a Facebook and a Google and an Amazon may be not so much, obviously the legislation. It’s not the technology. It’s how much we trust or not trust or don’t trust, rather, these separate companies in different ways.

Daniel Newman: Well, you know what they say. We asked you to no longer trust your eyes and ears, and just trust us. We’ll go back and then we’ll go back forward. Listen, in this age, in this day, in this time, trust is everything, but it’s always about finding those tangential relationships. In this case, privacy/transparency, but you have made a good point there as you stared deeply into the eyes of our crystal ball.

Olivier Blanchard, thank you so much for joining me today. I want to thank everybody out there for jumping into this week’s edition of Futurum Tech Podcast. We appreciate you staying with us, listening, subscribing, joining our community, talking to us on Twitter, and following all of our research at

For Daniel Newman and Mr. Blanchard, we are out of here. We’ll see you next week.

Disclaimer: The Futurum Tech Podcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such. 

About the Author

Daniel Newman is the Principal Analyst of Futurum Research and the CEO of Broadsuite Media Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise. Read Full Bio