Search

Always On Surveillance By Way Of That Phone In Your Pocket–Futurum Tech Podcast

This week on the Futurum Tech Podcast, we tackle the subject of data privacy, data security, and surveillance. Are we drifting towards a dystopian future in which privacy is dead? Or do we correct course and make data work for us again? This and more on this week’s FTP.

Our Main Dive

For the main dive in this week’s episode of the Futurum Tech Podcast, we talk about always on surveillance by way of the mobile device that’s in your pocket. The NY Times, in an investigation done by The Times Opinion staff called the Privacy Project did an investigation into the smartphone tracking industry and the results are, well, alarming to say the least. The reporting team of Stuart Thompson and Charlie Warzel obtained a file that contains over 50 billion location pings from the phones of more than 12 million Americans as they went about the ordinary business of living their lives. This file can pinpoint the exact location of a single smartphone over a period between 2016 and 2017. The reporting team spent months going through this data, tracking the movements of individuals in key cities in the U.S., and speaking with attorneys, academics and technology experts who study this field, and, of course, data companies.

If you live in one of the cities this dataset covers, and if you use apps that share your location: your local news app, a weather app, Google, Foursquare, Facebook, or maybe even a coupon saver app, you could be in this treasure trove of personal information, too.

The reporters describe this one data set like this: If you could see the full trove (of data), you might never use your phone in the same way again.

The data reviewed came from a data location company and was provided to The Times Opinion staff by anonymous sources who said they were concerned about how this information might be abused and they felt compelled to inform both lawmakers and the public about this massive personal privacy breach and the challenges it presents for society as a whole. The folks who reported this wanted to remain anonymous because they didn’t have authorization to share the information and of course there might also be penalties (like lost jobs) for doing so. Our conversation in the podcast revolves around this data set, whether it matters, and how privacy is (maybe) becoming a thing of the past—for all of us. You can find the original article by The Times Opinion staff here, and it’s definitely worth your time to read further when you have an opportunity: Twelve Million Phones, One Dataset, Zero Privacy.

Our Fast Five

We dig into this week’s interesting and noteworthy news:

  • Another Facebook data leak. 267 million users, mostly in the US. My colleague, Olivier Blanchard discusses yet another Facebook data leak that has quite likely affected just about everyone. Or as he says: There are only 219 million adults 18+ in the US, so when you do the math, basically everyone’s data was exposed. Here are the specifics:
    1) 267,140,436 records were exposed and shared on the dark web
    2) The database included IDs, phone numbers, and full names of the users, mostly based in the US
    3) The database was live on the web for a two-week period before it was shut down
    4) The risk to unwitting Facebook users is that this data can easily be used in phishing or spam messages
    Do we even notice anymore? Will Facebook ever be held accountable for negligence?
  • Apple, Google and Amazon partner on a project, which is not at all common. Fred McClimans shares news of a partnership between Apple, Google, and Amazon and their efforts toward creating standards for connected home technology, an open-source smart home standard. Not only is this standard intended to ensure that devices work together across a smart home environment, it’s also intended to make the development of new devices easier and, most importantly, keep everything secure in the process. For consumers and developers, as well as the smart home industry in general, that’s a good thing.
  • A surveillance net blankets China’s cities, giving police vast powers. I’m fascinated by China’s ongoing efforts to monitor, track, and control its citizens. Using a combination of facial recognition scanners, fingerprint databases, phone scanners, and 24/7 video monitoring, Chinese authorities are keeping close tabs on the country’s 1.4 billion citizens. And privacy? That’s laughable. The country’s surveillance networks are controlled by local police, who in many instances are parking data on servers that are wholly unprotected. In some instances, personal data gathered by these tools are accessible by private contractors and other middlemen, and not only can it be used by the government to track and monitor citizens, it can also be used by large companies as well. Over a four-day period in April, systems installed in an apartment complex gathered data on the inhabitants and visitors to the complex that is a perfect example of what life in China is like. The article: A Surveillance Net Blankets China’s Cities, Giving Police Vast Powers, is fascinating. If you’re intrigued as a result of our conversation during the podcast, definitely make time to read further.
  • Bias inherently a given when it comes to facial recognition algorithms? You bet it is. Go figure. It appears we inadvertently landed on a theme for this week’s podcast, surveillance, privacy (or lack thereof) and the far-ranging implications that that presents. Futurum’s Fred McClimans’ share about this new federal study by the National Institute of Standards and Technology (NIST) revealed that algorithms currently sold in the market can misidentify members of some groups up to 100 times more frequently than others. Bottom line, even the most “advanced” facial recognition systems can’t (yet) be trusted. Read more at The Verge: A federal study of top facial recognition algorithms finds ‘empirical evidence’ of bias.
  • Apple has a secret team working on satellite technology that the iPhone maker could use to beam internet services directly to devices, bypassing wireless networks, according to people familiar with the work. Futurum’s Olivier Blanchard reports that Apple has a whopping twelve (or so) folks working on the satellite technology project with the goal of deploying their results within five years. Could it be that data will be beamed directly to your device instead of routing through carriers? Could it mean more precise location tracking and other feature improvements? Olivier, as always, has thoughts on this one.

Tech Bites

How easy is it to fool facial recognition software? Easier than you might think. Researchers with an AI firm were able to fool facial recognition software at mobile payment kiosks (here’s looking at you WeChat and Alipay) as well as at a Chinese border patrol checkpoint and at a passport control gate at Amsterdam’s Schiphol Airport. That said, they were unable to fool Apple and Huawei’s facial recognition systems.

Crystal Ball: Future-um Predictions and Guesses

What’s ahead for consumers as devices and location tracking technology becomes more sophisticated and more widely used? Tune into the podcast for a listen to see what we think the future holds.

Transcript:

Shelly Kramer: Welcome to this week’s episode of the Futurum Tech Podcast. I’m your host Shelly Kramer, and I’m joined today by my fellow analysts Olivier Blanchard and Fred McClimans. Hello, gentleman.

Fred McClimans: Hello, Shelly.

Olivier Blanchard: Hey.

Shelly Kramer: Welcome. So before we get-

Fred McClimans: Hey? That was it, Olivier? Hey?

Olivier Blanchard: That’s it. That’s it.

Shelly Kramer: Hey.

Fred McClimans: Okay.

Shelly Kramer: He’s succinct. I like it.

Olivier Blanchard: It’s Friday. It’s the last Friday of the year sort of, almost, not quite.

Shelly Kramer: Before we get started, I need to share a disclaimer that this podcast is intended for informational purposes, sometimes educational purposes. We almost always talk about sometimes very large publicly-traded companies. We are not giving investment advice. We are not giving advice. So please listen to this podcast with that in mind and off we go. So we are going to talk today about some kind of creepy topics and the first one, the main dive we’re going to focus on an article that came out in the New York Times talking about always on surveillance by way of the mobile device that’s in your pocket. In brief, in an investigation called the privacy project, a team of reporters at the New York Times did an investigation into the smart phone tracking industry. And to say that the results are alarming is really kind of an understatement.

The reporting team of Stuart Thompson and Charlie Warzel obtained a file that has over 50 billion location pings from the phones of more than 12 million Americans as they really went about the ordinary business of living their lives. By the way, they focused on some key major cities as you might imagine, New York City, Washington D.C., some other large cities. I will tell you we’ll hyperlink their report in the show notes to this podcast. But really as I was reading this and as I was preparing for this show, watching some of the virtualization data here and just showing where some of these pings are and how they’re able to track people in sort of a visual way is really more than a little alarming. So back to the back story here. This data file can pinpoint the exact location of a single smart phone, and this particular report was over a period of 2016 and 2017. The reporting team spent months going through this data, tracking the movements of individuals in key cities and then also talking with attorneys and academics and technology experts who study this field. And of course they spoke with data companies.

By the way, if you happen to live in one of the cities this data set covers, and if you use apps that share your location, say for instance your local news app, a weather app, who doesn’t use a weather app? Google. Foursquare. Facebook. Or maybe even your favorite coupon saver app. You could actually be in this treasure trove of personal information.

The reporters described this data set like this: “If you could see the full trove of data, you might never use your phone in the same way again.” The last point I wanted to share on this before I hand it over to Fred and Olivier to talk a little bit more about is: Where did this data come from? It came from a data location company, and it’s probably a company that most of us haven’t ever heard of.

It was provided to the Times Opinion staff by anonymous sources who said that they were concerned about how this information might be abused. They felt compelled to inform both lawmakers and the public about this what they feel was a massive personal privacy breach. To put forth the challenges that this presents for society as a whole, these sources wanted to remain anonymous because they didn’t have the authorization to share this information and, of course, I’m sure there could well be penalties like lost jobs for sharing it. So anyway, with that, what do you think, Olivier?

Olivier Blanchard: I think it’s the new normal. We can look at this as kind of a dystopian, sci-fi adjacent future or we can look at it the way I look at everything technology-wise which is: Is it big brother? Big mother? Big butler? There’s always that prism that you can look at it through. The big brother version is the truly dystopian and scary version in which we’re living in a surveillance state. All of our moves and all of our actions are tracked by a government or corporate entities that are not super friendly, like for instance China. What’s happening with this technology in China is really frightening. And you’ll see the same sort of thing happening in Saudi Arabia and other places where the government isn’t necessarily of the people, for the people, by the people.

Shelly Kramer: Shutting off the internet.

Olivier Blanchard: Right, right. Exactly. But at the same time I can put on my marketer hats and think, “Wow, this is the kind of information that as a benevolent marketer with the right kind of ecosystem, the right kind of apps and permissions would be very useful in terms of understanding consumer behaviors and even specifically for clients with particular companies, whether they’re Walmart or Apple or whomever else, understanding specific customer behavior and being able to track them individually through a mall or through a city center to see how and when and how often they visit certain stores.” Likewise, I think it’s really important when it comes to smart cities and smart infrastructure.

We have this very old tool called the census where we kind of figure out, “Okay, there’s X amount of people and these are what these people kind of fit into these categories, so let’s assign a certain amount of resources to serve them.”

But I think that this kind of technology, this kind of surveillance can also be used to micro-target areas and actually become more specific and more efficient with city planning and resource planning. To me, it’s easy and I think it’s important to fall into that, “This is scary and this bothers me. But at the same time, there are other dimensions to this.” As voters and citizens and denizens of still a semi-free democracy/republic, I think the more we have these discussions, the more important it is. We could still steer this in the right direction to where it doesn’t become exploitative and where it actually can be very beneficial to us.

Shelly Kramer: Well, but I think that that’s really having investigations like this and having conversations like this are the only way we get there.

Olivier Blanchard: Yeah. And also I think it needs to be opt-in. Right now it’s not. It’s kind of like we have to suffer it. It’s happening in the background. We’re not aware of it. And it’s really free market. You have really good information flow, and we obviously don’t because this stuff is hidden behind a curtain. So this doesn’t feel like free market. It doesn’t really feel like we have a choice. That’s one of the problems I have with it right now.

Shelly Kramer: Fred, what do you think? I know you have interesting thoughts.

Fred McClimans: Well, I do. I’m going to put those aside for a moment and talk about this subject here. This is I think so far beyond the concept of big brother that 1984 the fantastic novel and film and so forth was based on. This is something where we have, to borrow a Happy Days Fonzie phrase, we’ve jumped the shark in this particular sense.

We know that companies and organizations cannot protect user data. They simply don’t have the ability to do that effectively today. Part of that revolves around their desire to keep data tied to a particular user. That’s where it has value. If it’s completely anonymous, it loses some value. So companies are not anonymizing data. They’re not necessarily encrypting data. They’re basing too many things on, “Well, we secure our data. Oh, yeah but the user that is creating this data, their password and their user ID an their email address may be hacked over here and linked to us there. But that’s not our issue.” We saw that recently with Ring where a particular household had somebody that was using their Ring camera, and Amazon, parent of Ring, said, “Hey look, that’s not our bad. Everything is secure. Their email and their password was hacked at some other site, not us.” But you get into this whole cycle where the system is fundamentally broken. Like I said, we know organizations can’t protect our data. We also know that companies often act in an unethical way.

We recently did a survey here at Futurum where we surveyed 2,000 global organizations. These are organizations, companies, brands, governments. We asked a really simple question: Do you ever implement a technology knowing that it’s not really safe, that the user data isn’t really protected. A surprising number said, “Yes, we do for various reasons: A competitive edge, time to market, revenue goals.” They push that boundary. They push that limit. We know that they can do that successfully and still be viable companies because the penalties, the fines, the slap on the wrist is simply not adequate to act as an incentive to behave in an ethical manner. It’s almost to that point where it’s a given, particularly in the western markets and parts of Europe to an extent, but certainly in North America, it’s a given that companies, even when you pay that company for a service, they feel that they have the legal right to your user data to monetize it. I have no problem with data monetization, provided it’s done ethically. But they have that fundamental right to it.

At a certain point, I think we all need to collectively stand up and say, “If you,” the organization, the company, the brand or the government, “feel that you have the right to that data, well then we as consumers should be able to say, ‘Hey, well we have the right to your data, how you’re using that data.’ What are the nodes that are out there in the network that are providing this technology, this feature, this service, this value to us.” It has to be a bidirectional exchange if they’re going to do it at all. I’d love to think that we can get to the point where data can be effectively monetized and consumers can benefit from that, maybe financially.

There are a number of ways that block chain and certain other technologies could actually provide some way of tracking your user data, who is using it, how it’s being used, that audit trail of data ownership. Right now, we are so far off the rails as a global society that I really do have some major concerns about how this gets abused, particularly when they may say, “Well, look. It’s just a dot on the screen.” Shelly, to your point in the New York Times article, they had some great visualizations. Here are all the thousands of people around this particularly location. But they isolated one dot in central park and said, “Okay, now we’re going to track that one dot.” And they tracked it all over the place. With that information, with that tracking information, it’s pretty easy to figure out who that person is.

Shelly Kramer: Well, they did.

Fred McClimans: And they did.

Shelly Kramer: They figured out who a lot of those people are.

Fred McClimans: And when you start adding that to facial recognition tools, when you start adding it to all the data that’s out there from various sources and the data aggregators and the data markets that are reselling all of this data, all of the sudden it becomes painfully clear that not only have we traded privacy for value of some sort, but that exchange is incredibly imbalanced and I think fundamentally pretty unethical.

Shelly Kramer: Well, absolutely so. I think that I’m relatively certain that most consumers have no idea. For instance, location data has been sold for years to third parties, companies like Verizon, AT&T, the weather channel. When you stop and think, well who would want that? How would that be valuable? This data is being sold. It provides tremendously critical intelligence for a lot of different businesses. It might be used by hedge funds. It might be used by real estate investment companies. There’s so many other applications. Financial institutions buy this kind of data all the time. Really they might pay more than a million dollars for something like this kind of a data set. So companies are selling this data. Other companies are buying and using this data and tracking us. By the way, you can take this data in a way that these reporters did not do. It can actually be tied to user IDs alongside a mobile phone.

There’s supposedly an anonymous identifier about 30 digits long that allows advertisers and other businesses to tie activity together across apps. It can combine with other information like your home, your name, your home address, your email, your phone. It’s crazy. I’m usually the person saying what you’re saying Olivier that, “Get over it. It’s here. It’s happening. It’s nothing new.”

Olivier Blanchard: I didn’t quite say that.

Shelly Kramer: Well, encapsulated.

Olivier Blanchard: I’m about to, but yeah. Okay.

Shelly Kramer: But I do think it is … One of the things you see in this article is that they were able to identity a Microsoft engineer who one day made a visit to Seattle and happened upon the Amazon campus. The next month, he started a new job at Amazon. He was interviewed for this article. He said, “I’m not really surprised by this. But knowing that you can get all of this data and you can find this out about me, it’s a little weird.” Anyway, it’s a really interesting topic and I think one that I’m not really sure what we can do about.

Fred McClimans: I think there are a number of things. And Olivier, I’ll abandon my soap box here in a moment.

Olivier Blanchard: Go for it. Go for it.

Fred McClimans: First off, there needs to be a change in the mindset. If consumers generate data, consumers should own the data. If they then want to sell that data, then whoever is buying that data has to be in a position where the consumer feels that yes this organization is A) secure B) going to use it ethically, and C) that the consumer has control over that transaction, not just on a once sold data system … I was going to say database. No pun intended.

But on a data basis perhaps or a singular sale basis. If you think about the way the data is sold, there’s no litmus test for a consumer of that data, a consumer here being some business or organization. There’s nothing that says, “Hey, I’ve got my million dollars here. I want this data. Or maybe I’ve got my thousand dollars or my hundred dollars. I want this data.” There’s no checkbox somewhere that somebody has to actually say, “I’m going to use this data ethically.” There’s nothing that says they can’t resell that data and combine it with more. That’s sort of the data creep that comes into play here.

I think there are things that we can do from a regulatory perspective that kind of lock that down and say, “Look, yes we know Facebook, Google, YouTube, all these companies and a lot of other companies out there are going to take a major financial hit.” But the reality is we can’t go any further down this path. We have to make sure that we have the right regulations in place, that we have the right penalties for people that abuse that data or use it in a way that is not completely disclosed and understood by the consumer up front. We hear a lot of talk about, especially in politics in the US today, the revolution is coming. We need a data revolution. We need a consumer first revolution that puts the consumer at the front of this and in control, otherwise think about our kids. Think about our grandkids. Where do we draw a line here and say, “We as a civilization do not want to go down this path. We’d rather go down the safer path.”

Shelly Kramer: Yeah. One of the things that concerns me about this as it relates to the government is actually twofold. One is I’d be interested to hear how you two feel about your level of confidence in the powers that be as it related to wading through a complex issue like this. I know that Senator Josh Hawley has kind of taken up arms in focusing on these companies. But I find in a lot of things that I see from a government level, I’m not convinced that there’s the technical expertise, the knowledge, the understanding at how pervasive some of these problems are or what could potentially be done about them. The other thing that I think also crosses my mind on this topic is that we are living in a time where we are very pro big business and not at all pro consumer protections across a wide … and that is not intended to be a political statement. That is simply an observation. You can make your own political assumptions. But this requires a people-first, protectionist mindset as it relates to consumers. I don’t feel like we live in a time where our government is thinking that way really or acting that way. Do you have a thought on that one guys?

Fred McClimans: It’s funny because at this point everybody’s data has been hacked somewhere. In an odd way, there was an article that I just read earlier this week. They were talking about the number of TV viewers in the UK that are still watching on, as they put it, black and white tellies. It’s something like 7,000 people in the UK are still using black and white tellies as they put it. When I read that, my first thought was, “There’s somebody that’s not having their viewing habits tracked.” It’s sad to think that that was the first thing that came to mind.

Shelly Kramer: Maybe we need to go back to that, huh? Oh, no. Olivier, what do you think?

Olivier Blanchard: To go back to your question earlier about how much faith do I have in the powers that be to tackle these complex technology issues, unfortunately zero because for whatever reason, we seem almost entirely focused on I want to say cultural issues like abortion, marriage equality, what have you as opposed to focusing on building the infrastructure and the kind of economy that we need to build in the next 20 to 30 years. We’re not even thinking tactically. We’re not thinking about managing and building. We’re still having these stupid arguments about culture. Which, I mean they’re stupid in the sense that they’re not particularly useful. They’re not stupid in the absolute. I think they’re important and interesting debates to have, but they’re not what we should be focusing on for the most part.

Fred McClimans: Well, they’re not existential threats like this is.

Olivier Blanchard: I guess. But people act like they’re the most important thing in the world, and unfortunately we elect our representatives based on these kind of entrenched single issue identity politics as opposed to looking for people who 1) understand the challenges that we face and 2) offer solutions and are having debates and discussions about how to tackle this. So for instance, especially the United States … the whole world is like this, but especially the United States is such a pro big business country, in fact you could argue that America isn’t really a country. It is a business. It’s really up to the governments and to us, our representatives, to take up that mantle and say, “Look, we want you to be super successful businesses of America, but at the same time, the government’s job is not to make it easier to be successful.” The government’s job is to make sure that there is balance so that you have all of the room. You have a huge sandbox in which to be successful. But at the same time, the society that we build, the country that we build around you is very functional and allows people to develop and grow and feed into that success even more.

We’ve seen examples around the world like GDPR in Europe, which is I think a really good first step towards at least recognizing and trying to start enforcing data and user privacy laws. But I think that with the US, I’m not as scared and annoyed with this massive data collection scheme as you guys are. My main issue is that 1) we don’t have transparency, which is a very big issue, and 2) that we don’t have rules and I guess mechanisms in place to make sure that there is a balance that you were talking about between the benefits to companies as opposed to the benefits to people. Maybe it’s because some of the technology companies that I’ve worked with in recent years are so focused on using data for benevolence and really useful use cases. It’s really truly the smart city using these dots, this kind of pedestrian data and auto data to figure out how much traffic is where when and how to broaden sidewalks to make sure that they’re wide enough for the type of traffic that they get, that there are enough public bathrooms in a particular part of town or in a particular neighborhood to accommodate that many pedestrians, that sort of thing. I’ve seen that data and those huge data sets used so intelligently and so benevolently that I’m not as scared and as freaked out by it. I don’t care if my dot is followed.

In fact, when I’m in certain sensitive areas that could be terrorist targets, I don’t mind that the technology exists to be able to track five or six phones or five or six individuals through facial recognition in that crowd who might become a problem at some point. I’m willing to entertain that balance. But you guys are right in that the way that it’s done now, without these conversations, without this public discourse, without the public holding the government accountable for setting rules and kind of a framework and not having this in place is dangerous. We could very easily tip into a China scenario, which would be very, very unfortunate and very hard to get away from. The time is now, but nobody in politics is really having discussions. You’ve got Andrew Yang, who on occasion mentions things that are kind of tech forward. But for the most part, he’s the exception. He’s definitely not the rule.

Fred McClimans: He is. Olivier, we’re not far apart at all on this. I see the value of data. And I think there are ways that organizations can leverage that data and monetize it, either indirectly for internal performance or in a smart city mode as you mentioned, the sidewalks, the traffic lights, traffic flow, trash collection, lighting, all these things. It can be used effectively for that. But that almost requires a mode that we step into that says, “Collect it. Use it. Delete it.”

Olivier Blanchard: Finland mode. That’s what I call it.

Fred McClimans: Finland mode. Sure. And we’re just not there yet. We’re in that, “Collect it. Use it. Sell it. Store it. Put it somewhere where it’s not secure.”

Shelly Kramer: Not secure. And let anybody get.

Olivier Blanchard: We’re going to talk about that some more later. Yeah.

Fred McClimans: Yeah. Yeah.

Shelly Kramer: Well, a fascinating topic to be sure, and one that I am sure we are far from done talking about and thinking about. But with that, we are going to move on. Speaking of data leaks, Olivier, tell us.

Olivier Blanchard: Yeah. Another data leak. This time I think we can thank Facebook again. Apparently, this security analyst firm did a little bit of a study and found an unsecured database of Facebook user data. Again, unsecured database. Focus on unsecured. This one had approximately 267 million records.

Shelly Kramer: I think you did some math on this.

Olivier Blanchard: Yes. It was kind of interesting. I said 267,140,000 records, mostly the US. The thing is there’s only 209 million adults 18 and over in the United States, which means that I think for the most part Facebook accounts you have to be 13 or something. It seems that that number is pretty close to every single person in the US who’s on Facebook give or take maybe ten or 12 million. It’s huge. The database included IDs, phone numbers, full names. It’s been shut down. It’s okay. But it was live for about two three weeks or something.

So again the question that I have is on the one hand you have companies that collect all of this data, and that’s fine. I’m okay with it, again. There’s a piece of this that just seems so grossly negligent that every week or every couple of weeks there’s a massive data breach and it’s always usually the same few companies that are supposed to be the experts in data collection and data security. They’re always the ones who drop the ball, and there’s no accountability whatsoever. And it’s gotten to the point now where that story, which would have been maybe two years ago front page for the day is page 16 now. We don’t care. It’s like, “Oh, another data breach. Blah, blah, blah.” It’s almost like we’ve just resigned ourselves to accept this. I think it’s unfortunate that we don’t hold these companies accountable for the negligence that exposes our data, and we don’t even ask our courts and governments, state, federal, and otherwise, to step in and create rules that would punish or at least penalize in some way companies that just don’t seem to care.

Shelly Kramer: The reality of it is I think that so many of these companies, they don’t really care about a hit to the reputation because this is in the news one day, out of the news the next day. So many companies from a security standpoint are protecting themselves with insurance policies that cover any financial ramifications so it doesn’t really matter. It’s too bad. Okay. On that really enlightening note, we are going to talk about Apple and Google and Amazon partnering on a project. Fred?

Fred McClimans: Yes. So in a subject that actually touches directly onto this data privacy issue-

Shelly Kramer: We have a theme.

Fred McClimans: We have a theme, yes. Apple, Google, and Amazon are teaming up together along with the other members of the project connected home over IP initiative that includes everybody from Phillips and smart home devices to a number of companies to basically come up with an interoperability standard for you smart home devices. You’ve got your Siri, Google Home, Alexa, and many other platforms around the world that are pretty much standalone islands. They’re silos unto themselves today. But they want to erase that. They want to make it easier for companies to develop systems that can work with any of these products and really open up the marketplace. They’re moving forward on this particular initiative. They’re hoping to have a draft standard for some kind of preliminary review sometime the end of 2020 or so. The idea behind this I think is a good idea.

The one thing that I haven’t seen addressed in a lot of this is that aspect of security. They can talk about, “Well, we’re going to make the communication between the devices more secure,” but that still doesn’t address the underlying threat of somebody that happens to use the same email account for multiple things. It gets hacked. And somebody finds their way into the system. On the one hand, this could definitely improve the ability of smart home devices, the cameras, smart microwaves, et cetera to really jump to the forefront very quickly. But it could also be a potential security threat. It was kind of interesting that I was reading the article about this on The Verge. When you get to the end, there’s this nice little disclaimer that says, “By the way, The Verge collects your data about what you’re reading and all this nifty sort of stuff here.”

Ironically, one of the first areas that they’re looking to target in the standards are actually things like home cameras and home door locks. Just the idea of people having their door locks networked and whatnot, yeah a little bit of an uneasy feeling there. But at least this I think a good step in the right direction to kind of bring these companies together and to adopt some formal standards about how these systems work, how they inter-operate because the future of the home is a lot of these monitoring devices, a lot of these customer engagement devices, information portals and security systems. So good step, but we’re still from my perspective waiting to see how they actually implement the security aspect of all of this.

Shelly Kramer: Yeah. It’ll be really interesting to see for sure. It’s funny, for somebody who is so immersed in technology, I really don’t have a lot of smart home devices. I don’t have lights, and I don’t have a Nest, and I don’t have a Ring doorbell. I want people to just leave me alone. I don’t need all that stuff. So on the topic of spying or surveillance, I’m going to stay with our theme here. I’m going to talk about China and what China is doing from a surveillance standpoint. There’s about one point four billion people who live in China. Of course, a communist country, we’re used to the reality that communist countries control their citizens. But what China is doing is their kind of weaving together both old school and state-of-the-art technologies, things like phone scanners, facial recognition cameras, face and finger databases, all different kinds of things really to help them have complete control over their citizens. They’re using these to identify, not just terrorists or people doing bad things. They want to use them to track everybody. The reality of life in China is that you don’t really have any personal privacy.

What’s even more interesting is that an article on this, an investigation, found that Chinese authorities actually put this personal data of millions of people on servers that were unprotected by even the most basic of security measures. That really gives a person lots of peace of mind, too. Where we are in China is this is really all they kind of … even though what they do from a surveillance state standpoint is light years ahead of what we do and see or know about in the United States, they’re really kind of at the beginning stages of this and what they’re doing is really kind of scary.

They gave an example in this article that I read that the police arrived one day in April in an apartment complex in an industrial city in central China. Over the course of a couple days, they installed cameras and two small white boxes at the gates of the complex. Once they were activated, the system began to sniff around for personal data.

These boxes are phone scanners. They’re called IMSI catchers. They’re very widely used in the west. They collect identification codes from mobile phones. The cameras recorded faces. On the back end, this system then attempted to tie that data together. If a face and a phone appeared at the same time in the same place, the system then knew what belonged to who.

Over the course of just four days, those boxed identified more than 67,000 phones. The cameras captured more than 23,000 images. And from that, about 8,700 unique faces were derived. This is just one system and one of hundreds that are part of city-wide, country-wide surveillance networks. Basically, there’s nothing that happens in China that the government doesn’t have access to, that the government can’t see.

By the way, you are required now in China to register your mobile devices. So if you buy a mobile device, you have to register it with the government so that they know that Shelly Kramer belongs to this phone. I just think it’s really interesting and part of why this fascinates me A) I pay a lot of attention to what’s going on in China because I think it’s fascinating. But also when we talk about, “Oh, we’ve jumped the shark.” And “Oh, this is just happening.” And “Oh, there’s good in it.” It’s super easy to turn the corner from the good to the really messed up surveillance state. That’s what concerns me. Again, on that really wonderfully light note, I’m going to hand it off to you Fred to talk about the federal study of facial recognition systems.

Fred McClimans: Yes. So again with our theme today, we’re going to talk about some of the issues with surveillance technology, and in this case facial recognition technology. We’ve all heard a lot in the news over the past year, two years, about how there may be bias in some of the facial recognition systems that are out there. In fact, even from a larger perspective, facial recognition technology and the processing is very much based on some of the emerging AI or artificial intelligence technologies. We know that there are different ways that bias can creep into an artificial intelligence system. You can have the bias that’s injected by the developed, the way it’s designed. You can have bias that is created in the tool itself by the data that is used to train the system. Then, of course, there’s bias in the way it’s applied or misapplied. But in this particular case here, we have had some very notable instances of facial recognition systems from very notable companies that have not worked as they were advertised, Google, Microsoft being a couple of examples there.

In this case, NIST, the National Institute of Standards and Technology set out to figure out just how well these facial recognition systems perform and is there bias in there. They tested 189 algorithms from 99 organizations. Notably, they did not test Amazon’s Rekognition, recognition with a K, software tool that has been critiqued by many people previously. But they looked at this data, and they tried to figure out: Is there an issue with the misrepresentation or the inability to correctly identify a particular person or a particular ethnicity. And it turns out, surprise, surprise, there is. There’s a huge issue. In some instances they sited Asian and African American people being misidentified as much as 100 times more than white men. In fact, the highest accuracy rates, go ahead. Take a guess. Middle-aged white men.

We have a serious issue here. They found that there were difficulties in age, younger people, older people being consistently misidentified as well. This is an issue that when we start to talk about how facial recognition tools are used, whether it’s in a surveillance capacity or whether it’s the facial ID that you use to unlock your smartphone, there are some serious issues here about just how secure and just how accurate these systems are. I think that we actually have a long way to go before these systems I think would be to the point where somebody could say, “Yes, I’m going to accurately base some type of 100% accuracy, this is that person, based on these systems here.” I can’t wait to see how these start to work their way into court cases. Will the facial ID tool identify this person here and they happened to maybe misidentify the person?

Of course, it’s a big issue now for a lot of international travelers, like us here. TSA is increasingly using the facial recognition tools to track people in and out of the country here in the US. We’ve got a ways to go, but I was just impressed by the thoroughness of this particular study and the fact that we’re actually starting to get some empirical data behind what people have suspected for a long time.

Shelly Kramer: Yeah, I read this a few days ago as well, and I thought it was really interesting, especially the fact that African America women were inaccurately identified most frequently in one too many searches. It was really kind of terrifying. So know we’re going to move on. Olivier, you’re going to tell us about something that Apple is working on.

Olivier Blanchard: Yeah. It’s totally different, so we’re actually kind of deviating a little bit from our theme, but we’ll get back to it in a second. I have a feeling. I have this really interesting story that popped up in my feed this morning, and it’s about Apple going to space evidently. There’s this report, and I don’t know how much credence to assign to it. Bloomberg and Fortune and everybody is reporting on this rumor that Apple has “a secret team” of allegedly as many as 12 people, 12 engineers.

Shelly Kramer: That many?

Olivier Blanchard: I know. It’s a huge team.

Shelly Kramer: Gosh.

Olivier Blanchard: That’s a huge investment on Apple’s part.

Fred McClimans: Is that a bushel? A dozen apples? A bushel or a peck? Or-

Olivier Blanchard: I think my local Golden Coral has a bigger staff than that, and they’re not putting anybody in space for sure-

Shelly Kramer: And you go to Golden Coral pretty often apparently.

Olivier Blanchard: I can neither confirm nor deny. But I hear that the macaroni and cheese is pretty awesome on Tuesdays.

Fred McClimans: Actually, the right way to say that is I will not confirm or deny because you can confirm or deny. You’re just choosing not to.

Olivier Blanchard: Correct. Correct.

Shelly Kramer: Mr. Golden Coral.

Olivier Blanchard: I stand corrected. Yeah, Ryan’s Steakhouse is more my speed, but that was 20 years ago. Okay, so anyway so Apple has a secret team allegedly working on developing satellites to beam data to devices according to Bloomberg specifically. Essentially the idea is that they are working on putting wireless satellites in space to bring connectivity to the masses. I thought that was kind of funny because … and all of my Apple fans in my Facebook feed anyway, they’re fawning over this. It’s like the most amazing thing. Apple is so innovative. But this isn’t exactly new.

What I was telling people is Apple is not a dollar short, but they’re definitely a day late because Facebook has already been working on this for I think two years now. The focus for Facebook I think initially was bringing the internet to Africa and really remote areas where it would make sense to do it this way rather than install a lot of expensive space stations. Elon Musk SpaceX, I think if I recall correctly, just got a license to put as many as 12,000 satellites in space to do this. OneWeb, which has a lot of money behind it … I think it’s like Softbank money … is also doing this.

There’s a bunch of startups that are also putting these micro satellites everywhere that will probably end up creating this network of internet connectivity from space or low orbit. The point is that kudos to Apple for finally getting into that game. I’m glad that they have a dozen engineers working on this. I guess if all of them put together a thousand satellites, they’ll be catching up to SpaceX in the next five years. But, whatever. I just thought it was kind of funny and interesting. Good luck to Apple in that endeavor if indeed it takes off. Pun intended.

Shelly Kramer: Maybe nod to Bloomberg about if you’re going to write a story on this topic, I don’t know maybe research a little more deeply.

Olivier Blanchard: They did. They mentioned some of the other stuff. It seemed to be a little bit of a PR thing because everybody-

Shelly Kramer: Picked up the story.

Olivier Blanchard: Not a lot of people were very critical of it. It was kind of like, “Oh, Apple is getting into this.” It just didn’t really seem like Apple really is getting into it yet. But, again, whatever.

Shelly Kramer: All right, with that we are going to step into the tech bytes section of our podcast. And we’re going to talk about … on theme, on trend … we’re going to talk about how easy it is to fool facial recognition at airports and border crossings and all of that sort of thing. We know that facial recognition is being widely embraced. Law enforcement is using it. And sometimes even when you least expect it, I think I mentioned before on this show that my daughter and I were traveling from Dallas to Puerto Vallarta recently. As we were getting ready to board the plane, the flight attendant on Southwest said, “Oh, by the way we use facial recognition. Just stop when you’re boarding and pause for a minute and we’ll take an image of your face. And oh, isn’t that cool.” I was thinking, “No. It’s not really cool because I really am not interested in opting into that.” But anyway, it is remarkably easy to fool these systems using masks. What do you think about that?

Fred McClimans: At the risk of sounding dumb, it’s like, “Duh.” I hate to sound kind of crude like that but these systems can very easily be fooled. I know Apple’s facial recognition is perhaps a bit more advanced in the way it looks at contours and so forth. But even with fingerprint sensors, there’s still a ways to go before they’re foolproof, but the idea that you can use a mask. Think about all the Mission Impossible people who could go out and use your phones.

But even still even more beyond that, not all facial recognition tools actually have traditionally in the past or traditionally typically in the past required your eyes to actually be opened or not. In fact, I can’t remember. Was it Google that ran into that issue with their phone. Unlike Apple’s phone that the eyes do need to be open, with Google you could actually have somebody that’s asleep hold the phone over their face and it would work. That’s sort of a scary thing because we know that people have taken photographs of people’s hands before in public and used that photograph to create a fingerprint map that was then used to unlock a mobile device. The ability to mask a person or to put on a mask and impersonate somebody I think is ridiculously obvious and a bit comical. I don’t know why we’re putting so much trust in these systems.

Olivier Blanchard: Well, two dimensional sensors, whether it’s a camera that takes a picture or a sensor on your screen on your phone, as long as it’s two-dimensional and it takes a picture of you, those are very easy to fool. Like you said, you can take a really nice glossy picture of your fingerprint and reverse it and put that on a two-dimensional sensor on a phone. You have a pretty good chance of actually being able to unlock that phone. It’s the same thing with masks or whatever. You can do that with cameras.

However, as three-dimensional sensors become the norm and start taking over for two-dimensional sensors, including … I don’t know if you know this but a lot of little sensors like the motion sensors in your home are becoming sophisticated enough now that they’re actually being able to map the shape of your face and actually follow the contours and identify somebody based, not on images, but on kind of this radar three-dimensional image that it’s able to capture and analyze on device. It doesn’t even have to go to the cloud to do it. As this equipment starts taking over for existing cameras and existing two-dimensional cameras, I think that the age of the mask spoof is going to come to an end. But we probably have a ten maybe 15 year window in which we may be able to do this. Depending on the application too, like gas stations will probably always be the last-

Shelly Kramer: Of everything.

Olivier Blanchard: Of everything. So you’ll still be able to spoof whatever gas station facial and fingerprint ID there is. But when it comes to banks, when it comes to airports, when it comes to even just street type of image capture, I think that’s going to be the first to be unspoofable.

Fred McClimans: There’s a fundamental issue here in the way we’re applying technology that I think is setting us up for failure here with this. We’re not relying on multi-factor authentication enough in the biometric space.

Shelly Kramer: I agree.

Fred McClimans: It’s not enough to, “Here’s my fingerprint or here’s my face.” And think about deep fakes and the ability to mimic somebody’s voice and so forth. We need to get to the point where it’s not just the face, but it’s the face, a passcode, a fingerprint-

Olivier Blanchard: Or gesture. A sequence of gestures.

Fred McClimans: A gesture. Something that is unique enough to you that spans multiple characteristics because one concern that I do have with this technology is just like some of the machine vision systems can be fooled in the most bizarre ways, I think a system like this could also be fooled. Think of it sort of … who was it? Tesla’s vision system had an issue with tracking the dashes on the road. Somebody at one point took one of the vehicular visual systems, and they were actually able to by placing tape in certain locations on a speed sign, completely baffle the computer because it learns a particular thing, and there are ways that if you understand how that software is working, you can force it to break. Maybe we get to the point down the road where we have these multiple factors of identification in the biometric realm, but maybe also too we have people that develop things like the radar detector or the radar jammer, something on your person that puts out a signal that intentionally jams the facial recognition system from actually picking up the correct signal back.

Olivier Blanchard: Or wear really good HD makeup and do a lot of contouring. I’ve watched the YouTube videos.

Fred McClimans: Massive contouring.

Olivier Blanchard: I think it’s possible.

Shelly Kramer: You should live with 14 year olds.

Olivier Blanchard: No. I probably shouldn’t.

Shelly Kramer: I live with two. Fourteen year old girls, which is a whole different. The amount of makeup and contouring and crap that is going on is crazy.

Olivier Blanchard: I bet they could fool facial recognition cameras if they really wanted to.

Fred McClimans: On a related note, if you really want to find something interesting to read, there was an article. I forget the woman’s name. She was the mask maker for-

Shelly Kramer: The CIA.

Fred McClimans: The CIA, yes.

Shelly Kramer: I’ll find that.

Fred McClimans: Doing Mission Impossible stuff way in advance of Mission Impossible doing that. She recently retired. In the article, they had different pictures of her and meetings that she was in with various people. You couldn’t tell it was her. The spy craft is so cool.

Shelly Kramer: I’ll hyperlink that article to the show notes.

Fred McClimans: Yes.

Shelly Kramer: Because it really is a fantastic read. Yeah, absolutely. So with that, we are going to finish up with the crystal ball portion of our show. We go back to the main dive topic, and we kind of make some predictions about the future. I’ll say this. What do we think is going to happen? We can’t tackle the world here. But from the US as it relates to surveillance and data privacy and consumer data privacy protection, what do you think that we can expect in the next … can we expect really anything to change as it relates to that? Meaningful change? What do you think we have ahead of us say in the next five years as it relates to this issue?

Olivier Blanchard: Not a lot of change unfortunately. No, I wish. I don’t know what needs to happen for change to start coming. For starters, we need a new generation of lawmakers and regulators. We’re not really seeing that.

Everybody is kind of ingrained.

Shelly Kramer: And people who really understand technology in a far deeper way than I think a lot of-

Olivier Blanchard: The three of us could run for office I guess.

Shelly Kramer: Oh, I have too many skeletons.

Olivier Blanchard: I don’t think it matters anymore evidently, but no comment on that. Or we start focusing our attention more on helping the government pursue this, assuming that it wants to. Or educate the public in demanding it. But I don’t know. I’d like to see some kind of American version of GDPR. That would be a good start.

But we kind of need to accelerate the process. We can’t be five, six, seven years behind the rest of the world on this. Not us. We can’t afford to. Yeah, I don’t know. I’m not hopeful that something radical will happen in the next five years because there’s so many other things that are in the forefront of everybody’s mind that it almost seems like item number 12 at best. I think 15 years from now, we’ll get it right. But I think, yeah, it’s going to come hard and fast, but it’s going to come a little late I think unfortunately.

Fred McClimans: I think the biggest thing that we’re likely to see, I don’t think it’s regulation. I don’t think it’s improvement. I think it’s the McPeoplization of people. If you think about the McDonald’s franchise model, what did that give us? It gave us sameness. It gave us less choice. If we look at China, where a lot of these facial recognition tools and the lack of data privacy where that’s leading, it’s being heralded as some as, “It’s a way to control people. It’s a way to stop crime. It’s a way to promote certain behavior, like with a social scoring system that’s used in China.” All of these things, they increasingly reduce agency, the ability of people to make their own choice, their own decision.

We’ve always been limited by technology. Think about how limited our communication was when pagers came out. People started communicating on pagers, and that limited their choices in the way they could communicate. With mobile phones, text messaging. We dumbed ourselves down to fit messages into that format. Twitter we dumbed ourselves down even more to fit into that initial character limit. The more technology that we have out there, the more artificial intelligence tools that are out there, the more data is out there, the more advertisers are using all this data, it restricts agency.

I think that that’s kind of where we’re heading down the road. I would just encourage people, “Hey, turn off the device. Understand this tool that you have in front of you. You don’t need geolocation data flying out all over the place. You can disable that, not completely in all cases, but there are things that you can do to just kind of protect your privacy.” I’m hoping that we at least can advocate for that and educate the consumer. Here are the basic things you can do and maybe get to the point where it really truly is an opt-in model not always on by default.

Shelly Kramer: I wish that that would happen. I think that I tend to be a skeptic in terms of … I love the McPeoplization phrase. I think that we are largely incredibly lazy. I think that we are ignorant. And I think that we’re ambivalent. And I say that with love. It’s just too easy. It’s just leave it. Why would I change it. Whatever. It’s like when I was in line in the Dallas airport with my daughter, and I was having a conversation with fellow passengers about how this is not really the best thing.

A woman very casually piped into our conversation. She said, “You know what? Here’s the deal. This ain’t no thing. As long as you’re not doing anything wrong, what does it matter? Scan my face.” That’s how some people thing. As long as you’re not doing anything wrong, it doesn’t matter. But they’re not thinking about their agency. They’re not thinking about their privacy. Until we get to a point where we collectively are mad as hell and we’re not going to take this anymore and something needs to change, nothing is going to change. I’m not really sure I see. I think it’s because we are inherently sort of lazy when it comes to sussing out information and very ambivalent when it comes to these things. I don’t see a bright future ahead on that front.

Fred McClimans: Maybe what we need to do is just have a simple law in place or regulation or some type of requirement that says, “If you’re a technology company, and you’re going to develop technology, you have to hire a social anthropologist first to understand how that technology is going to be used by consumers, by the people you sell the technology to, and by your own employees, who we know are going to bend the rules now and then, unless somebody like Amazon simply says, “Hey guys, listen to what is going on in this home over here on Alexa.” Hire social anthropologists.

Shelly Kramer: I hold out great hope that that is actually going to happen. Great hope, not really. With that, thank you for joining us today in this what ended up really kind of accidentally being a deep dive into technology and surveillance and all the things associated with that that perhaps you don’t think enough about or that you haven’t thought about lately. And we hope that you have walked away from this really thinking about this on a deeper level because I know that it impacts all of us and it really is a big deal. We really do have time and we can make time I think to learn more about this and to maybe do what we can to affect change. With that hopeful thought, we are going to say goodbye and thank you for joining us, and we’re out.

There will be plenty of more tech topics and tech conversations right here on the Futurum Tech Podcast, FTP. Hit that subscribe button. Join us. Become part of our community. We would love to hear from you. Check us out futurumgroup.com or Futurum Tech Podcast, Shelly Kramer, Fred McClimans, Olivier Blanchard. We’ll see you later.

Disclaimer: The Futurum Tech Podcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such. 

Image Credit: Crazylearner

 

Author Information

Shelly Kramer is a Principal Analyst and Founding Partner at Futurum Research. A serial entrepreneur with a technology centric focus, she has worked alongside some of the world’s largest brands to embrace disruption and spur innovation, understand and address the realities of the connected customer, and help navigate the process of digital transformation. She brings 20 years' experience as a brand strategist to her work at Futurum, and has deep experience helping global companies with marketing challenges, GTM strategies, messaging development, and driving strategy and digital transformation for B2B brands across multiple verticals. Shelly's coverage areas include Collaboration/CX/SaaS, platforms, ESG, and Cybersecurity, as well as topics and trends related to the Future of Work, the transformation of the workplace and how people and technology are driving that transformation. A transplanted New Yorker, she has learned to love life in the Midwest, and has firsthand experience that some of the most innovative minds and most successful companies in the world also happen to live in “flyover country.”

SHARE:

Latest Insights:

In this episode of Enterprising Insights, The Futurum Group Enterprise Applications Research Director Keith Kirkpatrick discusses several new generative AI-focused product announcements and enhancements focused on contact centers and service applications.
Anthony Anter and Tim Ceradsky from BMC Software join Steven Dickens to share their insights on fortifying mainframe operational resilience through a strategic CI/CD pipeline approach, emphasizing the importance of early integration and comprehensive testing strategies.
Dario Gil and Ion Stoica, from IBM Research and Anyscale & Databricks respectively, join us to share insights on why an open future for AI is critical for innovation and inclusivity. They delve into the AI Alliance's role in this vision.
The Six Five team discusses Synopsys Investor Day 2024.