The News: Peloton’s leaky API, which exposed private user data, was in the news alongside some other not-so-great news for the fitness brand this last week. The leaky API was first reported by Tech Crunch’s Zach Whittaker, and you can read his story here.
Analyst Take: It has most definitely not been a great few weeks for Peloton. With the recall of all Peloton Tread and Tread+ treadmills after the death of a child and some 70+ injuries after the brand first tried to shake off the concerns of the CPSC, and then later admitting it was wrong, Peloton was already in the spotlight. Adding to the Tread disaster is the that the Peloton API is leaking private customer data and it made a bad period for the brand reputation overall.
Regarding concerns about the Peloton API, this is an important user data privacy issue. Peloton has a community of some 3 million plus members. When setting themselves up in the Peloton system, members can choose to keep their profiles private or make them public, so that their friends can see their stats, workouts, etc. User profiles also include things like height, weight, age, gender, you know …. personal details. Many users, myself included, prefer to have a private profile. That means you still enter in that information, but you keep your settings private, not public. Easy, right?
Except when it doesn’t work. The Peloton API vulnerability was disclosed by Jim Masters, a researcher at Pen Test Partners, a security consulting company and the bug allowed anyone to pull users’ private information directly from Peloton’s servers, even if a profile is set to private.
Pen Test reported that the Peloton APIs required no authentication and that the information was simply available for anyone who went looking. This information included things I. mentioned earlier: User IDs, Instructor IDs, Group Membership, Workout Stats, Gender and Age, Height, Weight, and city where the user is located.
Pen Test Partners published an article last week stating that they reported the issue to Peloton in January and provided a 90-day deadline to fix the bug. Pretty common operating procedure. Masters got a confirmation from the company that the notice was received. Two weeks later, Pen Test noticed that Peloton executed what they observed was a partial fix and said nothing about it. This partial fix meant fixing the API so that the data was no longer available to anyone, but instead only to anyone with a Peloton account. What?
Pen Test Partners tried hard to connect with Peloton about this and were soundly ignored. It was only when Zach Whittaker, writing about the leak for Tech Crunch asked about it that the company decided it was probably a good idea to do something.
Jim Masters published a blog post on this issue that he updated on May 5th following a conversation with Peloton’s new CISO who advised the vulnerabilities were mostly fixed within seven days.
My colleague Fred McClimans and I covered the leaky Peloton API as part of our Cybersecurity Shorts series of the Futurum Tech Webcast. There’s more to the conversation, so check it out.
You can watch the video conversation here:
Or stream the audio on your favorite podcast app:
I’ll close by saying Peloton’s stock was down 15% on Wednesday, which is what happens when brands let hubris prevent them from doing the right thing by their customers Hopefully the brand will realize it’s better to be a cool brand that cares about that massive community they’ve amassed rather than the alternative.
Disclaimer: The Futurum Tech Webcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.
Shelly Kramer: Absolutely. All right. With that, we’re going to move on and we’re going to talk about Peloton. Peloton I believe has a bit of a uberous problem, at least when it comes to user’s personal safety and data privacy. If you’ve been paying attention at all this week, for the last couple of weeks, it has not been a great period for Peloton. A few weeks ago, the consumer protection agency issued a warning about Peloton’s Tread ad Tread+ treadmills. One child has died and at this point, there’s been some 70 some injuries as a result of this Peloton treadmill.
I actually considered buying one of these and instead bought a different treadmill. But from what I can see, one of the dangers is that this particular treadmill sits up a little bit higher than others, which allows a little one to get underneath it. By the way, all treadmills are dangerous. I remember when my twins were little. No matter how many thousands times you tell them not to play on the treadmill and you keep the key away from it, sometimes toddlers or young people can find a way to get it.
I remember one of my twins learned a valuable lesson that treadmills are dangerous. But anyway, when the CPSC came out with their warning about Peloton Tread and Tread+ treadmills, the company really just poo-poo’d it, and they really kind of arrogantly said, “You don’t have anything to worry about. This is not accurate. This is just a random warning. We’re all good.” What happened this week was that actually the company recalled all of its Peloton Tread and Tread+ treadmills and combined that with a statement, “We were wrong.”
I mean, there’s so much at stake from brand reputation. When you’ve had a death, it’s kind of an arrogant stance to take that the CPSC is wrong. But then in addition to that this week, there was news that the Peloton API was leaking private customer data. I’m sure many of our listeners are Peloton customers. You can see mine in the background. Peloton has done a great job of creating community. And when you set up your Peloton account, you can choose to have your account be public. You can have it be private.
But when you’re setting it up, the systems asks you for details about yourself, your height, your weight, your age, your gender, your city. Some people put a lot of information in. Some of these things people just don’t think about when they put their information in. They just put accurate information in. The thing about having a public profile is that part of the beauty of Peloton for many people is that you can connect with your friends. I can see that Fred’s working out or that Daniel’s working out or whatever. I can be motivated by the fact that you’re working out every day.
I need to get off my butt, right? But anyway, this API vulnerability was discovered by a guy named Jim Masters. He’s a researcher at a company called Pen Test Partners. They’re a security company that actually researches breaches and vulnerabilities. What Jim found is that the bug allowed anyone to pull user’s private information directly from Peloton’s servers even if a profile was a set to private. Hi. I mean, that’s kind of a big deal. I will say my profile is set to private. I’m not really interested in my information being there.
What happened though, I thought was equally interesting, is that they reported this issue directly to Peloton in January and Jim published a blog post about this and he said he gave them what is a standard in the industry 90 day notice, a 90 day deadline. Fix this bug. I’m telling you about this bug. Here’s 90 days to fix it, and then I need to go public with it. He submitted that notification to Peloton. And he got a confirmation that they’d receive the notice, and then there was radio silence.
And then a couple weeks later, Pen Test noticed that Peloton had done what they thought was a partial fix of this problem, but they didn’t say anything about it. This partial fix meant fixing the API so that the data wasn’t any longer available to anyone on the planet, but it was available to anyone with a Peloton account. Okay. Hi, that’s not really very much of a fix. So then they tried again to connect… Pen Test Partners tried again to connect with Peloton and they were ignored.
It was only when Zack Whittaker, a TechCrunch reporter, was writing about this and he was one of the first to report on this leak. When he asked about it, the company decided it was probably a good thing to do something. Masters published a blog post on this issue, and he updated it just this week following with a conversation with Peloton’s new CSO, who advised that the vulnerabilities had mostly been fixed within about seven days. Okay. It’s really hard to believe anything that Peloton says at this point. Again, I’m a customer, but it is interesting just kind of the attitude that Peloton has.
This really doesn’t concern me. It shouldn’t concern you. We’ll deal with this in our way, whatever, just seems like a really arrogant posture on the part of the brand. Oh, and the company stock was down about 15% on Wednesday. Maybe that’s what happens when you act like a jerk. I don’t know. I like to err on the side of optimism. My hope is that Peloton will learn from this and realize that this is not the best way to treat customer’s data certainly, to treat injuries as a result of its products.
I know that you have some thoughts on a famous Peloton user and really what that might be. Why don’t you share that?
Fred McClimans: Shelly, you’re the most famous Peloton user I know.
Shelly Kramer: Oh, by far not.
Fred McClimans: But there are others out there. And in fact, the issue of the API here… By the way, this kind of sounds a bit like without some of the arrogance what Zoom went through when everybody just jumped online and started having Zoom calls for business, for personal, for schools, the works. Zoom really hadn’t been set up at that point to really handle that type of massive wave of user adoption in security and everything that came with that.
I think Peloton probably got caught up a little bit in that type of a situation where they just hadn’t really thought about security and how important and critical it is in its type of an environment. Now they realized, look, after a year of pumping bikes as fast as we can and building our base and doing lots of marketing and saying, “Hey, let’s go Peloton,” now they actually have to pay the price for ignoring security. This came up this whole issue though back in January when Joe Biden was sworn in as the president of the United States.
President Biden is, I guess, the famous Peloton user out there. The Secret Service and everybody looks at all this device and they go, “Hey, this is great. He’s got an exercise bike, but it’s got a camera. It’s got data. It’s got all sorts of things in there. It can listen to conversations. It can see what’s going on in a room.” Essentially you’re in the situation here where the tech that we’re using for fitness, in this case, the president using it, which is a great thing, has so many potential vulnerability points in it, so many risk points.
That in order to really make it truly safe to put into the White House, you would literally have to disconnect everything that’s electronic on it and just simply have an exercise bike without all the two way communications and the sharing of data. I know that seems a little bit extreme, but there is something in the cybersecurity space here beyond even just the API issues here, the fact that these devices are all connecting wirelessly into some wifi device within your house, your apartment, wherever you happen to be.
There’s an attack strategy in cybersecurity called the man in the middle, where an organization puts a device or a person somewhere in between one person and another. They capture the data that goes back and forth between them. This is most famously used with devices like the Stingray cellular devices that were found floating around Washington DC a few years back, where a user comes into DC, they turn on their mobile device, and there’s a femtocell or a picocell, basically a small portable cell tower, that mimics the real cell tower.
It accesses that gateway. Your phone connects into it. You make a phone call and everything you do, everything you transmit over there is now captured by this picocell or femtocell, the Stingray type of an attack here. But the same thing theoretically is possible with a wifi device. When you go into a Starbucks or a Cozy or wherever you happen to be and you connect up to the local wifi, very often those devices themselves are fake devices.
It’s very easy for somebody to go into a shopping mall, for example, and just set up a fake wifi device, say, “Hey, look, it’s free. Dallas Town Center Mall. Free access.” People log into it.
Shelly Kramer: Which is why I never do that.
Fred McClimans: I know. In the wifi case though, theoretically it’s possible to do the same thing. Basically set up a very strong wifi signal that somebody within their own home could accidentally log into. The security risks here are very high. I would say just based on the fact that so many of the people that we engage with are now working in their homes, all the devices that they have there, all the electronics, everything, that’s a window into their home.
In particular, something like the Peloton Bike, which you see a lot of in the background of the offices or other type devices, you got to be really careful. When they’re not being used, turn them off. When they are being used, make sure it’s properly configured and you’re at least making yourself as secure as you possibly can be. Just to kind of wrap this up, we don’t know if the Peloton Bike release… I don’t know if the bike actually made it into the White House, how many layers of duct tape it has and how many wire connections were cut.
Definitely an issue here. I think you’re spot on, just the sheer arrogance. I mean, I know Peloton’s CEO said, “Hey, we admitted that we were wrong.” But the fact that they were so vehemently opposed to action on both the API issue and on the physical risk issue of their treadmills, there’s just no place for that. If you want to ruin your brand in 24 hours, tell your users you don’t care about safety in their devices.
Shelly Kramer: Right. I think sometimes people don’t think enough about the information that’s out there. But when my name and city and my gender and my… I think I read that not necessarily your date of birth, but if it happened to be your birth date, that information was available. I mean, the reality of it is there are all these little breadcrumbs out there on all of us, right? It’s not hard for somebody. By the way, again, these databases, this information is routinely leaked on cybersecurity forums, on hacker forums. Huge data dumps.
We talked about that last week. It’s not hard to go through these. By the way, now they’re also using AI machine learning to automate some of these processes and to pull information out of these databases. It really is important to protect your data and protect your privacy and to patronize brands that care as much about that as you do. I think to me that was what was important here. I am a Peloton fan. I am a customer, and there are many things I love about the experience that Peloton has created.
This experience, their attitude as it related to the consumer safety warning that led to an ultimate recall and their attitude here, although the one thing I will give them a pass on is that in Jim Masters’ blog post about this instance, he indicated that the CSO at Peloton was new to the role. Did they have a CSO? You know what I’m saying? A lot of times in an organization, when somebody’s coming in, somebody’s going out, sometimes things drop through the cracks. I’ll give them one teeny tiny pass for that.
Fred McClimans: They had a CSO in place. I don’t know what happened or what that transition’s like. We’ll have to dig into that a little bit.
Shelly Kramer: Yeah. Yeah. Anyway, I thought it was an interesting… It wasn’t a great week or two for Peloton.