Listen to this article now
What does privacy mean? Short answer: not much—at least not in our current state of digital transformation. Indeed, while I love technology and everything we’re now able to do to learn more about our customers through personalization, the cold hard truth is that there is no way to guarantee privacy in today’s digitized world. It was reported this month that Alexa is listening and recording your conversations. Data breaches continue to happen. And even though Mark Zuckerberg is preaching the future is about privacy, can we really trust him? It’s time to be honest. I don’t think so. At least, not as long as someone is using a computer, or living in proximity to one. Will that change? I sure hope so. Here’s why.
What Does Privacy Mean: A User-Focused Definition
Although we bandy the terms around loosely, there is a big difference between security/protection and privacy. Privacy is the empowered act of boundary-setting that allows me and other users to determine who can access what data about me, where, when and for what purpose. Privacy is—or at least should be—an inside job. It’s the act of determining when you want to close the blinds on your internet surfing, shopping habits, music preferences, or book club selections. It’s deciding when you are okay with someone gaining financially from your personal information and when you aren’t.
On the other hand, security/protection is the nuts and bolts of how a company keeps our personal user data safe. It’s the firewalls, the two-step authentication, the password guidelines, and the laws that require it. Security protection is the company’s responsibility—and not all companies take that responsibility seriously. In fact, we know most companies will experience a data breach of some kind. That means ultimately, the protection of one’s personal data is forever out of users’ hands.
Transparency is Required
Unfortunately, the only way a user can be empowered to establish privacy boundaries is through transparency from companies themselves. For instance, users need to know what types of information is being gathered from them in order to be able to make informed decisions about what information they want to share. Unfortunately, most companies bury this information in terms and services. Has anyone ever read one of those fully before clicking I agree?
But it gets complicated. We get ads for things we just had a private conversation about with a friend on the phone, for instance, or we get a text message requesting that we give to a certain political campaign or charity after listening to a certain podcast. Or what about the story of Target sending a coupon to a woman who didn’t even know she was pregnant yet. Clearly, some companies are overstepping. But many of us are okay with that.
The Privacy Paradox
According to a recent study, 54 percent of Americans don’t believe companies have their best interest at heart but will still give up their data if they get some benefit in return. Consumers want personalized experiences. They want to be treated like a human not a number. But in order to do that, data has to be collected. It’s a trade off and companies need to do a better job of building trust when it comes to privacy and security.
In addition to being transparent up front with data collection, businesses need to be transparent if a breach does happen. Be more like Tylenol and less like Toyota. Companies are learning the hard way that consumers will see through any phony ploy to hide a data breach and apologize for it months later. We are giving up our privacy, becoming loyal customers and we want to be treated with respect and assured that our data is secure.
Legacy-Era Definitions Will Die—Eventually
Mark Zuckerburg recently put together a fairly lame attempt at a privacy manifesto that discussed Facebook’s commitment to reducing the permanence of information gathered on Facebook and making systems more interoperable. His manifesto was lame because despite its name, it didn’t address the underlying issue of privacy. Can you say what data Facebook collects about you? Is it just your general information? Information about your friends? Your shopping habits? Music tastes? Movie preferences? Political leanings? While I’d venture to guess those are all yeses, we don’t know for sure. People are starting to see through his phony dedication to privacy.
In my view, Facebook is a good example of a new type of “legacy era” company—digital businesses who got it wrong the first time around. Facebook is the perfect example of a company who made the wrong assumptions—built an empire on sand—and is trying to double back but is finding it difficult to rework its old system to meet new user demands. (Sounds similar to the issues experienced by legacy-era businesses today, doesn’t it?)
At the end of the day, companies that incorporate transparent privacy policies into the building blocks of their companies are the ones that will see increased brand loyalty moving forward. They’re the ones who are actively pursuing ways to incorporate blockchain into the processes—who are actively working to not just meet but exceed the guidelines of the General Data Protection Regulation. They’re the ones who actively empower their customers to offer them information, knowing it will be used to enhance their user experience—no more, no less.
What does privacy mean? Today, not much. It’s a buzzword without much traction in the wild west of AI and digital transformation. But in the next five to 10 years, I anticipate privacy will become a game-changer for the companies that do it right. It will bolster trust—and ultimately sales. And customers will, thankfully, be all the wiser for it.
The original version of this article was first published on Forbes.
Futurum Research provides industry research and analysis. These columns are for educational purposes only and should not be considered in any way investment advice.