The News: Apple, Inc. (NASDAQ: AAPL) and Google (NASDAQ: GOOGL) have announced a new partnership to add contact-tracing functionality to their mobile platforms (iOS and Android, respectively). The contact-tracing feature will reportedly alert users if they have come into contact with a person having tested positive for COVID-19 disease. According to the companies, the feature will be opt-in, but has the potential to help almost one-third of the human population monitor its exposure to infectious diseases like the novel coronavirus COVID-19. More at Bloomberg.
Google, Apple COVID-19 contact-tracing partnership pits infectious disease mitigation against data security and privacy challenges
Analyst Take: The Google Apple COVID-19 partnership is currently broken down into two phases, the first of which is scheduled to start in mid-May. iOS and Android devices will be able to wirelessly exchange information by way of apps run by public health authorities. This information will be anonymized, which is a nod to customer data privacy.
In the second phase of the project, which may take several more months to launch, contact-tracing software will be built directly into the companies’ software, thereby eliminating the need to download a separate app.
While both companies are focused on ensuring that users’ personal and medical data will remain anonymized, lawmakers and technology observers alike wonder if they can make good on that claim.
One of the pieces of this contact-tracing puzzle that makes me feel better about the type of data being collected on users is that it will reportedly use Bluetooth rather than GPS or location services. In other words, it will rely on device-to-device proximity rather than on tracking and cross-referencing the location of individual users. That’s good.
The Way it Works
How does it work? Say two people meet to chat or cross paths in an elevator, or otherwise come in close proximity to one another. Assuming that they have opted into the system, their smartphones will ping each other via Bluetooth and exchange an anonymous “handshake” to register that they have been in contact. These handshakes will not only be encoded, but the encryption key will change every few minutes, then remain on their devices to ensure privacy. None of that information, thus far, is being collected or transmitted to a server or a third party. So far, so good.
Now say that one of these people is diagnosed with COVID-19 a few days later. That person enters his or her new medical status into a health-agency app on their phone. The system, once it has obtained the infected user’s consent, will allow the smartphone to send the record of all of the Bluetooth handshakes it collected recently to a third-party server. That’s less good. More on that in a moment.
Meanwhile, all of the phones that have recently exchanged a handshake with the infected individual’s phone will check that server at regular intervals, download the keys flagged positive for COVID-19, match them anonymously to the recorded handshakes, and notify the user that they have been in contact with a COVID-19 positive (without telling them who the infected contacts are, or where and the contact took place).
While the project has a lot going for it, its two most obvious potential flaws are data security vulnerabilities, and people choosing to simply not opt in.
There’s a lot to unpack there, but here’s a look at the good and the bad around this Google, Apple COVID-19 contact-tracing partnership.
Bluetooth instead of location. The use of Bluetooth instead of GPS/location data is a big plus when it comes to minimizing the footprint of data collection on individual users. It also emphasizes proximity and contact over location data, which makes more sense given the role that proximity and contact play in effective contact-tracing. I believe that this approach is the most logical, effective, and least intrusive use of mobile technology to achieve this end.
Opt-In instead of Opt-Out. The fact that the program is opt-in by deafult gives individual users the power to choose whether their data will be used for this purpose or not.
Key encryption. The handshake/key encryption piece of the model also makes a lot of sense because it creates an anonymized alias for each phone sharing data with other phones. This is an effective way to assign and catalog contact while hiding the identity of the user. The randomized rotating encryption — similar to rotating security keys on some credit cards — adds an additional layer of security to that privacy feature.
On-device key vault. I also like that the Bluetooth handshakes/keys are initially stored on each device rather than sent immediately or contemporaneously to a server somewhere. The choice to go with this model suggests that Apple and Google are serious about protecting their users’ privacy and data for this particular use case.
Third-party apps are vulnerable. I cringe every time I read that the system will, at least initially, depend on data accessed via “apps run by public health authorities.” Those apps and the servers that store users’ personal and medical data are the obvious vulnerability in this system. Even if user-to-user COVID-19 positive notifications remain anonymized through the use of encrypted keys, anyone entering their COVID-19 positive status into an app “run by public health authorities” should be concerned that their medical information and positive status could be accessed by hackers or other hostile actors.
Opt-In versus Opt-Out. Yes, this is both a pro and a con. My concern here is that if a significant percentage of the population doesn’t opt into this system in the first place, its effectiveness will be greatly minimized. And if the system is deemed ineffective by users because too few people use it, it will become mostly irrelevant, and opted into even less. (Rationale: “Why bother?”) As a staunch supporter of Opt-In, it pains me to say this, but this may be one of the very rare times when perhaps the use of a feature or app like this one may need to be mandatory, or at least Opt-Out by default.
Voluntary status reporting. In the same vein as my argument above, another potential flaw in this system is that it depends on someone who has tested positive to not only have opted into this system in the first place, but to also voluntarily enter into an app their COVID-19 positive status once they have tested positive. For a long list of potential reasons, from privacy concerns to a fear of being stigmatized should their status become de-anonymized, one can reasonably expect that a good number of COVID-19 positives will simply not enter their positive status into an app unless compelled to by law or an aggressive public health PR campaign encouraging them to do so.
The data security vulnerability in this model could potentially be mitigated IF Google and Apple, or a competent third-party technology partner, could be counted on to ensure that the ecosystem of public health databases and apps that it depends on were held to a mandatory and certifiable standardized framework — merely suggesting a framework will not be enough.
Also, as this will likely require an atypical degree of IT and policy cooperation between US states and various countries, I am curious to see how Apple and Google will resolve some of the practical hurdles associated with this particular challenge that may yet derail the execution of this project.
As to the challenge of promoting program opt-in and voluntary status reporting scale so that enough people will benefit from anonymized contact-tracing notifications, merely making the program available and known may not be enough. Apple, Google, and their public health partners may have to invest in a combination of PR, advertising, PSAs, influencer outreach, and public health guidelines — perhaps even including mandatory participation in areas where that may be possible — in order to achieve their desired outcome. Regardless of how well coordinated and well-crafted such a campaign may be, if the data security and privacy vulnerabilities of this program are not addressed to the public’s satisfaction, Apple, Google, and their public health partners may find it very difficult to convince enough members of the general public to trust that their COVID-19 positive status will remain anonymous to make the program truly effective.
This is obviously only the beginning of the conversation around the topic of around the pros and cons of contract-tracing as it relates to infectious disease mitigation and the data security and privacy challenges that are a big part of both protecting the public and mitigating infection. I’m sure I’m not the only one who will be watching this Google, Apple COVID-19 contract-tracing partnership with great interest as they launch and begin to promote this initiative.
Futurum Research provides industry research and analysis. These columns are for educational purposes only and should not be considered in any way investment advice.