How well a product works has always been a key to its success—or failure. There is no shortage of tech products that have, despite their buildup, eventually lost consumers because they failed to work as well as expected. There was Apple’s Newton, the Samsung Galaxy Fold more recently, and let’s not forget Juicero, a juicer that didn’t actually do anything more than give a bag of pre-processed juice a squeeze.
Elissa M. Redmiles is a researcher at Microsoft Research and the Max Planck Institute for Software Systems. Gabriel Kaptuck is a PhD student at Johns Hopkins University and a visiting fellow at Boston University’s Hariri Institute for Computing and Computational Science & Engineering. Eszter Hargittai is a professor and holds the chair in Internet Use and Society in the Communication and Media Studies Department at the University of Zurich. She is the editor of Research Exposed from Columbia University Press.
Then there are the products that function, but have frequent errors, and though they remain in the market, they gain a reputation for failing to meet consumer expectations. You’d be hard pressed to find a 2012 smartphone user who would feel comfortable turning to Apple Maps for directions, even all these years later. Similarly, if you Google search “Roomba,” the first question about the self-navigating vacuum is: “Does the Roomba vacuum work?” Inaccuracies in the navigation algorithm of the Roomba have led to self-help articles on how to get the vacuums working, lower than projected sales and market penetration, and most recently, competing products whose main selling point is improved accuracy. Consumers don’t like things that don’t work as advertised. And once trust in a product is lost, it’s not easily won back.
Yet, despite decades of evidence pointing to the importance of accuracy in the success of new technologies, the debate about coronavirus contact-tracing apps and whether people will adopt them has thus far centered on privacy risks. While privacy is undoubtedly an important element to consider, the singular focus on it has left a significant blind spot: how well will these apps work, and how might accuracy problems impact adoption? Early evidence shows that even if contact-tracing apps use state-of-the-art privacy protections, accuracy concerns may send users fleeing.
Covid-19 contact-tracing apps, one of the most debated technologies currently under development, are designed to detect when people have been exposed to coronavirus using an individual’s location data, or by communicating directly with nearby phones. These apps can help users make informed decisions about when to self-isolate and get tested for the virus. But, these apps can only help public health if enough people adopt them.
Recent scandals explain the collective fixation on privacy when it comes to these apps. The Cambridge Analytica data scandal, in which millions of Facebook users’ private information was used and leaked by the company, brought privacy worries to the forefront of public conversation and policy development around targeted advertising. Concerns over user privacy have since expanded well beyond Facebook, and have yet to let up.
As a result, a significant amount of the development around contact-tracing technology has focused on avoiding the privacy failures that would deter people from downloading Covid-19 apps. Researchers and technology companies have developed complex protocols to avoid data leaks in these inherently invasive apps.
Though privacy has dominated much of our attention, we’re beginning to see concerns voiced by public health officials about the viability of these technological solutions given the lack of Covid-19 tests available to inform the app notifications and concerns about whether a technological app can actually ensure that all of an infected person’s contacts are notified. But beyond these potential issues of privacy and public health viability, accuracy, or a lack thereof, is also a deterring factor for users. The Care19 contact-tracing app being used by the state of North Dakota already has numerous reviews in the Google App store complaining about accuracy errors and has seen users leave the platform over these problems.
Looking more broadly across the United States, a recent survey of 798 Americans that we conducted, as researchers at Microsoft Research, Johns Hopkins University and the University of Zurich, examined Americans’ interest in installing Covid-19 contact-tracing apps. Whether Americans are willing to install such an app is strongly related to how well it protects private information and how accurate it is. Specifically, Americans care about false negatives: how well the apps can detect exposure to the coronavirus. And they care about false positives: how many notifications the app sends out when there is no actual exposure. Nearly half of participants said they would not install a Covid-19 contact-tracing app that has false negatives or could leak their data.
Multiple research studies examining different technologies have shown that both accuracy and privacy are key factors when Americans consider the overall fairness of using technology to, for example, set bail or determine loan repayment rates. These are technologies that can alter users’ lives, just as a contact-tracing app might.
Moreover, past research, as well as the coronavirus app survey we conducted, has found that different demographic groups weigh accuracy and privacy concerns differently. For example, the younger people we surveyed expressed less willingness to install apps with either errors in accuracy or with privacy leaks. Americans who know someone who died from coronavirus were three times more likely than those who do not know someone who died to express willingness to install an app that has accuracy errors. Women and the more highly educated expressed less willingness to install an app that may leak their data. These differences make policy conversations about new technologies that omit concepts like accuracy unfairly one-sided.
Accuracy is a key piece of the technology puzzle. It can be a marketing strategy, a product death sentence, or a trigger for privacy concerns. If technology developers and policy makers fail to include key considerations of accuracy in addition to privacy in product development and policy making, innovations risk being derailed by low product adoption while unfairly omitting key constituent concerns.
More From WIRED on Covid-19