Re-Thinking Privacy in the Connected World

Share |

Late last week, SIIA hosted a lunch event “Re-Thinking Privacy in the Connected World” that focused on reevaluating popular understandings of the privacy cost/benefit analysis of data-driven innovation and the Internet of Things (IoT). So often discussion of these technologies and analytic methods demonize data collection as a risk to personal privacy and security. But our speakers gave us a more balanced understanding of the complex relationship between technology, data and privacy.  They underscored that while there may be fundamental right to privacy, privacy is not monolithic, but based on a diverse and evolving set of expectations.  And they rebutted the notions that enhancing privacy can be accomplished merely by limiting data collection, or that more data equals less privacy, which calls for more regulation.

Commissioner Maureen Ohlhausen of the Federal Trade CommissionFTCmo (FTC) opened the event with remarks on how to maximize consumer benefit in this space by practicing “regulatory humility.” The FTC’s mission to protect consumers is best served through looking for actual harms. Once harm has been identified the best next course of action is to look within existing tools for solutions. And finally, if new regulation is necessary, to take focused incremental steps – a light touch approach.

Commissioner Ohlhausen explained: “…we need to exercise regulatory humility and this means focusing on existing or likely consumer harms rather than hypothetical harms. And then we must consider what tools we already have to address those concerns and then evaluate the cost and benefits of taking such actions.”

Commissioner Ohlhausen cautioned against preemptive action: “We should remain vigilant for any deceptive and unfair use of big data that violates Section 5 of the FTC Act, but to avoid any preemptive action that could preclude entire future industries. Ultimately, our work as an agency should help strengthen competition in the market to better provide beneficial outcomes to consumers in response to their demand rather than to try to dictate desired outcomes in the market.”

She also challenged the popular belief that big data and IoT raise a Pandora’s Box of new data security issues. Emphasizing that much of the consumer harm that could potentially result from these technologies apply to all datasets more generally – which the FTC has a long, successful history of dealing with:

“Many of the concerns raised by big data can be suitably addressed by current law and policy. When there are new issues, we have to exercise humility in addressing them and this means working with organizations like SIIA and its members to understand problems deeply and to focus our enforcement actions on situations where improper use of consumer information causes substantial harm. This approach will free entrepreneurs to innovate with big data tools while simultaneously helping to ensure that consumers remain protected.”

David LeDuc, Senior Director of Public Policy at SIIA, led a panel discussion following Commissioner Ohlhausen’s remarks. The panel featured three speakers, all respected research and policy experts in this space: Benjamin Wittes of the Brookings Institution, Joshua New of ITIF’s Center for Data Innovation, and Adam Thierer of GMU’s Mercatus Center.

Each panelist focused on a particular part of the privacy equation. Wittes comments explored what privacy truly means, and called into question our ability to accurately measure privacy “gains” and “losses.” He challenged the audience with examples of ways in which big data and IoT have given rise to a novel kind of privacy. (Wittes and co-author Jodie Liu have a paper on this topic, The privacy paradox: The privacy benefits of privacy threats.)

Wittes pointed out that privacy is not what self-appointed advocates or scholars tell us it is, but rather what the majority of people believe it to be:  “Privacy is a really, really complicated value. The lived experience of privacy to real people is a very different thing from the hypothesized experience of privacy by privacy groups and scholars.”

So many people are using technologies in ways that grant them greater privacy from the people around them. Wittes gave the example of purchasing condoms through an online retailer or turning to search engines like Google to answer questions or obtain information on highly sensitive personal queries. And perhaps as Wittes suggests, this tells us something about the kind of privacy people truly desire:

“People actually care a great deal more about privacy from the people immediately around them – their parents, their loved ones, their partners. Privacy from relationships is, for many people, a much more powerful thing than privacy from some remote entity that may be collecting data on you. People will routinely choose to give data to big, remote entities by way of buying greater intimacy privacy from the people around them.”

Not only do we need a new vocabulary to talk about privacy we also need to understand the dangers of data poverty. Joshua New listed the numerous ways in which collection of data points on individuals is supremely important. (The Center for Data Innovation has a report on this topic, The Rise of Data Poverty in America.)Take for example FDA trials for a new drug.

New pointed out that study participants in drug trials are often not representative of the population. In particular women and Hispanics are severely underrepresented in such trials. When a drug has been approved and put on the market based on the data points generated from the study population, there are very real consequences for the inequity of data collection in this context. Adverse side effects that may impact different ethnicities or genders differently are not taken into account with a more homogeneous study population and can result in serious health consequences.

New concluded: “The Internet of Things, in general, provides so many opportunities as society becomes much more data-driven. The Internet of Things is one of the largest factors influencing these changes. The benefits are incredible. But it’s up to the folks here to make sure these benefits are applied equitably or these discrepancies that already exist won’t get worse or the inequalities won’t be exacerbated.”

Closing out the event, Adam Thierer zeroed in on the policy debate echoing Commissioner Ohlhausen’s take on the need for measured regulation that doesn’t end up inadvertently blocking innovation and the continued generation of consumer benefit.

Comparing the regulatory environment in the U.S. to Europe, Thierer claims that we have a real live case study on permissionless innovation vs. the precautionary principle. (Thierer has a book on this topic titled Permissionless Innovation: The Continuing Case for Comprehensive Technological Freedom.) While the U.S. has for the most part taken the former approach the Europeans have proven to be “extraordinarily risk-averse.” And the result he says is that the biggest innovators in the tech space are all American companies.

“If we let hypothetical worst case scenarios drive policy and base regulation upon hypothetical worst cases, the best case scenarios will never come about. Innovation essentially dies when you live in fear of the future and of innovation. We should allow experimentation and trial-and-error to continue and see where it takes us and address serious, hard problems as they develop, but let’s not reverse that equation and go with the cautionary approach…”

While the debate about the intersection of big data and privacy will no doubt continue for years to come, there is a clear takeaway from this event:   there is not necessarily an inherent conflict between data-innovation and privacy, but we need to recognize the tremendous value of data collection, continue to thoughtfully evaluate privacy gains and losses, and be sure not to over-regulate on hypothetical harms.

 

Diane Diane Pinto is the Public Policy Coordinator at SIIA. Follow the Policy team on Twitter @SIIAPolicy.