Fourth Industrial Revolution

Online privacy notices don't work. Here are 9 alternatives

Do you read the full terms and conditions? Image: Unsplash

Jen King
Privacy and Data Policy Fellow, Stanford University
Anne Flanagan
Project Lead, Data Policy, World Economic Forum
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Fourth Industrial Revolution?
The Big Picture
Explore and monitor how Fourth Industrial Revolution is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Fourth Industrial Revolution

Listen to the article

  • Online privacy notices present a human-technology interaction problem.
  • People often don't read the notices, discount the risks to their data or don't understand how their data could be used.
  • Here are nine alternatives that governments and businesses can use to protect consumers.

For the past two decades, we have become accustomed to the tiresome ritual of clicking through privacy policies online, but like lengthy terms of service documents, these are generally not written with the average person in mind. They are foremost legal documents, and, after all, who has time?

As people, we assume that we can and want to make informed decisions about who collects data about us and how. Yet numerous studies have identified how infeasible it is for people to have the capacity, let alone will, to manage the multitude of relationships they have with data processors.

What we have here is not just a legal problem, but a human-technology interaction problem, one that must be addressed with due consideration of human limitations with respect to technology, as well as the broader consequences for society.

Have you read?

What’s at stake

Despite the legal requirement to present a clear choice to people, the practice of presenting notices to individuals who won’t or can’t read them is not only disingenuous, it also undermines many of the advances that new data protection laws are attempting to achieve. For example, the newly enacted California Consumer Privacy Act in the U.S. and its cousin the California Privacy Rights Act is an attempt to give consumers increased control over their personal data. California consumers now have rights over access, deletion and “do not sell” requests of their personal data held by companies subject to the law.

Greater transparency is also required by companies, but neither of these laws places limits on the collection of personal data, and both leave existing notice and consent-based frameworks unchanged. Thus, consumers only directly benefit from these new rights if they are both aware of them and choose to exercise their rights, but nothing prevents companies with California-based customers to continue to collect whatever data they wish, as long as consumers click “I agree.”

One of the cognitive biases humans share is the tendency to engage in hyperbolic discounting — forgoing long term benefits over short-term rewards. With respect to data, it is nearly impossible to predict what the long-term consequences might be of data sharing we elect to do today. For example, did we realize a decade-plus ago when we began sharing our photos online that companies would someday scrape that data to train machine learning algorithms, including facial recognition tools? And that’s not our fault; such technologies were either in their infancy or not developed. Sometimes ticking that box can be akin to trying to predict the future.

Then there is the issue of future-proofing. Continued reliance on “tick the box” doesn’t take account of the reality of always-on connected devices or times when your data impacts others, such as when a person’s decision to provide their DNA to a consumer genetic database also implicates one’s direct genetic relatives, present, future and past. And more urgently, aggregated data pools exist, which allow companies to generate inferences both about individuals across multiple contexts as well as their relationships to others. Data collected in one context, for one purpose, may generate insights about you in another, without you ever knowing about it.

The way forward

There is a widespread acknowledgement that the current system of notice and consent is failing consumers, from regulators to CEOs, and yet, to date, policymakers around the globe still struggle to tackle the issue, likely because the issues are so complex as to require change on many fronts, and reform will require an acknowledgement of the issue as a human-technology interaction problem and not merely a legal one. In the meantime, we are caught in an endless loop of “clicking here to agree” while simultaneously decrying these mechanisms as a farce.

With the goal of breaking this loop, we worked with the World Economic Forum's Data Policy team at the Centre for the Fourth Industrial Revolution in San Francisco to hold a multistakeholder dialogue of experts in human-computer interaction, design, social justice – and law. Using human-centred design thinking, the group developed a set of nine ideas for policymakers and business leaders to consider.

These can be classified on four dimensions: design-focused vs. regulation-focused, and individually-oriented vs. collectively oriented. For example, Personal User Agents, a proposal intended to use software as a means for helping people manage their data relationships, is a design-focused approach intended to aid individuals. Data trusts, in contrast, are a legal and regulatory framework intended to create collective solutions for data protection that draw on trusted third parties to manage data on behalf of a group of individuals.

We discuss each of the nine ideas in depth in the resultant white paper, "Redesigning Data Privacy: Reimagining Notice & Consent for human technology interaction."

Nine alternatives to “Tick the Box” online notices

  • Data visualization tools for policy-makers: tools for illustrating the actual impact of data collection on actual individual experiences;
  • Harm assessment process: a data impact assessment for companies that focuses on outcomes for consumers;
  • Purpose limitation by default: limiting harmful types of secondary personal data collection and processing by default;
  • Positive regulation and responsible innovation: incentives for companies to adopt practices for responsible and ethical data collection and use;
  • Privacy by design in smart cities: design-justice focused processes for designing data collection in smart city environments;
  • Autonomy for tracking in public spaces: using personal technology (such as mobile devices) to express one’s preferences for individual tracking in public spaces;
  • Data Trusts: new legal vehicles for managing personal data by trusted third parties;
  • Algorithmic explainability: subjecting “black box” algorithms to auditability to limit societal and individual harm;
  • Personal user agents: software-based trusted virtual agents that manage one’s individual data collection and sharing preferences.

But it is the individual vs. collectively oriented dimension that we want to highlight. Because as long as we only consider notice and consent as a problem rooted in individuals and their decision-making capacities, we fail to miss the societal level harms that arise from aggregated data disclosure and collection. We also miss the opportunity to shift aspects of consent, especially particular contexts (e.g., health data), into collective models that may be more appropriate for the type of data involved or the potential risks.

This could include moving some data relationships into new data governance models such as data trusts, where a trusted third party oversees the administration and use of a collective set of data. We also suggest companies embrace responsible innovation in this area, by adopting privacy-protective data practices that could allow a level of exemption from notice and consent requirements through pre-certification or by considering adopting design practices that focus on identifying societal harms.

So, what next?

While there is general acknowledgement that existing notice and consent mechanisms are a problem, building consensus among stakeholders on the appropriate solutions is the next challenge. For while it might appear easiest to focus on building better mechanisms for individuals to grapple with consent decisions, the problem also raises societal level questions about data governance that must be addressed.

In sum, if we want to move forward to the next chapter of data protection and shutter the farce that is notice and consent, we need to acknowledge that this is a human-technology interaction problem, not merely a legal one, that requires a fresh approach. Otherwise, our best efforts at reform will fail to break the cycle, and we’ll continue to be hampered by mechanisms that are not fit for purpose.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Space: The $1.8 Trillion Opportunity for Global Economic Growth

Bart Valkhof and Omar Adi

February 16, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum