Emerging Technologies

Were the Luddites right? Why building digital trust is key to technological innovation

Why building digital trust and prioritizing individual agency is key to creating better technologies.

Why building digital trust and prioritizing individual agency is key to creating better technologies. Image: Getty Images/iStockphoto

Daniel Dobrygowski
Head, Governance and Trust, World Economic Forum
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Emerging Technologies

  • The 19th-century Luddite movement highlights how even skilled technology users will protest when such tools are used to eradicate their rights.
  • Two centuries later, innovation still too often fails to prioritize the concerns and fears of those who use, and are most affected by, new technologies.
  • The World Economic Forum's 'Digital Trust: Supporting Individual Agency' report outlines why transparency, privacy and redressability are the key dimensions to building digital trust.

The Luddites had the right idea. While today “luddite” is a pejorative term aimed at people who refuse to use or understand new technology, this 19th-century protest movement likely stemmed from people who understood the ramifications of new technologies all too well.

The self-described followers of Ned Ludd (a mythical Robin Hood-like figure) didn’t hate technology. Many in the Luddite movement were skilled weavers and quite effective at using the best technology available.

Rather, they turned to machine-breaking as a last-ditch protest against these tools being used “in a fraudulent and deceitful manner” to erode the quality of the trade to which they’d devoted their lives and to eradicate the rights they had come to rely on as workers.

Have you read?

The Luddites weren’t protesting technology – they were protesting the decisions powerful people were making about technology that threatened to rob them of their careers, violate their rights, and cast them into poverty.

They were protesting the fact that technology was being deployed (as we might say today) in a way that lacked transparency as to its goals, ran counter to their values and that gave them no avenue for redress for the very real harms they would suffer.

Overcoming the ‘crisis in trust’ in technology

Two hundred years later, these same issues have made governments and businesses – especially the technology industry – anxious about a crisis in trust.

In many cases, innovators believe that if they just make the technology more effective or increase their user base that will solve the trust problem. But while reliability is certainly part of trust, it’s the way decisions are made – and what gets considered in the design process – that determines whether a given technology will earn trust.

A vital lesson from the experiences of the early Industrial Revolution is that, while the spotlight often shines on the capabilities of new technologies, what’s too often overlooked is the fundamental importance of prioritizing the needs and concerns of the individuals who use and are affected by them.

This is especially true where rapid advancements in artificial intelligence (AI) and other fields hold immense promise, but their adoption and success hinge on establishing trust and ensuring user safety.

At the World Economic Forum Annual Meeting in Davos, panelists in the session 'Technology in a Turbulent World' described how the crisis in trust is not that individuals don’t understand how to use technology – it’s that they don’t believe it is working in their favour or supporting their expectations and values.


As with the Luddites, the problem is not that people don’t trust the tools available, the problem is that people don’t trust the decisions made in developing and deploying technology.

This is fundamentally a question of agency – will new technologies support me and my needs as a user or do these technologies, that I increasingly must use to function at work or in society, actually serve someone else?

It is unclear to most people what decisions go into the tech development process. As a result, they do not believe that their privacy or other values will be protected, and they don’t think anyone will protect them if they are harmed by new technologies.

Digital trust and supporting individual agency

The World Economic Forum’s Digital Trust Initiative set out to answer this fundamental question of trust and developed the Digital Trust Framework to ensure that leaders and innovators prioritize security, safety, reliability, effective governance practices and inclusive, responsible and ethical uses of new and emerging technologies.

Ultimately, the Forum’s research found, trust is a function of respect for individuals’ and society’s values – and that includes values around individual agency and choice.

This understanding of the interaction between values and innovation necessitates a shift towards prioritizing transparency, safeguarding privacy, and implementing mechanisms for redress when harms occur.

The Digital Trust Framework
The digital trust framework Image: The World Economic Forum's 'Digital Trust: Supporting Individual Agency' report

The Digital Trust Initiative’s latest report, 'Digital Trust: Supporting Individual Agency', closely examines the digital trust dimensions of transparency, privacy and redressability according to the perspectives of organizations, governing bodies and individuals – all with the central lens of supporting individual agency.

Business and government leaders are encouraged to prioritize the individual’s perspective throughout the technology life cycle and take a by-design approach, especially to transparency and privacy.

Creating better tech prioritizes people

Ultimately, creating better tech for people is a deliberate design choice that demands integrating these essential elements into the very fabric of the development and deployment processes, rather than treating them as mere afterthoughts.


How is the World Economic Forum fostering a sustainable and inclusive digital economy?

Rather than address the Luddites’ concerns, their activities were criminalized and the movement violently suppressed. While the term itself has become a joke among sophisticates, the demands for agency and respect the Luddites voiced continue to echo through the history of technological innovation.

At a time when new technologies, like immersive tech, AI, synthetic biology and others, bring us to the precipice of dramatic social and industrial change, we have a chance, and an obligation, to do better this time and take these demands seriously by designing emerging tech with human agency and respect for our rights at its heart.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Why the Global Digital Compact's focus on digital trust and security is key to the future of internet

Agustina Callegari and Daniel Dobrygowski

April 24, 2024

About Us



Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum