Emerging Technologies

Why tech governance needs trust

The first tragedy involving a self-driving car has shaken trust in high tech. Image: REUTERS/Hannibal Hanschke

Hilary Sutcliffe
Director, SocietyInside
Conrad von Kameke
Director, TIGTech
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how The Digital Economy is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Technological Transformation

On Sunday 18 March, in Tempe, Arizona, the first pedestrian was killed by a self-driving car.

The circumstances of the accident have prompted a rapid reassessment of the safety of the cars in most countries where they are being trialled, with many tests suspended. The event has shaken public trust in self-driving cars and their safety,and triggered increased scrutiny of the technology and its claims.

Unusually, in this particular instance, it is the process and design of the governance of the technology which is receiving the most attention. In particular, the approach of different US states to finding the appropriate balance between the promotion of technological innovation and ensuring public safety has brought the role of governance and the priorities and trade-offs into the spotlight.

The reverberations worldwide in response to governance decisions in a single city in the US illustrate the potential for a significant negative impact, not just on individuals, but on whole sectors and even entire technologies when governance is suddenly not seen to be trustworthy.

In response to the accident, Arizona suspended testing of the cars by Uber, the company involved, while other manufacturers, such as Toyota, have voluntarily suspended their road trials of self-driving cars.

Carmaker BMW’s chief engineer suggested that no automaker or technology company currently has sensor and computer systems advanced enough to ensure adequate safety in such urban settings, calling into question the pace of development of the technology in the US over Europe which he proposes is safer and less reckless.

The event has also galvanised public opinion, with polarisation quickly forming. A group of institutions focused on highway and auto safety are calling for a moratorium to restrict the abilities of companies to conduct ‘beta-testing on public roads with families as unwitting crash-test dummies’; while others cite the many lives lost because of driver error, which will, theoretically, be saved through the use of autonomous vehicles, as a rationale for a lighter regulatory touch - particularly in US federal regulation which is currently under consideration.

As information about governance trade-offs and design flaws become clearer and the positions of the different groups harden, it seems likely that trust in the governance regimes will become the decisive factor for this debate. The outcomes, in Arizona and elsewhere, seem likely to have important repercussions in the effective deployment of self-driving cars across the world.

Trust is not normally a concept put at the heart of governance design. We propose that it should be. If notions of earning trust and a focus on trustworthiness were part of governance design efforts and procedures, would that make any difference to its design and effectiveness?

Have you read?

Would governance be more deserving of our trust, and would it actually earn greater trust from policymakers and society at large?

We will explore these important questions in a new project, which aims to develop a series of Principles for Trust in Technology Governance. These principles would help support those crafting governance mechanisms as well as stakeholders participating in their design and implementation. The project would enable better understanding of how such a focus could inform governance design in the future by taking into account the underlying mechanisms of earning or losing trust.

This is fundamentally not about finding ways of facilitating or inhibiting the introduction of technologies into the market; it is about designing governance that increases the chance of earning political and societal trust. This is regardless of whether, for example, risk assessment and risk-management decisions under this governance result in unrestricted market access, market restrictions, or even prohibitions of a technology or its respective products.

While self-driving cars are a focus of media interest at the moment, the history of technology introductions, from GMOs to food irradiation, nanotechnologies to artificial intelligence shows us that trust in the governance of technology is of paramount importance. Greater understanding of the mechanisms of the earning and losing of that trust is of critical value to all stakeholders involved in governance design - from regulators to politicians, NGOs, industry, and the scientific community - and both directly and indirectly to society at large.

For further information please contact: Hilary Sutcliffe - hilary@societyinside.com - +44 (0) 7799 625 064; Conrad von Kameke - conrad@bioinnovatorseurope.org - +49 151 254 96 239

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Energy transition: Everything you need to know and live coverage from #SpecialMeeting24

Ella Yutong Lin and Kate Whiting

April 23, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum