Emerging Technologies

Don’t trust technology and AI? This expert explains why

Note reads 'A.I.'

Why is there still a lack of trust in AI? Image: Unsplash/Hitesh Choudhary

David Elliott
Senior Writer, Forum Agenda
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Emerging Technologies

This article is based on an interview with Azeem Azhar.

  • Trust in new technology like AI is not accelerating as fast as new tools are evolving.
  • Citizen participation could help redress the balance, says tech expert Azeem Azhar.
  • The World Economic Forum’s new whitepaper, Principles for the Future of Responsible Media in the Era of AI, puts forward five principles a responsible media should adopt to navigate the new age of artificial intelligence.

Are you unsure about AI and what it means for our future? If so, this is likely down to the speed of change and a perceived lack of control, according to one expert.

Azeem Azhar is the founder of Exponential View, a research group that studies the impact of technology on business, politics and society. He is also co-chair of the World Economic Forum’s Global Future Council on the Future of Complex Risks. With more than two decades’ experience in technology as an entrepreneur, investor and advisor, he has observed changes in the industry that he says are foundational to our relationship with the revolutionary tools emerging today.

Have you read?

“If you've been following the tech industry for a long time like I have, it used to be about gadgets," he says.

"But at some point in the past decade, [tech companies] became more than just creators of digital candies – whether it's a new calendaring app or a new phone or cloud service. They actually became providers of goods that in some sense look like public goods but are privately produced."

This shift, and questions around the power relationship we have with tech companies, is a key driver of distrust in technology, he adds.

Trust and technology

Trust Imbalance: Business Most Trusted to Integrate Innovation into Society
Most institutions are not trusted enough to introduce innovations to society. Image: 2024 Edelman Trust Barometer

A recent report looking at trust, compiled by public relations consultancy Edelman, found that most institutions are not trusted to introduce innovations to society. Out of business, NGOs, government and media, business is most trusted to make sure innovations are safe, understood and accessible – but at 59% of people, that figure still falls below 60%, the level above which the report says an institution is trusted. Media is the least trusted of the four, an issue tackled in the Forum’s new whitepaper, Principles for the Future of Responsible Media in the Era of AI, which puts forward five principles that a responsible media industry should adopt “to help relevant stakeholders navigate the new age of artificial intelligence”.

However, the issue of trust and technology is perhaps not as straightforward as sometimes described. "We do trust technology,” Azhar says. “We trust our hammer; we trust our lightbulb; we trust our towel. And these things may sound trivial but they are, in fact, technology."

The question is, then, why do many people feel like certain classes of technology are a threat? Azhar argues that this is rooted in our level of understanding of how that technology was developed, the speed of change, and the amount of control we feel we have over new developments.

"We don't feel we have agency over its shape, nature and direction," he says. "When technology starts to change very rapidly, it forces us to change our own beliefs quite quickly because systems that we had used before don't work as well in the new world of this new technology."

Tech has become a critical industry

In the time that Azhar has been observing technology, it has gone from a fringe sector to a critical industry. But, he argues, the way that tech firms engage with the state isn't transforming at the same pace and in the same manner.

"These big tech companies are the holders of power in a couple of really important ways. The technologies on the one hand are infrastructural, so they're like the freeways and the highways and the sewage systems and the gas utility and the electrical system. And that's the lower end of what these firms offer in terms of cloud services.

"But they also, in different ways, control the interface, the top layer through which we look at the world. We have no way of participating in a legitimate process that gives us some sense that we delegated these controls to someone we trust to look after them for us."

Discover

How is the World Economic Forum creating guardrails for Artificial Intelligence?

Citizen participation can restore trust

Many companies working on technology such as generative AI are already inviting public opinion to ensure their technologies are beneficial for people and society. Microsoft, for example, uses a variety of methods to learn from its diverse user base how it can optimize its products.

For Azhar, an additional route could involve forming ‘citizen assemblies’ to help shape the direction new technology heads in. He says there are hundreds of examples of this being used across a variety of areas and cites the examples of citizen juries in Ireland and how France has convened to look at issues such as abortion or euthanasia law.

"Rather than throwing this to a referendum of the entire population, which will always be polarizing, or pushing it into the legislature in a traditional way, which will involve party jostling, they established a process in both countries – about a decade apart – of a citizen's jury.

"Both heard evidence from experts and heard evidence from people who had personal experience. And through that, the juries came up with a statement, which in a sense was a statement of policy, a statement of values that represented the society."

When it comes to technology, such citizen juries could look at questions such as the extent to which driving regulations should be adapted for autonomous vehicles, or questions around copyright attribution and intellectual property with AI.

"One of the reasons I'm excited about this sort of approach is that I think it really builds trust,” he says.

Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

5 ways to make the transition to Generative AI a success for your business

Ana Kreacic and Michael Zeltkevic

May 7, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum