Global Risks

Let's ask a different question. How risky is it not to develop emerging technologies?

The DEKA Arm System is pictured in this Pentagon's Defense Advanced Research Projects Agency (DARPA) handout image released May 9, 2014. The U.S. Food and Drug Administration has approved the robotic arm for amputees that is named for the "Star Wars" character Luke Skywalker and can perform multiple, simultaneous movements, an advance over the metal hook currently in use. REUTERS/DARPA/Handout via Reuters (UNITED STATES - Tags: MILITARY HEALTH SCIENCE TECHNOLOGY)

Risky business: The real threat is turning our back on developing new technology Image: REUTERS/DARPA/Handout via Reuters

Andrew D. Maynard
Professor, School for the Future of Innovation in Society, Arizona State University, Author of Future Rising: A Journey from the Past to the Edge of Tomorrow
Our Impact
What's the World Economic Forum doing to accelerate action on Global Risks?
The Big Picture
Explore and monitor how Global Risks is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Global Risks

Take an advanced technology. Add a twist of fantasy. Stir well, and watch the action unfold.

It’s the perfect recipe for a Hollywood tech-disaster blockbuster. And clichéd as it is, it’s the scenario that we too often imagine for emerging technologies. Think superintelligent machines, lab-bred humans, the ability to redesign whole species – you get the picture.

The reality, of course, is that the real world is usually far more mundane: less “zombie apocalypse" and more “teens troll supercomputer; teach it bad habits.”

Looking through this year’s crop of Top Ten Emerging Technologies from the World Economic Forum (WEF), this is probably a good thing.

 Top 10 emerging technologies 2016
Image: World Economic Forum

Since 2012, I’ve been part of a group of WEF advisers who help compile an annual list of emerging technologies that are poised to transform our lives. This year’s list includes autonomous vehicles, blockchain (the technology behind BitCoin), next-generation batteries and a number of other technologies that are beginning to make their mark.

The list is aimed at raising awareness around potentially transformative technologies so that investors, businesses, regulators and others know what’s coming down the pike. It’s also an opportunity for us to think through what might go wrong as the technologies mature.

Admittedly, some of these technologies would stretch the imagination of the most creative of apocalyptic screenwriters – it’ll be a while, I suspect, before “Graphene Apocalypse” or “Day of the Perovskite Cell” hit the silver screen. But others show considerable potential for a summer scare-flick, including “brain-controlling” optogenetics and the mysterious sounding “Internet of Nano Things.”

Putting Hollywood fantasies aside, though, it’s hard to predict the plausible downsides of emerging technologies. Yet this is exactly what is needed if we’re to ensure they’re developed responsibly in the long run.

Tech problems, tech solutions

It’s tempting to ask what concrete harm technologies like those in this year’s top ten could cause, then simply figure out how to “fix” the problems. For instance, how do we ensure that “logical” self-driving cars safely share the road with less “logical” humans? Or how do we prevent bacteria that are genetically programmed to produce commercial chemicals from polluting the environment? These are risks that lend themselves to technological solutions.

But focusing on such questions can mask much more subtle dangers inherent in emerging technologies, threats that aren’t as amenable to technological fixes, and that we all too easily overlook.

For example, being infused with internet-connected nano-sensors that reveal your most intimate biological details to the world could present social and psychological risks that can’t be solved by technology alone.

Similar concerns arise around “open artificial (AI) intelligence ecosystems” - the next step up from systems like like Amazon’s Echo, Apple’s Siri and Microsoft’s Cortana. Combining “listening” devices, cloud computing and the Internet of Things, machines are increasingly combining the capacity to understand normal conversation with the ability to take action on what they hear. This is a truly transformative technology platform. But what happens when these AI ecosystems begin to listen in on private conversations and share them with others? Or independently decide what’s best for you? These possibilities raise ethical and moral concerns that aren’t easily addressed solely by tech solutions.

Expanding our conception of what we value

One way to tease out the subtler possible impacts of emerging technologies is to think of risk as a threat to something of value – an idea that’s embedded in the somewhat new concept of Risk Innovation. This “value” depends on what’s important to different individuals, communities and organizations.

Health, wealth and a sustainable environment are clearly important “things of value” in this context, as are livelihood, and food, water and shelter. Threats to any of these align with more conventional approaches to risk – a health risk, for instance, can be understood as something that threatens to make you sick, and an environmental risk as something that threatens the integrity of the environment.

But we can also extend the idea of a threat to something we value to less conventional types of risk: threats to self-worth, for instance, or culture, sense of security, equity, even deeply held beliefs.

These touch on things that define us as individuals and communities, and get to the heart of what gives us a sense of purpose and belonging. In this way, relevant threats might include inequity or an eroded sense of self-worth from new tech taking away your job. Or anxiety over who knows what about you, and how they might use it. Or fear of becoming socially marginalized by the use of new technologies. Or even dread over sacrosanct beliefs – such as the sanctity of life, or the right to free choice – being challenged by emerging technological capabilities.

Threats like these aren’t easy to capture. Yet they have a profound impact on people – and as a consequence, on how new technologies are developed and used. Thinking more broadly about risk as a threat to value is especially helpful to understanding the possible undesired consequences of tech innovation, and how they might be avoided.

Risks of missing out on new technologies

This approach to risk also opens the door to considering the potential risks of notdeveloping a technology. Beyond existing value, future value is also important to most people and organizations.

For instance, autonomous vehicles could eventually prevent tens of thousands of road deaths; optogenetics – using genetic engineering and light to manipulate brain cell activity – could help cure or manage debilitating neurological diseases; and materials like graphene could ensure more people than ever have access to cheap clean water. Not developing these technologies potentially threatens things that many people hold to be extremely valuable.

Of course, on the flip side, these technologies may also threaten what is important to some. Self-driving cars might undermine human responsibility, not to mention the enjoyment of driving. Optogenetics raise the possibility of involuntary neurological control. And graphene might be harmful to some ecosystems if released into the environment in sufficient quantities.

By considering how emerging technologies potentially interact with what we consider to be important, it becomes easier to weigh the possible downsides of developing them – or at least developing them without due consideration – against those of either impeding their development, or not developing them at all.

The greatest risk of all

What emerges when risk is approached as a threat to value is a much richer way of thinking about how emerging technologies might affect people, communities and organizations, and how they can be developed responsibly. It’s an approach that forces us to realize that the consequences of developing new technologies are complex, and touch people in different ways – not all of them for the better. It’s not necessarily a comfortable reconceptualization – but looking at risk from this new angle does pave the way for technologies that benefit many people and disadvantage few, rather than the other way round.

In reality, unlike the simplicity of Hollywood blockbusters, the risks associated with emerging technologies are rarely clear-cut, and almost never straightforward. Yet they nevertheless exist. Every one of this year’s World Economic Forum top 10 emerging technologies has the potential to threaten something of value to some person or organization – whether undermining an established technology or business model, jeopardizing jobs, or influencing health and well-being.

These dangers are context-specific, often intertwined with each other, sometimes conflicting, and often balanced by the risks of not developing the technology. Yet understanding and addressing them is essential to realizing the long-term benefits that these technologies offer.

And here, perhaps, is the greatest risk – that either in our enthusiasm for developing these technologies, or our Hollywood-inspired fears of potential consequences, we lose sight of the value of developing new technologies that make our world a better place, not just a different one.

Published in collaboration with The Conversation.

Have you read?

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Global RisksEmerging Technologies
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Disasters will be less devastating if we plan for them

Gareth Byatt and Ilan Kelman

March 11, 2024

About Us



Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum