Emerging Technologies

The ultimate tech age - a book extract

Artificial intelligence (AI) is changing the world.

The share of jobs requiring AI skills has grown 4.5 times since 2013. Image: Pexels

Lars Thinggaard
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Fourth Industrial Revolution is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Technological Transformation

This is an extract from the World Economic Forum's Book Club pick for July 2020 - Tech for Life: Putting Trust Back in Technlogy by Lars Thinggaard and Jim Hagemann Snabe. Join our Book Club to discuss.

Two hundred years ago in the throes of the first Industrial Revolution, the north of England was home to a vociferous and often violent group of discontented textile workers. They were called the Luddites.

The Luddites attacked the machinery which was endangering their livelihoods. Without the power of universal suffrage, it was the sole form of protest, which they thought would attract the attention of the powerful in business and government. They were right. At one point, the British army had more soldiers dedicated to dealing with the Luddites than it had actively engaged in the Iberian Peninsula fighting in the Napoleonic Wars. Protestors were punished harshly. Some were shot, some executed, others imprisoned and deported. The movement flickered and quickly died.

Discover

What is the World Economic Forum's Book Club?

The Luddites did not blindly oppose technological advancement – though that is what the term is now largely understood to mean – but simply felt powerless in the face of progress. Destroying machinery was, they believed, their only viable course of action. They simply wanted some power at the negotiating table. This was described as “collective bargaining by riot” by the historian Eric Hobsbawm.

The Luddites of northern England were not unique. Soon after, French tailors launched violent protests against Barthélemy Thimonnier, the inventor of the sewing machine.

Similar protests are footnotes throughout technological advancement. More recently, there is Neo-Luddism, which describes itself as "a leaderless movement of passive resistance to consumerism and the increasingly bizarre and frightening technologies of the Computer Age.”

Such movements are founded on fear. For those involved, the fear is genuine and grounded in their experience of reality. But, fearful footnotes should not divert us from the universal narrative: technology is the animating spirit of progress in all forms of human life and endeavor. Genies cannot be put back into bottles. What has been discovered cannot be undiscovered.

We live in an era distinguished by wide-ranging and rapid technological progress. Software is the technology at the core of this. Software code now lies behind virtually every business. It is the lubricant of the tech growth engine. And what an engine! Every single day 2.5 quintillion bytes of data is generated globally. Ninety percent of the data in the world has been created in the last two years. There is more and it is coming quicker than ever before. The time it takes a medium or technology to reach 50 million people is speeding up: radio, 38 years; television, 13 years; the internet, four years; Facebook, 3.5 years; Instagram, six months; Angry Birds, 35 days.

Look around.

Artificial intelligence (AI) is changing the world. The creation of intelligent machines, allowing for speech recognition, facial recognition, problem-solving, machine learning, autonomous robots and much more is rapidly expanding what we can achieve. Machine learning in particular, where massive amounts of data can be processed by ever increasing computational power to find patterns not discernible to humans, is pushing the boundaries.

Originally coined in 1955 in a research proposal by John McCarthy at Dartmouth College, the term “artificial intelligence” was to encapsulate the “study and design of intelligent machine.” Machines which could learn from their environments, build up their own knowledge, and solve problems in their own way. The official start of the field is generally agreed to be 1956, when the project in the research proposal was set in motion.

Since artificial intelligence emerged in the 1950s, innovators and researchers have filed applications for nearly 340,000 AI-related inventions and published over 1.6 million scientific publications. AIrelated patenting is growing rapidly: over half of the identified inventions have been published since 2013.

Stanford University’s AI Index charts some of the developments in AI. They are astonishingly many and varied. In recent years we have seen these milestones:

• In 2016, the error rate of automatic labelling of ImageNet declined from 28 percent in 2010 to less than three percent.

Human performance is about five percent.

• In March 2016, the AlphaGo system developed by the Google DeepMind team beat Lee Sedol, one of the world’s greatest Go players, 4–1. DeepMind then released AlphaGo Master, which defeated the top ranked player, Ke Jie. And then AlphaGo Zero used self-generated data and with three days of data beat the original engine (AlphaGo Lee) 100-0.

• A 2017 Nature article described an AI system trained on a data set of 129,450 clinical images of 2,032 different diseases and compared its diagnostic performance against board-certified dermatologists. Researchers found the AI system capable of classifying skin cancer at a level of competence comparable to the dermatologists.

• In 2017, Microsoft and IBM both achieved performance within close range of “human-parity” speech recognition in the limited Switchboard domain.

• A Microsoft machine translation system achieved human-level quality and accuracy when translating news stories from Chinese to English. The test was performed on newstest2017, a data set commonly used in machine translation competitions.

• In 2018, Google developed a deep learning system that can achieve an overall accuracy of 70 percent when grading prostate cancer in prostatectomy specimens. The average accuracy of achieved by US board-certified general pathologists in study was 61 percent. Additionally, of ten highperforming individual general pathologists who graded every sample in the validation set, the deep learning system was more accurate than eight.

The AI story is constantly unfolding. As we were writing, Microsoft invested $1 billion in OpenAI, the Elon Musk founded initiative looking into the transparency of algorithms. The share of jobs requiring AI skills has grown 4.5 times since 2013 and machine learning, deep learning and natural language processing skills are now the three most in-demand on the job site Monster.com.

Find out more here.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Emerging TechnologiesFourth Industrial Revolution
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Stanford just released its annual AI Index report. Here's what it reveals

James Fell

April 26, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum