Emerging Technologies

What does ancient China tell us about today's technology?

Terracotta warriors and horses, which were unearthed during the first excavation from 1978 to 1984, stand inside the No. 1 pit of the Museum of Qin Terracotta Warriors and Horses in Xian, Shaanxi province, in China, January 8, 2018.   REUTERS/Charles Platiau

The link between algorithms and politics. Image: REUTERS/Charles Platiau

Mark MacCarthy
Faculty, Georgetown University
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Emerging Technologies

As machine-learning algorithms come to dominate decision-making in business, politics, and society, the pressure to make more personal data available will steadily increase, and privacy protections will be eroded. If we do not take the reins of these new technologies, they could lead us toward a political system we did not choose.

Around 1200 BC, the Shang Dynasty in China developed a factory system to build thousands of huge bronze vessels for use in everyday life and ritual ceremonies. In this early example of mass production, the process of bronze casting required intricate planning and the coordination of large groups of workers, each performing a separate task in precisely the right order.

A similarly complex process went into fashioning the famous army of terracotta warriors that Qin Shi Huang, China’s first emperor, unveiled one thousand years later. According to the Asian Art Museum in San Francisco, the statues “were created using an assembly production system that paved the way for advances in mass production and commerce.”

Some scholars have speculated that these early forms of prescriptive-work technologies played a large role in shaping Chinese society. Among other things, they seem to have predisposed people to accept bureaucratic structures, a social philosophy emphasizing hierarchy, and a belief that there is a single right way of doing things.

When industrial factories were introduced in Europe in the nineteenth century, even staunch critics of capitalism such as Friedrich Engels acknowledged that mass production necessitated centralized authority, regardless of whether the economic system was capitalist or socialist. In the twentieth century, theorists such as Langdon Winner extended this line of thinking to other technologies. He thought that the atom bomb, for example, should be considered an “inherently political artifact,” because its “lethal properties demand that it be controlled by a centralized, rigidly hierarchical chain of command.”

Today, we can take that thinking even further. Consider machine-learning algorithms, the most important general-purpose technology in use today. Using real-world examples to mimic human cognitive capacities, these algorithms are already becoming ubiquitous in the workplace. But, to capitalize fully on these technologies, organizations must redefine human tasks as prediction tasks, which are more suited to these algorithms’ strengths.

A key feature of machine-learning algorithms is that their performance improves with more data. As a result, the use of these algorithms creates a technological momentum to treat information about people as recordable, accessible data. Like the system of mass production, they are “inherently political,” because their core functionality demands certain social practices and discourages others. In particular, machine-learning algorithms run directly counter to individuals’ desire for personal privacy.

A system based on the public availability of information about individual community members might seem amenable to communitarians such as the sociologist Amitai Etzioni, for whom limitations on privacy are a means to enforce social norms. But, unlike communitarians, algorithms are indifferent to social norms. Their only concern is to make better predictions, by transforming more and more areas of human life into data sets that can be mined.

Moreover, while the force of a technological imperative turns individualist Westerners into accidental communitarians, it also makes them more beholden to a culture of meritocracy based on algorithmic evaluations. Whether it is at work, in school, or even on dating apps, we have already become accustomed to having our eligibility assessed by impersonal tools, which then assign us positions in a hierarchy.

To be sure, algorithmic assessment is not new. A generation ago, scholars such as Oscar H. Gandy warned that we were turning into a scored-and-ranked society, and demanded more accountability and redress for technology-driven mistakes. But, unlike modern machine-learning algorithms, older assessment tools were reasonably well understood. They made decisions on the basis of relevant normative and empirical factors. For example, it was no secret that accumulating a lot of credit card debit could hurt one’s creditworthiness.

By contrast, new machine-learning technologies plumb the depths of large data sets to find correlations that are predictive but poorly understood. In the workplace, algorithms can track employees’ conversations, where they eat lunch, and how much time they spend on the computer, telephone, or in meetings. And with that data, the algorithm develops sophisticated models of productivity that far surpass our commonsense intuitions. In an algorithmic meritocracy, whatever the models demand becomes the new standard of excellence.

Have you read?

Still, technology is not destiny. We shape it before it shapes us. Business leaders and policymakers can develop and deploy the technologies they want, according to their institutional needs. It is within our power to cast privacy nets around sensitive areas of human life, to protect people from the harmful uses of data, and to require that algorithms balance predictive accuracy against other values such as fairness, accountability, and transparency.

But if we follow the natural flow of algorithmic logic, a more meritocratic and communitarian culture will be inevitable. And this steady transformation will have far-reaching implications for our democratic institutions and political structures. As the China scholars Daniel A. Bell and Zhang Weiwei have noted, the major political alternative to Western liberal-democratic traditions are the communitarian institutions that continue to evolve in China.

In China, collective decisions are not legitimated by citizens’ explicit consent, and people generally have fewer enforceable rights against the government, particularly when it comes to surveillance. An ordinary Chinese citizen’s role in political life is largely limited to participation in local elections. The country’s leaders, meanwhile, are selected through a meritocratic process, and consider themselves custodians of the people’s welfare.

Liberal democracies are not likely to shift entirely to such a political system. But if current trends in business and consumer culture continue, we might soon have more in common with Chinese meritocratic and communitarian traditions than with our own history of individualism and liberal democracy. If we want to change course, we will have to put our own political imperatives before those of our technologies.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Emerging TechnologiesEconomic Growth
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Stanford just released its annual AI Index report. Here's what it reveals

James Fell

April 26, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum