Leadership

Four principles for leadership in an uncertain world

A black swan flies at sunset in Centennial Park in Sydney February 13, 2013. REUTERS/Daniel Munoz (AUSTRALIA) - RTR3DQ6O

The new normal? Image: REUTERS/Daniel Munoz (AUSTRALIA) - RTR3DQ6O

Lee Howell
Managing Director, World Economic Forum Geneva
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Leadership?
The Big Picture
Explore and monitor how Global Governance is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

leadership

“It is better to know how to learn than to know.” - Dr Seuss

The year 2016 provided ample evidence that leaders rely far too much on expert predictions and probabilistic assumptions, judging by the dismay expressed after the Brexit vote in the United Kingdom and the presidential elections in the United States. If you have a populist bent then perhaps this is just more evidence of elites living in a bubble outside of the “real world”. For the more technocratic-minded, however, these were “black swan” events that signal a new normal in world affairs.

Karl Popper first wrote about the “black swan” to demonstrate the problem of inductive reasoning and to introduce the notion of falsifiability in The Logic of Scientific Discovery. Decades later, Nassim Taleb made it the risk metaphor with which we are all familiar when he wrote The Black Swan: The Impact of the Highly Improbable, just before the 2008 global financial crisis. Taleb described a “black swan” event as having three unique characteristics:

  • Rarity: “It is an outlier, as it lies outside the realm of regular expectation because nothing in the past can convincingly point to its possibility.”
  • Extreme impact: “It carries an extreme impact (unlike the bird).”
  • Retrospective predictability: “In spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.”

It is debatable if the Brexit vote or the election results in the US measure up to Taleb’s criteria for a “black swan” in the current context. The notion that a future systemic shock, perhaps a massive cyberattack, can be averted does not appear plausible in an interdependent and interlinked global economy. Scientists regularly warn us that natural disasters are on track to increase with unabated climate change. And the recent Christmas market tragedy in Berlin and nightclub attack in Istanbul remind us that dreadful acts of terrorism and violent extremism are less rare and less remote, at least in an era of ubiquitous social media. Simply honing your populist or contrarian instincts will not serve you any better in terms of foresight. If you are convinced that the world is spiraling in the wrong direction or instead believe a positive transformation is just beginning, be prepared for more uncertainty in the year ahead.

Why the anticipation of greater uncertainty? Firstly, we live in a hyper-connected world and somehow expect even higher levels of inter-operability across multiple systems that impact our daily lives. This expectation is what drives the popular narrative about the future impact of the internet of things, big data and artificial intelligence. However, economists Ian Goldin and Mike Mariathasan assert that “more than simple connectivity, our increasing interdependence represents complexity” and therefore “the world today should be defined as a complex system.” They argue further that because of connectivity it is “increasingly difficult to identify the root cause of a hazard or even the channels of its transmissions”, as globally accessible systems are increasing in their complexity.

Now consider that in the late 1990s, the sociologist Charles Perrow observed that advanced technology deployed to ensure safety, such as warning systems and safeguards, fail because system complexity makes their failure inevitable or “normal”. His “Normal Accident Theory” was premised on the notion that a system’s susceptibility to accidents can be determined by examining two of their dimensions: interactive complexity and tight versus loose coupling. Perrow highlighted the paradox that: if prescribed safety measures result in increasing the complexity of a tightly coupled system that is also exposed to interactive complexity, then the likelihood of an accident increases. Tying the above mentioned theories together is why I expect more global uncertainty ahead.

 Figure 1: Interactions, Complexity and Coupling (Systems)
Figure 1: Interactions, Complexity and Coupling (Systems) Image: Perrow, Figure 9.1, Interactions/Couplings Chart (1999: 327)

I also anticipate greater uncertainty, because we overlook the fact that social-economic systems are highly complex. Evgueni Ivantsov cautions that social systems “inherit all the complexity of both non-organic and organic systems but also produce the new complexity of a human mind, social behaviour and economic life.”

Thirdly, despite this complexity, we nonetheless overlook the easy distortion of our perceptions (and therefore our decisions) by our cognitive biases. As the late Harper Lee observed in To Kill a Mockingbird: “[P]eople generally see what they look for, and hear what they listen to.” We are still developing our heuristics or thinking strategies to navigate this confluence of connectivity, systems and complexity. As we cannot suppress our aversion to ambiguity, here are four things that can help us cope with uncertainty in a global context –you might consider one or all of them as candidates for your list of New Year resolutions.

Learn what is a risk versus an uncertainty

The notion of risk is intuitive, yet it still can be difficult to define. Its etymological origin in English dates to 1661 and defines it as “hazard, danger: exposure to mischance or peril.” Advances in the mathematics of probability slowly changed our understanding of risk in the 17th and 18th centuries. Long before the enthusiasm over big data, the initial excitement around statistical science was its demonstration that “what appears to be mere chance is the measure of our ignorance.” However, this mathematical measurement of risk was tested early in the 20th century by the First World War and the Great Depression and led to a conceptual distinction between risk and uncertainty.

Risk, Uncertainty and Profit, written by Frank Knight in 1921, is often cited as the origin of the notion that risk and uncertainty are separate concepts: a risk is a measurable uncertainty; whereas a true uncertainty cannot be measured and therefore cannot be characterized as a risk. The two concepts have been further refined as Larry Epstein and Tan Wang make the additional distinction that risk entails decision-making where “probabilities are available to guide choice”, and uncertainty is when “information is too imprecise to be summarized by probabilities.”

There are also two types of uncertainties: aleatory uncertainty and epistemic uncertainty. Aleatory uncertainty arises from a situation of pure chance. Epistemic uncertainty arises from a problem situation where judgement is required for its resolution. From a practical perspective, it is helpful to think of risk and uncertainty in a continuum when contemplating how to deal with both.

Learn how to understand human error

“Blame games” are a popular pastime in politics and journalism, but 2016 was an exceptional season for Europe and the US. We seem hardwired to find human error as the cause of so many calamities. An easy thought experiment to demonstrate this is to ask “remain” voters in the UK what happened in the Brexit referendum. Paradoxically, knowing how to study human error becomes a critical leadership skill when responding to future uncertainty.

As we witness a worsening humanitarian crisis in Syria, investigating the international security failures at its origin will be among the responsibilities of leaders in 2017. How incoming leaders like US President-Elect Donald Trump review the past performance of their predecessors give the best hint at what they believe is responsive and responsible leadership. But blaming outgoing President X or Prime Minister Y for recent failures will not guarantee a better outcome in the near future. Fixating on human error represents an old view of how things work with respect to complex systems.

Safety expert Sidney Dekker argues that you either see “human error as the cause of a mishap” or see “human error as the symptom of deeper trouble”. He also asserts that “to understand failure, you must first understand your reactions to failure” and, more importantly, “the more you react, the less you understand.” Understanding human error is also critical in avoiding the effect of hindsight bias on future decisions.

Learn how hindsight is harmful

Mark Twain is reputed to have said: “History does not repeat itself, but it does rhyme.” It remains sage advice about the nature of international relations: we have a poor track record of preventing global mishaps, yet their causes often appear obvious upon reflection. Yet hindsight is arguably the most troublesome of cognitive biases that affect our decision-making, particularly when facing rising uncertainty about the future.

A leader’s interpretation of a recent failure inevitably will shape his or her future strategy – this is why, for example, there is so much anxiety as to what the US will do in Syria and Iraq during the Trump administration. In studying aviation disasters, Dekker observed that most reactions to past failures share the following four common characteristics:

1) Retrospective: “Reactions arise from our ability to look back on a sequence of events, of which we know the outcome.”

2) Counterfactual: “They lay out in detail what people could or should have done to prevent the mishap.”

3) Judgemental: “They judge people (e.g. not taking enough time, not paying enough attention, not being sufficiently motivated) for supposed personal shortcomings.”

4) Proximal: “They focus on those people who were closest in time and space to the mishap, or to potentially preventing it.”

But hindsight bias is pernicious and it can transform a difficult and complex reality into a simplistic and linear one, where difficult decisions are framed merely as binary responses, like yes or no. This illusion is dangerous as it blurs the distinction between what you now know the situation was actually like, and what people understood it to be at the time. This, in turn, weakens our resilience to similar shocks in the future. There are three errors of analysis, anchored to hindsight bias, that should be avoided when looking back at the events of 2016:

1) Predetermined outcome: “We think that a sequence of events inevitably led to an outcome. We underestimate the uncertainty people faced at the time, or do not understand how very unlikely the actual outcome would have seemed.”

2) Linear sequencing: “We see a sequence of events as linear, leading nicely and uninterruptedly to the outcome we now know about. Had we seen the same situation from the inside, we would have recognized the possible confusion of multiple possible pathways…”

3) Oversimplification: “We oversimplify causality. When we are able to trace a sequence of events backwards (which is the opposite of how people experienced it at the time) we easily couple “effects” to preceding “causes” (and only those causes)…”

Along with these mistakes, there are other important cognitive biases related to decision-making when facing uncertainty.

  • Confirmation bias: we place extra value on evidence consistent with a favoured belief and not enough on evidence that contradicts it. We fail to search impartially for evidence.
  • Anchoring and insufficient adjustment: we root our decisions in an initial value and fail to sufficiently adjust our thinking ways from that value.
  • Groupthink: we strive for consensus at the cost of a realistic appraisal of alternative courses of action.
  • Egocentrism: we focus too narrowly on our own perspective to the point that we can’t imagine how others will be affected by a policy or strategy. We assume that everyone has access to the same information we do.

Source: adapted from Beshears & Gina

We need to remind ourselves to make our very best efforts to identify and control cognitive biases. However, leaders still run the risk of treating a problem as already having a known solution when the challenge is in fact a novel one.

Have you read?
Learn what is a technical versus an adaptive challenge

Ronald Heifetz and Marty Minsky have identified the single biggest failure of leadership as treating adaptive challenges like technical problems. Their research suggests two types of leadership challenges: adaptive and technical change. When the problem definition, solution and implementation are clear, they categorize this as a technical change. In contrast, an adaptive change requires a novel solution and new learning. Their conclusion is that adaptive change must come from the collective intelligence of an organization. It also requires an organization to learn its way towards solutions rather than simply search for known solutions.

Most people today were either born during or after the Cold War when US leadership in world affairs was taken for granted to some degree. I am certain that the emergence of a multi-polar world will present many adaptive challenges going forward with respect to global governance. This prospect is perhaps the source of much of the anxiety about the future state of the world. In the absence of a novel hypothesis or an original paradigm, leaders will need to learn new means and methods, but also, possibly, engage in experimentation. This is another source of anxiety as the prospect of muddling through such adaptive challenges as climate change, cybersecurity or the Fourth Industrial Revolution is disconcerting to say the least.

Navigating global uncertainty

Learning the four things mentioned above will help orient your decision-making in the face of global uncertainty, but your aim should not be to predict the future. Returning to Brexit and the US elections, it is worth noting that a decade ago Philip Tetlock demonstrated that it is nearly impossible to achieve accurate, long-term political forecasting.

From a behavioral perspective, his key insight was in showing that political analysts were not only overconfident about what they know about the future, but were also reluctant to change their minds in response to new evidence. Therefore, it was not surprising that the accuracy of long-term forecasts was no better than chance. Put another way, political forecasters who are self-critical and avoid simple heuristics are relatively better at assigning probabilities to future outcomes than their opposite. Tetlock’s research has also found that people who are younger and of lower status in an organization (versus older and higher status) are more enthusiastic about assessing the accuracy of probability judgement.

Umberto Eco, the late Italian academic and novelist, came to see the world as a harmless enigma. But he also observed that it was an enigma “made terrible by our own attempt to interpret it as though it had an underlying truth.” This snippet of philosophical wisdom is perhaps as important to heed when dealing with uncertainty in the year ahead.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
LeadershipGlobal CooperationGeo-Economics and Politics
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

This is what businesses need to be focusing on in 2024, according to top leaders

Victoria Masterson

April 16, 2024

3:12

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum