Financial and Monetary Systems

Why banks will need to prepare for robots going rogue

SoftBank's robots 'pepper', dressed in different bank uniforms, are displayed during a news conference in Taipei, Taiwan July 25, 2016. REUTERS/Tyrone Siu - D1BETROXIJAA

Risks could spiral if a machine started making inappropriate investment recommendations. Image: REUTERS/Tyrone Siu

Elizabeth St-Onge
Partner, Financial Services, Oliver Wyman
Ege Gürdeniz
Principal, Digital, Technology, and Analytics, Oliver Wyman
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Financial and Monetary Systems?
The Big Picture
Explore and monitor how Banking and Capital Markets is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Banking and Capital Markets

Banks are rolling out machine-learning applications to handle all manner of tasks once reserved for humans, from customer service to automated investment picking. But are they ready to clean up the mess created if the robots go rogue?

While financial services firms have sturdy structures in place to police human misconduct – and have expanded them in recent years to cover social media and other new technologies – machine misconduct is another matter. Standing at the crossroads of compliance, risk management, human resources, and technology, the management of machine conduct has no natural home in most banks’ organizational structures.

This needs to change if banks hope to tap the incredible potential of machine learning applications. Used correctly, these technologies can deliver significant benefits both to banks and their customers. Machine-learning applications can provide better customer insights and solutions and more efficiencies across the entire firm, from customer interface to back office functions.

Image: Accenture

But banks also need to scrutinize the ethical ramifications of machine learning applications just as aggressively as they vet the backgrounds, ethics, and compatibility of job applicants. One bad bot can harm a bank’s reputation and potentially dent revenue. Since machine misconduct is purely a digital phenomenon, problems often spread instantly – causing chain reactions that can affect organizations and customers on a massive scale.

Even technology giants have stumbled in the machine-learning arena. Apple’s Siri search tool recently defined the word “mother” in an inappropriate way, while Google’s photo app made a racist blunder.

So how can banks beef up their machine risk management while still fostering innovation and tapping into machine learning’s immense promise?

First, they need to create robust machine-development and data-governance standards for their machine-learning efforts. That starts with an inventory of all such applications running throughout the company. At many banks, individual teams roll out new applications in isolation. They need a firm-wide view.

Next banks must dive headlong into the data. They already have a deep understanding of market and other data that flow in and out every day, but machine-learning applications are introducing vast quantities of new types of social media and customer-interface data that need to be catalogued and monitored. These new data forms require the same level of governance as trading and other financial data. Individuals or teams must be relentless in screening out anything that could bias a machine-learning application’s results.

Before a new application is introduced, it should go through a review and approval process that balances the need for proper risk management across the firm with the need to promote innovation. Each application has the potential to introduce new data and decisions into the ecosystem that could corrupt other functions.

Banks must also establish accountability for machine mishaps. They already have long-standing procedures for employees: the human resources function governs behavior and other ethical considerations; compliance makes sure company and regulatory rules are followed; conduct teams govern interactions with customers; while risk teams make sure products being sold don’t put the firm in peril.

Similarly, banks need a taxonomy for machine learning applications that spells out the roles, responsibilities, and procedures for governing and managing the risk associated with each type of machine that can potentially spiral if, say, a new machine were to start making inappropriate investment recommendations to customers. Fingers would be pointed at the technology team, who might in turn deflect blame to the sales team or the model risk management team. The compliance group and others might not get involved at all.

To ensure banks can respond appropriately, they must boost the level of technological expertise inside each governance function, from risk management to compliance and human resources. Banks need to add data scientists and other technologists in these areas so that the right questions are being asked and the oversight is informed.

Finally, there are ethical considerations to the use of machine learning for decision making, and senior management needs to be actively involved in developing that framework.

Machine-learning applications can enable banks to create value for their customers, employees, shareholders, and society in new ways. But banks must be aware of the risks of machine learning and address them quickly and systematically. Without proper governance, it won’t be long until a machine-learning disaster with major ethical, legal, and financial consequences unfolds.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Financial and Monetary SystemsEmerging Technologies
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

It is time to create an economy that works for all

Piyachart "Arm" Isarabhakdee

April 23, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum