Davos Agenda

Why “do no harm” should be every tech investor's mantra

Protecting and defending human rights is everyone’s responsibility - including tech investors.

Protecting and defending human rights is everyone’s responsibility - including tech investors. Image: Neerav Bhatt/Flickr

Laura Okkonen
Investor Advocate, Access Now
Brett Solomon
Executive Director and Co-founder, Access Now
Our Impact
What's the World Economic Forum doing to accelerate action on Davos Agenda?
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Davos Agenda

This article is part of: World Economic Forum Annual Meeting

Listen to the article

  • Companies need to view the development, use and governance of their technologies through a human rights-focused lens.
  • Protecting and defending human rights is everyone’s responsibility, including tech investors.
  • Building on the UN Guiding Principles on Business and Human Rights (UNGPs), tech investors should conduct proper due diligence and impact assessments before deciding where to invest their money and push company leaders to act the same way.

In recent years, there has been plenty of progress around sustainable investment, with investors increasingly considering environmental, social and governance (ESG) factors when deciding what to fund. But there has been less focus on responsible investment's role in ensuring that the technologies we use every day respect human rights.

It’s time for tech investors and funders to recognize that they too are responsible for the potential human rights harms of digital platforms and services and to help hold the companies they bankroll to account.

Put your money behind human rights

Many technology companies have adopted the UN Guiding Principles on Business and Human Rights (UNGPs) as they seek to “do good” with the products and services they create. But before aiming to fix all of society’s problems, companies need to view the development, use and governance of their technologies through a human rights-focused lens, so as to “do no harm” first and foremost.

This is where investors can play a part, building on the UNGPs, by conducting proper due diligence and impact assessments before deciding where to invest their money and in pushing company leaders to act in the same way, if and when they do go ahead with any investment. Whether unintentionally or via the deliberate weaponization of certain technologies, poorly-considered investments in tech portfolios can seriously harm groups of people already in vulnerable or risky situations.

Have you read?

Disclaimer: high rewards carry high risks

The human rights risks of technology are particularly acute and apparent when it comes to social media. Large online platforms have repeatedly failed to embed human rights in their development, to mitigate risks emerging from their systems over time or to remedy the harms caused as a result of their platforms. In situations of armed conflict and crisis or in countries with poor human rights records, this laissez-faire attitude can be life-threatening for at-risk people, as in Ethiopia, Syria, Israel and Palestine, and Myanmar. This is indifference at the very least and wilful discrimination at worst – and certainly does not satisfy the “do no harm” mantra.

Investors must take responsibility for the role their funding plays in enabling such harms and take action through strengthened due diligence, impact assessments and remedy mechanisms. After all, when even the UN has called out social media companies for being complicit in facilitating human rights violations, their conclusion extends to investors in such companies.

Similarly, while artificial intelligence-based (AI) solutions and other forms of automated decision-making are often touted as “silver-bullet” solutions for a whole host of societal challenges – generating great excitement among investors as a result – these same technologies are prone to dangerous biases, causing serious harm both on and offline. With the potential applications – and implications – of emerging technologies evolving daily, both potential and existing investors should insist on using human rights impact assessments (HRIAs) to ensure AI systems are developed and deployed in a rights-respecting manner.

Failing to consider human rights implications when investing in tech can have severe repercussions, as we’ve seen in cases involving government use of spyware. The misuse and abuse of surveillance technologies violate privacy, enable discrimination and fuel violence against at-risk populations, such as human rights defenders, journalists and activists. These rights-violating tools have been deployed by both authoritarian regimes and democratic governments. Investors cannot abdicate responsibility for such damage. While most companies in this sector are privately held, all tech investors should periodically review their portfolios for exposure to surveillance technologies to ensure they are not directly or indirectly funding such technologies.

Reform is needed to enable tech investor impact

Many investors have already shown a willingness to act in a rights-respecting way. During the 2022 Annual General Meetings (AGMs) season, shareholders at the largest tech companies filed proposals on various human rights issues, asking for stronger governance and oversight, meaningful transparency reporting on government censorship demands and rights-focused impact assessments.

However, such shareholders’ hands are tied by dual-class share structures, which limit shareholders’ power to push for meaningful change (compared with senior executives, usually company founders, who have outsized voting power). As a result, of the human rights proposals mentioned, none actually passed.

Investors must work to increase their oversight and accountability around human rights if they are to fulfil their responsibilities meaningfully. But for this to happen, efforts must go beyond the existing “responsible investor” community – all investors must step up.


How is the World Economic Forum ensuring the responsible use of technology?

Across all the major tech companies, a concentrated group of investors sets the example across the sector. If they were to systematically conduct ongoing human rights due diligence of the companies and projects in their portfolios and engage with civil society to identify where they may be causing or contributing to human rights violations, they could spark a shift towards rights-respecting behaviour across the entire sector.

Protecting and defending human rights is everyone’s responsibility, and that includes the organizations holding the purse strings. It is time for investors and funders to ask themselves, “will this hurt people?” before opening their wallets, and to take responsibility for the human aftermath of their investment decisions.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Davos AgendaBusiness
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

AI, leadership, and the art of persuasion – Forum  podcasts you should hear this month

Robin Pomeroy

March 1, 2024

About Us



Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum