4 ethical tech lessons we've learned during COVID-19 crisis

Woman sat at desk looking at screen video call with four colleagues.
Image: Freepik.
  • Chaos caused by the pandemic has highlighted the need for more resilient systems.
  • Technology used responsibly can be an effective tool for social change.
  • This is what we've learned from applying our ethical use principles to pandemic-response situations.

2020 shook our world in unimaginable ways, forcing us to face multiple intersecting emergencies: public health, economic, and racial justice crises.

And, while technology isn’t the primary tool to solve for systemic health and social inequity, it can play an important role in partnership with government and civil society efforts. However, for technology to be effective, it must be built responsibly – and no matter how good a tool is, people won’t use it unless it has their trust.

As Chief Ethical and Humane Use Officer, I’ve worked closely with my team over the past year to apply our ethical and humane use principles when developing technology that addresses these crises. With this work, we’ve learned valuable lessons which will remain important and applicable long after the current pandemic is over. We share these lessons in the spirit of transparency as we navigate this uncharted territory together, knowing we always have more to learn.

Lesson 1: Learn from the past – avoid reinventing the wheel

When we face crises, there's often a bias for immediate action. However, it is critical that we take time to incorporate lessons from the past in developing modern solutions. In 2020, we were reminded to take a step back and recognize many of these issues are not new. While COVID-19 itself is novel, we know that this isn’t the first global health crisis we’ve faced. This isn’t the first time we’ve surfaced health inequities in our communities, though the disparities have deepened due to the pandemic. The world has experienced pandemics before, and there are many well-established, trusted organizations that deal with health crises every day.


What is the Global Technology Governance Summit?

Emerging and frontier technologies can help tackle social, economic and health challenges. But designed improperly, they could exacerbate the problems that they are intended to address.

For this reason, the World Economic Forum will launch its inaugural Global Technology Governance Summit on 6-7 April 6-7. The first-ever event will be hosted with Japan to create a collaborative neutral space where senior leaders, CEOs, board members, startups, innovators, entrepreneurs, academics, policymakers and civil society can come together to discuss and share issues related to the governance and protocols critical to new technologies.

As we developed our pandemic-response product, Work.com (a data-driven platform designed to help businesses reopen safely and safeguard staff), Salesforce partnered with privacy and medical experts to make sure we were following best practices to keep privacy and human rights at the forefront of this work. We also looked to organizations like the United Nations and the International Committee of Red Cross in developing our data retention and deletion guidelines.

It often is more efficient and responsible to leverage existing tools, processes, technologies, or support others who have already been doing the work. When time is of the essence and lives are at stake, we reinforced our commitment to trusting experts and listening to public health guidance. With this guidance as our north star, we then looked to modify – not reinvent – the wheel.

Lesson 2: Know your values and lead with principles-based action

At Salesforce, our core values influence everything we do. When we created the Office of Ethical and Humane Use, we began by identifying a set of five guiding principles – human rights, privacy, safety, honesty, and inclusion – to inform our work on responsible technology. These principles, based upon international human rights covenants and informed by employee feedback, guide us every day in designing, developing, and delivering our products.

When the pandemic hit and the world was thrown into a state of ambiguity and fear, we leveraged our guiding principles to provide direction for our work. We partnered with our privacy team to create a set of principles specifically to guide our COVID-19 technology response: 1) protect human rights and equality; 2) honor transparency; 3) minimize data collection; 4) take a long-term approach; 5) ensure the security of personal data. These considerations were what we came back to time and again to guide our teams in developing Work.com.

For example, these principles led to the intentional decision to not offer automated contact tracing to preserve privacy and maintain users’ trust. Instead, we helped our customers transition from analog (i.e., pen and paper) to digital systems for contact tracing, improving contact tracing accuracy and efficacy, and making human contact tracers more efficient.

What is the World Economic Forum doing about the Fourth Industrial Revolution?

The World Economic Forum was the first to draw the world’s attention to the Fourth Industrial Revolution, the current period of unprecedented change driven by rapid technological advances. Policies, norms and regulations have not been able to keep up with the pace of innovation, creating a growing need to fill this gap.

The Forum established the Centre for the Fourth Industrial Revolution Network in 2017 to ensure that new and emerging technologies will help—not harm—humanity in the future. Headquartered in San Francisco, the network launched centres in China, India and Japan in 2018 and is rapidly establishing locally-run Affiliate Centres in many countries around the world.

The global network is working closely with partners from government, business, academia and civil society to co-design and pilot agile frameworks for governing new and emerging technologies, including artificial intelligence (AI), autonomous vehicles, blockchain, data policy, digital trade, drones, internet of things (IoT), precision medicine and environmental innovations.

Learn more about the groundbreaking work that the Centre for the Fourth Industrial Revolution Network is doing to prepare us for the future.

Want to help us shape the Fourth Industrial Revolution? Contact us to find out how you can become a member or partner.

Lesson 3: Invest upfront to create scalable and repeatable processes

In a crisis, the landscape is constantly changing. With new information and challenges rising daily, a scalable and repeatable process to address issues is key.

We’ve seen governments’ pandemic responses evolve from primarily focusing on testing and contact tracing to a global vaccine rollout. For us, this meant providing ethical guidance and consultation to teams building COVID-19 vaccine management technologies. Given the widespread need and urgency for these types of tools, such as appointment scheduling technology and digital health credentials, we were able to scale our team’s work by partnering closely with systems integrators and providing practical guidance on how to center privacy, equity and ethical use throughout.

Another way we scale our work is by leveraging technology. In addition to the public health crisis, we are also faced with a renewed awareness of the racial injustice crisis in America and around the world. One action we’re taking to mitigate racial bias and harm in our products is by addressing non-inclusive language in our technical content and code. Inclusion is an ongoing practice, and we’re committed to continually reviewing our technical language and leveraging scalable tools, such as Acrolinx and other plug-ins, to automate language replacements and drive inclusion in our technical language.

Lesson 4: Multi-stakeholder collaboration is key

In this year of crisis, we’ve learned that investing in listening, having conversations across differences, and working together are key to addressing a crisis.

When it comes to vaccine management and digital health credentials, we’re committed to working with other industry groups to create interoperable standards and suggested best practices. Using technology to help us get out of a crisis requires governance, policy, and standards. We want to be the trusted digital partner — and the only way to do that is by collaborating with other groups who are working toward the same ends. This means taking leadership roles in partnerships like the Vaccination Credential Initiative and Good Health Pass Collaborative, both cross-sector working groups tasked with restoring confidence and promoting equity and collaboration in the development of vaccination and digital health pass systems.

We also leverage our Ethical Use Advisory Council, a diverse group of external experts and Salesforce frontline employees and executives, for thoughtful discussion and guidance around ethical use issues. We realize that in order to move forward together, it’s imperative we learn how to have difficult conversations and communicate across differences. We’ve learned that active listening, providing multiple safe channels for feedback, and sharing how we’ll be held accountable are key to building trust and making meaningful change through difficult times.

"For technology to be effective, it must be built responsibly."

—Paula Goldman Chief Ethical and Humane Use Officer, Salesforce

These lessons hold true outside of a crisis. As Dr. Kirsten Bibbins-Domingo, Professor and Chair, Department of Epidemiology and Biostatistics, University of California, San Francisco said: "Crises are crucibles for innovation."

Ongoing learning and iteration are key to business success before, during, and after a crisis. By investing time to reflect on previous work and institutionalize ongoing learnings, we’re better able to face obstacles in the future.

About Us
Partners & Members
Language Editions

Privacy Policy & Terms of Service

© 2022 World Economic Forum