Fourth Industrial Revolution

Why we must democratize access to high-performance computing

The new Bull sequana supercomputer is seen during presentation in Paris, France, April 12, 2016. REUTERS/Philippe Wojazer - D1AESYCOEZAA

Major discoveries in science and engineering depend on organisations having access to new high-performance computing tools. Image: REUTERS/Philippe Wojazer

Peter Ungaro
Senior Vice-President and General Manager, High-Performance Computing and Artificial Intelligence, Hewlett Packard Enterprise
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Fourth Industrial Revolution?
The Big Picture
Explore and monitor how Fourth Industrial Revolution is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Fourth Industrial Revolution

This article is part of: The Davos Agenda
  • Society's data needs are set to grow, as data increasingly informs our actions.
  • High-performance computing (HPC) systems can help us understand and apply this data.
  • We must democratize access so that more organizations - and the world - can realize the benefits.

Every two years, we create more data than was previously created through all of history. Our hyper-connected world fuels this exponential increase in data - from wearable devices and smart appliances to electronic health records and autonomous vehicles.

New technologies fuel this swell of data, launching us into the Information Era. But while we have been able to generate and collect massive amounts of data, we haven't had the tools and resources to deliver timely insights and inform actions to shape our future.

That was, at least, until now.

We are entering the Age of Insight – a new era that is defined by insights and discoveries that benefit all and that elevate the greater well-being of every human on the planet.

As someone who has spent his career building some of the world’s fastest computing systems to harness the full potential of data, I am optimistic about our progress - and the future. The ability to model and analyze data is the key to unlocking answers to the world’s toughest problems and answering questions we never knew to ask.

At its core, the underlying driver of the Age of Insight is data, and it has no end in sight. Continuous growth in data is forcing us to build larger and larger models, both math models for simulation and data models for analytics. More data and larger models and datasets drive the need for larger computers. On top of these demands, there are new algorithms to compute on this data such as artificial intelligence (AI), including machine learning and deep learning, further fueled by technologies such as 5G and the internet of things (IoT).

Have you read?

High-performance computing (HPC) systems are the tools needed to help understand data and, ultimately, the world around us. They are the backbone of modelling, simulation, high-end big data analytics, and AI workloads. They can convert complex data into digital models that help researchers and engineers understand how something will look and perform in the real world.

The machines are made up of thousands of processors operating in parallel, performing quadrillions of operations each second on unimaginable volumes of data. Scientists are using them to tackle massive problems such as mapping the human brain, predicting the path of a hurricane, finding new sources of energy or vaccines for global pandemics, and simulating supernovas.

"Our computing needs as a society will only continue to accelerate."

Pete Ungaro

Unfortunately, access to HPC systems has been extremely limited. Today, research institutions are the primary users of HPC systems. But I believe that everyone – from the Fortune 500 to small businesses, local governments and individual researchers – should be able to benefit from them.

Democratizing access to HPC

Researchers have been successfully using HPC systems to fast-track scientific discovery and make critical advancements for society; however, we are relying on very few systems to address some of the biggest problems the world has ever faced. I regularly speak with organizations around the globe that are attempting to solve HPC-sized problems without the right tools in place. To solve problems we have not been able to solve before, we must innovate in new ways.

That’s where HPC-as-a-Service comes in. At Hewlett Packard Enterprise, our vision to build systems that perform like a supercomputer and run like the cloud is becoming a reality. By offering HPC-as-a-Service, we are removing barriers to access such as system complexity, unique facilities to house them, system costs, operating costs for power and cooling, and any need for highly skilled and knowledgeable HPC technical staff.

Any organisation could soon have access to the most powerful HPC and AI capabilities, enabling them to make strides in scientific research and engineering, achieve bold new discoveries, and thrive in a digital-first world.

What's next?

Looking ahead, our computing needs as a society will only continue to accelerate. I believe that every data center in the future will have systems and technology in them that come from a supercomputer. This will help organisations handle their growing data needs and digitally transform their enterprise.

Companies like HPE are innovating with exascale systems in mind. When they are delivered for the first time in 2021, these systems will be the largest supercomputers on the planet. They will allow us to perform 1,000,000,000,000,000,000 calculations per second. That’s a quintillion. To visualise the magnitude of a quintillion: it would take 40,000 years for quintillion gallons of water to flow over Niagara Falls. It’s a billion billion.

New innovations like photonics – using light to transmit data – and accelerators are driving the scale, performance and cost advantages of exascale systems, but exascale users will not be the only beneficiaries.

These technologies are already being incorporated into standardized HPC solutions and will also become accessible as a service, which means that more governments, researchers and enterprises will be armed with the most advanced problem-solving tools on the planet, just at a smaller scale.

I am excited about what the future holds and the impact that we can make when we uncover the answers hiding in plain sight today. And I am thrilled that the future is right around the corner.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Fourth Industrial RevolutionArtificial IntelligenceDavos Agenda
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Space: The $1.8 Trillion Opportunity for Global Economic Growth

Bart Valkhof and Omar Adi

February 16, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum