- Data centres processing and storing the world's data already use around 1% of the electricity we generate, according to the IEA.
- Computing is expected to account for up to 8% of global power demand by 2030.
- The emissions associated with everyday computing could be surprisingly high.
A stone’s throw from a power station on the barren outskirts of Tbilisi, Georgia’s capital, a grey warehouse surrounded by metal containers hums to the sound of money.
Inside, hundreds of computer servers work continuously to solve complicated mathematical equations generating the digital currency Bitcoin - burning enough electricity to power tens of thousands of homes in the process.
Have you read?
“Any high-performance computing ... is energy intensive,” explained Joe Capes of global blockchain company The Bitfury Group, which operates the facility in Tbilisi.
Cryptocurrencies are one of several new technologies, like artificial intelligence and 5G networks, that climate experts worry could derail efforts to tackle global warming by consuming ever-growing amounts of power.
Data centres processing and storing data from online activities, such as sending emails and streaming videos, already account for about 1% of global electricity use, according to the International Energy Agency (IEA).
That’s about the same amount of electricity that Australia consumes in a year.
But as societies become more digitalised, computing is expected to account for up to 8% of the world’s total power demand by 2030, according to some estimates, raising fears this could lead to the burning of more fossil fuels.
“If we don’t take into account the carbon footprint, we are going to have a climate change nightmare coming from information technology,” said Babak Falsafi, a professor of computer and communication science at the Federal Polytechnic School of Lausanne.
One solution is to improve the efficiency of data centres, which is something operators are naturally prone to do since electricity accounts for a large share of their running costs, according to data experts.
“As a rule of thumb, a megawatt costs a million dollars per year ... This obviously catches management’s attention,” said Dale Sartor, who oversees the U.S. Department of Energy’s Center of Expertise for Data Centers in Berkeley, California.
Energy demand from data centres in the United States has remained largely flat over the past decade as improvements in computing have allowed processors to do more with the same amount of power, he told the Thomson Reuters Foundation by phone.
But that is set to change, predict tech analysts.
The 50-year-old trend known as Moore’s Law, which has seen computer chips double in capacity every two years, is expected to slow down as it becomes harder to add any more transistors to a chip.
Some companies have been looking at other ways to make savings.
In Georgia, where most electricity is generated by hydropower, Bitfury deployed a system to reduce the energy needed to cool down its heating servers.
Cooling can account for up to half of a data centre’s total energy use, the company says.
While some of its processors are still cooled with outside air, others are immersed inside metal tanks filled with a special liquid with a low boiling point.
As the liquid boils, the vapour transfers heat away from the processors, keeping them fresh and allowing the company to do away with fans and save water.
“Air is free ... but it is not efficient,” explained Capes, who heads Bitfury’s liquid cooling technology subsidiary, adding that the system consumes 40% less electricity than traditional air cooling solutions.
Others have taken similar steps.
A Google data centre in Finland uses recycled seawater to reduce energy use while some companies have opened facilities near the Arctic Circle to benefit from naturally cold air.
But improving efficiency “can only get you so far”, said Elizabeth Jardim, a senior corporate campaigner at environmental group Greenpeace. “At some point you will have to address the type of energy that is powering the facility.”
Tech giants including Facebook, Google, Apple, Amazon and Microsoft have committed to using only renewable energy but some still use fossil fuels, and more needs to be done to bring others on board, she said.
Jardim suggested governments enact policies to incentivise tech companies to procure green energy and increase transparency around the data sector’s carbon footprint.
Less Cat Videos
Meanwhile, internet users can also play a role by switching to greener companies or simply reducing their data use, said Jardim.
“Right now data pretty much is equivalent to energy, so the more data something takes the more energy you can assume it’s using,” she said.
Simply sending a photo by email can emit about the same amount of planet-warming gases as driving a car for a kilometre, said Luigi Carafa, executive director of the Climate Infrastructure Partnership, a Barcelona-based non-profit.
“The problem is we don’t really see this, so we don’t perceive it as a problem at all,” he said by phone.
A 2019 study by energy supplier OVO Energy found that if Britons sent one less email a day the country could reduce its carbon output by the equivalent of more than 81,000 flights from London to Madrid.
Global online video viewings alone generated as many carbon emissions as the whole of Spain in 2018, according to French think tank The Shift Project.
“People can already reduce their carbon emissions today if they stop watching cat videos,” said Falsafi, the Lausanne professor, who heads the university’s research centre for sustainable computing, EcoCloud.
“Unfortunately, they are neither aware of the issue nor incentivised to reduce carbon emissions.”