Your tweets aren’t the energy sucking monster that we thought, but things might get worse
But for how long?
As data has become one of the world's most-traded currencies, data centers designed to house it have grown too -- along with their carbon footprint. According to recent estimates, data centers account for one percent of total electricity usage worldwide, and analysts have predicted for years that this trend would only continue to worsen as demand for data centers rose. But a new analysis published Friday in the journal Science challenges conventional wisdom and says that might not be so true after all.
The paper, which was a joint analysis by Northwestern University, Lawrence Berkeley National Laboratory, and Koomey Analytics, drew information from data center equipment stocks, efficiency trends, and market structure in order to more effectively look at energy usage by data center equipment and data center types (i.e. cloud versus small scale centers.) This kind of bottom-up approach has traditionally been overlooked by previous analyses, write the authors. Where other, less thorough, analyses have used 2010 data to estimate massive energy growth in 2020, the authors write that their approach instead suggests that increases in data center efficiency have managed to meet increasing demand without equally scaling energy usage.
"Our findings do not mean that the IT industry and policymakers can rest on their laurels."
"Considering that data centers are energy-intensive enterprises in a rapidly evolving industry, we do need to analyze them rigorously," study co-author Arman Shehabi, a research scientist at Lawrence Berkeley National Laboratory, said in a statement. "Less detailed analyses have predicted rapid growth in data center energy use, but without fully considering the historical efficiency progress made by the industry. When we include that missing piece, a different picture of our digital lifestyles emerges."
The authors write that increased virtualization (or, instances of computing per servers), as well as increased use of cloud and hyperscale data centers (a kind of data center owned and operated by the company it supports, like Amazon or Google), have helped avoid a steep incline in energy use in recent years.
Having users and policy makers understand this trend can be a challenge though, coauthor Jonathan Koomey of Koomey Analytics, an analytical company that focuses on the environmental impact of information technology, tells Inverse.
"We hope that this work will encourage people to adjust their intuition about computing and electricity," says Koomey. "The tendency of most people is to assume that because computers are economically important, that they therefore need to use a lot of electricity. I think what we've been able to show with this work is that we can have vastly increased use of computing while at the same time keeping the electricity use roughly constant."
However, while this paper's analysis shows that energy usage hasn't risen proportionately to data center expansion, this doesn't mean that energy usage still hasn't continued to grow since 2010. The authors write that computing instances from data centers have increased six-fold while "only" increasing server energy usage 25 percent from 2010 levels, but this 25 percent increase (along with energy increases from other aspects of data center management) has still raised the total terawatt-hours of energy usage from 92TWh in 2010 to 130 TWh in 2018. Because of this, lead researcher on the paper and adjunct professor in Northwestern's McCormick School of Engineering, Eric Masanet, said in a statement it's important that the IT sector doesn't pat itself on the back just yet.
"While the historical efficiency progress made by data centers is remarkable, our findings do not mean that the IT industry and policymakers can rest on their laurels," said Eric Masanet. "We think there is enough remaining efficiency potential to last several more years. But ever-growing demand for data means that everyone -- including policy makers, data center operators, equipment manufacturers and data consumers -- must intensify efforts to avoid a possible sharp rise in energy use later this decade."
The researchers propose a three-point plan to continue and improve the infrastructure and efficiency of the world's growing data centers. First, the authors write that policy makers should work to strengthen efficiency standards and incentivize data centers to continue to improve the efficiency of servers already at their centers. Second, the authors write that more investment needs to be put toward developing next-generation server technology, whether that be through quantum computing or advanced heat dissipation.
Michael McNerney, Vice President Marketing and Network Security at the IT company Supermicro, tells Inverse that evolving technologies like 5G and the physical limitations of Moore's Law of transistors might be some of the first hurdles to jump.
"Continued innovation in data center technologies will drive significant performance and efficiency increases that should reduce power consumption, but the demand for new services typically consumes those improved performance gains," says McNerney. "We don’t see quantum computing likely having near term-impacts. Instead, Moore’s Law, AI, 5G, NVMe, and move to the cloud will have more of a near-term impact."
And finally, the authors write that continued analysis and modeling of these data centers will be important to ensure no blind spots, as their study pointed out from previous analyses, are left unchecked when developing new efficient models.
When it comes to user experience, coauthor Koomey tells Inverse that if all goes according to plan, users shouldn't notice a difference at all.
"In general, things get better over time," says Koomey. "The efficiency improves rapidly as the performance improves. So from a user's perspective you would continue to see improvements in the services being delivered."