Crunching Numbers, Sustainably
Crunching Numbers, Sustainably
Powerful computer applications can draw as much energy as small countries. Engineers are working to develop less wasteful hardware that can be considered green.
It sounded like a perfectly reasonable idea. In 2001, while working at Los Alamos National Laboratory, computer scientist Wu Feng designed a low-power supercomputer system called Green Destiny, intended to generate less waste heat that contributed to failures. No one could be against waste. But at a demo of Green Destiny at a supercomputing conference, “I literally got booed and hissed,” Feng recalled, laughing. “Not humongously, but you could hear at the end of the talk there, there were people that weren’t happy, and that’s mainly because there wasn’t this acceptance for being energy efficient.”
Feng, who today is a professor at Virginia Tech in Blacksburg, has spent a good chunk of his career on making high-end computers more energy efficient.
It’s an issue that remains somewhat under the radar. “Few people think of the IT industry when they think about climate change,” wrote IBM in an official blog, but according to a 2021 report from the Technology Policy Council of the Association for Computing Machinery (ACE), between 1.8 percent to 3.9 percent of global carbon emissions are created by the information and communications technology sector while the entire aviation industry accounts for only about 2.5 percent. Some estimates credit computing with an even higher greenhouse gas contribution, approaching 10 percent.
The ever-growing use of artificial intelligence (AI) and other high performing computing (HPC) applications only adds to the problem, demanding more energy and generating more carbon emissions. Computer scientists and microelectronics engineers are striving to reduce the digital world’s carbon footprint in an approach called green computing.
Green computing, also known as sustainable computing, is a multifaceted concept that addresses the problem along several fronts. “The green computing environment is an entire ecosystem that consists of the hardware and the software,” Feng said. “The idea is you want to get the same work done or at the same rate or at the same speed but with less energy. The way I would view it is like fuel efficiency in computing.”
But that comes back to his experience more than 20 years ago with Green Destiny. “What I had proposed then was effectively a Toyota Prius in that it was more fuel efficient, but it also had to sacrifice a little bit of performance,” Feng said.
Within the supercomputing community, however, the bias has been for performance over all else: Porsches, not Priuses. And that processing power requires electricity. For example, Argonne National Laboratory’s soon-to-be-completed exascale system Aurora is projected to draw a maximum of 60 MW, more than twice the power of the current supercomputer champion, Frontier at Oak Ridge Laboratory.
As reducing greenhouse gas emissions becomes a societal focal point, however, the energy use of supercomputers and large server farms is attracting scrutiny. To keep the high-power applications running, green computing may be the way of the future.
“The consumer side is important and not to be ignored, but we are focused on the data centers because it’s easier to measure, and it’s also easier to make a single change with large effect,” said Vijay Gadepally, leader of energy-aware research at MIT’s Lincoln Laboratory Supercomputing Center.
Gadepally points out that much of the wonderful functionality of your phone actually happens elsewhere. “Your phone certainly is more complex today than two years ago,” he said. “I’d argue that more of the computing is being offloaded to these large data centers. When you talk to Siri or Alexa or any of these tools, more and more of that processing is occurring in some large-scale data center somewhere. And the number and impact of these data centers is growing quickly.
Join ASME's Community of Experts
“It’s not uncommon to see data centers coming up that are 10 megawatts, 15 megawatts, 20 megawatts,” Gadepally said.
An overlooked aspect, he adds, is that “a lot of these data centers are water cooled. That’s the only way you can kind of get the density of power into these data centers. And they’re often in places where you have stressed watersheds or other issues associated with water.”
Green computing begins with considering the environmental impact of the design and manufacture of the hardware itself and the raw materials that make up the processor chips, known as embodied carbon or embodied footprint. That’s the focus of Argonne National Laboratory materials scientist Angel Yanguas-Gil’s work.
“Our research right now focuses on understanding how we can accelerate the development and incorporation of new materials and into architectures so that they can demonstrate savings in terms of energy efficiency,” Yanguas-Gil said. “From the point of view of the overall sustainability aspect for IT devices, and for sensors or even smartphones, the energy consumption that goes into fabrication is much higher than the one that goes into use.”
More from Mechanical Engineering magazine: Building the Business Case for Industry 4.0
Yanguas-Gil and his Argonne colleagues are working on ways to accelerate and optimize the fabrication of microprocessor chips that are energy-efficient both in their creation and their ultimate operation. “One of the approaches that we have been taking is trying to integrate materials science and machine learning to accelerate the development and integration of these materials into processes that are homogeneous across a large scale, which is what the semiconductor industry cares about,” he said. “Having approaches that can make the computation more efficient can really impact the energy footprint of microelectronics overall, because you’re making devices that are going to be used for a long time, that are going to require active cooling as part of systems that are going to be heavily used, much more efficient.”
Especially in the supercomputing realm, there are also practical advantages to going green.
“Power dissipation is proportional to the temperature in the system,” Feng said. “When we did the Green Destiny supercomputer, we found was that every 10 °C increase in the temperature in the system would double its failure rate. So that’s a big reason for getting supercomputers to be more green, not only from an operational cost perspective, but also with respect to efficiency and availability and reliability of the computing resource.”
Advances in Heat Transfer: Heat Transistor May Make for Cooler Computers
Feng noted that the best solution for excessive heat, whether in a single computer device or a huge data center, is not to generate it to begin with. “From that perspective, it’s a hardware problem. But given the context of companies having to make money, you can’t do just a hardware solution. You can’t do one without the other. The hardware basically sets the expectation of where the power consumption range will be, what the computing capability will be. And then the software, if it’s smart enough, will automatically adjust power consumption for the end user so that they can still get the same performance, while saving energy.”
The Lincoln Laboratory Supercomputing Center deals with that issue by increasing awareness of just how much energy its users are consuming. “We have spent a lot of effort in making the information public,” Gadepally explained. “People don’t know what the carbon footprint or energy requirements of their jobs are and are flabbergasted when I tell them how much some of these models use in terms of energy. You’re talking about megawatt hours or gigawatt hours. If you’re thinking about things like GPT 3.5 or Bard or [other AI programs], you’re talking about dozens of gigawatt hours’ worth of energy usage.”
Explore ASME research in the latest issue of R&D Pulse
Another strategy is changing the way the hardware operates. “We have made some changes to our data center, and we’ve now seen this making its way through to other places, just making changes on the way that the hardware operates to make it inherently more efficient,” Gadepally said. “One specific example is we cap the power to our GPUs at about 60 percent of their max power rating. In general, the higher power it draws, the higher your performances. But we’ve done a lot of experiments in which we’ve seen that you can basically limit that power and your performance, in that the amount of time it takes to solve the problem is largely unchanged.”
As a specific example, Gadepally cites an experiment with implementing this cap on the Google language model BERT.
“What we found in our experiment is the training time goes from about 79 to 82 hours,” he said. “So it’s about a three or four hour increase in the amount of time it took to run this job, but we were able to save about a week’s worth of an average American’s household energy. When we talked to our users, they said, ‘I would take that trade any day.’ There’s very few people for whom going from 79 to 82 hours is going to make any difference in their lives. But if you can save a week’s worth of your house’s energy, that’s a nontrivial amount of energy savings. Those are the type of changes we’re hoping to see across the board.”
The power capping strategy has another benefit, Gadepally explained. “Not only did we see an energy reduction from the GPUs themselves, we now run our GPUs about 30 degrees Fahrenheit cooler because of that lower power. What that means is you get this compounding effect, where your air conditioning is reduced, especially in hot summer months. The amount of cooling we need goes down and the hardware lasts longer, which means that it can help us with the embodied carbon. They become more reliable because they’re running at a lower temperature.”
Energy can also be saved by sacrificing processing power when it’s not absolutely needed.
“Another direction that people have been looking at is changing the precision of models, which means that you can get more out of your given computer infrastructure,” Gadepally said. “So I could run things at lower precision, which means less data, less memory, and less processing. Do you always need to have the highest end model running at all times? It turns out that for many use cases, you might be OK with an approximate answer.”
More on digital engineering: Computer-aided Farming Helps Grow Crops
One of Feng’s non-research projects is the Green500 list, a biannual ranking of the most energy-efficient supercomputers, which he started in 2007. “We recently reached the goal of 50 gigaflops [50 billion floating point operations per second] per watt,” Feng said. “That’s a tremendous amount of computing capability for just one watt of power.” The list serves as a benchmark and an incentive for the supercomputing community to up its game when it comes to green computing and finding ways to increase performance without incurring a penalty in carbon emissions. “We ended up reaching this goal because the governments funded work to ensure better energy efficiency of these systems.”
That secrecy can make it difficult to estimate the energy footprint of materials or manufacturing processes so that mitigation strategies can be devised and implemented.
As with other areas with wide environmental impact, convincing people that going green is not only necessary but practical is a major concern. “In my opinion, the biggest challenge is that I don’t think we’ve yet created the right incentives for it,” Gadepally said. “If you look at this from the academic perspective, or maybe even partly the commercial perspective, the push from consumers is for better, faster models. I don’t think we’ve yet reached a point where we’re incentivizing more efficient models.”
Gadepally noted that there could be different levels or types of incentives. One might make the environmental impact of a particular activity more immediately obvious. “When you book a flight on Google Flights, they now show you the carbon footprint of your travel. I would love for that to be something that I see when I’m talking to ChatGPT. Or even just the energy. Tell me something, like this query cost one tree in the rain forest.” Other incentive approaches might be encouraging the establishment of data centers largely employing renewable energy. “I think pushing for efficiency rather than just carbon neutrality would be one of the directions,” he said.
The concept of green computing still faces some resistance in some quarters but is slowly and steadily taking hold. “I would say that greenness was a third-class citizen 20 years ago, and over the past two decades, it’s become a robust second-class citizen” in the supercomputing arena, Feng observed. “The Green500 community wants to base the efficiency of supercomputers based on fuel efficiency, like 55 miles per gallon or whatever, whereas the traditional supercomputing community wants to rate the computers based on the top speed of the car. I’d like energy efficiency or greenness to be a first-class citizen.”
For the immediate future, however, as the ACE report noted, “Computing can help mitigate climate change but must first cease contributing to it.”
More for You: ASME's commitment to Sustainability
Mark Wolverton is a technology writer in Narbeth, Pa.
Feng, who today is a professor at Virginia Tech in Blacksburg, has spent a good chunk of his career on making high-end computers more energy efficient.
It’s an issue that remains somewhat under the radar. “Few people think of the IT industry when they think about climate change,” wrote IBM in an official blog, but according to a 2021 report from the Technology Policy Council of the Association for Computing Machinery (ACE), between 1.8 percent to 3.9 percent of global carbon emissions are created by the information and communications technology sector while the entire aviation industry accounts for only about 2.5 percent. Some estimates credit computing with an even higher greenhouse gas contribution, approaching 10 percent.
The ever-growing use of artificial intelligence (AI) and other high performing computing (HPC) applications only adds to the problem, demanding more energy and generating more carbon emissions. Computer scientists and microelectronics engineers are striving to reduce the digital world’s carbon footprint in an approach called green computing.
Green computing, also known as sustainable computing, is a multifaceted concept that addresses the problem along several fronts. “The green computing environment is an entire ecosystem that consists of the hardware and the software,” Feng said. “The idea is you want to get the same work done or at the same rate or at the same speed but with less energy. The way I would view it is like fuel efficiency in computing.”
But that comes back to his experience more than 20 years ago with Green Destiny. “What I had proposed then was effectively a Toyota Prius in that it was more fuel efficient, but it also had to sacrifice a little bit of performance,” Feng said.
Within the supercomputing community, however, the bias has been for performance over all else: Porsches, not Priuses. And that processing power requires electricity. For example, Argonne National Laboratory’s soon-to-be-completed exascale system Aurora is projected to draw a maximum of 60 MW, more than twice the power of the current supercomputer champion, Frontier at Oak Ridge Laboratory.
As reducing greenhouse gas emissions becomes a societal focal point, however, the energy use of supercomputers and large server farms is attracting scrutiny. To keep the high-power applications running, green computing may be the way of the future.
Data-center solutions
The Center for Sustainable Systems at the University of Michigan reports that servers and data centers in the United States alone emit 28.4 million tons of carbon dioxide annually. With something of a digital arms race now under way in AI research and applications, companies and institutions keep rapidly expanding their data centers and their operations, all of which adds to the environmental problems created by the already-existing dataverse.“The consumer side is important and not to be ignored, but we are focused on the data centers because it’s easier to measure, and it’s also easier to make a single change with large effect,” said Vijay Gadepally, leader of energy-aware research at MIT’s Lincoln Laboratory Supercomputing Center.
Gadepally points out that much of the wonderful functionality of your phone actually happens elsewhere. “Your phone certainly is more complex today than two years ago,” he said. “I’d argue that more of the computing is being offloaded to these large data centers. When you talk to Siri or Alexa or any of these tools, more and more of that processing is occurring in some large-scale data center somewhere. And the number and impact of these data centers is growing quickly.
Join ASME's Community of Experts
“It’s not uncommon to see data centers coming up that are 10 megawatts, 15 megawatts, 20 megawatts,” Gadepally said.
An overlooked aspect, he adds, is that “a lot of these data centers are water cooled. That’s the only way you can kind of get the density of power into these data centers. And they’re often in places where you have stressed watersheds or other issues associated with water.”
Green computing begins with considering the environmental impact of the design and manufacture of the hardware itself and the raw materials that make up the processor chips, known as embodied carbon or embodied footprint. That’s the focus of Argonne National Laboratory materials scientist Angel Yanguas-Gil’s work.
“Our research right now focuses on understanding how we can accelerate the development and incorporation of new materials and into architectures so that they can demonstrate savings in terms of energy efficiency,” Yanguas-Gil said. “From the point of view of the overall sustainability aspect for IT devices, and for sensors or even smartphones, the energy consumption that goes into fabrication is much higher than the one that goes into use.”
More from Mechanical Engineering magazine: Building the Business Case for Industry 4.0
Yanguas-Gil and his Argonne colleagues are working on ways to accelerate and optimize the fabrication of microprocessor chips that are energy-efficient both in their creation and their ultimate operation. “One of the approaches that we have been taking is trying to integrate materials science and machine learning to accelerate the development and integration of these materials into processes that are homogeneous across a large scale, which is what the semiconductor industry cares about,” he said. “Having approaches that can make the computation more efficient can really impact the energy footprint of microelectronics overall, because you’re making devices that are going to be used for a long time, that are going to require active cooling as part of systems that are going to be heavily used, much more efficient.”
Especially in the supercomputing realm, there are also practical advantages to going green.
“Power dissipation is proportional to the temperature in the system,” Feng said. “When we did the Green Destiny supercomputer, we found was that every 10 °C increase in the temperature in the system would double its failure rate. So that’s a big reason for getting supercomputers to be more green, not only from an operational cost perspective, but also with respect to efficiency and availability and reliability of the computing resource.”
Advances in Heat Transfer: Heat Transistor May Make for Cooler Computers
Feng noted that the best solution for excessive heat, whether in a single computer device or a huge data center, is not to generate it to begin with. “From that perspective, it’s a hardware problem. But given the context of companies having to make money, you can’t do just a hardware solution. You can’t do one without the other. The hardware basically sets the expectation of where the power consumption range will be, what the computing capability will be. And then the software, if it’s smart enough, will automatically adjust power consumption for the end user so that they can still get the same performance, while saving energy.”
Sacrificing power
Sometimes, however, making systems more efficient can actually defeat the purpose of saving energy. “Often when we give people more efficient computers, they just do more,” Gadepally said. “It’s like, ‘Oh, cool, this is double the speed. I can do a problem that’s double the size now.’ ”The Lincoln Laboratory Supercomputing Center deals with that issue by increasing awareness of just how much energy its users are consuming. “We have spent a lot of effort in making the information public,” Gadepally explained. “People don’t know what the carbon footprint or energy requirements of their jobs are and are flabbergasted when I tell them how much some of these models use in terms of energy. You’re talking about megawatt hours or gigawatt hours. If you’re thinking about things like GPT 3.5 or Bard or [other AI programs], you’re talking about dozens of gigawatt hours’ worth of energy usage.”
Explore ASME research in the latest issue of R&D Pulse
Another strategy is changing the way the hardware operates. “We have made some changes to our data center, and we’ve now seen this making its way through to other places, just making changes on the way that the hardware operates to make it inherently more efficient,” Gadepally said. “One specific example is we cap the power to our GPUs at about 60 percent of their max power rating. In general, the higher power it draws, the higher your performances. But we’ve done a lot of experiments in which we’ve seen that you can basically limit that power and your performance, in that the amount of time it takes to solve the problem is largely unchanged.”
As a specific example, Gadepally cites an experiment with implementing this cap on the Google language model BERT.
“What we found in our experiment is the training time goes from about 79 to 82 hours,” he said. “So it’s about a three or four hour increase in the amount of time it took to run this job, but we were able to save about a week’s worth of an average American’s household energy. When we talked to our users, they said, ‘I would take that trade any day.’ There’s very few people for whom going from 79 to 82 hours is going to make any difference in their lives. But if you can save a week’s worth of your house’s energy, that’s a nontrivial amount of energy savings. Those are the type of changes we’re hoping to see across the board.”
Pulse of the Profession: Digital Transformation
Unlocking the full potential of digital transformation requires a fundamental rethink of traditional ways of working. Our ASME white paper dives deep into this engineering revolution.
The power capping strategy has another benefit, Gadepally explained. “Not only did we see an energy reduction from the GPUs themselves, we now run our GPUs about 30 degrees Fahrenheit cooler because of that lower power. What that means is you get this compounding effect, where your air conditioning is reduced, especially in hot summer months. The amount of cooling we need goes down and the hardware lasts longer, which means that it can help us with the embodied carbon. They become more reliable because they’re running at a lower temperature.”
Energy can also be saved by sacrificing processing power when it’s not absolutely needed.
“Another direction that people have been looking at is changing the precision of models, which means that you can get more out of your given computer infrastructure,” Gadepally said. “So I could run things at lower precision, which means less data, less memory, and less processing. Do you always need to have the highest end model running at all times? It turns out that for many use cases, you might be OK with an approximate answer.”
More on digital engineering: Computer-aided Farming Helps Grow Crops
One of Feng’s non-research projects is the Green500 list, a biannual ranking of the most energy-efficient supercomputers, which he started in 2007. “We recently reached the goal of 50 gigaflops [50 billion floating point operations per second] per watt,” Feng said. “That’s a tremendous amount of computing capability for just one watt of power.” The list serves as a benchmark and an incentive for the supercomputing community to up its game when it comes to green computing and finding ways to increase performance without incurring a penalty in carbon emissions. “We ended up reaching this goal because the governments funded work to ensure better energy efficiency of these systems.”
Knowing the cost
The issue of embedded carbon has been more challenging to address because information transparency can be a challenge. “I think most companies are open to talking about the manufacturing process, but it’s often something that companies are unwilling to share because it really does give a lot of details about the process,” Gadepally said. “It’s the same thing even with the [language] models. We’re often guessing some of these energy requirements, they’re not really published. People think it’s intellectual property because you’re telling me the exact number of parameters, and I can guess what your capabilities are based on that. Right now, everyone’s holding onto their information.”That secrecy can make it difficult to estimate the energy footprint of materials or manufacturing processes so that mitigation strategies can be devised and implemented.
As with other areas with wide environmental impact, convincing people that going green is not only necessary but practical is a major concern. “In my opinion, the biggest challenge is that I don’t think we’ve yet created the right incentives for it,” Gadepally said. “If you look at this from the academic perspective, or maybe even partly the commercial perspective, the push from consumers is for better, faster models. I don’t think we’ve yet reached a point where we’re incentivizing more efficient models.”
Gadepally noted that there could be different levels or types of incentives. One might make the environmental impact of a particular activity more immediately obvious. “When you book a flight on Google Flights, they now show you the carbon footprint of your travel. I would love for that to be something that I see when I’m talking to ChatGPT. Or even just the energy. Tell me something, like this query cost one tree in the rain forest.” Other incentive approaches might be encouraging the establishment of data centers largely employing renewable energy. “I think pushing for efficiency rather than just carbon neutrality would be one of the directions,” he said.
The concept of green computing still faces some resistance in some quarters but is slowly and steadily taking hold. “I would say that greenness was a third-class citizen 20 years ago, and over the past two decades, it’s become a robust second-class citizen” in the supercomputing arena, Feng observed. “The Green500 community wants to base the efficiency of supercomputers based on fuel efficiency, like 55 miles per gallon or whatever, whereas the traditional supercomputing community wants to rate the computers based on the top speed of the car. I’d like energy efficiency or greenness to be a first-class citizen.”
For the immediate future, however, as the ACE report noted, “Computing can help mitigate climate change but must first cease contributing to it.”
More for You: ASME's commitment to Sustainability
Mark Wolverton is a technology writer in Narbeth, Pa.