Previous Chapter: Front Matter
Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.

Overview

In recent years, the global adoption of artificial intelligence (AI) has spurred significant construction and investment in new data centers and cloud computing. These data centers require large-scale continuous power, posing challenges for local electric grids and broader climate goals. To explore how to map, measure, and mitigate the impacts of AI data center electricity usage, the National Academies of Sciences, Engineering, and Medicine convened the workshop Implications of Artificial Intelligence-Related Data Center Electricity Use and Emissions in Washington, DC, on November 12–13, 2024. Organized through the National Academies’ Roundtable on Artificial Intelligence and Climate Change, the event gathered more than 95 in-person and more than 350 virtual participants from academia, the technology industry, electric utilities, community advocacy groups, and government agencies to discuss how recent AI developments could impact energy demands, identify options to mitigate increased electricity use and emissions, and consider regional implications related to data center siting and renewable resource availability.

The workshop aimed to foster shared learning and enhanced coordination as stakeholders work to understand and mitigate the technical, social, behavioral, and environmental impacts of AI data centers and their unique energy needs, with a focus on how these issues are unfolding in the United States. Attendees considered the available evidence on current and future trends in AI adoption, discussed technical and policy solutions that could help to mitigate AI data centers’ large energy demands and their associated carbon emissions and impacts on local

Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.

communities, and identified data and modeling gaps and needs relevant to improving understanding of these issues. They also discussed how AI technologies can potentially be leveraged to improve the grid and accelerate decarbonization goals by facilitating efficiency improvements both in data centers and across the economy more broadly. In presentations and moderated discussions, speakers examined the projected power demands from proposed new data centers and implications for the grid, sustainability goals, and local economies and communities and discussed challenges and opportunities in data center infrastructure and hardware–software interactions that could influence the efficiency of AI data centers in the future.

Over the course of the workshop, participants gained a new understanding of the scope and scale of the issues, as well as insights into potential solutions, knowledge and research gaps, and opportunities to create interdisciplinary, collaborative teams to advance innovation in this space. In the workshop’s final session, members of the planning committee reflected on key issues that emerged from the discussions and contributed their thoughts on knowledge gaps and future directions for research, policy, and innovation.

TRENDS IN ARTIFICIAL INTELLIGENCE USE AND RELATED DATA CENTER ENERGY DEMANDS

Throughout the workshop, invited experts shared information on past and current energy demands for AI, particularly in data centers, along with projections for the future. Several speakers highlighted how technological advancements have led to tremendous gains in AI system performance and efficiency over the past decade, but described how gains in energy efficiency have been outpaced by the rapid growth in computing demands and data center expansions. This has resulted in a net increase in energy demands from AI data centers that is projected to accelerate further in the coming years. Much of the growth in data center infrastructure is concentrated in specific regions (e.g., Northern Virginia) due to factors such as fiber optic network availability, business-friendly policies and incentives, affordable power and water, and a skilled workforce. However, this regional clustering can create uneven impacts on local electric power grids and communities across the United States and globally.

Yet, the uncertainty in data center–driven load growth is high—some analyses suggest that energy use by AI workloads is overestimated, and some high-end projections suggest usage could double or even triple by 2030. To put this in perspective, in 2023, data centers accounted for

Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.

4.4 percent of total U.S. electricity consumption1—a relatively small share compared to other sources—and AI represents only a portion of activity within data centers. The expansion of AI data centers is also unfolding alongside broader trends in electrification and decarbonization, and overall electricity demand is rising as electrification accelerates across other industries such as transportation, manufacturing, and buildings, adding to the uncertainty in demand projections.

The real challenge and opportunity is the speed at which data centers require additional power, which could serve as a catalyst for accelerating grid modernization and advancing beneficial electrification across other sectors. This speed, however, amplifies the challenges that electric power utilities and grid operators face in keeping pace with rising demand—and avoiding the pitfalls of either overbuilding or underbuilding infrastructure—while striving to meet decarbonization goals by increasing clean energy generation and reducing fossil fuel use. Several speakers described how utilities have used projected increases in energy demands from AI data centers as justification for delaying the retirement of fossil fuel generation plants, or bringing new gas generators online, posing a setback for meeting decarbonization targets. This is compounded by the fact that utilities are typically incentivized to prioritize the cheapest and most reliable sources of electricity—often natural gas—rather than those with the lowest carbon emissions.

Despite the challenges related to grid generation capacity, several speakers pointed out that the availability of AI tools and the expansion of AI data centers also present new opportunities to modernize the electric grid and support greater efficiency and reliability. For example, if the electricity demands of data centers can be more flexible, more of their energy needs can be met through renewable generation; in addition, various demand response and market participation arrangements can position data centers as a partner to utilities in helping to increase grid capacity and ensure reliability.

While use of AI is growing and driving new investments in infrastructure, it currently represents only a portion of the total activity within data centers. Most data centers still primarily support a wide range of non-AI workloads, such as cloud computing, web hosting, and storage. As a result, the societal impacts of AI—such as changes in labor, education, and decision-making—are distinct from the broader environmental and energy impacts associated with data centers. It is challenging but important to separate these conversations: AI’s influence on society will

___________________

1 A. Shehabi, S.J. Smith, A. Hubbard, et al., 2024, 2024 United States Data Center Energy Usage Report, LBNL-2001637, Lawrence Berkeley National Laboratory, https://eta-publications.lbl.gov/sites/default/files/2024-12/lbnl-2024-united-states-data-center-energy-usage-report.pdf.

Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.

depend on how extensively it is used and how efficient models become, while the operational footprint of data centers stems from their overall workload mix, infrastructure design, and energy practices.

Workshop planning committee chair Benjamin C. Lee, University of Pennsylvania, highlighted several points raised around trends and future use cases for large language models (LLMs). For example, AI algorithms can be redesigned to require fewer operations and less data movement, with the potential to create 10–100-fold improvements in efficiency. Designing more domain-specific and specialized machine learning models to use in place of larger, general models can also potentially drive discoveries in medicine, science, and business at much lower computational costs, although the energy efficiency implications of this approach are as yet unknown.

Despite all the opportunities around flexibility and efficiency, several speakers suggested that improving efficiency is not likely to be a major focus for AI developers until and unless LLMs become monetized. Until then, financial and regulatory incentives may be required for companies to design technologies in ways that are less energy intensive—the cost of electricity is trivial in comparison to capital expenditures on hardware.

Another consideration relevant to the future use of AI and associated energy demand is the degree to which users will tolerate latency, or the length of time it takes for data to travel between the user and the server and back. Factors such as proximity to Internet Exchange Points, high-density fiber optic networks, and direct cloud connections all affect latency, making data center location a crucial consideration. As a result, edge computing is becoming more popular to reduce delays. Beyond user experience, longer latencies can also pose security risks, as data rerouted through multiple locations become vulnerable to interception. Both latency tolerance and throughput (the number of responses an AI can generate per second) influence AI model design and energy efficiency. However, several speakers noted that it is still unclear how much latency users will accept, given the early stage of LLM deployment.

Multimodal models, which process text alongside images, audio, and video, present another potential complexity for energy demand. These models may require more computational time and thus more energy, although some speakers suggested that new AI tools for compression and optimization could help offset costs and energy needs. Questions such as those around latency, throughput, and multimodal models are just a few examples of the factors that contribute to significant uncertainties around future energy demands related to AI use.

Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.

SUSTAINABILITY AND SOCIETAL CONSIDERATIONS AROUND DATA CENTER EXPANSION

In addition to energy demands and related impacts on the grid, the rapid expansion of AI data centers can increase air pollution, affect water use and quality, and contribute to material use and environmental impacts from information technology equipment manufacturing and e-waste at the local, national, and global scales. Noting that data centers are already impacting the electric power grid, communities, and the environment at a speed far faster than experts can understand and quantify, Eric Masanet, University of California, Santa Barbara, highlighted the importance of building stakeholder consortia to address these issues and conduct more holistic sustainability assessments in the long term, but stressed that actions could be taken in the near term to begin to address problems. In particular, he noted that analyses of data center impacts need to incorporate factors such as water, air pollution, and e-waste along with energy and emissions impacts, and he emphasized that analyses should focus on the local scale where burdens are already being felt, since these impacts can be masked in analyses at national and global scales. At the national and global scale, many of the major tech companies, such as Google, have announced their commitment to decreasing carbon emissions from data centers.

To accomplish real improvements in environmental burden, data center operators and manufacturers of AI accelerator hardware need to be more transparent and share more information with the research community. Furthermore, Masanet highlighted the value of bridging data center energy models with life-cycle analysis approaches to provide a full life-cycle perspective. He suggested that the field could eventually move toward consequential life-cycle assessments, an approach wherein sophisticated and dynamic models of data centers’ energy use and emissions consider not only direct life-cycle inputs and outputs, but also various indirect scenarios and consequences on the economy and environment. For example, this approach could be used to understand the sustainability impacts of keeping fossil fuel plants in operation in order to meet growing energy demands instead of being retired and replaced with clean technologies. Using consequential life-cycle analyses to project different scenarios can enable stakeholders to more fully understand near- and long-term impacts of various approaches and reduce adverse impacts, Masanet noted.

In a session moderated by Prashant Shenoy, University of Massachusetts Amherst, experts highlighted societal considerations around data center expansion, including opportunities for data center growth to propel grid improvements, as well as the potential for significant negative impacts

Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.

on local communities. While AI applications have many potential benefits for society, including for supporting grid modernization and power management, Shenoy said that it is as yet unclear whether these benefits will outweigh the negative impacts of unchecked data center expansion. Several speakers highlighted the need for both technology advancements and policy solutions to guide responsible growth.

The scale and concentration of AI data centers influence their impacts on communities. Several speakers pointed to examples of negative impacts on air quality, water availability, land use, and electricity costs being experienced by communities in Northern Virginia, which is home to the largest concentration of data centers in the world. These negative impacts must also be weighed against the potential benefits to communities, such as tax revenue and the potential for increased grid stability if the data center is capable of flexible operation or hosting its own behind-the-meter power resources.

As AI development moves forward, speakers underscored the need to look beyond performance improvements and consider factors such as equity and sustainability implications. Shenoy suggested that developing smaller, more distributed AI data centers and inference services, similar to how content delivery and edge networks function, could help to counter trends toward hyperscaling and clustering and thereby ameliorate some of the negative impacts in communities. He added that people may also be willing to accept some sacrifices in terms of performance, such as increased latency, if it helps to avoid negative impacts to the local environment, residents, and grid infrastructure.

DATA CENTER INFRASTRUCTURE AND TECHNOLOGY ADVANCEMENT

Participants also examined how the hardware and software infrastructure that underlies AI data centers can influence future AI use, energy demands, and data centers’ interactions with the grid. Ayse K. Coskun, Boston University, pointed to a need to reconsider all aspects of data center design, from hardware and software to cooling and architecture, for increased efficiency and sustainability now and in the future. Many speakers pointed to the role of flexibility in addressing these issues, including the need for flexibility to incorporate future improvements in hardware and software; flexibility to smooth out the wide swings in energy consumption that are common with some AI processes that pose a threat to grid reliability; and demand flexibility in terms of how data centers respond to external constraints or demand response programs, onsite energy availability, and carbon and cost metrics. While flexibility and demand response have great potential to address efficiency

Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.

and sustainability, AI-focused data centers today often run at maximum capacity, in part due to the significant cost of AI-specific hardware.

At the same time, several technological advances are under way to improve the efficiency of AI workloads. Researchers are developing new model architectures that require fewer computational resources, such as sparse models, and are exploring smaller, task-specific models as alternatives to large, general-purpose ones. Innovations in hardware-software codesign are also yielding gains in efficiency. Additionally, advances in optical interconnects are helping reduce the energy cost of data movement within data centers, a growing contributor to overall power use. Perhaps most notably, there have been significant increases in chip efficiency in recent years, with each new generation of AI accelerators—such as graphics processing units, tensor processing units, and custom application-specific integrated circuits—delivering significant improvements in performance per watt, helping to curb the otherwise steep rise in AI-related energy consumption.

Carole-Jean Wu, Meta, underscored the role of AI as this century’s most important technology, affecting every economic sector and bringing new capabilities that may help to address many important societal problems. She said that the rapidly growing energy demands associated with AI make it critical to pursue efficiency innovations across the entire AI stack, from hardware to software to infrastructure and algorithm redesigns. Scalable model architecture innovations can also create efficiencies, and a better overall understanding of AI workloads can unlock more benefits. These innovations will be especially important when easier-to-achieve AI efficiency techniques reach a point of diminishing returns, Wu noted.

Metrics can help redefine technical and policy goals as well as performance and service-level agreements. Several speakers suggested that more holistic sustainability metrics, such as metrics capturing health costs and other community impacts, are needed along with new metrics to augment or replace current power use efficiency measures and capture a broader range of sustainability goals, such as carbon emissions, water use, and flexibility.

A CALL FOR COLLABORATION AND DATA SHARING

A common thread throughout the workshop was a call for enhanced collaboration and data sharing around AI data center activities, energy use, and interactions with the grid. In terms of understanding current and future trends, Lee noted that speakers identified a need for additional data to quantify and project the benefits and harms of AI use cases and different data center designs and enable the models used for these

Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.

projections to be more granular and reflective of data centers’ hardware, software, and operational characteristics. While capabilities are improving, several speakers noted that the rapid growth and high level of uncertainty in this sector undermines the ability to model energy demands and make accurate projections.

Although it may be difficult to accomplish, Coskun said that more effective data sharing and collaboration among industry, academia, and government would be instrumental in improving the overall understanding of AI workloads, forecasting capabilities, and informing sustainability standards. She suggested that more collaboration and access, perhaps modeled on the Electric Power Research Institute’s DCFlex initiative, can inform prototypes and early-concept demonstrations.2

Recognizing that the potential impacts of data centers encompass multiple domains and stakeholders—from public health practitioners to electrical engineers and grid systems operators—participants also underscored the importance of collaboration and data sharing to fully elucidate and mitigate effects of data center expansion on the grid and communities. Several participants suggested a need for companies building data centers to be more transparent about these operations to ensure accurate projections. Masanet added that experts across many areas will be needed to make use of this information. For example, engineers and technologists are needed to input the computational complexities into the models; energy and life-cycle assessment analysts are needed to extract insights from modeling; epidemiologists and community groups are needed to identify possible impacts; and social science experts are needed to incorporate behavioral and economic implications into sustainability analyses.

Facilitating flexible demand response coordination with optimized, AI-driven metrics that contribute to larger-scale grid coordination and decarbonization efforts would also require effective exchange of information and coordination between data center owners and grid operators, Wu noted. In addition, she said that there is a need for data centers and their suppliers to be more transparent in their emissions and accounting methodologies to enable accurate quantification of their environmental impacts and drive emissions reductions and energy optimizations. Finally, from a policy perspective, Shenoy noted that collaboration among community, industry, and government stakeholders is needed to design fair regulations that bring the widest possible benefits.

___________________

2 Electric Power Research Institute, n.d., “DCFlex,” https://msites.epri.com/dcflex, accessed April 21, 2025.

Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.
Page 1
Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.
Page 2
Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.
Page 3
Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.
Page 4
Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.
Page 5
Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.
Page 6
Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.
Page 7
Suggested Citation: "Overview." National Academies of Sciences, Engineering, and Medicine. 2025. Implications of Artificial Intelligence–Related Data Center Electricity Use and Emissions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29101.
Page 8
Next Chapter: 1 Introduction
Subscribe to Email from the National Academies
Keep up with all of the activities, publications, and events by subscribing to free updates by email.