Conversations about both the detriments and benefits of artificial intelligence use in academic settings have been prevalent in recent years. However, an often overlooked consideration in these discussions is the impact that AI has on the environment.
Consequences of computing
With the sheer amount of information AI makes accessible, there exists a need for a place to house all of the power needed for this data: AI data centers are large facilities that provide the power and computing resources for machine-learning models.
These centers consist of hardware and software systems that continuously store, compute and transmit data. Due to thousands of servers operating uninterrupted, the data centers require significant maintenance to prevent overheating through robust cooling methods often utilizing large quantities of water.
One query on ChatGPT requires 519 milliliters of water, which is a little more than one bottle of water. The chatbot handles around 10 million queries daily, using up to 1.4 million gallons alone for answering user’s questions, not including the water usage needed to maintain data centers themselves.
According to the Washington Post, “a large data center, researchers say, can gobble up anywhere between 1 million and 5 million gallons of water a day — as much as a town of 10,000 to 50,000 people.”
According to University of Arizona associate professor of chemical and environmental engineering Kerri Hickenbottom, these centers can use air or water-based cooling systems, both of which necessitate large quantities of water, albeit from different sources.
“If you think about your home, you can have a swamp cooler that cools your home, or you can have AC, which is like a mechanical compression system. These systems, both of them, take quite a bit of water to operate. The evaporative cooling systems evaporate the water, and then they use that latent heat of evaporation as a cooling effect, and then they distribute that cold air to cool off the servers,” Hickenbottom said. “Air cooled systems also use water, except not necessarily on site. What they’ll do is use a mechanical compressor […]. They use a lot of energy on site to compress a coolant to then be used to cool the servers. On site, they may not use a lot of water, but they use a lot of water at the power generation facility that generates the energy to run your cooling system.”
The water can only be recycled a few times before replenishing the system with clean freshwater to prevent mineral and salt buildup.
Beyond this water usage, the centers also require extensive amounts of energy, energy that often comes from fossil fuels.
According to the United Nations Environmental Programme, “a request made through ChatGPT, an AI-based virtual assistant, consumes 10 times the electricity of a Google Search, reported the International Energy Agency.”
The IEA further reported that “by 2026, electricity consumption by data centers, cryptocurrency and artificial intelligence could reach 4% of annual global energy usage — roughly equal to the amount of electricity used by the entire country of Japan.”
Eyes on Arizona: Why the southwestern state is prime real estate for data centers
Arizona currently houses 111 data centers with 102 in Phoenix, eight in Tucson and one in Nogales.
According to Hickenbottom, Arizona is an ideal location for these centers for multiple reasons; not only is energy relatively cheap in the state, but Arizona is located on a high quality fiber optics line with good connection to other major metropolitan areas, like Los Angeles and San Antonio. Arizona has stable climate conditions (no natural disasters) that make it more suitable to the construction of these centers.
However, Arizona is also water-stressed, making the increased usage of this already precious resource troubling for many communities across the state. Drought conditions throughout Arizona continue to worsen; over the past few months, Phoenix fell just one day short of reaching its record for straight days of dry conditions (the record is 160, and the city finally received measurable precipitation after 159 days). In addition to this, according to the U.S. Drought Monitor, nearly 64% of Arizona is now experiencing severe or extreme drought conditions.
This data begs the question: how is Arizona equipped to meet the needs of the data centers it attracts?
According to Hickenbottom, it might not be.
“I do not think we’re prepared for this load. I think that there’s a lot of excitement for attracting these data centers, because, you know, it does bring a lot of economic growth. However, in the long run, it doesn’t provide a lot of jobs. Initially it does, because, you know, you’re building these data centers, which is a huge infrastructure undertaking,” Hickenbottom said.
University of Arizona assistant professor of watershed management and ecohydrology Shang Gao expressed a similar sentiment about the impact of data centers on a state that is already short of water.
“It always boggles my mind why we have to have a huge data center in the middle of the desert. In terms of energy it sort of makes sense, as we have large solar energy input, but the real question is the water usage,” Gao said. “Having data centers is really not an option if you ask me, as we are already short of water. Even if the adjacent areas of Phoenix where the data centers are located have more water than Tucson from their local source of water from the two reservoirs in the Phoenix area, it is still an issue. The region depends on the Central Arizona Project, which migrates water from Colorado hundreds of miles away which brings the water price here up.”
Water prices have become a concern for all parties involved, companies and individuals alike.
Negotiating conflicting interests: Reactions from public and private sectors
According to a report published from the University of Tulsa, tech companies are able to often negotiate better rates for water than local residents, as water rates are set by public authorities on factors of cost of water treatment and distribution rather than the supply.
Some Arizona residents were outraged about Google’s negotiated water rate for its planned data center in Mesa, Arizona, which is “$6.08 per 1,000 gallons of water, while residents paid $10.80 per 1,000 gallons”, placing the community at a disadvantage.
However, in this similar vein of negotiation, these centers also have leveraging power when it comes to the use of renewable energy.
“[Arizona] is really trying to shift a lot of its energy sources, shifting from natural gas and coal. We’ve seen some power generating stations using conventional natural gas and coal, those types of resources, we’ve seen those generating stations close and more solar popping up in the area. These data centers also have a big sustainability component. They know that they need to be more sustainable, and so they can help drive the market for renewable energy resources,” Hickenbottom said. “When they’re looking for contracts and new places to set up their establishments, [data centers] actually have a lot of bargaining power with the local municipalities and utilities to say, hey, I want my energy portfolio to be 90% renewable energy resources, which helps these utilities for planning and and building new infrastructure that’s more renewable and resource based. so they do have a lot of sway in that space, which is very positive.”
Not all negative: How AI is being used to solve problems it creates
The irony lies in the reality that despite the water consumption of artificial intelligence, it is becoming a major force in presenting unique methods and knowledge of improving water conservation.
AI is a double-edged sword. While it requires many resources for upkeep, it remains a valuable tool for researchers, assisting in decoding large data sets and advancing various components of research initiatives.
“AI is data resilient, the system depends on what data you want and the model will be trained accordingly so it makes things super efficient. So we can skip the hidden mechanisms that we may or may not understand and look at the end results without the intermediate steps. If you ask anybody in academia involved in prediction, optimization, categorization work they will use AI techniques and I am for sure using it,” Gao said.
According to an article published in the scientific journal Discover Water, “by leveraging data from diverse sources, including micro sensing, imaging, in situ and remote sensing devices, AI techniques are now enabling the creation of reliable and robust hydrological models at finer spatiotemporal resolutions, which is crucial for addressing highly nonlinear hydro-meteorological processes [31,32,33,34,35]. Therefore, exploring innovative AI models is essential for better allocation, regulation and conservation of water resources, significantly contributing to their sustainable management.”
There is always room for improvement in the functionality of technology. “If we can develop technology to improve the water usage efficiency for the data centers, like how we are already doing research on how to make agriculture water usage efficient, if we make the usage in data centers less wasteful, we can create more room to meet that demand,” Gao said.
AI and accessibility: How energy and water usage sparks broader conversations
As conversations about the role AI plays in academic, professional, social and environmental spheres continue, researchers are keeping an eye on how this influence will affect data usage and accessibility. With these data centers contributing to significant energy consumption, prices may start to rise for average consumers.
Hickenbottom believes that users will see a change in the prices of resources, which may affect how we engage in different activities.
“Maybe we’ll go back to watching more DVDs, streaming less. I don’t know. Maybe we’ll read more actual books instead of downloading. Maybe we’ll go to the library more. Maybe we’ll regress a little bit, do a little bit of a data cleanse […] and start to connect more with real objects, which wouldn’t be a bad thing. I don’t know,” Hickenbottom said.
This change in prices could also impact accessibility and create economic disparities in data usage.
“Accessibility to data I think will change a lot, like who can access it, because it’s going to get more expensive to access […]. I don’t think it’s going to necessarily be as accessible as it has always been, so that will be something interesting to keep an eye on moving forward,” Hickenbottom said.
Follow the Daily Wildcat on Instagram and Twitter/X