background preloader

ENVIRONMENTAL IMPACTS

Facebook Twitter

How much water does AI consume? The public deserves to know it. - OECD.AI. Everyone is talking about artificial intelligence. It’s not simply a buzzword; it has already become the backbone for scientific breakthroughs, accelerated business growth, and approaches to global challenges such as climate change. AI models are essentially complicated mathematical functions with many parameters. AI’s ever-growing capabilities rely on huge volumes of data and computationally intensive calculations to extract useful patterns. Servers are hungry. And thirsty Large AI models like GPT-3, with many billions of parameters, are often trained and deployed on large clusters of servers with multiple graphic processing units (GPUs). Air pollution and carbon emissions are well-known environmental costs of AI. Scope-1 onsite water consumption: AI servers’ massive energy consumption generates much heat. The scope-1 and scope-2 water consumption are sometimes collectively called operational water consumption.

Water consumption vs. water withdrawal There needs to be more transparency. Artificial intelligence technology behind ChatGPT was built in Iowa — with a lot of water | AP News. DES MOINES, Iowa (AP) — The cost of building an artificial intelligence product like ChatGPT can be hard to measure. But one thing Microsoft-backed OpenAI needed for its technology was plenty of water, pulled from the watershed of the Raccoon and Des Moines rivers in central Iowa to cool a powerful supercomputer as it helped teach its AI systems how to mimic human writing.

As they race to capitalize on a craze for generative AI, leading tech developers including Microsoft, OpenAI and Google have acknowledged that growing demand for their AI tools carries hefty costs, from expensive semiconductors to an increase in water consumption. But they’re often secretive about the specifics. Few people in Iowa knew about its status as a birthplace of OpenAI’s most advanced large language model, GPT-4, before a top Microsoft executive said in a speech it “was literally made next to cornfields west of Des Moines.” “Most people are not aware of the resource usage underlying ChatGPT,” Ren said. New tools are available to help reduce the energy that AI models devour. When searching for flights on Google, you may have noticed that each flight's carbon-emission estimate is now presented next to its cost.

It's a way to inform customers about their environmental impact, and to let them factor this information into their decision-making. A similar kind of transparency doesn't yet exist for the computing industry, despite its carbon emissions exceeding those of the entire airline industry. Escalating this energy demand are artificial intelligence models. Huge, popular models like ChatGPT signal a trend of large-scale artificial intelligence, boosting forecasts that predict data centers will draw up to 21 percent of the world's electricity supply by 2030.

The MIT Lincoln Laboratory Supercomputing Center (LLSC) is developing techniques to help data centers reel in energy use. Their techniques range from simple but effective changes, like power-capping hardware, to adopting novel tools that can stop AI training early on. Curbing power and cooling down. Kate Crawford: exposing AI’s costs | Baillie Gifford. Kate Crawford knows of a secret that makes it impossible to see the latest chatbots and other artificial intelligence advances the same way. The Australia-born professor recently surprised an Edinburgh Book Festival audience with news of a study “indicating that every time you have an exchange with ChatGPT, it's the equivalent of pouring out half a litre of fresh water onto the ground” because that’s what it takes to “cool the giant AI supercomputers” involved. When we meet shortly afterwards, she highlights a further issue with the ‘large language models’ (LLMs) that let ChatGPT and other ‘generative’ AI tools create text, images and other human-like output.

“The energy difference from just doing a traditional search query to using a LLM is enormous,” she says. “Some research indicates it can be up to 1,000 times more energy intensive.” And this brings us to the heart of the matter. Industry insider But that’s just one of many plates the multi-talented scholar has spinning.

NEW DATA CENTRES

Data Center Fires: A Detailed Breakdown with 21 Examples - Dgtl Infra. A data center employs advanced fire protection systems and implements comprehensive prevention, detection, and suppression strategies to mitigate potential fire hazards. Despite these measures, data center fires – while seemingly infrequent, with only a few major incidents occurring every year across thousands of facilities worldwide – present a significant threat that should not be dismissed. Data center fires occur in specialized buildings equipped with power and cooling infrastructure that are used to house computer servers and network equipment. These fires are caused by factors including electrical failures, overheating lithium-ion batteries, inadequate maintenance, and human error. The impact of data center fires extends beyond immediate physical damage to the facility and equipment, often resulting in substantial downtime required to restore operations.

Data Center Fires – An Overview Fuel – Data Center Fires Heat – Data Center Fires Causes of Data Center Fires Electrical Failures. Shaolei Ren on X: "Thanks to more information recently released by Microsoft, we've updated our estimate of the operational water footprint for GPT-3 (175B). Will update our preprint ASAP. Note that GPT-4 reportedly has a much larger model size than GPT-3. Bing Chat and ChatGPT use '1 bottle of water' in cooling for every query, leading to concerns for local water supplies. Climate change is impacting everything, like it or not.

The gradual increase in average summer temperatures is creating an increase in extreme weather events and impacting everything from food availability to energy prices. Microsoft is no stranger to the effects of climate change either, as one of the few companies pledging to offset its entire current and legacy carbon footprint in the decades ahead. However, the advent of Bing Chat and ChatGPT may have thrown a wrench in the works. Generative AI like Bing Chat and ChatGPT can mimic human text and speech, creating natural language search queries, reports, and summaries.

We all know ChatGPT costs around $700,000 dollars per day to run, according to reports, but did you know that every query also uses the equivalent of a bottle of water in cooling? For Microsoft, it creates something of a headache. Water scarcity is projected to become a major challenge as we head into the mid 21st century.

ENVIRONMENTAL impact - Copy

Q&A: UW researcher discusses just how much energy ChatGPT uses. Expert quotes | Technology | UW News blog July 27, 2023 Training a large language model, such as ChatGPT, uses on average roughly equivalent to the yearly electricity consumption of over 1,000 U.S. households, according to Sajjad Moazeni, UW assistant professor of electrical and computer engineering, who studies networking for AI and machine learning supercomputing.Sanket Mishra/Unsplash ChatGPT and other large language models learn to mimic humans by analyzing huge amounts of data. Behind any chatbot’s text box is a large network of computer processing units that support training and running these models. How much energy do networks running large language models consume? A lot, according to Sajjad Moazeni, a University of Washington assistant professor of electrical and computer engineering, who studies networking for AI and machine learning supercomputing.

Just training a chatbot can use as much electricity as a neighborhood consumes in a year. UW News sat down with Moazeni to learn more. Fait partie de la famille de marques Yahoo. Nous, Yahoo, faisons partie de la famille de marques Famille de marques Yahoo Sites et applications que nous possédons et exploitons, y compris Yahoo et AOL, ainsi que Yahoo Advertising, notre service de publicité numérique.Yahoo. Lorsque vous utilisez nos sites et applications, nous utilisons des Cookies Les cookies (y compris les technologies similaires telles que le stockage Web) permettent aux opérateurs de sites Web et d’applications de stocker et de lire des informations de votre appareil.

Consultez notre politique relative aux cookies pour en savoir plus.cookies pour : vous fournir nos sites et applications ; authentifier les utilisateurs, appliquer des mesures de sécurité, empêcher les spams et les abus ; et mesurer votre utilisation de nos sites et applications. afficher des publicités et des contenus personnalisés en fonction de vos profils de centres d’intérêt ; mesurer l’efficacité des publicités et contenus personnalisés ; et développer et améliorer nos produits et services.

Shaolei Ren on X: "Thanks to more information recently released by Microsoft, we've updated our estimate of the operational water footprint for GPT-3 (175B). Will update our preprint ASAP. Note that GPT-4 reportedly has a much larger model size than GPT-3. A.I. usage fuels spike in Microsoft’s water consumption. The cost of building an artificial intelligence product like ChatGPT can be hard to measure. But one thing Microsoft-backed OpenAI needed for its technology was plenty of water, pulled from the watershed of the Raccoon and Des Moines rivers in central Iowa to cool a powerful supercomputer as it helped teach its AI systems how to mimic human writing.

As they race to capitalize on a craze for generative AI, leading tech developers including Microsoft, OpenAI and Google have acknowledged that growing demand for their AI tools carries hefty costs, from expensive semiconductors to an increase in water consumption. But they’re often secretive about the specifics. Few people in Iowa knew about its status as a birthplace of OpenAI’s most advanced large language model, GPT-4, before a top Microsoft executive said in a speech it “was literally made next to cornfields west of Des Moines.”

Building a large language model requires analyzing patterns across a huge trove of human-written text. Query.prod.cms.rt.microsoft. AI's Big Dirty Problem Is Tarnishing Big Tech's Environmental Image. It's not easy being green: AI's 'hidden' carbon footprint. Recent media coverage has drawn attention to the important fact that, despite its seeming intangibility, AI is not cost-free from a carbon-accounting perspective. Environmental concerns should not eclipse AI’s clear potential to generate positive societal impacts and, indeed, environmental benefits (see, for example our blog discussing ways in which AI can help tackle climate change).

However, businesses should be alert to the hidden carbon footprint of such technologies—particularly as their use becomes more widespread and more deeply embedded into business models. Background Recent research has made the striking prediction that, unless sustainable AI practices are implemented rapidly, by 2025 AI “will consume more energy than the human workforce, significantly offsetting carbon zero gains”. Where, however, does this energy consumption happen? The answer, as recently reported, is that training and running advanced AI models consumes vast quantities of water and electricity.

For example: Bloomberg - Are you a robot? The Carbon Emission of Writng lower for AI than for humans.