Close Menu
TechCentralTechCentral

    Subscribe to the newsletter

    Get the best South African technology news and analysis delivered to your e-mail inbox every morning.

    Facebook X (Twitter) YouTube LinkedIn
    WhatsApp Facebook X (Twitter) LinkedIn YouTube
    TechCentralTechCentral
    • News

      How a dowdy database maker became an investor darling

      18 June 2025

      Who let the dogs order? Sixty60 now delivers for Fido

      18 June 2025

      Starlink to South Africa: ‘We are ready to invest’

      17 June 2025

      Vodacom CEO Joosub bags R71m in pay – but taxman will take a big cut

      17 June 2025

      Major rift opens between Microsoft and OpenAI

      17 June 2025
    • World

      Trump Mobile dials into politics, profit and patriarchy

      17 June 2025

      Samsung plots health data hub to link users and doctors in real time

      17 June 2025

      TechCentral Nexus S0E2: South Africa’s digital battlefield

      16 June 2025

      Yahoo tries to make its mail service relevant again

      13 June 2025

      Qualcomm shows off new chip for AI smart glasses

      11 June 2025
    • In-depth

      Grok promised bias-free chat. Then came the edits

      2 June 2025

      Digital fortress: We go inside JB5, Teraco’s giant new AI-ready data centre

      30 May 2025

      Sam Altman and Jony Ive’s big bet to out-Apple Apple

      22 May 2025

      South Africa unveils big state digital reform programme

      12 May 2025

      Is this the end of Google Search as we know it?

      12 May 2025
    • TCS

      TechCentral Nexus S0E1: Starlink, BEE and a new leader at Vodacom

      8 June 2025

      TCS+ | The future of mobile money, with MTN’s Kagiso Mothibi

      6 June 2025

      TCS+ | AI is more than hype: Workday execs unpack real human impact

      4 June 2025

      TCS | Sentiv, and the story behind the buyout of Altron Nexus

      3 June 2025

      TCS | Signal restored: Unpacking the Blue Label and Cell C turnaround

      28 May 2025
    • Opinion

      Beyond the box: why IT distribution depends on real partnerships

      2 June 2025

      South Africa’s next crisis? Being offline in an AI-driven world

      2 June 2025

      Digital giants boost South African news media – and get blamed for it

      29 May 2025

      Solar panic? The truth about SSEG, fines and municipal rules

      14 April 2025

      Data protection must be crypto industry’s top priority

      9 April 2025
    • Company Hubs
      • Africa Data Centres
      • AfriGIS
      • Altron Digital Business
      • Altron Document Solutions
      • Altron Group
      • Arctic Wolf
      • AvertITD
      • Braintree
      • CallMiner
      • CYBER1 Solutions
      • Digicloud Africa
      • Digimune
      • Domains.co.za
      • ESET
      • Euphoria Telecom
      • Incredible Business
      • iONLINE
      • Iris Network Systems
      • LSD Open
      • NEC XON
      • Network Platforms
      • Next DLP
      • Ovations
      • Paracon
      • Paratus
      • Q-KON
      • SevenC
      • SkyWire
      • Solid8 Technologies
      • Telit Cinterion
      • Tenable
      • Vertiv
      • Videri Digital
      • Wipro
      • Workday
    • Sections
      • AI and machine learning
      • Banking
      • Broadcasting and Media
      • Cloud services
      • Contact centres and CX
      • Cryptocurrencies
      • Education and skills
      • Electronics and hardware
      • Energy and sustainability
      • Enterprise software
      • Fintech
      • Information security
      • Internet and connectivity
      • Internet of Things
      • Investment
      • IT services
      • Lifestyle
      • Motoring
      • Public sector
      • Retail and e-commerce
      • Science
      • SMEs and start-ups
      • Social media
      • Talent and leadership
      • Telecoms
    • Events
    • Advertise
    TechCentralTechCentral
    Home » AI and machine learning » AI is booming, but so is its carbon footprint

    AI is booming, but so is its carbon footprint

    The creation of every new chatbot and image generator requires a significant amount of electricity.
    By Agency Staff10 March 2023
    Twitter LinkedIn Facebook WhatsApp Email Telegram Copy Link
    News Alerts
    WhatsApp

    Artificial intelligence has become the tech industry’s shiny new toy, with expectations it’ll revolutionise trillion-dollar industries from retail to medicine. But the creation of every new chatbot and image generator requires a lot of electricity, which means the technology may be responsible for a massive and growing amount of planet-warming carbon emissions.

    Microsoft, Google and ChatGPT maker OpenAI use cloud computing that relies on thousands of chips inside servers in massive data centres across the globe to train AI algorithms called models, analysing data to help them “learn” to perform tasks. The success of ChatGPT has other companies racing to release their own rival AI systems and chatbots or building products that use large AI models to deliver features to anyone from Instacart shoppers to Snap users to chief financial officers.

    AI uses more energy than other forms of computing, and training a single model can gobble up more electricity than 100 developed-world homes use in an entire year. Yet the sector is growing so fast — and has such limited transparency — that no one knows exactly how much total electricity use and carbon emissions can be attributed to AI. The emissions could also vary widely depending on what type of power plants provide that electricity; a data centre that draws its electricity from a coal or natural gas-fired plant will be responsible for much higher emissions than one that draws power from solar or wind farms.

    We’re talking about ChatGPT and we know nothing about it. It could be three raccoons in a trench coat

    While researchers have tallied the emissions from the creation of a single model, and some companies have provided data about their energy use, they don’t have an overall estimate for the total amount of power the technology uses. Sasha Luccioni, a researcher at AI company Hugging Face, wrote a paper quantifying the carbon impact of her company’s Bloom, a rival of OpenAI’s GPT-3. She has also tried to estimate the same for OpenAI’s viral hit ChatGPT, based on a limited set of publicly available data.

    “We’re talking about ChatGPT and we know nothing about it,” she said. “It could be three raccoons in a trench coat.”

    Researchers like Luccioni say we need transparency on the power usage and emissions for AI models. Armed with that information, governments and companies may decide that using GPT-3 or other large models for researching cancer cures or preserving indigenous languages is worth the electricity and emissions, but writing rejected Seinfeld scripts or finding Waldo is not.

    Cautionary tale

    Greater transparency might also bring more scrutiny; the crypto industry could provide a cautionary tale. Bitcoin has been criticised for its outsized power consumption, using as much annually as Argentina, according to the Cambridge Bitcoin Electricity Consumption Index. That voracious appetite for electricity prompted China to outlaw mining and New York to pass a two-year moratorium on new permits for crypto mining powered by fossil fuels.

    Training GPT-3, which is a single general-purpose AI program that can generate language and has many different uses, took nearly 1.3GWh, according to a research paper published in 2021, or about as much electricity as 120 US homes would consume in a year. That training generated 502 tons of carbon emissions, according to the same paper, or about as much as 110 US cars emit in a year. That’s for just one program, or “model”. While training a model has a huge upfront power cost, researchers found in some cases it’s only about 40% of the power burned by the actual use of the model, with billions of requests pouring in for popular programs. Plus, the models are getting bigger. OpenAI’s GPT-3 uses 175 billion parameters, or variables, that the AI system has learned through its training and retraining. Its predecessor used just 1.5 billion.

    OpenAI is already working on GPT-4, plus models must be retrained regularly in order to remain aware of current events. “If you don’t retrain your model, you’d have a model that didn’t know about Covid-19,” said Emma Strubell, a professor at Carnegie Mellon University who was among the first researchers to look into AI’s energy issue.

    Another relative measure comes from Google, where researchers found that artificial intelligence made up 10-15% of the company’s total electricity consumption, which was 18.3TWh in 2021. That would mean that Google’s AI burns around 2.3TWh annually, about as much electricity each year as all the homes in a city the size of Atlanta.

    While the models are getting larger in many cases, the AI companies are also constantly working on improvements that make them run more efficiently. Microsoft, Google and Amazon — the biggest US cloud companies — all have carbon negative or neutral pledges. Google said in a statement that it’s pursuing net-zero emissions across its operations by 2030, with a goal to run its offices and data centres entirely on carbon-free energy. The company has also used AI to improve energy efficiency in its data centres, with the technology directly controlling cooling in the facilities.

    OpenAI cited work it has done to make the application programming interface for ChatGPT more efficient, cutting electricity usage and prices for customers. “We take our responsibility to stop and reverse climate change very seriously, and we think a lot about how to make the best use of our computing power,” an OpenAI spokesman said in a statement. “OpenAI runs on Azure, and we work closely with Microsoft’s team to improve efficiency and our footprint to run large language models.”

    Microsoft noted it is buying renewable energy and taking other steps to meet its previously announced goal of being carbon negative by 2030. “As part of our commitment to create a more sustainable future, Microsoft is investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application,” the company said in a statement.

    “Obviously these companies don’t like to disclose what model they are using and how much carbon it emits,” said Roy Schwartz, professor at Hebrew University of Jerusalem, who partnered with a group at Microsoft to measure the carbon footprint of a large AI model.

    Using GPUs to accelerate AI is dramatically faster and more efficient than CPUs – typically 20x more energy efficient for certain AI workloads

    There are ways to make AI run more efficiently. Since AI training can happen at any time, developers or data centres could schedule the training for times when power is cheaper or at a surplus, thereby making their operations more green, said Ben Hertz-Shargel of energy consultant Wood Mackenzie. AI companies that train their models when power is at a surplus could then tout that in their marketing. “It can be a carrot for them to show that they’re acting responsibly and acting green,” Hertz-Shargel said.

    Most data centres use graphics processing units or GPUs to train AI models and those components are among the most power hungry the chip industry makes. Large models require tens of thousands of GPUs, with the training programme ranging from weeks to months, according to a report published by Morgan Stanley analysts earlier this month.

    One of the bigger mysteries in AI is the total accounting for carbon emissions associated with the chips being used. Nvidia, the biggest manufacturer of GPUs, said that when it comes to AI tasks, they can complete the task more quickly, making them more efficient overall.

    Read: Chat app Discord embraces AI

    “Using GPUs to accelerate AI is dramatically faster and more efficient than CPUs — typically 20x more energy efficient for certain AI workloads, and up to 300x more efficient for the large language models that are essential for generative AI,” the company said in a statement.

    While Nvidia has disclosed its direct emissions and the indirect ones related to energy, it hasn’t revealed all of the emissions they are indirectly response for, said Luccioni, who asked for that data for her research.

    Read: ChatGPT: What is OpenAI’s chatbot and what is it used for?

    When Nvidia does share that information, Luccioni thinks it’ll turn out that GPUs burn up as much power as a small country. “It’s going to be bananas.”  — Josh Saul and Dina Bass, (c) 2023 Bloomberg LP

    Get TechCentral’s daily newsletter



    ChatGPT Google Microsoft Nvidia OpenAI Sasha Luccioni
    Subscribe to TechCentral Subscribe to TechCentral
    Share. Facebook Twitter LinkedIn WhatsApp Telegram Email Copy Link
    Previous ArticleChat app Discord embraces AI
    Next Article TCS | Yugen Naidoo on Lenovo and the future of the PC

    Related Posts

    Major rift opens between Microsoft and OpenAI

    17 June 2025

    Meta bets $72-billion on AI – and investors love it

    17 June 2025

    Samsung plots health data hub to link users and doctors in real time

    17 June 2025
    Company News

    Sage brings together HR leaders to explore the future of payroll and people management

    18 June 2025

    Altron: a brand journey, a birthday celebration and a bet on Joburg’s future

    17 June 2025

    7 benefits of social media integration in WordPress

    17 June 2025
    Opinion

    Beyond the box: why IT distribution depends on real partnerships

    2 June 2025

    South Africa’s next crisis? Being offline in an AI-driven world

    2 June 2025

    Digital giants boost South African news media – and get blamed for it

    29 May 2025

    Subscribe to Updates

    Get the best South African technology news and analysis delivered to your e-mail inbox every morning.

    © 2009 - 2025 NewsCentral Media

    Type above and press Enter to search. Press Esc to cancel.