• by qnleigh on 10/6/2025, 3:02:50 AM

    For reference, global energy consumption is about 180,000 TWh[1]. So while the numbers in this article are large, they're not a significant fraction of the total. Traveling and buying things are probably a much bigger part of your carbon footprint. For example:

    - 25 LLM queries: ~8.5 Wh

    - driving one mile: ~250-1000 Wh

    - one glass bottle: ~1000 Wh [2]

    - a new laptop: ~600,000 Wh [3]

    - round-trip flight from LA to Tokyo: ~1,000,000 Wh

    [1] https://ourworldindata.org/energy-production-consumption

    [2] https://www.beveragedaily.com/Article/2008/03/17/study-finds...

    [3] https://www.foxway.com/wp-content/uploads/2024/05/handprint-...

  • by gaoshan on 10/6/2025, 1:49:42 AM

    One thing it's doing is jacking up electricity rates for US States that are part of the [PJM Interconnection grid](https://en.wikipedia.org/wiki/PJM_Interconnection). It's a capacity auction price that is used to guarantee standby availability and it is [up significantly](https://www.toledochamber.com/blog/watts-up-why-ohios-electr...) at $270.43 per MW/day, which is far above prior years (~$29–58/MW/day) and this is translating to significantly higher consumer prices.

  • by geuis on 10/6/2025, 1:35:32 AM

    My thoughts.

    Current gen AI is going to result in the excess datacenter equivalent of dark fiber from the 2000's. Lots of early buildout and super investment, followed by lack of customer demand and later cheaper access to physical compute.

    The current neural network software architecture is pretty limited. Hundreds of billions of dollars of investor money has gone into scaling backprop networks and we've quickly hit the limits. There will be some advancements, but it's clear we're already at the flat part of the current s-curve.

    There's probably some interesting new architectures already in the works either from postdocs or in tiny startups that will become the base of the next curve in the next 18 months. If so, one or more may be able to take advantage of the current overbuild in data centers.

    However, compute has an expiration date like old milk. It won't physically expire but the potential economic potential decreases as tech increases. But if the timing is right, there is going to be a huge opportunity for the next early adopters.

    So what's next?

  • by simonw on 10/6/2025, 12:59:45 AM

    If I'm interpreting this right it's estimating that ChatGPT's daily energy usage is enough to charge just 14,000 electric vehicles - and that's to serve in the order of ~100 million daily users.

  • by blondie9x on 10/6/2025, 12:52:22 AM

    This doesn't include the energy for mining and chip production either. Can you imagine if it did?

    Then when you take into account the amount of water used to cool the data centers as well as part o extraction and production process? Things get insane then https://www.forbes.com/sites/cindygordon/2024/02/25/ai-is-ac...

  • by SweetSoftPillow on 10/6/2025, 7:23:12 AM

    Very useful context:

    "How much energy does Google’s AI use? We did the math": https://cloud.google.com/blog/products/infrastructure/measur...

  • by driverdan on 10/6/2025, 2:01:54 AM

    This doesn't seem to factor in the energy cost of training which is currently a very significant overhead.

  • by logicallee on 10/6/2025, 6:57:02 AM

    The energy used for manufacturing the datacenters including the GPU's must also be rather high. Manufacturing is an energy-intensive sector.

    Edit: I asked ChatGPT-5:

    https://chatgpt.com/share/68e36c19-a9a8-800b-884e-48fafbe0ec...

    it says:

    >the manufacturing of GPUs and datacenters themselves consumes a large amount of energy, not just their operation. The operational energy use (for AI training, inference, cooling, etc.) gets most of the attention, but the embodied energy — the energy used to extract raw materials, manufacture chips and components, and construct facilities — is substantial.

    and summarizes it with:

        4. Bottom Line
    
         • Manufacturing GPUs and datacenters is highly energy-intensive, but operational energy dominates over time.
         • For a single GPU, embodied energy ≈ 0.5–1 MWh.
         • For a datacenter, embodied energy ≈ 6–12 months of its operational energy.
         • Given AI-scale deployments (millions of GPUs), the embodied manufacturing energy already reaches terawatt-hours globally — roughly comparable to the annual electricity use of a small country.

  • by ChrisArchitect on 10/6/2025, 4:48:35 AM

    Related:

    OpenAI’s Hunger for Computing Power Has Sam Altman Dashing Around the Globe

    https://news.ycombinator.com/item?id=45477192

  • by blueblisters on 10/6/2025, 3:43:11 AM

    Math comparing new datacenter capacity to electric cars -

    Projections estimate anywhere between 10GW to 30GW of US datacenter buildup over the next few years

    1GW of continuous power can support uniform draw from ~2.6M Tesla Model 3s assuming 12,000 miles per year, 250Wh/mile.

    So 26M on the lower end, 80M Model 3s on the upper end.

    That's 10x-30x the cumulative number of Model 3s sold so far

    And remember all datacenter draw is concentrated. It will disproportionately going to impact regions where they're being built.

    We need new, clean power sources yesterday

  • by maxglute on 10/6/2025, 4:19:27 AM

    >The Schneider Electric report estimates that all generative AI queries consume 15 TWh in 2025 and will use 347 TWh by 2030; that leaves 332 TWh of energy—and compute power—that will need to come online to support AI growth. T

    +332TW is like... +1% of US power consumption, or +8% of US electricity. If AI bubble burst ~2030... that's functionally what US will be left with (assuming new power infra actually built) mid/long term since compute depreciates 1-5 years. For reference dotcom burst left US was a fuckload of fiber layouts that lasts 30/40/50+ years. Still using capex from railroad bubble 100 years ago. I feel like people are failing to grasp how big of a F US will eat if AI bursts relative to past bubbles. I mean it's better than tulip mania, but obsolete AI chips also closer to tulips than fiber or rail in terms of stranded depreciated assets.

  • by b00ty4breakfast on 10/6/2025, 7:51:01 AM

    >We used the figure of 0.34 watt-hours that OpenAI’s Sam Altman stated in a blog post without supporting evidence. It’s worth noting that some researchers say the smartest models can consume over 20 Wh for a complex query. We derived the number of queries per day from OpenAI's usage statistics below.

    I honestly have no clue how much trust to place in data from a blog post written by a guy trying to make people give him lots of money. My gut is to question every word that comes out of his mouth but I'm maybe pessimistic in that regard.

    But besides that, the cost of this stuff isn't just the energy consumption of the computation itself; the equipment needs to be manufactured, raw materials need to be extracted and processed, supplies and manpower need to be shuffled around. Construction of associated infrastructure has it's own costs as well. what are we, as a society (as opposed to shareholders and executives) going to get in return and is it going to be enough to justify the costs, not just in terms of cash but also resources. To say nothing of the potential environmental impact of all this.

  • by perilunar on 10/6/2025, 11:24:01 AM

    MWh per day? TWh per year?

    In what world are these sensible units of power? Why can't we just use Watts FFS?

    Btw:

    1 MWh per day ≈ 42 kW

    1 TWh per year ≈ 114 MW

    Or:

    All Chat GPT users: 850 MWh / day = 310 GWh / year ≈ 35.4 MW

    All AI users: 15 TWh / year ≈ 1.7 GW

  • by EcommerceFlow on 10/6/2025, 4:44:08 AM

    The question to ask is why supply of energy hasn't kept up with demand. Regulations (primarily in Democratic states) is most likely the answer. When you use government incentives to pick winners and losers with energy sources, it throws the entire energy market out of sync.