I've never been so conflicted about a technology

46 speckx 109 5/15/2025, 3:45:32 PM marcjenkins.co.uk ↗

Comments (109)

roywiggins · 4h ago
> Models require massive amounts of electricity and water (to cool servers).

Does it?

https://prospect.org/environment/2024-09-27-water-not-the-pr...

> training GPT-3 used as much water as just under twice what is required for the average American’s beef consumption. In other words, just two people swearing off beef would more than compensate.

https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...

> It would be a sad meaningless distraction for people who care about the climate to freak out about how often they use Google search. Imagine what your reaction would be to someone telling you they did ten Google searches. You should have the same reaction to someone telling you they prompted ChatGPT.

Aurornis · 4h ago
I agree with your general point, but water usage numbers for farming animals are notorious for being exaggerated. There are studies that take all of the rainfall multiplied by all of the grazing land the cows are allowed on and include that as “water used”.

Water usage is also a nuanced concept because water isn’t destroyed when we use it. You have to consider the source and incidentals of processing it, as well as what alternatives it’s taking away from. None of this fits into convenient quotes though.

I think energy usage is a much better metric because we can quantify the impacts of energy usage by considering the blended sum of the sources going into the grid.

Borg3 · 3h ago
Of course water is not destroyed. It gets contaminated. And to clean it up, you need energy. Basically, we can simplify those equation to pretty much single argument: energy. Whatever you do, it needs energy. And people really really comes very lighty to that problem and waste management. If we would really like to recycle everything property I suspect global energy usage 10x fold at least. For now, its just better (I mean more economical) to just "store" that waste somewhere.
bryanlarsen · 2h ago
Not in a data center. In a data center the water is either re-used (closed loop) or it's evaporated (open loop). In either case the water does not need decontamination.
UncleOxidant · 52m ago
Yeah, I'm not sure how water in a data center gets contaminated unless they're saying it's contaminated with heat. That can be an issue in some areas where raising the temperature of a river could have an adverse effect on life in the river, but I don't think many of these data centers put water directly back into a river.
rdiddly · 3h ago
I must be missing something because water seems like the least of our worries when compared to the energy being used and the carbon dioxide being produced. I know I know, still talking about CO2 at parties instead of the hot new water thing? BOOOO-RINGGGG!!!

The whole issue of "using" water is meaningless to me in the context of the water cycle. Does a data center "use" water? Whatever water evaporates from their cooling systems falls again as rain and becomes someone else's water. Same with farming - it all either evaporates (sometimes frustratingly right from the field it was applied to, or otherwise from the surface of the river it eventually runs into, or from the food you bite into, or from your sweat, or from your excretions/the sewage system/rivers again), or ends up supplementing a (typically badly depleted) aquifer, or gets temporarily used by animals including humans to e.g. hydrolyze fats (but full completion of metabolism of said fats & fatty acids actually returns MORE water on a net basis than it took to metabolize them) and so on.

In short, water is never passing into anything or anyone. It's passing through it. You don't own it, you're just borrowing it.

Even water recirculated as a coolant in a data center (the closest thing to actually "using" water) is a finite quantity, needed only one time, with maybe small top-ups due to losses, all of which end up, you guessed it, evaporating into the commons.

roywiggins · 3h ago
Water use can have big, localized problems, but that's mostly solvable by just putting the datacenters somewhere else.

Some places rely on deep aquifers that don't refill on human lifetimes, essentially fossil water. But that's mostly a local problem and we should just stop building water-hungry industry and agriculture in those places because it's stupid.

perrygeo · 4h ago
I don't understand the "whatboutism" angle. Running LLMs requires massive amounts of electricity and water. Period.

That beef consumption and other activities also require massive amounts of resources is independent. Both can be true. And both need to be addressed. We no longer have the luxury to pick and choose which climate interventions we find most convenient; we need to succeed on all fronts. If the patient is dehydrated and bleeding, do doctors sit around debating if they should give them water or gauze? No, they do both immediately.

btilly · 3h ago
According to https://www.theverge.com/24066646/ai-electricity-energy-watt... a reasonable estimate is half a percent of global energy usage by 2027. On the one hand, this is a lot of energy. On the other, that's in the same general neighborhood as what we could save by maintaining proper tire pressure on our cars. (See https://www.factcheck.org/2008/08/the-truth-about-tire-press... for more about that.)
perrygeo · 2h ago
I get that. I just don't get why you frame it as an either/or question as though they were mutually exclusive. We can reduce data center emissions, and improve tire maintenance. Por que no los dos?
TheNewsIsHere · 3h ago
I used to live in a place where there are a lot of data centers now. There were only a few just a few years ago. Now there are tens and more under construction.

I returned to that place to visit family and friends last year. It was an eye opening experience. The people who live there have taken a keen interest on curtailing further development of any new data centers. One of the chief complaints is the constant power issues that didn’t exist before 2021-2022.

The locals argue that this is the result of the dramatic infusion of AI into every technology product. They’re likely not wrong. The communities in the area became quite politically active over the issue and have retained all manner of analysts and scientists, journalists and investigators, and so on, to aide them in making a political case against future data center development.

The thing that got me was the complaints about the noise. Over the past few years the locals and those in their employ have been monitoring noise levels near the data centers and they’ve tracked sustained increases in noise pollution with the timeline over which the use of AI has exploded. There it has become sort of a trope to measure the working day by how much noise pollution is produced by any data center nearby. Mostly belonging to a hyperscaler known by an acronym.

The data centers have reportedly not been the boon that was promised, which seems to be seen as insult to injury. The area already has a vibrant tech scene independent of data center operations. So the locals don’t really see value in allowing more data centers to be built, and they’re starting to organize politically around the idea of preventing future data center construction and implementing heavy usage based taxation on utilities used by the existing ones.

teruakohatu · 3h ago
> massive amounts of electricity and water ... we need to succeed on all fronts

Datacenters' used for training or batch jobs can be placed in places where water and power are plentiful, ambient temperature are relatively low and government/society are stable.

A datacenter is going to be built at the bottom of New Zealand where all these things are true. There must be plenty of places in the world where this holds true.

At least for training, I think it is possible to have our cake and eat it. For real-time inference, probably not.

int_19h · 17m ago
Given typical inference speeds for SOTA models, I rather suspect that you could also do it for inference without latency being particularly notable.

We might need to lay a few more cables to New Zealand though.

distances · 3h ago
> A datacenter is going to be built at the bottom of New Zealand where all these things are true. There must be plenty of places in the world where this holds true.

There are many datacenters being built in Finland. Water is not running out, and one of the cheapest electricity prices in Europe thanks to plentiful wind power.

metalcrow · 4h ago
They would if the patient is alive and fighting them! In our case, the patient is society and it is quite definitely not all unified that we cannot pick and choose. If the doctors want to save their patient, they either need to tie them down and do what is needed, or they need to work within the confines of what the patient allows while trying to convince them to allow more.
dgb23 · 3h ago
The patient is kicking and screaming, because they are an addict.

Regardless, I’m not convinced by the parent comment. It basically suggests we solve this without analysis of costs and effects. Sounds insane to me.

Especially since the most promising solutions come from rapidly developing technologies and research. Turning off the lights just sends us backwards.

-__---____-ZXyw · 1h ago
> Especially since the most promising solutions come from rapidly developing technologies and research

What solutions are you referring to?

This sentiment baffles me when I see it. It's not like we learned we'd this problem yesterday, and now we've to guess how we might solve it based on guessing.

For 35 years, "rapidly developing technologies and research" hasn't improved the numbers. I understand that there's a regular stream of news which touts one thing or another as a possible solution, but which one of them has done anything? How long do we wait for this techno-optimism to bear fruit?

roywiggins · 3h ago
Okay, but there is some reason why you personally are still spending electrons posting on the internet and not, presumably, entertaining yourself by rolling coal, and it's probably because the amount of energy you're using to browse and post on HN seems trivial to you. Does the energy use differential between Python and C++ seem important enough to always pick C++ for everything?

What I'm saying is that casual LLM use is also essentially trivial, and scolding people for using querying ChatGPT when you wouldn't scold people for, say, making a dozen Google searches or watching a Netflix show or taking a long shower or driving to the mall, is quite silly and a waste of time.

Similarly, there's a reason why doctors will tell you to worry more about your cigarette habit than almost any other bad habits you have, because it's so much more likely to kill you that it practically demands to be prioritized.

lolinder · 3h ago
Whataboutism is bad, but there's a distinction between whataboutism and making comparisons to get a sense of the true scale of the problem. If you're not allowed to ever compare relative scales of things without being accused of whataboutism then we're paving the way to making every molehill a mountain.

Two average American's beef consumption for training GPT-3 would place the cost in water of training GPT-3 at 1/150000000 of the US beef industry's average water consumption. That means it's a rounding error relative to all the other uses we place on water, and pointing that out is not whataboutism, it's putting the problem in its proper scale.

roywiggins · 3h ago
if a few thousand westerners who want to atone for LLM water use skipped a few burgers, the planet would come out ahead on net I think
-__---____-ZXyw · 1h ago
No burgers and no LLMs is better, though, in absolute terms. Or at least, that's one of the points being made, I believe
clbrmbr · 3h ago
Naive question: why doesn’t the market regulate electricity consumption?

I’ve heard the answers pre-AI, but I wonder how this new general purpose use of electricity changes the calculus?

joshjob42 · 3h ago
We have a constraint, say keeping warming below 1.5 or 2C, and we want to achieve it with as little pain as possible.

You use a laptop, you probably play some video games, you see movies, you probably eat meat, you probably drive a car around town instead of a velomobile.

So the question is, how much energy is used by AI and what does it get us? Google's latest TPUs consume ~250W per fp8-petaflop at the data-center level (ie including cooling etc, according to their recent presentations). Even assuming 50% utilization which is a bit poor, that's say 0.5kW per fp8-petaflop, or ~1kW per 10^15 dense-parameter-equivalent tokens per second. So using a huge model, say 10T dense-equivalent parameters, and doing inference at the rate of o3-mini or Gemini 2.5 Pro (around 150 tokens a second) consumes 1.5kW during generation time. But maybe a better way to think about it is just that you'd be using 10J of energy per token, or ~15J of energy per word.

In that context then, generating a book the length of Fellowship of the Ring (~180k words) would consume ~0.75kWh of energy. That's about like playing a video game on a PS5 for ~3 hours. In the US, that's on average ~270g of CO2, like driving ~1.5 miles or so in a Prius. It'd be about like riding ~30 miles on an e-bike.

Another way to think of it might be that if you are texting with a friend, wherein lets say you both aggressively type at ~120wpm, you'd be using ~15W of power to chat with an AI at a similar pace, about the power draw of your friend's Macbook Air he would be typing on. That's around 0.5-0.7 miles on an ebike per hour of chatting.

So even compared to your typical leisure activities, chatting with an AI is comparable in resource use to whatever else you'd be doing in that time most likely. And of course, a human in the US on average generates ~1.5kg of CO2 per hour they're alive in the US. In Bolivia, it's ~150g/hr, and in Sudan it's ~50g/hr or so. So if you're a company and you want to be climate conscious merely hiring someone who is a ~ subsistence farmer to do a job such that their standard of living rises enough to roughly emit the CO2 of the average Bolivian would mean they emit more carbon in a typical day than running an AI to produce as many words as are in the entire Game of Thrones book series to date. And if you were going to help someone immigrate from Bolivia to the US, you could have the AI write ~10 Songs of Ice and Fire a day for the same net CO2 output.

Not to say that we shouldn't do those things because of climate or whatever, but I'm just saying that the objective energy and climate impact of using these models for things is small compared to basically any alternative for completing a task and most entertainment activities people do (movie theaters for instance use ~7kW or so while running, so in the time you spend watching the Fellowship of the Ring in a theater, an AI could write a text as long as FOTR 28 times).

burningion · 3h ago
I know everyone likes to abstract away the costs associated with AI data centers.

But let's look at what has happened with Grok, for example:

From May 6, 2025

https://www.yahoo.com/news/elon-musk-xai-memphis-35-14321739...

>The company has no Clean Air Act permits.

> In just 11 months since the company arrived in Memphis, xAI has become one of Shelby County's largest emitters of smog-producing nitrogen oxides, according to calculations by environmental groups whose data has been reviewed by POLITICO's E&E News. The plant is in an area whose air is already considered unhealthy due to smog.

> The turbines spew nitrogen oxides, also known as NOx, at an estimated rate of 1,200 to 2,000 tons a year — far more than the gas-fired power plant across the street or the oil refinery down the road.

The details are in the specifics here. People are _already_ feeling the effects of the AI race, the consequences just aren't evenly distributed.

And if we look at the "clean" nuclear deals to power these data centers:

https://www.reuters.com/business/energy/us-regulators-reject...

> The Talen agreement, however, would divert large amounts of power currently supplying the regional grid, which FERC said raised concerns about how that loss of supply would affect power bills and reliability. It was also unclear how transmission and distribution upgrades would be paid for.

The scale of environmental / social impacts comes down to how aggressive the AI race gets.

chasd00 · 3h ago
yeah, running GPUs in a datacenter is such a strange thing to fixate on with respect to the overall health of the entire planet. I don't recall it being discussed when the whole "move to cloud" thing started and AWS (+ others) came online, surely those have a larger impact.
roywiggins · 2h ago
you could probably save energy by scrapping Python in favor of C++ everywhere but people don't seem too worried about that either
BeetleB · 3h ago
Is training the bigger producer of CO2 or inference for the whole world using it? I suspect that in the long run, it is the latter.
jebarker · 4h ago
> “If you’re going to use generative tools powered by large language models, don’t pretend you don’t know how your sausage is made.”

This isn't a very hopeful quote given how many people continue to eat sausages even though we all know how sausages are made.

staunton · 4h ago
> we all know how sausages are made.

I'm doubtful. There's a few documentaries showing the process and I've had people tell me multiple times that they were genuinely shocked by it. I'm assuming those people "knew" and it's just that knowing "it's all the waste at the butcher getting stuffed into guts" is not quite the same as seeing it first hand.

jfengel · 2h ago
I'm not sure what documentaries you're watching.

Natural sausage casings are specialty items. If you're buying it at the grocery store, it's probably collagen (closely related to gelatin).

And it's not "all the waste". It includes fatty cuts that people wouldn't want to eat whole, but it doesn't include organ meats outside of specialty items.

Perhaps people find meat-grinding distressing, though it's really not all that different from ground beef. The emulsified filling of hot dogs and bologna looks odd, but the ingredients are inoffensive.

I'm less disturbed by sausage-making than by the slaughter and prime butchering of animals. They're no less dead and dismembered if you're eating a steak or pot roast. I'd rather we at least make use of all of the other parts.

staunton · 1h ago
Indeed, the images I remember most vividly myself (though I wouldn't say they shocked me at any point) is the guts being emptied. So it was some more traditional process for specialty sausages, not a huge factory.

Then again, some people find seeing a huge Industrial room full of raw meat distressing, perhaps somewhat analogously to how they might not be afraid of one spider but suddenly panick upon seeing hundreds at one spot.

jebarker · 4h ago
I wasn't even really thinking of the actual sausage stuffing process so much as how the meat gets to the butcher (which is really a meat grinding factory in most cases) in the first place.
mvdtnz · 3h ago
People get upset because we use every part of an animal for food? I think it's absolutely great. I am a meat-eater who doesn't feel ethically great about it, but I'd feel much worse if I knew a large percentage of a carcass went to waste.
raincole · 3h ago
And what happened after they told you they were shocked? Did they all become vegan once and for all?
staunton · 1h ago
Most just expressed their shock (which didn't seem staged) but changed nothing. One person became vegetarian for about a year, then started to eat meat again. I'm not sure if they still eat sausages.
bdangubic · 4h ago
and why focus on LLMs... there is a shitton of other things that use power/resources/... if we are going to be start worrying here we should consistently apply this across everything...
sundaeofshock · 3h ago
Because LLMs are not deeply embedded in all aspects of our society. It is very difficult to reduce existing usages of carbon (eg cars) in society, as opposed to stopping the widespread use of LLMs.
bdangubic · 1h ago
facts! but did we have these conversations about other things in these terms before they got deeply embedded in all aspects of society?
teruakohatu · 3h ago
I feel the points made in this article have been debated many times on HN.

The argument about power usage for developers, as opposed to consumers, is probably insignificant compared to the inefficient use of compute resources for deployed software today.

Arguably LLMs are probably enabling some web developers to create native applications rather than Electron monstrosities saving many P-core CPU cycles multiplied by the number of their users.

Optimising server applications with an LLM could eliminate unnecessary cloud servers.

Of course all the above could be done without LLMs, but LLMs can empower people to do this kind of work when they were not able before.

breuleux · 3h ago
> Arguably LLMs are probably enabling some web developers to create native applications rather than Electron monstrosities saving many P-core CPU cycles multiplied by the number of their users.

Does any such project exist, or are developers using LLMs to help them develop new Electron monstrosities? I think that if a developer has the sensibility to want to develop a native app, they will do so regardless, it's not that much harder.

dgb23 · 3h ago
The main point being: LLMs cost, but do something useful. Building slow applications just adds cost, but the utility is much more vague.
BeetleB · 3h ago
> The argument about power usage for developers, as opposed to consumers,

Except with MCP, essentially most consumers will become "programmers". See my other comment for the rationale (https://news.ycombinator.com/item?id=43997227).

(TLDR: MCP lets non-programmers convert their prompts into programs, for all practical purposes. Currently there is a barrier to entry to automate simple tasks: The need to learn programming. That barrier will go away)

teruakohatu · 3h ago
> Except with MCP, essentially most consumers will become "programmers"

You are more optimistic than I am. Most people I have seen are using LLMs are at best as an alternative to Grammarly or a document/web summary, or at worst making decision based on outdated LLM advice or as an inaccurate fact-engine.

The average person could code using Excel, but most don't even if they know how to use IF() and VLOOKUP().

BeetleB · 2h ago
That's because most people don't have access to MCPs. It will take 1-2 years to hit critical mass - once it's easy to plug in to ChatGPT and once major companies (e.g. Google for Gmail) provide easy to configure MCP servers.

> The average person could code using Excel, but most don't even if they know how to use IF() and VLOOKUP().

Using Excel (even without IF) is way more complicated than what I am saying. MCPs will enable people to program with natural language. It's not like vibe coding where the natural language will produce code we'll run. The prompt will be the program. You need to put in a lot more effort to learn the basics of Excel.

elpocko · 3h ago
I would like to see a comparison of the amount of electrity used to run AI and the total electricity used to run computer games. I'm quite sure we're wasting much more resources on computer games, just for entertainment.
acomjean · 4h ago
I think AI is interesting, but I wonder what it will do to progress when used very extensively.

Trained on all the things/ideas of the past and creating a larger barrier of entry for new ideas.

Thinking about new computer languages, who would use one that AI couldn’t help you code in?

blizdiddy · 4h ago
When chatGPT came out, i thought we would stagnate with new languages, but not anymore. Now we have small models that can be fine tuned for less than $20 and we have huge models that can learn a new programming language in-context if you provide the docs.
furbolapp · 3h ago
I think its the other way around, it wont create barriers instead it will focus on detecting/recognizing/aknowledging originality and new ideas
computerex · 4h ago
> AI and Crypto are net negatives in this regard.

Not sure how anyone can with certainty say that AI will be a net negative in the long run for climate change. Logic I think says the opposite.

candiddevmike · 3h ago
How can AI be a net positive for climate change? We know what needs to happen to fix it, how would AI telling us the same thing "hit differently"?
philipkglass · 2h ago
The cost of energy generated from burning fossil fuels is dominated by the cost of the fuel itself; the power plant costs much less. Non-combustion energy is the opposite: fuel costs are tiny or zero, so the construction cost for the initial plant is much more important.

If you generate a terawatt hour of electricity with natural gas, most of the cost will be from the fuel. A nuclear plant will have a tiny fraction of the cost come from fuel for the same amount of energy. A solar farm will have none of the cost come from fuel.

If AI lowers construction costs, it will improve the relative economics of non-fossil energy compared to fossil energy. A natural gas plant constructed at half the cost will have its final energy cost decrease just a little whereas a half-as-expensive solar farm will have its final energy cost decrease nearly by half. Making clean energy cheaper than fossils means that it will out-compete dirty energy even in locations where there are no explicit policies to reduce CO2 emissions.

You can see the effects on pricing advantage with this interactive simulation of electricity supply in the United States. If you cut the overnight construction cost in half for all generating technologies, solar and wind dominate the country:

https://calculators.energy.utexas.edu/lcoe_map/#/county/tech

Some example modeling of gas/solar electricity economics in the United Kingdom here:

https://electrotechrevolution.substack.com/cp/160279905

Companies have already started using robotics and AI to construct solar farms faster and at lower cost:

https://www.aes.com/press-release/AES-Launches-First-AI-Enab...

https://cleantechnica.com/2025/02/27/leaptings-ai-powered-ro...

https://www.renewableenergyworld.com/solar/cool-solar-tech-w...

senordevnyc · 2h ago
Maybe AI will help us actually create or implement something that fixes it, instead of idealistic approaches that are never going to happen?
raincole · 3h ago
I think it's just most programmers finally got the reality check: we're menial laborers, not superstars.
MaxGripe · 2h ago
Plz don’t be conflicted. The Earth has an unlimited source of energy, the Sun, and water cannot be “saved” because our planet is a closed cycle.
ChrisMarshallNY · 3h ago
> Sources used to train models are kept secret.

I've been using Perplexity, and it annotates its recommendations, as to sources. Maybe it isn't very complete, though.

BeetleB · 3h ago
The environmental cost can't be overstated.

When I AI code, it's more convenient for me to type in a prompt to change the name of a variable. This involves sending a request to the LLM provider, doing very expensive computations, and then doing a buggy job at it. Even though my IDE can do it for what 0.1% of the energy usage? Or even less? Try running an LLM locally on a CPU and you'll get a glimpse of how much energy it is using to do this simple task.

But coding with AI isn't that huge. What will become huge this year or next is MCP. It will bring "programming" to the masses, all of whom will do stupid queries like the above.

Consider this: I wrote an MCP server to fetch the weather forecast, and have separate tools to get a broad forecast, an hourly forecast, etc. I often want to check things like "I'm thinking of going to X tomorrow. It will be a bummer if it's cloudy. Which hours tomorrow have less than 50% cloud cover?" I could go to a weather web site, but that's more effort (lots of clicks to get to this detail). Way easier if I have a prompt ready.

OK - that doesn't sound too bad. Now let's say I want to do this check daily. What you have to realize is that with MCP the prompt above is as good as a program! It's trivial for an average non-programmer to write that prompt and put it as part of a cron job, and have the LLM email/text you when the weather hits a predefined criteria.

Consider emails. I sign up for deals from a retailer.[1] Now deals from them are a dime a dozen so I've been programmed to ignore those emails. But now with MCP, I can set a simple rule: Any email from that retailer goes to the LLM, and I've written a "program" that loosely describes what I think is a great deal, and let the LLM decide if it should notify me.

Everyone will do this - no programming required! That prompt + cron is the program.

Compared to traditional programming, this produces 100-1000x more CO2 emissions. And because there is no barrier to entry, easily 1000x more people will be doing programming than are doing now. So it's almost a millionfold in CO2 emissions for tasks like these.

[1] OK, I don't do it, but most people do.

lordnacho · 4h ago
I think it's harsh to start the accounting so soon after the breakthrough.

Yes, of course AI uses a lot of energy. But we have to give it a bit of time to see if there are benefits that come with this cost. I think there will be. Whether the tradeoff was worthwhile, I think we are not even close to being able to conclude.

Something like social media, which has a good long while behind it, I could accept if you started to close the book on the plus-minus.

jebarker · 4h ago
You can't have it both ways, i.e. deploy the technology as rapidly and widely as possible but then say you have to wait to do any accounting of the pros/cons.
breuleux · 3h ago
You have to start the accounting immediately, because once a technology becomes entrenched, it's very difficult to backtrack. If it spins out of control and half of all energy is sunk into AI because half of services depend on it and workforces have been laid off, you won't be able to do anything about it without disrupting half of services. The window of opportunity to keep a technology in check, unfortunately, often occurs before the problems it causes become obvious.
candiddevmike · 3h ago
Same arguments folks had with cryptocurrencies.
olalonde · 3h ago
I guess it makes sense if you're genuinely worried about an imminent climate crisis. I am just not...
_rpxpx · 4h ago
"I've never been so conflicted"? Err, where's the sense of conflict? All I see is: "it sucks, it's helping kill the planet, but I need a job."
JKCalhoun · 4h ago
> Firstly, there’s the environmental impact.

Surely, like everything else in tech, this too shall pass. I expect power requirements to fall away since there is no doubt strong incentives to do so.

> My naive optimism led me to believe that technology would help us fight climate change.

Yeah, proceeding full steam ahead with planet destruction and praying tech will save us is kind of naive. You're not alone though.

> There are also ethical concerns regarding the methods used to obtain data for training AI models.

This one hasn't registered as problematic for me. Maybe I'm unethical?

> Content creators can't even determine which parts of their work were used to train the model…

"Content creators" don't develop their style from a vacuum either. I'm not conflicted about this one either.

> LLMs are also contaminating the web with generic content produced without care or thought.

The web has been contaminated ever since SEO. Maybe AI will kill the web. So it goes.

breuleux · 3h ago
> Surely, like everything else in tech, this too shall pass. I expect power requirements to fall away since there is no doubt strong incentives to do so.

Halving power requirements is only a net gain if demand isn't doubled. Unfortunately, using less power means cheaper service, so demand will increase, and not necessarily smoothly. If there is a massive use case for which AI is currently too expensive, simply moving under that threshold could decuple demand -- a small efficiency gain leading to a giant spike in total energy use.

JKCalhoun · 3h ago
You're probably right. I assume they're moving these servers to places like along the Columbia River where electricity is cheap (bitcoin miners have been there for a decade).
Uehreka · 3h ago
> The web has been contaminated ever since SEO. Maybe AI will kill the web. So it goes.

This statement kinda blows my mind. You contemplate the destruction of the web, the single most world-changing invention of the last century, possibly the last millennium, a force that has democratized access to information, accelerated scientific progress and connected people around the world, the technology that one way or another played a huge role in the lives of every person on this site, and you shrug?

I see your Kurt Vonnegut and raise you Dylan Thomas: Rage, rage against the dying of the light.

JKCalhoun · 3h ago
I know it's cynical. But I see more and more the destructive aspects of the web seeming to now eclipse the positives.

It may be age related too. I had lived some 30-plus years before the Web became a thing ... and I kind of liked the 70's and 80's.

greenie_beans · 3h ago
rage against a tsunami
BugsJustFindMe · 4h ago
> I expect power requirements to fall away

This sounds extremely naive to my ears. Performance demands currently outpace resource utilization reductions and it's not clear that we're at the point where that will change soon. Also: https://en.wikipedia.org/wiki/Jevons_paradox

> > There are also ethical concerns regarding the methods used to obtain data for training AI models.

> This one hasn't registered as problematic for me. Maybe I'm unethical?

Yeah, the real ethical concern comes from the explosion of pernicious slop output that is utterly destroying everything good about the internet and media, not the training.

pixl97 · 3h ago
You mean speeding up the destruction of the internet and media?

The problem is it was falling apart before generative AI. This AI has just hastened its demise.

BugsJustFindMe · 3h ago
Sure, yes, but it has definitely gotten a lot worse.
observationist · 3h ago
After google started spamming us with pinterest and cheap stackoverflow copied forum slop, the internet was pretty much saturated with garbage. Google and Facebook and SEO are to blame - they wanted to squeeze every last drop they could, and they did, and here we are.

AI might make things 1-2% worse, but we weren't suffering from a lack of legitimate content or media. We were suffering from centralized control and walled gardens force feeding us slop in the name of profit, the total enshittification of the internet without regard for the damage done - tragedy of the commons, maybe.

AI could also make things better, because it's no longer worth slogging through the bullshit with search - I'll have AI do my searching for me, or follow its recommendation directly, and never interact with any of the monetized bullshit the search engine tries to pawn off.

SEO content will have to get through ever more competent and discerning AI in order to get eyeballs. That eliminates a broad class of low effort trash, directly demonetizing bad faith actors.

So now you have content creators producing more plausible, but mediocre content, but OAI and other AI providers will have direct access to their logs - if they register someone as having had produced bulk trash content, they can shut down the account, report to authorities or other companies if there's some sort of fraud or illicit behavior going on.

There are competing pressures, and with Google also losing its monopoly on adtech, maybe we'll see companies forced to compete on quality products instead of exploitation of user data, all the while seeing the general quality of the internet improve.

Or maybe AI will just accelerate the race to the bottom and the internet's dead already.

spyckie2 · 4h ago
This author could live in the Bronze Age and not a single thing would change about this article.

“I’ve never been so conflicted about a technology. Of course we are talking about iron smelting and its effects on the environment. Look I used an iron hoe and was impressed but have you seen how it was made? Look at all the waste and smoke and wood burned.

If you use an iron tool yourself, at least know how it was made.”

frereubu · 4h ago
I see where you're coming from, but the planet was not on the verge of a climate breakdown at the time. I think that's the implicit context in the block post that your reply misses.
roywiggins · 4h ago
The marginal impact of AI on global climate probably rounds down to zero, or at least is not the thing driving us over the cliff:

> Global energy consumption in 2023 was around 500,000 GWh per day. That means that ChatGPT’s global 3GWh per day is using 0.0006% of Earth’s energy demand.

https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...

lucianbr · 4h ago
How much of that 500 TWh would disappear if you round to zero everything that consumes 3GWh per day or less? And keep in mind you are calculating for a single app (ChatGPT), nor for an entire industry or tech. Don't look at "civil aviation" but at each airline in particular.

I suspect nearly all of it. You framed it such that it appears insignificant, but with this framing nothing is significant.

kragen · 4h ago
ChatGPT is probably the majority of LLM use today, so this is not the equivalent of claiming that neither your car nor mine nor anybody else's uses more than 3GWh per day and therefore cars don't use a significant amount of energy collectively.

Also, though this is not really relevant to your major incorrect point, airlines aren't part of civil aviation.

roywiggins · 3h ago
I guess I'd frame it another way: how many LLM conversations a month would you have to have to increase your energy consumption by 1% vs how many more hot showers

if you are actually worried about your ChatGPT energy usage, skip a hot shower or spend a few hours less playing Cyberpunk 2077 and a few more hours reading an old book.

zellyn · 3h ago
I definitely think breaking out ChatGPT on its own is too fine-grained. At the very least, one should lump all LLM training and use, or perhaps all deep learning training and use together.

I suspect at a "sensible" breakdown (trying to avoid the "How long is my coastline?" problem), which is presumably something akin to Zipfian, the main uses will actually account for most energy usage.

If anyone can find a breakdown at the level of granularity that "all deep learning" would make sense, let me know: so far my Googling has led primarily to either detailed breakdowns of energy _sources_, or high-level breakdowns of use at the level of "industry", "agriculture", or "residential use".

robinsonb5 · 3h ago
"No raindrop considers itself responsible for the flood."
roywiggins · 1h ago
hm, sure, but the guy peeing into the flood probably isn't responsible, at least by the standards that we usually levy responsibility
frereubu · 4h ago
That may be true, but I guess my approach to it would be that as a general principle we shouldn't be using things that increase energy usage for little real gain, and this is something that is in my professional field that I can do something about personally. It does come somewhat from a place of feeling like the world as a whole is absolutely failing to get to grips with the significant large-scale changes that are needed.
roywiggins · 4h ago
a single hot shower uses more Watt-hours than most people spend on ChatGPT in a week, or a month. if people really want to cut down on how many watt-hours they're using on frivolous things there are dozens of things they should do first. if people are spending time worried about their ChatGPT energy use more than they are spending time going vegan or getting used to cold showers then they have been grossly misinformed about what the relative impact is

(to be clear: I'm not vegan. but most people, incl. myself, could be quite easily, and it would save several orders of magnitude more energy and water)

robinsonb5 · 3h ago
Ultimately we're trying to persuade people to reduce their quality of life in order to reduce their energy footprint. That's already a difficult sell. It becomes an impossible sell when any reductions that might be achieved are immediately offset by a new energy-hungry technology coming online - especially when the long-term effect (goal, even) is to reduce yet further the quality of life of everyday people.
roywiggins · 3h ago
convincing people to change anything is very hard, so why are people spending time wringing their hands over LLM use? if you convinced one person to go vegan or get into cold showers for a year you'd do more for the planet than convincing a hundred people not to pick up ChatGPT
casey2 · 4h ago
The reason we have any environmental regulations at all was from mass death and disease caused by technology. Far more people died due to the agricultural revolution then will ever die due to climate change in the next few decades in the worst predictions.

The Black Death alone was 25-50 million people in 7 years

roywiggins · 4h ago
alfiedotwtf · 4h ago
In the end, it would be hilarious if this article was actually AI generated
bitpush · 4h ago
Disappointed with the article, especially because it uses "think of the planet" as a weak argument against this technology.

> Firstly, there’s the environmental impact

Their own blog contributes to climate crisis they are now crying about. We can argue someone in a developing country writing a similar article saying "all these self publishing technologists are making climate crisis worse" and it'll have a stronger point.

I say this without discounting the real environmental costs associated with technology, but LLMs / AI isnt uniquely problamatic.

Your latest macbook, iphone, datacenter, ssds.. all have impact.

frereubu · 4h ago
Your argument seems to be flattening everything so it's just the same - "everything has an impact" - but different things have different impacts, and each needs to be measured against their utility. Me ordering a coffee at my local cafe has an impact, but it's a good deal less than me driving from London to Edinburgh. The author's argument, as I read it and generally agree, is that we don't really need LLMs to get things done, but their use comes with a large environmental cost. I don't think that will stop its use, unfortunately. There's just too much capital behind them at the moment.
zdragnar · 4h ago
We don't really need LLMs like we don't really need the internet. We don't really need that blog, nor do we really need Netflix, porn or Facebook.

Individuals value things differently, so attempting to do society-wide prioritization is always going to be a reductive exercise.

For example: Your local cafe doesn't need to exist at all. You could still drink coffee, you'd just have to make it yourself. That cafe is taking up space, running expensive commercial equipment, keeping things warm even when there aren't customers ordering, keeping food items cool that aren't going to be eaten, using harsh commercial chemicals for regular sanitization, possibly inefficient cooling or heating due to heavy traffic going in and out the door, so on and so forth.

Imagine the environmental impact of turning all cafes into housing and nobody driving to go get a coffee.

frereubu · 4h ago
Again, this just feels like throwing your hands up in the air and saying "it's too hard to decide!" But we have to take decisions somehow if we're going to do anything.
zdragnar · 3h ago
Yes, that's the challenge of centralizing economies. You aren't going to be able to do so efficiently because you don't have every person's preferences.

If by "we're going to do anything" you mean presumably fiat power to ban LLMs, then you're better off using that fiat power to just put a sin tax on carbon emissions and letting people decide where they want to cut back.

sherburt3 · 3h ago
Ok lets try to trick China and India into believing industrialization is lame and being poor is cool. That should buy us a couple of years
roywiggins · 4h ago
Take one fewer hot shower a week and you've saved enough energy to power a lot of ChatGPT queries. Play one fewer hour of Minecraft. Turn off raytracing. Eat one fewer burger per month. All of those things would save more energy than forgoing a few ChatGPT conversations.
Uehreka · 4h ago
Their use does not come with a large environmental cost. The average American lifestyle has a “water footprint” of 1200 “bottles of water” per day. 10-50 ChatGPT queries == 1 bottle of water. If you decide to use ChatGPT but shorten your daily shower by a second or two you will more than offset your total water usage increase.

Thus LLMs don’t have to be that useful to be worth it. And if used in certain ways they can be very useful.

Source (with links to further sources): https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...

sherburt3 · 2h ago
Crossing your fingers and hoping for a sci-fi technology solution to be invented for climate change seems far more realistic at this point than expecting Taylor Swift to not dump 10 tons of CO2 into the atmosphere because she wanted to have lunch in France, so I'm putting all my eggs into that basket.
roywiggins · 4h ago
It's not a large environmental cost, though.
kelseyfrog · 4h ago
Correct. We have to order by impact and refrain from spending effort on all but the issue at the top of the list.

If we choose to affect any lower-priority issue, it is an example of hypocriticality that de-legitimizes the whole project.

nicce · 4h ago
If we had stayed farmers, we wouldn't have these issues. Most people would be still happy. Intelligence is both destruction and savior.
criddell · 4h ago
> farmers are 3.5 times more likely to die by suicide than the general population

https://www.fb.org/in-the-news/modern-farmer-farmers-face-a-...

nicce · 4h ago
Do you think that study applies if we would go 2000 years back? World was kinda different.
kirubakaran · 4h ago
"Many were increasingly of the opinion that they’d all made a big mistake in coming down from the trees in the first place. And some said that even the trees had been a bad move, and that no one should ever have left the oceans." - Douglas Adams
codydkdc · 4h ago
> Firstly, there’s the environmental impact. Models require massive amounts of electricity and water (to cool servers). An AI chatbot contributes more to the climate crisis than a Google search.

I really dislike the "people should be better and use less energy" argument for solving macro-problems like this

> My naive optimism led me to believe that technology would help us fight climate change. I was wrong: AI and Crypto are net negatives in this regard.

...why? why would technology that specifically requires a lot of energy help "fight climate change"?

this entirely article is missing any point to me -- it's very vague and speaking in generalities.

> “If you’re going to use generative tools powered by large language models, don’t pretend you don’t know how your sausage is made.”

why? if you generate code with a LLM, then read and deeply understand the code, what's wrong?

> I can’t help but feel the web would be a better place if the technology had never existed in the first place.

if we didn't invent the wheel, we wouldn't have so many cars/trucks polluting the planet

lucianbr · 4h ago
> ...why? why would technology that specifically requires a lot of energy help "fight climate change"?

There's no good reason it would. Nevertheless, proponents of both AI and crypto have claimed multiple times it would. So I think it is quite fair to bring it up.

https://ia.samaltman.com/

alfiedotwtf · 3h ago
You do realise most cryptocurrencies aren’t PoW, so your point is living in the past, man!
proxynoproxy · 3h ago
Only the one big pow coin matters tho. Turns out, PoS coins are worthless!

I also dislike energy frugality arguments. They come from a Luddite place. A civilization is defined by its energy usage. Let’s be advanced civilization.