I would really like it if an LLM tool would show me the power consumption and environmental impact of each request I’ve submitted.
kingstnap · 4h ago
You can assume the API price is roughly proportional to electricity usage.
If you buy $10 in tokens, that probably folds into ~$3 to $5 dollars in electricity.
Which would be around 30 to 90 kWhr in electricity.
Depending on the source, it could be anywhere from ~500g/kWhr (for natural gas) and ~24g/kWhr for hydroelectric.
It's a really wide spread, but I'd say for $10 in tokens, you'd probably be in the neighbourhood of 1 kg to 40 kg of emissions.
What's a good thing is that a lot of the spread comes from the electricity source. So if we can get all of these datacenters on clean energy sources it could change emissions by over an order of magnitude compared to gas turbines (like XAi uses).
chessgecko · 58m ago
if you bought a nvidia h100 at wholesale prices (around $25k) and ran it 24/7 at commercial electric rates (lets say $0.1 per kwh), then it would take you over 40 years to spend the purchase price of the gpu in electricity. Maybe bump it down to 20 for data center cooling.
I don't think the cost of the ai is close to converging to the price of power yet. Right now its mostly the price of hardware and data center space minus subsidies.
dijit · 4h ago
I don’t think that you can make this assumption.
People are selling AI at a loss right now.
preciz · 6h ago
And each toilet flush you make should also have a Co2 calculation which should go against your daily carbon allowance.
evrimoztamur · 6h ago
Spending drinking water for toilet flushes is indeed a problem. Perhaps not CO2 measurements directly, but informing people in general of how much high quality water is wasted on flushes alone will hopefully bring more momentum into more efficient flushing mechanism and introducing grey water systems to new and old buildings alike. Good idea!
aziaziazi · 5h ago
I don’t flush my toilet, I Kildwick [0] but J-pd has a more interesting comparaison
people downvote your sarcasm, but if you do the calculations you're kinda right.
1Kg of Beef costs:
- The energy equivalent of 60.000 ChatGPT queries.
- The water equivalent of 50.000.000 ChatGPT queries.
Applied to their metric Mistral Large 2 used:
- The water equivalent of 18.8 Tons of Beef.
- The CO2 equivalent of 204 Tons of Beef.
France produces 3836 Tons of Beef per day,
and one large LLM per 6 months.
So yeah, maybe use ChatGPT to ask for vegan recipes.
People will try to blame everything else they can get a hold on before changing the stuff that really has an impact, if it means touching their lifestyle.
The LLMs are not the problem here.
plants · 5h ago
Those are incredible stats. As a vegan who uses LLMs at work frequently, I would love to have the source as well :)
The difference is that food is important and live-giving and LLMs are a very fancy magic 8-ball
tptacek · 1h ago
Where does a Youtube LetsPlay video fall into that calculation? My understanding is that a single watch of a video is orders of magnitude more than a day's active use of ChatGPT.
j-pb · 6h ago
You don't need beef, beef is a lifestyle choice.
I use LLMs to do all of my coding these days, it's certainly more essential for feeding me than beef.
leksak · 6h ago
Also a lifestyle choice
j-pb · 5h ago
Yes but one with a much much much smaller impact as we just demonstrated.
This is exactly the kind of cognitive dissonance in people that I meant.
You literally see the math and go "but I like my meat, why should I give that up if you got your AI".
Because, as I just demonstrated, my AI takes a infinitesimal fraction of your meat.
It literally takes you only going vegan for a day to offset your entire AI usage of a year.
jrflowers · 5h ago
This is spot on because there can’t be two issues that exist simultaneously. There can only be one thing that wastes enormous amounts of energy and that thing is beef
j-pb · 5h ago
You can try to misconstrue and ridicule the argument,
but that won't change the math that if you have one thing that causes 1 unit of damage, and another thing that causes 100.000 units of damage, then for all intents and purposes the thing that produces 1 unit of damage is irrelevant.
And any discussion that tries to frame them as somewhat equally important issues is dishonest and either malicious or delusional.
My guess, as I've expressed earlier in the comment chain, is that it's emotionally easier for people to bike-shed about the 0.01% of their environmental impact, than to actually tackle things that make up 20%.
And no it's not only beef (which is a stand-in for meat and diary), another low hanging fruit is also transport, like switching your car for a bike.
But switching from meat and diary to a vegan diet would reduce up to 20% of your personal environmental impact, in terms of CO2.
And about 80-90% of rainforest deforestation is driven directly or indirectly by livestock production.
So it's simply the easiest most impactful thing everyone can do. (Switching your car for a bike isn't possible for people in rural areas for example.)
jrflowers · 4h ago
>1 unit of damage, and another thing that causes 100.000 units of damage, then for all intents and purposes the thing that produces 1 unit of damage is irrelevant
You make a good point. A problem is only a real problem if you can’t find a bigger thing that makes it look small by comparison. For example, the worldwide concrete industry creates more co2 than beef does so there is no reason to stop eating beef if you enjoy it.
Now I know that some might say that “all of this is cumulative” or “the material problems that stem from entrenched industries is actually a reason not to invent completely novel wasteful things rather than a justification for them” but in reality only two things are true: only the biggest problem is real, and the only problem is definitely some other guy’s doing. If I waste x energy and my neighbor wastes y amount, a goal of reducing (x+y) is oppressive whereas a goal where I just need to try to keep x lower than y feels a lot nicer.
You're writing a whole lot of words that completely ignore they already gave a solution to things being cumulative.
Skip meat for one day, use AI for a year, come out ahead.
j-pb · 4h ago
Whatever you need to tell yourself to keep eating meat buddy.
AnimalMuppet · 4h ago
Human energy, attention, and effort are finite. Put the effort where it will have the biggest effect.
jrflowers · 4h ago
I agree. Humans have been eating meat and doing construction for the entire history of civilization, they are not the sort of things that could be affected by posting online. LLMs on the other hand are new, largely in the hands of a small handful of companies, and a couple of those companies are bleeding cash in such a way that they might actually respond to consumer pressure. It is cynical to compare them to things that we know will not change as a justification for a blanket excuse for them.
Seeing as these models being wasteful is integral to the revenue of companies like OpenAI and Anthropic, the more people that tell them that the right business strategy is to start perpetually building data centers and power plants, the less incentive they have to build models that run efficiently on consumer hardware.
mlnj · 4h ago
Nothing stopping us to live the hermit life in the mountains. But here we are trying to get bits and bytes to write that JIRA ticket instead of us.
motoxpro · 4h ago
You could eat a few bites less of beef in one meal and that would be your equivalent AI use for a lifetime.
stonogo · 5h ago
Toilets are already labeled with their usage rate.
jrflowers · 5h ago
This is a good point because being curious about energy usage is the same thing as advocating for an imaginary rule about energy usage
jiehong · 6h ago
Let’s call it GreenOps
jeffbee · 5h ago
That would be ... thousands of time less useful than giving you the same information at the motor fuel pump. Unfortunately this isn't one of those situations where every little bit counts. There are 2 or 3 things you can do to reduce your environmental impact and not using chatbots isn't one of the things.
jiehong · 6h ago
So, using the smallest model for the task would help, as expected.
A very small model could run on device to automatically switch and choose the right model based on the request. It would help navigate the difficult naming of each model of each vendor for sure.
potatolicious · 5h ago
> ” A very small model could run on device to automatically switch and choose the right model based on the request.”
This is harder than it looks. A “router” model often has to be quite large to maintain routing accuracy, especially if you’re trying to understand regular user requests.
Small on-device models gating more powerful models most likely just leads to mis-routes.
evrimoztamur · 5h ago
What is the levelised cost per token? As in how we calculate levelised cost of energy.
If we take the total training footprint and divide that by the number of tokens the model is expected to produce over its lifetime, how does that compare to the marginal operational footprint?
My napkin math says per token water and material footprints are up 6-600% and 4-400% higher respectively for tokens on the order of 40B to 400M.
I don't have a good baseline on how many tokens Mistral Large 2 will infer over the course of its lifetime, however. Any ideas?
kurthr · 3h ago
Within marginal error, dollars=destruction.
Even if the company is "green" they make money, they pay employees/stockholders, those people use the money to buy more things and go on vacations in airplanes. Worse, they invest the money to make more money and consume more goods.
Even your gains and vegetables are shipped in to feed you, if you walk to the grocery store. You pay rent/mortgage for a house built with concrete and steel. The highest priced items you pay for are also likely the most energy and environmentally costly. They create GDP.
It's a little weird with LLMs right now, because everything is subsidized by VC, Ads, BigCo investment so you can't see real costs. They're probably higher than the $30-200/mo you pay, but they're not 10x the price like your rent, car payment, food, vacation, investment/pension are.
dr_kretyn · 3h ago
This is a fantastic report. As someone tasked to get the most of AI at our company, in conversations I'm frequently getting questions about it's environmental impact. Great to have a reference.
djoldman · 3h ago
They report that the emissions of 400 output tokens, "one page of text," equates to 10 seconds of online video streaming in the USA.
So I guess one saves a lot of emissions if one stops tiktok-ing, hulu-ing, instagram reel-ing, etc.
wmf · 3h ago
It's sad to see the French of all people fall for guilt-trip austerity thinking. Just decarbonize the grid and move on. Energy is good.
austinjp · 4h ago
This is interesting but I'd love it if they'd split training and inference. Training might be highly expensive and conducted once, while inference might be less expensive but conducted many, many times.
jeffbee · 5h ago
These conclusions are broadly compatible with "The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink" or, as I prefer, the PDF metadata title that they left in there, "Revamped Happy CO2e Paper".
Despite the incredible focus by the press on this topic, Mistral's lifecycle emissions in 18 months were less than the typical annual emissions of a single A320neo in commercial service.
The press focus is a mix of the usual "new thing BAD", and the much more insidious PR work by fossil fuel megacorps.
Fossil fuel companies are damn good at PR, and they know well that they simply can't make themselves look good. The next best thing? Make someone else look worse.
If an Average Joe hears "a company that hurts the environment" and thinks OpenAI and not British Petroleum, that's a PR win.
jeffbee · 5h ago
I suspect the press is also aligned against machine learning because they are still Big Mad® that the internet destroyed their revenue model (charging individuals $50 to advertise used cars, for example).
If you buy $10 in tokens, that probably folds into ~$3 to $5 dollars in electricity.
Which would be around 30 to 90 kWhr in electricity.
Depending on the source, it could be anywhere from ~500g/kWhr (for natural gas) and ~24g/kWhr for hydroelectric.
It's a really wide spread, but I'd say for $10 in tokens, you'd probably be in the neighbourhood of 1 kg to 40 kg of emissions.
What's a good thing is that a lot of the spread comes from the electricity source. So if we can get all of these datacenters on clean energy sources it could change emissions by over an order of magnitude compared to gas turbines (like XAi uses).
I don't think the cost of the ai is close to converging to the price of power yet. Right now its mostly the price of hardware and data center space minus subsidies.
People are selling AI at a loss right now.
[0] https://www.kildwick.com/
[1] https://news.ycombinator.com/threads?id=j-pb
1Kg of Beef costs:
Applied to their metric Mistral Large 2 used: France produces 3836 Tons of Beef per day,and one large LLM per 6 months.
So yeah, maybe use ChatGPT to ask for vegan recipes.
People will try to blame everything else they can get a hold on before changing the stuff that really has an impact, if it means touching their lifestyle.
The LLMs are not the problem here.
> Using ChatGPT is not bad for the environment
— https://andymasley.substack.com/p/individual-ai-use-is-not-b...
He’s done some good followup articles as well:
https://andymasley.substack.com/s/ai-and-the-environment
I use LLMs to do all of my coding these days, it's certainly more essential for feeding me than beef.
This is exactly the kind of cognitive dissonance in people that I meant.
You literally see the math and go "but I like my meat, why should I give that up if you got your AI".
Because, as I just demonstrated, my AI takes a infinitesimal fraction of your meat.
It literally takes you only going vegan for a day to offset your entire AI usage of a year.
And any discussion that tries to frame them as somewhat equally important issues is dishonest and either malicious or delusional.
My guess, as I've expressed earlier in the comment chain, is that it's emotionally easier for people to bike-shed about the 0.01% of their environmental impact, than to actually tackle things that make up 20%.
And no it's not only beef (which is a stand-in for meat and diary), another low hanging fruit is also transport, like switching your car for a bike.
But switching from meat and diary to a vegan diet would reduce up to 20% of your personal environmental impact, in terms of CO2.
And about 80-90% of rainforest deforestation is driven directly or indirectly by livestock production.
So it's simply the easiest most impactful thing everyone can do. (Switching your car for a bike isn't possible for people in rural areas for example.)
You make a good point. A problem is only a real problem if you can’t find a bigger thing that makes it look small by comparison. For example, the worldwide concrete industry creates more co2 than beef does so there is no reason to stop eating beef if you enjoy it.
Now I know that some might say that “all of this is cumulative” or “the material problems that stem from entrenched industries is actually a reason not to invent completely novel wasteful things rather than a justification for them” but in reality only two things are true: only the biggest problem is real, and the only problem is definitely some other guy’s doing. If I waste x energy and my neighbor wastes y amount, a goal of reducing (x+y) is oppressive whereas a goal where I just need to try to keep x lower than y feels a lot nicer.
https://www.theguardian.com/cities/2019/feb/25/concrete-the-...
https://www.chathamhouse.org/sites/default/files/publication...
Skip meat for one day, use AI for a year, come out ahead.
Seeing as these models being wasteful is integral to the revenue of companies like OpenAI and Anthropic, the more people that tell them that the right business strategy is to start perpetually building data centers and power plants, the less incentive they have to build models that run efficiently on consumer hardware.
A very small model could run on device to automatically switch and choose the right model based on the request. It would help navigate the difficult naming of each model of each vendor for sure.
This is harder than it looks. A “router” model often has to be quite large to maintain routing accuracy, especially if you’re trying to understand regular user requests.
Small on-device models gating more powerful models most likely just leads to mis-routes.
If we take the total training footprint and divide that by the number of tokens the model is expected to produce over its lifetime, how does that compare to the marginal operational footprint?
My napkin math says per token water and material footprints are up 6-600% and 4-400% higher respectively for tokens on the order of 40B to 400M.
I don't have a good baseline on how many tokens Mistral Large 2 will infer over the course of its lifetime, however. Any ideas?
Even if the company is "green" they make money, they pay employees/stockholders, those people use the money to buy more things and go on vacations in airplanes. Worse, they invest the money to make more money and consume more goods.
Even your gains and vegetables are shipped in to feed you, if you walk to the grocery store. You pay rent/mortgage for a house built with concrete and steel. The highest priced items you pay for are also likely the most energy and environmentally costly. They create GDP.
It's a little weird with LLMs right now, because everything is subsidized by VC, Ads, BigCo investment so you can't see real costs. They're probably higher than the $30-200/mo you pay, but they're not 10x the price like your rent, car payment, food, vacation, investment/pension are.
So I guess one saves a lot of emissions if one stops tiktok-ing, hulu-ing, instagram reel-ing, etc.
Despite the incredible focus by the press on this topic, Mistral's lifecycle emissions in 18 months were less than the typical annual emissions of a single A320neo in commercial service.
https://arxiv.org/pdf/2204.05149
Fossil fuel companies are damn good at PR, and they know well that they simply can't make themselves look good. The next best thing? Make someone else look worse.
If an Average Joe hears "a company that hurts the environment" and thinks OpenAI and not British Petroleum, that's a PR win.