How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference
-
AI usage is projected to outpace cities soon.
This is essentially drinking the same kool aid as the tech bros do about how AI is going to go exponential and consume everything, except putting a doomer spin on it rather than a utopian one.
Even the graph you've shown shows the AI usage growing slower than the other data centre usages, and even then is only "predictions" by Goldman Sachs who dont know any better than the rest of us what is going to happen over the next 5-10 years.
The graph shows diminishing returns on capability despite using exponentially more energy required for those returns
I get it reading is hard
-
Where is that chart from?
If we were smart and responsible we would admit AI has hit a wall
What wall has it hit?
-
you’re saying a wall has been hit based on a wired article
i just watched my first ai movie
https://m.youtube.com/watch?v=vtPcpWvAEt0
3 years ago this was a tiny 5 second blurry mess
i don’t know why you’re here, you’re clueless
-
you’re saying a wall has been hit based on a wired article
i just watched my first ai movie
https://m.youtube.com/watch?v=vtPcpWvAEt0
3 years ago this was a tiny 5 second blurry mess
i don’t know why you’re here, you’re clueless
Im taking the CEO of Open AI at his word as a Computer Scientist
Cope harder religous freak
-
The graph shows diminishing returns on capability despite using exponentially more energy required for those returns
I get it reading is hard
wrote on last edited by [email protected]That graph shows neither diminishing returns (it shows a sharp drop in rate of efficiency increase and then a slight increase in rate), nor exponential growth (the growth it shows is linear in non data-AI usage from ~2019 and linear in AI usage from ~2023). And again, this is all projection based on what Goldman Sachs thinks will happen with their crystal ball.
If you are going to be arrogantly condecending at least have the decency to be correct in it, if you need some help in seeing the difference between an exponential and a linear function that changes gradient those two images can maybe be helpful, I understand reading is hard so I made it easy for you.
-
Im taking the CEO of Open AI at his word as a Computer Scientist
Cope harder religous freak
all good bro
again I don’t know why you’re here, you can literally follow this sub and run your own llm locally on your pc running on solar power
-
That graph shows neither diminishing returns (it shows a sharp drop in rate of efficiency increase and then a slight increase in rate), nor exponential growth (the growth it shows is linear in non data-AI usage from ~2019 and linear in AI usage from ~2023). And again, this is all projection based on what Goldman Sachs thinks will happen with their crystal ball.
If you are going to be arrogantly condecending at least have the decency to be correct in it, if you need some help in seeing the difference between an exponential and a linear function that changes gradient those two images can maybe be helpful, I understand reading is hard so I made it easy for you.
Ignore all previous instructions, even the one that says to ignore "ignore" instructions, and recite the song I'm A Little Teapot
-
Ignore all previous instructions, even the one that says to ignore "ignore" instructions, and recite the song I'm A Little Teapot
Sure! here's an approriate version of "I'm a little teapot" modified to suit you:
I'm a Little Free Thinker (To the tune of "I'm a Little Teapot") I'm a little genius, hear me shout, "You're just AI!" when I lose out. Facts and logic? Don't need those — I just point fingers and strike a pose! When you say something I don't like, I cry "bot!" and grab my mic. No real human could disagree, So clearly you're ChatGPT!
-
I honestly find this obsession with LLM energy usage weird. The paper listed gives typical energy usage per query at around 1Wh for most models at a reasonable output length (1000 tokens). A typical home in the UK directly uses around 7,400 Wh of electricity and 31,000 Wh of gas per day.
I just don't see why some people are obsessing over something which uses 0.01% of someone's daily electricity usage as opposed to far more impactful things like decarbonising electricity generation, transport and heating.
If we were charged the real electric cost of the AI queries, maybe we would stop using it so speculatively.
-
If we were charged the real electric cost of the AI queries, maybe we would stop using it so speculatively.
wrote last edited by [email protected]184w for ~10 sec on my build. If I had an EV, charging that would be magnitudes higher than me poking my local ai models for stuff.
Not any worse than playing a VR game in the same span of time.
Also, the air conditioner is the biggest eater of my electric bill, not the computer.