- cross-posted to:
- fuck_ai@lemmy.world
- cross-posted to:
- fuck_ai@lemmy.world
Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max
Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max
Datacenter LLM tranches are 7-8 H100s per user at full load which is around 4 kW.
Multiply that by generation time and you get your energy used. Say it takes 62 seconds to write an essay (a highly conservative figure).
That’s 68.8 Wh, so you’re right.
Source: I’m an AI enthusiast
Does that account for cooling? Storage? Networking? Non-H100 compute and memory?
Nope. Just GPU board power draw. 60 seconds is also pretty long with how fast these enterprise cards are but I’m assuming they’re using a giant 450B or 1270B model.
kW is a unit of instantaneous power; kW/s makes no sense. Note how multiplying that by seconds would cancel time out and return you power again instead of energy. You got there in the end, though.
Woop, noted, thanks
Well that’s of the same order of magnitude as the quoted figure. I was suggesting that it sounded vastly larger than it should be.
They’re probably factoring in cooling costs and a bunch of other overhead, I dunno