2000 times, given your approximations as correct, the usage of a household for something that’s used by millions, or potentially billions, of people it’s not bad at all.
Probably comparable with 3d movies or many other industrial computer uses, like search indexers.
I just edited my comment, just no wonder you missed it.
In 2024, chatgtp was projected to use 226.8 GWh. You see, if people are “gaming” 24/7, it is quite wasteful.
Edit: just in case, it isn’t obvious. The hardware needs to be produced. The data collected. And they are scaling up. So my point was that even if you do locally sometimes a little bit of LLM, there is more energy consumed then just the energy used for that 1 prompt.
2000 times, given your approximations as correct, the usage of a household for something that’s used by millions, or potentially billions, of people it’s not bad at all.
Probably comparable with 3d movies or many other industrial computer uses, like search indexers.
Yeah, but then they start “gaming”…
I just edited my comment, just no wonder you missed it.
In 2024, chatgtp was projected to use 226.8 GWh. You see, if people are “gaming” 24/7, it is quite wasteful.
Edit: just in case, it isn’t obvious. The hardware needs to be produced. The data collected. And they are scaling up. So my point was that even if you do locally sometimes a little bit of LLM, there is more energy consumed then just the energy used for that 1 prompt.
Yeah it’s ridiculous. GPT-4 serves billions of tokens every day so if you take that into account the cost per token is very very low.