In the week since a Chinese AI model called DeepSeek became a household name, a dizzying number of narratives have gained steam, with varying degrees of accuracy […] perhaps most notably, that DeepSeek’s new, more efficient approach means AI might not need to guzzle the massive amounts of energy that it currently does.
The latter notion is misleading, and new numbers shared with MIT Technology Review help show why. These early figures—based on the performance of one of DeepSeek’s smaller models on a small number of prompts—suggest it could be more energy intensive when generating responses than the equivalent-size model from Meta. The issue might be that the energy it saves in training is offset by its more intensive techniques for answering questions, and by the long answers they produce.
Add the fact that other tech firms, inspired by DeepSeek’s approach, may now start building their own similar low-cost reasoning models, and the outlook for energy consumption is already looking a lot less rosy.
Assuming deepseek can actually be run locally you would just need a laptop, a dynamo, and the poetic edda to use as the installation prompt.
i remember someone posted about spanish dudes living in apartment wired in such a way that they had all their power generated locally. that’s some solar panels, exercise bike fitted with a flywheel and generator, batteries charged from these two and something else (?). they had a very light static website hosted on something tiny that was up only when they had enough power, which was most of the time
foot-loom powered ml