From the Department of "Whomst Could Have Expected This"

This just came out recently:

The reasons why the AI business is struggling are diverse but one is quite well known: AI platforms are notoriously expensive to operate. Platforms like ChatGPT and DALL-E burn through an enormous amount of computing power and companies are struggling to figure out how to reduce that footprint. At the same time, the infrastructure to run AI systems—like powerful, high-priced AI computer chips—can be quite expensive. The cloud capacity necessary to train algorithms and run AI systems, meanwhile, is also expanding at a frightening rate.

The article itself is about 25% more snarky and condescending1 than I would personally put it, but it's not wrong. Contrary to all manner of hare-brained optimism, we still live in a physical world that requires quite a lot of resources to perform computation, and we've primarily been "scaling up" AI systems by adding more computational requirements. In fact, if this turns out to be true (I'd still wait for a second, non-Gizmodo source), it would seem most "advances" in AI add operating costs for producers at a much faster rate than they add value for consumers. How's that for a "fast takeoff"?

Anyway, not to brag2, but some of us called this one a while ago.


  1. It is, I suppose, Gizmodo. 

  2. That's exactly what I'm doing. 


You'll only receive email when they publish something new.

More from Tom
All posts