"To start with, I had previously assumed a 10-year depreciation curve, which I now recognize as quite unrealistic based upon the speed with which AI datacenter technology is advancing. Based on my conversations over the past month, the physical datacenters last for three to ten years, at most. Changes to cooling systems, chip and racking designs, power systems, and even overall layouts, mean that the buildings themselves are likely depreciating quite rapidly as well. Then when you consider that new GPU iterations, which seem to come out every year or two, effectively obsolete prior models, you realize that I should have been using a much faster depreciation curve across the whole capitalized structure.
However, if you speed up the depreciation curve to something in the three to five-year range, it would imply that my prior breakeven revenue number of $160 billion to justify 2025’s capex spend, is woefully inadequate. In reality, the industry probably needs a revenue range that is closer to the $320 billion to $480 billion range, just to break even on the capex to be spent this year. As I wasn’t educated on the intricacies of a datacenter, I wasn’t bearish enough on the economics of an AI datacenter. No wonder my new contacts in the industry shoulder a heavy burden—heavier than I could ever imagine. They know the truth.
Remember, the industry is spending over $30 billion a month (approximately $400 billion for 2025) and only receiving a bit more than a billion a month back in revenue. The mismatch is astonishing, and this ignores that in 2026, hundreds of billions of additional datacenters will get built, all needing additional revenue to justify their existence. Adding the two years together, and using the math from my prior post, you’d need approximately $1 trillion in revenue to hit break even, and many trillions more to earn an acceptable return on this spend."
https://pracap.com/an-ai-addendum/
#AI #GenerativeAI #AIBubble #PonziScheme #Economy #AIHype #DataCenters