The AI Depreciation Debate: How Utilisation Drives Value for Investors
Artificial Intelligence continues to reshape how the world computes, how digital services operate and where investors decide to place their capital. Nvidia’s most recent quarter illustrates this: $51.2 billion in data centre revenue, up 66% from 2024, with $65 billion in revenue guidance for the next quarter. These numbers do not point to an exhausted market, but show an industry that continues to thrive beyond expectations.
The depreciation debate around the Mag-7 has returned, fuelled by investor Michael Burry’s warning that Google, Microsoft, Meta and Amazon have extended the useful life of their servers and deferred an estimated $176 billion in expense. He’s correct that investors should question earnings when accounting assumptions shift.
But in AI infrastructure, the assumption that stretched depreciation telegraphs hidden strain doesn’t hold. As the frontier moves forward, older chips don’t become idle, but instead cascade into new roles that remain revenue-generating. Demand is broad enough, and the workload mix is equally diverse, so useful life is determined by utilisation, not by accounting convention.
“This optimism is free of engineering spin and is proof positive that the economic life of AI hardware now outpaces historical depreciation curves.”
AI infrastructure does not behave like consumer electronics. When a new generation of hardware arrives, the previous one moves into a different role where it still generates revenues.
The most advanced frontier chips support the most acute workloads: training foundation models; optimisation of existing systems; supporting major enterprise deployments. Demand is far greater than available capacity, so access is prioritised. That is a commercial reality, not a flaw in the system.
Chips of a previous generation support the performance-sensitive and highly monetizable workloads of enterprise co-pilots, embeddings, recommendation engines, analytics, batch inference at more attractive economics – a deliberate allocation strategy that maximises returns.
Amin Vahdat, Google’s VP and GM of AI, Cloud and Infrastructure noted recently that the company still operates some eight-year-old Tensor Processing Units (TPUs) at 100 per cent utilisation. The cascade model has nothing to do with optimism or accounting tricks because previous generations of chips continue to generate revenue while the scarce frontier chips are assigned to the tasks that create the most economic value.
“Scarce frontier chips are assigned to the tasks that create the most economic value.”
Five years ago, most AI activity was concentrated in research and training. When the frontier moved forward, the workload of older ships diminished, so a three- or four-year cycle made sense. Today, the demand landscape is broader.
The bigger tech companies retire assets only when utilisation drops below commercial viability, and the marginal value of upgrades has also shifted in turn. Swapping out a functioning chip for a newer model that offers only modest gains – on workloads that don’t need them – also erodes return on capital. As a result, depreciation stretches and hardware lifecycles lengthen because the equipment remains economically useful for longer.
“The variety of workloads and the continued supply constraint at the frontier suggest that depreciation is tracking economic life rather than masking deterioration.”
If, however, frontier demand plateaus and the availability of middle-tier workloads contracts, older assets become redundant. Then, when energy regulation or carbon policy materially increases operating costs, utilisation will fall faster than revenue can offset those costs. At that point, extending asset life would be nothing more than delaying bad news.
But that is not the environment we are in today. The utilisation evidence, the variety of workloads and the continued supply constraint at the frontier suggest that depreciation is tracking active economic life rather than masking deterioration.
The Mag-7 depreciation debate is more of a live case study in how frontier technologies defy historical patterns. For AI, every leap in capability creates new workloads faster than the hardware itself ages.
Michael Burry’s warning came in the form of a social media post where he claimed "understating depreciation by extending useful life of assets artificially boosts earnings," and went even further saying it was "one of the more common frauds of the modern era."
His words are particularly redolent because it forces investors to consider the quality of earnings, which are not yet strained. The evidence, from 100 per cent utilisation of older chips to Nvidia’s order book, shows that economic obsolescence is lagging physical and performance obsolescence by several years. The cascade is tangible, the demand curve steep, and the supply wall remains intact.
For investors, the takeaway is clear: that when technical architecture alters demand profiles, old valuation templates break. Discipline means understanding whether a new platform is finite or self-reinforcing. In the case of the AI models and systems of today, the platform is emphatically the latter. That does not make valuations automatically cheap, but it does make the depreciation critiques much harder to heed.
“For investors, the takeaway is clear: when technical architecture alters demand profiles, old valuation templates break.”
At Arbra Partners, our pragmatic analysis is combined with a global macro lens to help clients understand shifting technologies and the markets around them that offer opportunity. We identify the structural trends that matter to ensure capital is positioned with the resilience, purpose and long-term advantage.