The Self-Digesting AI Machine. Why the Economics of AI Don't Stack Up, and Where Real Value Lies.

The Self-Digesting AI Machine. Why the Economics of AI Don't Stack Up, and Where Real Value Lies.

The Self-Digesting Machine

We now live in a world where the current technological zeitgeist feels like a recurring dream for many investors. You know the one. It involves loading up on 'Magnificent Seven' stocks whenever they appear slightly discounted, on the assumption that the giants of the tech era will smoothly transition into the victors of the AI era. But a fundamental concern is being overlooked, not just about who wins and who loses, but about how software itself will look, and whether the underlying economics of the industry can survive the very technology it has chosen to celebrate.

The Raw Mechanics of Modern AI

To understand what is at stake, we must first look past the marketing of 'superintelligence' and examine the actual machinery. At their core, these systems are not sentient entities. They are mathematical equations that output a sequence of numbers, known as weights, which are reused in a second equation to predict the next token in a sequence. Surrounding this foundation is a 'harness': a structured layer of prompts and decision-making logic that guides how the model interacts with the world. This harness handles routing. If a model is prompted to be concise, for instance, a higher-level layer might detect this and redirect the request to a smaller, more efficient model rather than the primary one.

This reveals something important: much of what we perceive as machine intelligence is actually clever infrastructure sitting on top of a foundation of mathematics. The magic is, in large part, in the scaffolding.

The Vulnerability at the Centre

This architecture has a critical weakness, and it flows in two directions simultaneously.

First, the weights themselves are extraordinarily difficult to protect. While initial training is expensive and painstaking, once a model is released, it can be distilled or replicated by competitors. This is no longer a theoretical concern. Anthropic has formally alleged that Chinese AI firms including DeepSeek, Moonshot AI, and MiniMax ran industrial-scale distillation campaigns, flooding Claude with specially crafted prompts across tens of thousands of accounts to train rival models. DeepSeek reportedly achieved frontier-level reasoning capabilities at a claimed training cost of around $5.6 million, a fraction of what US labs spend on comparable runs. The weights, despite representing billions of dollars of investment, are trending towards commoditisation.

If the weights become essentially free to replicate, the only remaining proprietary layer is the harness. But this is where the second vulnerability emerges. In April 2026, Anthropic accidentally included a source map file in its public Claude Code npm package, exposing roughly 512,000 lines of TypeScript code, including the full system prompt, tool definitions, and the logic underlying Claude Code's agentic loop. It was downloaded and mirrored widely within hours. And even without such leaks, harnesses remain relatively simple constructs, reproducible by competent engineers without access to any proprietary material.

This is the crux of the self-digestion problem. AI systems are now capable enough to help build the very software layers that sit on top of them. The more capable models become, the more they erode the scarcity and defensibility of the products surrounding them. Value evaporates up the stack and concentrates at the physical layer, the hardware, the energy, the infrastructure, while the software above it becomes increasingly generic. The technology is eating its own moat.

Enter the Post-GUI Era

This dynamic is reshaping not just AI economics, but the entire software industry. We are witnessing the early stages of a 'Post-GUI' era. The traditional Graphical User Interface, that forty-year-old paradigm of windows, buttons, and clicks, is not merely being updated. It is evaporating in favour of automated AI pipelines.

Consider what might be called the 'Palantir bet': instead of fifty people sitting at desks using various interfaces to perform tasks, clicking to send emails and navigating menus to generate reports, the future may involve a handful of engineers monitoring an automated AI pipeline across multiple screens. In this new paradigm, the GUI no longer serves as the tool through which humans do the work. It becomes a management layer for the AI agents that handle actual execution.

This has a direct consequence for enterprise spending. If businesses can automate entire workflows end-to-end through custom pipelines, they will almost certainly reduce their software subscriptions and per-seat licences. This is a structural threat to legacy software providers, not a cyclical one. The question is not whether this happens, but when.

What Happens to companies like Microsoft?

No company illustrates this risk more acutely than Microsoft. Historically, Microsoft's dominance was built on deliberate incompatibility. The company's internal strategy documents, cited in a 2004 European Commission ruling, explicitly described the Windows API as being kept broad to maintain lock-in. If you wanted to open a Word document, you had to own Word. If you wanted Excel macros to function correctly, you had to stay in the Microsoft ecosystem. That era is effectively over.

Today, a user can ask an AI to convert a document into almost any format, or indeed into something that is not a document at all, a video, song lyrics or an interactive application. This bypasses the need for proprietary software entirely. The question becomes: what is Microsoft's proprietary product in a world where Word may be redundant, where PowerPoint can be replaced by a dynamic interactive app built by an AI in seconds, and where Windows itself faces pressure from AI-native operating systems and open alternatives such as Linux?

Other than the inertia of institutional habit, the honest answer is not obvious. Enterprises do not abandon familiar tools quickly, procurement cycles are long, and retraining costs are real. But inertia is a diminishing asset, not a durable one.

Even Azure, Microsoft's supposed hedge against this scenario, warrants scrutiny. Azure grew revenue by 34 per cent in fiscal year 2025, reaching $75 billion, numbers that look impressive in isolation. But a portion of that growth comes from Microsoft hosting its own legacy business products and its substantial investment in OpenAI, which uses Azure infrastructure. A meaningful share of Azure's growth is, in effect, Microsoft paying itself. That is not the same as building bare-metal infrastructure that independent developers and enterprises choose purely on its merits, the way they choose AWS.

The Valuation Problem

This brings us to a more immediate concern: the precariousness of current valuations. Legacy software companies frequently trade at twenty to forty times earnings, valuations predicated on the assumption of perpetual growth. If earnings structurally decline due to the collapse of the per-seat model, investors are effectively paying for what a company will be worth forty years from now, whilst its core business is actively contracting.

There is also the deeper question of whether AI can turn a profit at all in its current form. We can try to draw a comparison to Uber. For years, Uber subsidised cheap rides to eliminate competition before eventually raising prices once the market was captured. But unlike Uber, the AI sector faces a unique problem: competitors can replicate capabilities so rapidly that the venture capital strategy of flooding the market to achieve dominance may not work if the business model is simultaneously consuming itself. You cannot establish a moat with a product that teaches others how to build the same moat.

There is a further wrinkle that the Uber analogy misses. Uber's competitors were constrained by physical assets: vehicles, drivers, regulatory licences. AI competitors face no such constraint. A well-resourced team with access to distilled weights and open infrastructure can close the capability gap in months. The winner-takes-all assumptions baked into current AI valuations may simply not apply.

Where the Real Value Lies

As software becomes a commodity, value migrates to what cannot easily be replicated: the physical infrastructure required to run the mathematics. Whilst the weights may eventually be free to obtain, running inference at scale is not. It demands massive energy consumption, dense data centres, and specialised hardware. Whoever controls this layer controls the margin.

This is why companies such as Apple, Google, and Amazon are better positioned than the conventional narrative suggests. Apple benefits from the eventual need for AI to run locally on hardware, a transition that plays directly to their chip design expertise, as demonstrated by the move to M-series silicon. When Apple decides it is falling behind, it has a proven history of catching up, and bringing chip design in-house rather than remaining dependent on external suppliers. Amazon's AWS and Google Cloud, meanwhile, provide the infrastructure that developers building applications at genuine scale actually require. These are not software bets; they are infrastructure bets, and infrastructure has historically been where durable value concentrates during major technological transitions.

NVIDIA's position deserves more scepticism than it typically receives. The company has ridden the initial surge with remarkable success, and the Cisco comparison is instructive here: Cisco was similarly indispensable to the early build-out of the internet, briefly became the world's most valuable company in March 2000, and subsequently saw its stock fall by nearly 90 per cent and never fully recover to its peak. The parallel is not perfect. NVIDIA's fundamentals are considerably stronger than Cisco's were at its peak, with more grounded earnings multiples and genuine revenue growth. But the structural concern holds: as AI infrastructure matures, the demand for expensive, general-purpose GPUs may give way to purpose-built, application-specific chips designed by the hyperscalers themselves. Amazon and Google are actively building their own AI accelerators. The CUDA ecosystem provides lock-in, but it is the kind of lock-in that well-resourced competitors have every incentive to dissolve.

The Cigar Butt

For Microsoft specifically, the situation may resembles what Warren Buffett calls a 'cigar butt' investment: a stock that may have been oversold, with one last puff of value remaining. The structural threats to its software business are real, but they are unlikely to manifest fully in earnings for another ten to twenty years. This creates a window in which the stock could recover as markets realise the disruption is not imminent, and the potential upside, might heavily outweigh the downside. Either Microsoft catches up with the other big players, and if it doesn't it will still perform above average for another decade or so.

The Deeper Shift

What we are witnessing is a reconfiguration of where scarcity lives in the technology stack. For decades, scarcity lived in proprietary formats, incompatible file systems, and the accumulated network effects of dominant software platforms. AI is dissolving those barriers faster than any previous technology, and in doing so, it is also dissolving the moats of the companies that built them, and the smarter AI becomes, the less expensive it becomes, because as a result of being better, it should make the whole stack cheaper.

So this is my bet... The physical world, energy, silicon, cooling, transmission, will remain stubbornly expensive for now. In this new landscape, the winner is not the one who provides the most polished interface. The winner is the one who controls the hardware and the pipelines that allow the mathematics to run as cheaply and efficiently as possible.

We are not simply watching software evolve. We are watching it be digested by the very tools it helped create. The question for investors is not whether this process is underway. It clearly is. The question is how much time remains before the earnings statements catch up with the logic.