SELECT LANGUAGE BELOW

The AI bubble is nearing its end. Here’s how to get ready.

The AI bubble is nearing its end. Here's how to get ready.

OpenAI’s Revenue and Financial Challenges

OpenAI has recently confirmed its monthly revenue is around $2 billion, which translates to an annual run rate of $24 billion by April 2026. This significant figure would have been hard to imagine just two years ago. However, internal forecasts suggest that the company might face a cash burn of up to $17 billion this year. Projections indicate a potential loss of about $14 billion by 2026, despite revenue expectations exceeding $28 billion.

Considered the most valuable AI enterprise globally, OpenAI is backed by Microsoft and numerous venture capitalists, yet its cash burn rate is so high that it consumes a large portion of its revenues. As one might summarize, “This is what happens when the subsidy ends.”

In a comparable scenario, Anthropic also faces similar financial pressures. By early 2026, they expect their annual revenue run rate to hit $30 billion. Analysts estimate that the company may be losing between 200% and 3,000% of each subscriber’s fee for their Claude Code tools, yet the funding keeps coming.

Big Tech plans to invest around $700 billion into AI infrastructure in 2026, a notable increase from the $400 billion spent previously. Nvidia, recognized as the world’s most valuable company since June 2024, along with other AI startups, continues to secure funding at valuations based on potential revenue generation.

However, there exists a widening gap between spending and actual earnings. Back in 2023, David Kahn from Sequoia Capital raised questions about the sustainability of such investments, and it seems that the actual gap is now larger due to skyrocketing investment levels.

Something has to be done.

The Subscription Dilemma

Anthropic’s Claude Max subscription is priced at $200 monthly. At first glance, this seems excessive, particularly when examining the spending habits of power users.

Reports suggest that a single $200 subscription can deplete up to $5,000 in computing resources monthly, though some analysts believe the actual figure is closer to $500. Nevertheless, Anthropic appears to be subsidizing its power users generously.

OpenAI has adopted a similar model with its pricing of $20 for ChatGPT Plus and $200 for Pro plans, strategically aimed at acquiring market share rather than enhancing user profitability.

This is what could be termed a “subscription lie.” Consumers aren’t actually paying for the full service—they’re benefiting from subsidized trials.

Even the great Amazon, a company known for operating at a loss since its beginning in 1994, illustrates a point. Jeff Bezos himself noted its unprofitability in a 2000 BBC interview. Despite losses nearing $30 billion from Uber due to heavy subsidization strategies, the company managed its first profit in 2023 through fare hikes after establishing market dominance.

This approach is only sustainable until funds run dry, at which point customers will bear the costs.

Job Cuts in Tech

In 2026, tech firms are anticipated to cut tens of thousands of jobs, with Oracle and Amazon laying off thousands to redirect funds toward AI infrastructure. For instance, Salesforce announced that AI agents replaced 4,000 customer support positions. Coinbase also revealed a significant reduction of 14% of its workforce as part of reshaping towards an AI-focused model.

The question arises: will these companies see financial benefits in the long run? An executive from Nvidia has pointed out that, in terms of expenses, the computational costs far exceed the salaries of human workers.

Additionally, companies like Amazon have discovered that transitioning to AI isn’t without hurdles. After layoffs of engineers, a proprietary AI tool failed to perform correctly, leading to a considerable loss of orders. The company even had to call an urgent engineering meeting to address issues stemming from AI-assisted coding changes.

The Hidden Costs of AI

AI isn’t solely software. It requires substantial amounts of physical resources—steel, copper, and energy. The tokens rented for $20 a month are processed in vast, costly data centers, and the associated electricity isn’t solely underwritten by venture funding.

The scale is astonishing; the International Energy Agency predicts that U.S. data centers will consume 415 terawatt-hours in 2024, a number expected to triple by 2035. This cost will inevitably fall on the consumer.

Electricity bills for households have surged about 30% since 2020, outpacing inflation. The cost of electricity has skyrocketed, up by 267% in regions with burgeoning data centers. In states like Virginia and Ohio, regulators are beginning to approve rate hikes, which will ripple into consumers’ bills over time.

Thus, in most regions, it’s the consumers who will be footing the bill for the energy-intensive operations underpinning AI innovation, rather than the tech companies profiting from it.

Facing the Difficult Truth

This scenario paints a picture of dependency. When companies cut back on human resources in favor of AI systems, they shift their budget from personnel costs to API expenses. With fewer seasoned engineers around to catch potential errors, the reliance on AI can lead to a precarious situation.

OpenAI and Anthropic may eventually have to adjust their pricing significantly to cover their costs. One could easily foresee a situation where a $200 Pro plan could rise to $800, while users might see the free tier vanish overnight.

Once key employees are laid off, they often don’t return. Their knowledge leaves with them, and businesses that have integrated AI into their operations may find themselves stuck in vendor arrangements with substantial annual costs.

Interestingly, Jim Covello from Goldman Sachs pointed out a prevailing concern in 2024: companies are spending large amounts on generative AI without retrieving commensurate benefits. The exorbitant costs associated with these tools are not justified by their value.

This presents a trap; while AI has immense potential, the disparity between expenses and profits continues to widen, hampering companies’ ability to disentangle themselves from these commitments.

The Future of AI

Even if the bubble bursts, some aspects of the AI landscape will endure. Local models running on consumer-grade hardware offer an avenue to avoid the costs of API usage. A high-end RTX 4090 can now manage complex language models that previously needed specialized setups.

Companies that purchase their hardware rather than leasing it are likely to emerge in a strong position. Owning tools and data means greater control over operations. Those whose business models hinge solely on API access lack real ownership.

Ultimately, as AI firms seek to recoup their substantial investments, prices are likely to climb, imposing stricter conditions on dependent businesses.

In past market crashes, such as the dot-com bubble or telecom industry downturn, significant wealth was lost. However, core technologies survived, and it’s reasonable to expect that AI will do the same. The real mystery remains: will current billion-dollar valuations persist, or will substantial fallout occur, particularly affecting businesses opting for profit-losing APIs? This shake-up may reshape the industry significantly.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News