Bitget App
Trade smarter
Buy cryptoMarketsTradeFuturesEarnSquareMore
After the surge in memory, where are the sectors in the AI industry chain that are truly worth betting on for the long term?

After the surge in memory, where are the sectors in the AI industry chain that are truly worth betting on for the long term?

CointimeCointime2026/05/12 05:46
By:Cointime

In the past two years, the AI track in the US stock market has been soaring, driving the industry chain to generate trillions of dollars in new market value. But the distribution of wealth in the industry is extremely imbalanced: one upstream company, Nvidia, has a market value of $4.5 trillion and a super high gross profit margin of 73%; The combined annualized revenue of OpenAI and Anthropic in the midstream is only about 45 billion US dollars; Downstream application and infrastructure players such as CoreWeave, Cursor, and Perplexity are still in the stage of financing and burning money at the same time.

The entire AI industry chain presents a clear triangular structure of top fat and bottom thin: the closer to the upstream chip, the thicker the profit and the higher the barrier; The further downstream we go, the more intense the competition becomes and the harder it is to make profits.

Will future funds continue to stay upstream or gradually migrate towards midstream infrastructure and downstream applications? Stanford University has launched a new course, MS&E 435, which brings together nine core industry leaders to analyze the value flow and optimal investment direction of the AI industry chain in the next decade from dimensions such as industry profit distribution, pattern evolution, cost structure, and organizational change.

1、 AI profit pattern solidifies: chip layer takes nearly 80% of gross profit, crushing the cloud computing era

As early as early 2024, investor Apoorv Agrawal released a report titled "Generative AI Economics," which came to a shocking conclusion: the AI chip layer consumes 83% of the entire industry chain's gross profit.

After two years, the scale of the AI ecosystem has expanded from $90 billion to $435 billion, but the profit pattern has hardly loosened:

The annual revenue of the chip layer is about 300 billion US dollars, with Nvidia alone holding 80% of the market share;

Infrastructure layer revenue of approximately 75 billion US dollars;

The revenue of the application layer is about 60 billion US dollars.

The gross profit margin gap among the three layers is significant: 73% for the chip layer, 55% for the infrastructure layer, and only 33% for the application layer. Converted gross profit absolute value: $225 billion for the chip layer, $40 billion for the infrastructure layer, and $20 billion for the application layer.

Compared to the cloud computing era of the previous technological revolution, the landscape is completely reversed: in the traditional cloud computing stack, the chip layer only receives 6% of the gross profit, while the application layer takes 70% of the profit. Nowadays, in the AI industry, hardware is enjoying huge profits, while software and applications have become profit havens.

The industry landscape is more accurately summarized as follows: the chip layer is a solo game, the application layer is a two player game, and the middle infrastructure layer is a multiplayer battlefield where many players engage in fierce competition.

2、 The chip layer powerhouse remains strong: Nvidia's monopoly is difficult to break, and a market value of 10 trillion yuan is not just talk

Altimeter partner Brad Gerstner and Nvidia core executive Sunny Madra deeply interpreted the underlying logic that is difficult to overturn at the chip level.

The market has always underestimated Nvidia's growth potential and valuation value: it has a market value of $4.5 trillion, a P/E ratio of only 13 times, less than half of the market average, and its revenue has maintained a high growth rate of 70% throughout the year. Brad boldly predicts that Nvidia will become the world's first enterprise with a market value of 10 trillion US dollars.

The confidence comes from two major hard core supports:

Orders are highly locked in, with trillions of dollars worth of orders in hand for the next eight quarters, and demand far exceeds supply;

The industry has fully entered the era of inference from pre training, and the demand for inference may skyrocket by 1 billion times, with no ceiling on computing power demand;

The hardcore logic of computing power cost: The computational cost of generating one Token is equal to the square of the model parameter quantity multiplied by the context length, and the computing power demand will only become stronger and stronger.

Despite the emergence of self-developed chips in the industry, such as Google's seventh generation TPU Ironwood in mass production and Anthropic's million dollar orders; Amazon Trainium2 deploys 1.4 million instances, with annual revenue exceeding 10 billion yuan; Microsoft Maia 200 landing on Azure, OpenAI partners with Broadcom to layout self-developed ASIC.

But Huang Renxun's attitude is very indifferent: most ASIC projects will eventually be cut off. Even if all self-developed chips from Google, Amazon, and Microsoft are successful, it will be difficult to shake Nvidia's foundation - not because the competitors are not strong enough, but because the AI computing power market is large enough to accommodate all players, and Nvidia has always been at the top of the pyramid.

3、 Hidden Rigid Dividend: Electricity and Infrastructure, Undervalued AI Cost Core

Crusoe founder Chase Lochmiller has torn apart the most easily overlooked aspect of the AI industry from a practical perspective: behind the skyrocketing computing power lies the insane increase in electricity, infrastructure, and labor costs.

Crusoe is building a 2.1 gigawatt ultra large data center park in Texas, which is the largest private substation in the United States, with electricity consumption comparable to two Denver cities. The construction workers alone number 9000, far exceeding the local small town's population of 120000 residents.

Cost breakdown further overturns cognition: the total cost per megawatt of data center is about $19 million, and the biggest expense is not chips or cooling equipment, but labor, with a labor cost of up to $4.7 million per megawatt. A gigawatt level park burns $4.7 billion annually in labor costs alone.

In addition, the price of core equipment has skyrocketed in three years: the cost of a gas turbine per megawatt has increased from $1 million to $3 million. The root cause is the stagnant production capacity of the four major manufacturers, GE, Siemens, Mitsubishi Heavy Industries, and Pratt&Whitney, while the demand for AI computing power has doubled.

In the long run, traditional electrical giants Eaton and Schneider may seem stable, but AI is restructuring the power architecture: from 765 kV high voltage to 900 V DC inside the cabinet, the entire power conversion system needs to be redesigned. Long term veteran equipment manufacturers will continue to benefit, and long-term industry rules will be completely rewritten.

4、 The turning point of the application layer is not in the model, but in the organizational restructuring of the enterprise

The CEO of VNet, Ali Ghodsi, has put forward a disruptive viewpoint: according to the definition of UC Berkeley AMP Lab in 2009, AGI has already been achieved, but people keep raising standards and pulling the "goal" backwards, which creates the illusion that AI does not meet expectations.

MIT data supports reality dilemma: 95% of enterprise AI pilots ultimately fail. The core reason is not that the model is not strong enough, but that there is a large amount of "unwritten implicit organizational knowledge" in the enterprise - the experience of those old employees who have been deeply involved in the industry for 20 years and the unwritten process rules inside are the invisible barriers that current AI models cannot read and replicate.

This is very similar to the electric motor of the Industrial Revolution era: it was invented in 1880 and it wasn't until 1920 that it truly boosted productivity. In the middle of the forty years, the enterprise simply replaced the steam engine with an electric motor, but did not demolish and reconstruct the factory and production process.

The same applies to the landing of AI: most companies simply apply AI to old processes, which is bound to have little effect. VNet 'own case study confirms that using AI to optimize data connectors in a conventional way only saves one and a half months; After thoroughly dismantling and restructuring the organization and workflow, efficiency has achieved a qualitative improvement.

Ali's judgment: The real big opportunity at the application layer does not belong to stronger models, but to players who dare to rewrite the organizational logic of the enterprise and restructure business processes. The speed of AI application implementation is determined by human and organizational changes, rather than waiting for GPT-6 and Opus-5 iterations.

5、 Ultimate judgment: Should we bet on upstream to earn cash flow or bet on the ten-year trend of the application layer?

There is an unchanging law in the technology industry: the value of the technology stack will eventually climb from the bottom hardware to the upper software and applications.

It took a full fifteen years for cloud computing to transition from hardware dominance to software dominance. The AI industry chain will eventually complete this value reversal, but the process will be extremely long: either wait for the application layer to continue to explode and seize profits, or wait for the chip layer's ultra-high gross profit margin of 73% to gradually fall back to the level of 6% for cloud computing hardware.

Both trends are currently occurring, but at a slow pace. Based on the pace of industry changes in the past, it will take at least ten years for the application layer to catch up with the software position in the cloud computing era in terms of profit share.

This leads to a clear investment logic:

Short term 1-2 years: Prioritize betting on upstream sectors close to computing power and chips, with the strongest certainty and the fastest cash flow landing;

Medium to long-term 5-10 years: Layout the application layer and enterprise service track, betting on the long-term trend of upward migration of industrial value;

Before disruptive technologies restructure computing power costs and break through high gross margins of chips, the closer to the chip, the closer to the profit.

The surge in AI memory and computing power is not the end point, but the starting point for the redistribution of profits in the industry chain. Only by understanding the three-tier structure, cost structure, and organizational change can we choose the track that is truly worth sticking to in the long term.

0
0

Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.

Understand the market, then trade.
Bitget offers one-stop trading for cryptocurrencies, stocks, and gold.
Trade now!