The artificial intelligence (AI) infrastructure build-out remains one of the biggest growth drivers the stock market has ever seen. However, the AI market is shifting in a way that could lead to new leaders emerging. The first phase of AI was all about training foundational large language models (LLMs), while the importance of inference and AI agents is dominating the second phase.
Nvidia (NVDA 4.39%) was the big winner in phase 1 of the AI build-out, as its graphics processing units (GPUs) became the engines for training AI models. The company created a wide moat through its CUDA software, which it had earlier seeded in universities and research labs conducting early AI work. As a result, most foundational AI code was written on Nvidia software and optimized for its chips.
Image source: Getty Images.
However, inference is not as technically demanding as LLM training, and developers today tend to work higher up in the software stack on open-source AI frameworks like OpenAI's Triton, which helps level the playing field. While Nvidia is likely to remain the king of AI training and has better positioned itself for inference through its acquisition of Groq's assets and its language processing units (LPUs), there are now other companies that look to be big winners with huge growth runways, including Broadcom (AVGO 3.26%) and Advanced Micro Devices (AMD 5.69%).
Broadcom: A winner in custom chips
One of the big current AI infrastructure trends is hyperscalers (owners of huge data centers) looking to diversify the AI accelerators they use, including developing their own. One of the companies they are increasingly turning to for help is Broadcom, which is a leader in ASICs (application-specific integrated circuits).
ASICs are custom chips that are hardwired for a specific purpose. They lack the flexibility of general-purpose GPUs, but they tend to perform well for their intended tasks while being more energy-efficient. This is particularly important when it comes to inference, since power usage is an ongoing and expensive cost.

NASDAQ: AVGO
Key Data Points
Broadcom helped Alphabet develop its highly successful tensor processing units (TPUs). These chips, which Alphabet is also now letting a few select customers buy directly from Broadcom, are a huge driver for the company. Meanwhile, other hyperscalers, including OpenAI and Meta Platforms, are also using Broadcom to help them develop their own custom AI ASICs.
Broadcom has said it has a clear line of sight for over $100 billion in AI ASIC revenue alone in its fiscal 2027. At the same time, the company is a leader in the fast-growing data center networking space, which is becoming even more important as chip cluster sizes grow. Between these two opportunities, it is set to see huge growth.
AMD: An inference and agentic AI winner
Like Broadcom, AMD has an opportunity with inference. The company's ROCm software platform has improved immensely over the past two years, and its modular chiplet design, which can pack in more memory, is well-suited for inference. Inference tends to be more memory-bound than computation-bound, and its new chip will have 1.5 times the memory capacity of Nvidia's upcoming Rubin chips.
AMD has struck two large deals with OpenAI and Meta Platforms for six gigawatts worth of GPUs each, which are $100 billion deals. It is also believed that the company is working on a large GPU deal with Anthropic.

NASDAQ: AMD
Key Data Points
In addition to its GPU opportunity, the company is set to ride another powerful trend with agentic AI. It is the leader in data center central processing units (CPUs), and as AI agents rise, the GPU-to-CPU ratio in AI servers is expected to shift from 8-to-1 to 1-to-1. That's because CPUs will need to handle the sequential reasoning and ability to work with other tools that AI agents require. AMD pegs this market at $120 billion over the next few years.
Between its GPU and CPU opportunities, AMD looks poised to be a big winner in the next phase of AI.




