Semiconductor giant Nvidia (NVDA 1.09%) returned with stellar results for the fiscal first quarter of 2025 (ending April 28, 2024). Revenue and earnings surged by an explosive 262% and 690% year over year, respectively. Data center revenue soared 427% year over year to $22.6 billion, and accounts for 87% of the company's first-quarter revenue.

The growing demand for artificial intelligence (AI) chips in the data center segment has undoubtedly played a major role in Nvidia's blockbuster quarterly results and the dramatic increase in the company's valuation.

Shares of Nvidia gained by nearly 129% so far in 2024 and have reached record-high levels. Can the stock go even higher or is it close to its peak?

AI catalyst

Nvidia continues to see robust demand for its market-leading Hopper GPU architecture chips (H100 and H200 chips) and the next-generation Blackwell architecture chips, thanks to rapid advancements and adoption of generative AI technologies by cloud service providers, enterprise customers, consumer internet companies, sovereigns, and start-ups. Since generative AI applications and the underlying large language models consume high amounts of GPU computing resources for training and inference (training and running models in real time), the demand for Nvidia's AI-optimized Hopper and Blackwell GPUs is set to grow in the coming months.

Companies and sovereigns are actively transitioning the trillion-dollar data center infrastructure from that based on "dumb" network interface controllers (NICs) and GPUs to accelerated computing. Nvidia is also expecting many clients to transition their accelerated computing infrastructure from H100 GPUs to the more advanced H200 GPUs (which offer double the inference performance as compared to H100 chips at a lower cost) and to next-generation Blackwell architecture systems.

While the company plans to commence shipments of its H200 chips and Blackwell chips in the second quarter, it expects demand for these chips to outpace supply until next year. This will allow Nvidia to enjoy significant pricing power for its GPUs in the coming months.

Nvidia's Compute Unified Device Architecture (CUDA) software stack -- a parallel programming platform optimized for accelerating AI workloads across the company's hardware portfolio -- has long been a major competitive advantage. CUDA's versatility, scalability, and performance played a crucial role in enabling the company to rapidly adapt to new AI workloads and use cases throughout the past decade. This trend has continued, considering that the recent CUDA algorithm innovations helped improve inference performance for certain popular models on H100 chips by 3x, translating into almost 3x cost reduction.

Inferencing requires more computation power

While AI training has been the primary driver of Nvidia's data center business, inferencing workloads are fast proving to be an even bigger opportunity due to their higher energy needs. The company estimates that inferencing accounted for about 40% of its data center revenue in the trailing four quarters.

Considering the increasing complexity of inferencing models and the rising demand for AI computing (due to an expanding user base and higher number of queries per user) across industries, the inferencing workloads will require even more computing power in the next few years. Nvidia is positioned to capitalize on these opportunities with its Hopper and Blackwell architecture GPUs, CUDA software stack, and a recently introduced inference-optimized software called Nvidia Inference Microservices (NIMs).

Geographic market expansion

Nvidia expects to diversify its data center revenue by targeting the sovereign AI market. Countries worldwide are building domestic AI infrastructure and capabilities using their data, business networks, and workforces. These "sovereign AI" buildouts in countries such as Japan, Italy, France, and Singapore are driving demand for Nvidia's AI computing products.

This trend of countries focusing on reducing overreliance on foreign technology players is helping Nvidia diversify its revenue base across geographies. CEO Jensen Huang expects the sovereign AI market to be worth high-single-digit billions in fiscal 2025, representing a significant revenue catalyst for the company.

Stock split

Nvidia recently announced a 10-for-1 stock split effective June 10, 2024. Although the split does not change the fundamentals of the company, it makes its shares more accessible for retail investors -- especially those who cannot purchase fractional shares. The expansion in the investor base can result in a spike in the company's share prices, especially since Nvidia boasts a remarkable financial growth trajectory and robust growth drivers.

Valuation

Nvidia is trading at a forward price-to-earnings (P/E) ratio of 36.7. While not exactly cheap, it is still lower than the company's five-year average forward P/E multiple of 82.5.

Nvidia is a major disruptive force in the still nascent AI industry. The company is positioned to grow in tandem with the AI industry and can prove to be a compelling investment even at the current elevated share price.