Shares of semiconductor giant Nvidia (NASDAQ: NVDA) have gained nearly 217% over the last year. Undoubtedly, the rapid advancement and adoption of generative artificial intelligence (AI) applications and large language models have been the key demand drivers for its AI-capable chips and systems. The graphics processing unit (GPU) leader has emerged as both an enabler and a major beneficiary of the ongoing generative AI revolution.
Nvidia posted a strong performance in its fiscal 2025 first quarter, which ended April 28: Revenue and earnings soared year over year by 262% and 690%, respectively. For the fiscal year, which will end Jan. 31, analysts expect its revenue to grow by 97% to $120 billion and earnings per share (EPS) to rise by 109% to $2.71.
Beyond that exceptional short-term outlook, there are also at least three major reasons to expect Nvidia will grow significantly in the long run.
A dominant accelerated computing player
Nvidia’s data center business revenue soared by a jaw-dropping 427% year over year to $22.6 billion in fiscal Q1. That segment accounted for 87% of its revenue, and will play a critical role in the company’s future growth story.
Hyperscalers (large cloud infrastructure providers), enterprises across verticals, and sovereigns worldwide are upgrading trillions of dollars worth of installed data center infrastructure that was built around dumb NIC (network interface cards) and CPUs by installing accelerated computing hardware. This infrastructure has become critical in training and inferencing large language models and other generative AI applications. Nvidia also expects enterprises to upgrade existing accelerated computing infrastructure from that based on current Hopper architecture H100 chips to next-generation Hopper architecture H200 chips and next-generation Blackwell architecture chips.
The economics are highly appealing for clients, especially for cloud service providers. During the most recent earnings call, an Nvidia executive asserted that “for every $1 spent on NVIDIA AI infrastructure, cloud providers have an opportunity to earn $5 in GPU instant hosting revenue over four years.”
Demand for Nvidia’s AI GPUs is far outpacing supply, even though the company has been focusing on expanding production capacity for chips like H100 and Grace Hopper. It expects the supply of next-generation H200 and Blackwell chips will continue to fall short of demand until next year. This will ensure that Nvidia continues to enjoy pricing power, despite the increasing competition in this niche of the chip industry.
Besides its AI GPUs, Nvidia has also introduced the Grace Hopper Superchip (CPU + GPU), Blackwell architecture chips, AI-optimized Spectrum-X Ethernet networking, and Nvidia AI enterprise software. These products help drive performance gains and users’ lower costs while training and running AI applications.
According to Nvidia CEO Jensen Huang, AI is enabling the $3 trillion information technology industry to build tools that can target nearly $100 trillion of industry. Against this backdrop of solid growth, commitment to innovation, and rapidly expanding market opportunities, the company’s forward price-to-earnings (P/E) multiple of 33.93 looks justified, even if it is not cheap.
Full-stack AI platform
Nvidia has evolved from a chip supplier to a “full stack” AI platform provider. The company provides hardware such as GPUs, DPUs (data processing units), and CPUs a complete software stack (CUDA, AI enterprise software, inference microservices, Omniverse), high-speed networking components (InfiniBand, Ethernet), and servers to build “AI factories” that generate multimodal outputs (AI tokens) including text, images, audio, and video. AI factories refer to the essential infrastructure built by clients for AI production. In its fiscal first quarter, Nvidia worked with more than 100 customers to build AI factories that ranged in size from hundreds of GPUs to 100,000 GPUs.
Nvidia’s GPUs and the supporting Compute Unified Device Architecture (CUDA) software stack — an AI-optimized parallel programming platform for the company’s hardware portfolio — have been pivotal in multiple AI breakthroughs, including transformer models, unsupervised learning, and foundational models like GPT-4 and Meta Platforms‘ Llama. In its efforts to stay ahead of the competition, the company has accelerated the release cadence of its products and major features from once every two years to once every year. Nvidia has also built a large ecosystem of partners that includes technology titans, AI start-ups, and every major cloud service provider.
All these factors have enabled Nvidia to build a solid competitive moat in the burgeoning AI space.
Expanding addressable market
Nvidia is also leveraging its AI platform to expand its addressable market in areas such as “sovereign AI,” the automotive industry, and physical AI.
Nvidia sees sovereign AI as a major growth opportunity since countries worldwide are building out their domestic AI capabilities. The company partners with governments and local players to provide end-to-end AI infrastructure. Management expects sovereign AI’s contribution to Nvidia’s revenue to grow from nothing in fiscal 2024 to a figure in the high-single-digit billions in fiscal 2025.
Nvidia’s Drive platform, which integrates hardware and software solutions to provide computing power, AI technologies, and software frameworks for autonomous vehicles and advanced driver-assistance systems, is also seeing robust demand.
Nvidia also expects physical AI — i.e., AI-enabled robots — to be a major long-term growth driver. The company is creating end-to-end robotics platforms for factories and warehouses as well as humanoid robots.
Although Nvidia’s share price is near its all-time high, the growth drivers discussed above should provide a strong enough case to convince investors to pick up shares of this blockbuster stock now.
Should you invest $1,000 in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Nvidia made this list on April 15, 2005… if you invested $1,000 at the time of our recommendation, you’d have $808,105!*
Stock Advisor provides investors with an easy-to-follow blueprint for success, including guidance on building a portfolio, regular updates from analysts, and two new stock picks each month. The Stock Advisor service has more than quadrupled the return of S&P 500 since 2002*.
*Stock Advisor returns as of June 10, 2024
Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Manali Pradhan has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Meta Platforms and Nvidia. The Motley Fool has a disclosure policy.
3 Reasons to Buy Nvidia Like There’s No Tomorrow (Hint: The Stock Split Isn’t 1 of Them) was originally published by The Motley Fool
Source: finance.yahoo.com