Nvidia (NASDAQ: NVDA) is the dominant force in the market for artificial intelligence (AI) graphics cards, with an estimated 92% market share. This is the reason the company has been growing at a terrific pace, up 230% over the past year.
However, the company faces a potential threat from some tech giants. Let’s examine that threat before checking how Nvidia is reportedly going to take care of it.
Custom AI chips pose a threat to Nvidia
While companies including Microsoft, Amazon, Meta Platforms, and Alphabet are Nvidia’s customers and have spent billions of dollars on its graphics processing units (GPUs), it’s no secret they have been developing custom AI chips to reduce their reliance on the graphics-card specialist.
Alphabet, for instance, has deployed custom AI accelerators known as tensor processing units (TPUs) in Google Cloud to “scale cost-efficiently for a wide range of AI workloads, spanning training, fine-tuning, and inference.” Similarly, Meta Platforms is expected to deploy a new custom AI chip this year in a bid to reduce its dependence on Nvidia.
Microsoft has built custom AI chips for deployment in its Azure data centers, and they’re expected to hit the market this year. Meanwhile, Amazon revealed its own chips for training AI models in November, making them available for use by Amazon Web Services (AWS) cloud customers.
There are two reasons these tech giants have been developing chips in-house. First, Nvidia has been unable to keep up with the massive demand for its AI GPUs. The waiting period for the company’s flagship H100 AI graphics card can reportedly stretch up to a year.
Nvidia is trying its best to increase the supply of its graphics cards with the help of its foundry partners, but customers may not be comfortable waiting for so long to get their hands on these chips.
Second, Nvidia’s AI GPUs are very expensive. The H100 processor reportedly carries a price tag between $30,000 and $40,000. However, investment banking firm Raymond James estimates that it costs Nvidia just over $3,300 to manufacture one H100 GPU, pointing toward the immense pricing power the company enjoys in this market.
So it’s not surprising to see tech giants looking to cut down on such massive spending by developing custom chips internally to tackle specific AI workloads for which an H100 may not be required.
Formally known as application-specific integrated circuits (ASICs), these custom chips are dedicated entirely to performing specific operations rapidly while being energy efficient. Semiconductor research group SemiAnalysis reportedly estimates that a successfully developed custom AI chip could help Nvidia’s customers save hundreds of millions of dollars.
All this tells us why Nvidia may be looking to enter the custom AI chip market.
Nvidia wouldn’t want to let go of this potentially $55 billion revenue opportunity
Broadcom (NASDAQ: AVGO) and Marvell Technology (NASDAQ: MRVL) are two leading manufacturers of ASICs, and both companies have witnessed a sharp jump in AI-related orders. Marvell, for instance, could generate $1 billion in revenue from selling custom AI chips this fiscal year. Broadcom, on the other hand, is expected to sell custom AI chips worth $8 billion to $9 billion in 2024, according to one estimate. These two companies together control a 47% share of the ASIC market.
Investment banking firm Needham estimates that the overall custom chip market was worth an estimated $30 billion last year. AI is already commanding a significant chunk of this space as sales of high-end custom ASICs reportedly stood between $13 billion and $18 billion last year. Morgan Stanley predicts that ASICs could account for 30% of the $182 billion AI chip market by 2027, pointing to a potential revenue opportunity of $55 billion in this space.
A Feb. 9 Reuters exclusive says that a former Marvell executive is heading Nvidia’s custom chip division, and the GPU specialist has already held discussions with Amazon, Microsoft, Meta, OpenAI, and Google to make custom chips for them.
If the Reuters report about Nvidia entering the custom AI chip space turns out to be true, investors will have another solid reason to buy this fast-growing AI stock. It’s currently trading at an attractive 35 times forward earnings, a discount to its five-year average forward price-to-earnings ratio of 42.
Should you invest $1,000 in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Stock Advisor provides investors with an easy-to-follow blueprint for success, including guidance on building a portfolio, regular updates from analysts, and two new stock picks each month. The Stock Advisor service has more than tripled the return of S&P 500 since 2002*.
*Stock Advisor returns as of February 12, 2024
Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Harsh Chauhan has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Alphabet, Amazon, Meta Platforms, Microsoft, and Nvidia. The Motley Fool recommends Broadcom and Marvell Technology and recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.
Nvidia Could Be About to Counter a Big Artificial Intelligence (AI) Threat With This Move was originally published by The Motley Fool
Source: finance.yahoo.com