Nvidia stock climbs as CFO says new chip to ship in 2024

By Max A. Cherney, Stephen Nellis and Medha Singh

SAN JOSE, California (Reuters) -Nvidia\'s stock climbed on Tuesday after the heavyweight chipmaker said its new flagship AI processor is expected to ship later this year and CEO Jensen Huang said he is chasing a data center market potentially greater than $250 billion.

Nvidia (NASDAQ:NVDA)\'s stock rose nearly 2% to $901 after Huang and Chief Financial Officer Colette Kress answered questions from investors at the company\'s annual developer conference in San Jose, California. The shares had dipped nearly 4% earlier in the day.

\"We think we\'re going to come to market later this year,\" Kress said, referring to the company\'s new AI chip, which the company debuted on Monday.

Huang estimated that companies operating data centers will spend more than $250 billion a year to upgrade them with accelerated computing components that Nvidia specializes in developing. He said that market was growing by as much as 25% a year.

Nvidia is shifting from selling single chips to selling total systems, potentially winning a larger chunk of spending within data centers.

\"Nvidia doesn\'t build chips, it builds data centers,\" Huang said.

Called Blackwell, Nvidia\'s new processor combines two squares of silicon the size of the company\'s previous offering. Nvidia also detailed a new set of software tools to help developers sell artificial-intelligence models more easily to firms that use its technology.

Nvidia is working with contract chip manufacturer TSMC to avoid bottlenecks in packaging chips that slowed shipments of its previous flagship AI processor, Huang said.

\"The volume ramp in demand happened fairly sharply last time, but this time, we\'ve had plenty of visibility\" into demand for Blackwell chips, Huang said.

Some analysts said Wall Street has already factored in the debut of the B200 Blackwell chip, which the company claims is 30 times faster at some tasks than its predecessor.

The Blackwell chip will be priced between $30,000 and $40,000, Huang told CNBC.

Huang later clarified that comment, saying Nvidia will include its new chip in larger computing systems and that prices will vary based on how much value they provide.

\"The Blackwell technology shows a significant performance uplift compared to Hopper (the current flagship chip) but it\'s always hard to live up to the hype,\" said David Wagner, portfolio manager at Aptus Capital Advisors.

In a discussion about Nvidia\'s cooperation with South Korean chipmakers, Huang said Nvidia is qualifying Samsung Electronics (KS:005930)\' high bandwidth memory (HBM) chips.

Reuters reported last week that Samsung\'s HBM3 series have not yet passed Nvidia\'s qualification for supply deals.

Samsung\'s cross-town rival SK Hynix on Tuesday said it has begun mass production of next-generation HBM3E chips, with sources saying initial shipments will go to Nvidia this month.

At the center of Wall Street\'s AI euphoria, Nvidia\'s stock has more than tripled over the past 12 months, making it the U.S. stock market\'s third-most valuable company, behind only Microsoft (NASDAQ:MSFT) and Apple (NASDAQ:AAPL).

Even after that meteoric rally, Nvidia is trading at about 35 times its expected earnings, cheap compared with its PE of 58 a year ago, according to LSEG data.

That decline in Nvidia\'s PE valuation is the result of analysts massively increasing their estimates of the chipmaker\'s future earnings, and if those forecasts turn out to be too optimistic, Nvidia\'s stock risks plummeting back to earth.

Nvidia expects major customers including Amazon.com (NASDAQ:AMZN), Alphabet (NASDAQ:GOOGL)\'s Google, Meta Platforms (NASDAQ:META), Microsoft, OpenAI and Tesla (NASDAQ:TSLA) to use its new chip.

Its hardware products will likely remain \"best-of-breed\" in the AI industry, Morningstar analysts said, lifting their estimates for Nvidia data-center revenue for 2026 and 2028.

\"We remain impressed with Nvidia\'s ability to elbow into additional hardware, software, and networking products and platforms,\" they said.

The software push shows how Nvidia, whose chips are mostly used to train large-language models like Google\'s Gemini, is trying to make its hardware easier to adapt for companies rushing to integrate generative AI into their businesses.

Many analysts expect Nvidia\'s market share to drop several percentage points this year, as competitors launch new products and the company\'s largest customers make their own chips, although its dominance is expected to remain unchallenged.

Leave a Reply

Your email address will not be published. Required fields are marked *