Meta founder and CEO Mark Zuckerberg speaks during the Meta Connect event at Meta headquarters in Menlo Park, California, on Sept. 27, 2023.

Josh Edelson | AFP | Getty Images


Meta is spending billions of dollars on Nvidia’s popular computer chips, which are at the heart of artificial intelligence research and projects.

In an Instagram Reels post on Thursday, Zuckerberg said the company’s “future roadmap” for AI requires it to build a “massive compute infrastructure.” By the end of 2024, Zuckerberg said that infrastructure will include 350,000 H100 graphics cards from Nvidia.

Zuckerberg didn’t say how many of the graphics processing units (GPUs) the company has already purchased, but the H100 didn’t hit the market until late 2022, and that was in limited supply. Analysts at Raymond James estimate Nvidia is selling the H100 for $25,000 to $30,000, and on eBay they can cost over $40,000. If Meta were paying at the low end of the price range, that would amount to close to $9 billion in expenditures.

Additionally, Zuckerberg said Meta’s compute infrastructure will contain “almost 600k H100 equivalents of compute if you include other GPUs.” In December, tech companies like Meta, OpenAI and Microsoft said they would use the new Instinct MI300X AI computer chips from AMD.

Meta needs these heavy-duty computer chips as it pursues research in artificial general intelligence (AGI), which Zuckerberg said is a “long term vision” for the company. OpenAI and Google’s DeepMind unit are also researching AGI, a futuristic form of AI that’s comparable to human-level intelligence.

Meta’s chief scientist Yann LeCun stressed the importance of GPUs during a media event in San Francisco last month.

″[If] you think AGI is in, the more GPUs you have to buy,” LeCun said at the time. Regarding Nvidia CEO Jensen Huang, LeCun said “There is an AI war, and he’s supplying the weapons.”

In Meta’s third-quarter earnings report, the company said that total expenses for 2024 will be in the range of $94 billion to $99 billion, driven in part by computing expansion.

“In terms of investment priorities, AI will be our biggest investment area in 2024, both in engineering and computer resources,” Zuckerberg said on the call with analysts.

Zuckerberg said on Thursday that Meta plans to “open source responsibly” its yet-to-be developed “general intelligence,” an approach the company is also taking with its Llama family of large language models.

Meta is currently training Llama 3 and is also making its Fundamental AI Research team (FAIR) and GenAI research team work more closely together, Zuckerberg said.

Shortly after Zuckerberg’s post, LeCun said in a post on X, that “To accelerate progress, FAIR is now a sister organization of GenAI, the AI product division.”

— CNBC’s Kif Leswing contributed to this report

WATCH: The AI dark horse: Why Apple could win the next evolution of the AI arms race

The AI dark horse: Why Apple could win the next evolution of the AI arms race

Read More: World News | Entertainment News | Celeb News
CNBC

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

MIDAS SHARE TIPS: Food delivery giant Kitwave caters to our every need

About 20 years before he was roundly defeated at Waterloo, Napoleon allegedly…

Bitcoin halving is likely this week — here’s what you need to know

The bitcoin “halving” is almost upon us. This technical event, written in…

Apple lays off over 600 California employees after shuttering car project

Tim Cook, CEO of Apple, during an event at Apple Park campus…

Iran warns the U.S. not to ‘tie their destiny’ to the fate of Israel’s Netanyahu

Iran’s foreign minister warned the U.S. not to “tie their destiny” to…