In an Instagram Reels post on January 18, Mark Zuckerberg said the company’s “future roadmap” for AI requires building “a massive computing infrastructure” that will be powered by 350,000 Nvidia H100 chips by the end of 2024.
Meta CEO did not say how many GPUs he bought, but the H100 is due to hit the market in late 2022 and supply is extremely limited. Analysts at Raymond James estimate Nvidia is selling the H100 for $25,000 to $30,000. On eBay, the chip can fetch more than $40,000. If Meta gets a lower price, the cost would be close to $9 billion.
In addition, Zuckerberg also revealed that if other GPUs are included, the infrastructure's computing power is equivalent to 600,000 H100s. In December 2023, Meta, OpenAI, and Microsoft said they will use AMD's new Instinct MI300X AI chip.
Meta needs such “heavy” computer chips because it is pursuing artificial general intelligence (AGI), which Zuckerberg calls a “long-term vision” for the company. OpenAI and Google DeepMind are also working on AGI – a futuristic AI that is on par with human intelligence.
Meta chief scientist Yann LeCun emphasized the importance of GPUs at an event last month in San Francisco, US. He said that if you want AGI, you have to buy more GPUs. “There is an AI war going on and he (Nvidia CEO Jensen Huang) is providing the weapons.”
In its third-quarter 2023 earnings report, Meta said total spending in 2024 would range from $94 billion to $99 billion, partly to expand its computing capacity. In a meeting with analysts, the group’s chief executive also confirmed that “in terms of investment priorities, AI will be the largest investment area in 2024, both in terms of technology and computing resources.”
Also on January 18, Zuckerberg said he plans to “responsibly” open source the AGI the company is developing, an approach similar to the larger Llama language family. Meta is training Llama 3 and will bring the Fundamental AI Research (FAIR) and GenAI research groups together more closely.
Shortly after Zuckerberg's post, engineer LeCun wrote on X: “To expedite the process, FAIR is now a sister organization to GenAI, the AI product division.”
(According to CNBC)
Source
Comment (0)