Meta Begins Testing Its First In-House AI Training Chip

Báo Giao thôngBáo Giao thông12/03/2025

Meta, the company that owns Facebook, is testing its first in-house chip for training artificial intelligence (AI) systems, a milestone in its quest to design more custom chips and reduce its reliance on suppliers like Nvidia.


Estimated investment budget of 119 billion USD

The world's largest social media company Meta has begun small-scale testing with the chip and plans to ramp up production for widespread use if the trial is successful.

Meta bắt đầu thử nghiệm chip đào tạo AI nội bộ đầu tiên- Ảnh 1.

Meta, the company that owns Facebook, is testing its first in-house chip for training artificial intelligence (AI) systems.

The push to develop in-house chips is part of Meta's long-term plan to reduce its massive infrastructure costs, as the company bets big on AI tools to fuel growth.

Meta, which also owns Instagram and WhatsApp, forecasts total spending in 2025 to range from $114 billion to $119 billion, including up to $65 billion in capital spending, largely driven by investments in AI infrastructure.

Meta's new training chip is a dedicated accelerator, meaning it's designed to handle specific AI tasks only, a source said. This makes it potentially more energy efficient than the integrated graphics processing units (GPUs) typically used for AI workloads.

According to the source, Meta is cooperating with the world's largest contract chip manufacturer TSMC to produce this chip.

The test deployment began after Meta completed the chip’s first “tape-out,” a major milestone in silicon chip development that involves sending the initial design through a chip manufacturing plant. A typical tape-out process costs tens of millions of dollars and takes about three to six months to complete, with no guarantee that the test will be successful. If it fails, Meta will need to diagnose the problem and repeat the tape-out step.

The chip is the latest in the company’s Meta Training and Inference Accelerator (MTIA) line, a program that has had a rocky start over the years and has seen a chip canceled at a similar stage of development.

Last year, however, Meta began using an MTIA chip to perform inference—the process of running an AI system as users interact with it—for recommendation systems that decide what content appears in Facebook and Instagram news feeds.

Meta plans to use internal training chips by 2026

Meta executives say they want to start using in-house chips by 2026 for training, which is the computationally intensive process of feeding an AI system massive amounts of data to “teach” it how to operate.

Meta bắt đầu thử nghiệm chip đào tạo AI nội bộ đầu tiên- Ảnh 2.

Meta executives say they want to start using in-house chips for training by 2026.

As with the inference chip, the goal of the training chip is to start with recommender systems and then use it for generative AI products like the Meta AI chatbot, executives said. “We’re looking at how we do training for recommender systems, and then how we think about training and inference for generative AI,” Meta Chief Product Officer Chris Cox said at Morgan Stanley’s technology, media, and telecommunications conference last week.

Mr. Cox described Meta's chip development efforts as "a walk, crawl, then run situation" so far, but said executives consider the first-generation inference chip for recommender systems "a huge success."

Meta previously canceled an in-house custom inference chip after it failed in a small-scale test deployment similar to the current one for the training chip, instead returning to ordering billions of dollars worth of GPUs from Nvidia in 2022.

The social media company has remained one of Nvidia’s biggest customers ever since, amassing a fleet of GPUs to train its models, including its recommendation and advertising systems and its Llama family of platform models. These units also perform inference for the more than 3 billion people who use its apps every day.

The value of those GPUs has come into question this year as AI researchers have grown increasingly skeptical about how much further progress can be made by continuing to “scale up” large language models by adding more data and computing power.

These doubts were reinforced by the late January launch of new low-cost models from Chinese startup DeepSeek, which optimize computational efficiency by relying more heavily on inference than most current models.

Nvidia shares lost as much as a fifth of their value at one point during a global sell-off in AI stocks sparked by DeepSeek. They have since recovered much of their losses as investors bet the company’s chips would remain the industry standard for training and inference, though they have since fallen back on broader trade concerns.



Source: https://www.baogiaothong.vn/meta-bat-dau-thu-nghiem-chip-dao-tao-ai-noi-bo-dau-tien-192250312120123752.htm

Comment (0)

No data
No data

Same tag

Same category

New creations in the TV series 'Remake' leave an impression on Vietnamese audiences
Ta Ma - a magical flower stream in the mountains and forests before the festival opening day
Welcoming the sunshine in Duong Lam ancient village
Vietnamese artists and inspiration for products promoting tourism culture

Same author

Heritage

Figure

Business

No videos available

News

Ministry - Branch

Local

Product