You are viewing a single comment's thread from:

RE: LeoThread 2025-03-11 12:28

Meta previously pulled the plug on an in-house custom inference chip after it flopped in a small-scale test deployment similar to the one it is doing now for the training chip, instead reversing course and placing orders for billions of dollars worth of Nvidia GPUs in 2022.

The social media company has remained one of Nvidia's biggest customers since then, amassing an arsenal of GPUs to train its models, including for recommendations and ads systems and its Llama foundation model series. The units also perform inference for the more than 3 billion people who use its apps each day.

The value of those GPUs has been thrown into question this year as AI researchers increasingly express doubts about how much more progress can be made by continuing to "scale up" large language models by adding ever more data and computing power.