You are viewing a single comment's thread from:

RE: LeoThread 2025-01-03 09:38

This could be evidence of the limitations of current AI scaling laws — the methods companies are using to increase the capabilities of their models. In the not-too-distant past, it was possible to achieve substantial performance boosts by training models using massive amounts of computing power and larger and larger data sets. But the gains with each generation of model have begun to shrink, leading companies to pursue alternative techniques.

Grok 3 is training with 10X, soon 20X the compute of Grok 2

— Elon Musk (@elonmusk) September 21, 2024