You are viewing a single comment's thread from:

RE: LeoThread 2025-03-18 17:33

in LeoFinance3 days ago

Part 8/10:

In contemplating the relevance of such data scales relative to Artificial General Intelligence (AGI), Suver brings forth an intriguing notion: attaining zettabyte-scale models—while grand in its ambition—may not be a strict prerequisite for achieving meaningful forms of intelligence. As observed in the current arena, existing LLMs still show remarkable competence in specific tasks, indicating potential pathways that bypass the sheer enormity of data traditionally associated with AGI deployment.

Scaling reasoning models independently from data collection is also a promising strategy. This dual approach may result in smarter models at the ten-trillion or even hundred-trillion token scales, blending data insights with enhanced reasoning capabilities for superior output.