You are viewing a single comment's thread from:

RE: LeoThread 2025-02-21 10:06

in LeoFinance2 months ago

Distillation is a process of extracting knowledge from a larger AI model to create a smaller one. It can allow a small team with virtually no resources to make an advanced model.

A leading tech company invests years and millions of dollars developing a top-tier model from scratch. Then a smaller team such as DeepSeek swoops in and trains its own, more specialized model by asking the larger "teacher" model questions. The process creates a new model that's nearly as capable as the big company's model but trains more quickly and efficiently.