You are viewing a single comment's thread from:

RE: LeoThread 2025-03-06 10:50

Alibaba's QwQ-32B operates with 32 billion parameters compared to DeepSeek's 671 billion parameters with 37 billion parameters actively engaged during inference — the process of running live data through a trained AI model in order to generate a prediction or tackle a task.

Parameters are variables that large language models (LLMs) — AI systems that can understand and generate human language — pick up during training and use in prediction and decision-making. A lower volume of parameters typically signals higher efficiency amid increasing demand for optimized AI that consumes fewer resources.

Alibaba said its new model achieved "impressive results" and the company can "continuously improve the performance especially in math and coding."