For its initial release, Abacus.ai has applied its so-called “Dracarys recipe” to the 70B parameter class of models. The recipe involves optimized fine-tuning among other techniques.
“It’s a combination of training dataset and fine-tuning techniques that improve the coding abilities of any open-source LLM,” Bindu Reddy, CEO and co-founder of Abacus.ai told VentureBeat. “We have demonstrated that it improves both Qwen-2 72B and LLama-3.1 70b.”