Yea, definitely, but what I probably are more excited about though, is not the model itself, but that the team is flexible and able to adapt LeoAI almost on the spot. There will always be a better model out tomorrow
You are viewing a single comment's thread from:
That is true although some are thinking that Llama is going to be the open source standard.
Do you agree with that?
everything is pointing in that direction. Who else can afford tens of thousands of $3k graphics cards simultaneously to train the models (don't cite me on that exact number, but it was approximately what I heard in one of Matthews videos.
It is shaping up to be very interesting. We are going to see a massive amount of rollouts in the next year. Hopefully #leoai incorporates as much as it can.
That is where huge value can be quickly derived.