I spent a lot of time yesterday, and today installing and playing with Llama 3.2's 1B model on my local machine. So far, it's a lot worse than ChatGPT, but that's expected. My PC isn't strong, too.
I'm testing the limits of what my PC can handle. Just increased the context length to 16,000 tokens and my PC haven't crashed yet.
https://inleo.io/threads/view/ahmadmanga/re-leothreads-lv3eqtg3?referral=ahmadmanga
https://inleo.io/threads/view/ahmadmanga/re-leothreads-2wag6bjpn?referral=ahmadmanga