That is certainly productive.
What happens there is two fold:
- more data is generated by LeoAI, feeding more into the model
- the synthetic data (generated by AI) is engaged with by humans, increasing the value of the synthetic data as comments/replies take place.
Maybe @khaleelkazi already thought about this or should I send a feedback notice to him?
Was just thinking about this because I did just this as a test on a article and that worked well.
I am not following.
Back to the point, the key is, as always, data. The more that is generated, the better LeoAI will function. Obviously, human content is required. However, synthetic days is important.
When Leo's chatbot rolls out, we have to be generating at least 10 million tokens per day.
Well Im looking into if I could build something that could use like chatgpt or something to generate a answer until we have leoai.
@mightpossibly could be the one to ask that to. He seems to know a great deal about it.
But there are ways to do things like that.
So basically, @anderssinho, a bot that automatically
That's a cool idea! And shouldn't be neither too complicated to create or costly to run. The llm cost would actually probably be so cheap that even if the bot posted a thousand questions a day, it probably wouldn't spend more than a few cents worth of tokens (a small model would suffice plenty for this, like gpt-4o-mini or claude-3.5-haiku)
I'd be happy to help you get started if you want to try and build it yourself.