It's comparable to what you've been talking a lot about lately. As you know, current LLMs are trained on "everything", i.e. not specialized. This is what LeoAI will bring to the table; still trained on everything, but in addition it's specialized on Hive data.
This example is basically the same concept, but for coding. It allows you to locally fine tune your programming environment to gain access to up-to-date documentation for specific software libraries. Very useful when you're building hive applications for instance. Sure, most LLMs can make hive applications without it, but often, specific functions (and especially new ones) are not part of the training data, making it very difficult to create working code. This is a major hurdle that this prompting-tip solves.
Ah okay. Got it now. This is something very cool then. Honing the direction and spreading the development out towards the edges.
Very cool indeed