AI is mining the sum of human knowledge from Wikipedia. What does that mean for its future?
Ever notice how more people are relying on AI for answers instead of going straight to the source? It’s becoming a thing, and while AI is great for quick info, it’s causing a bit of a ripple effect. For instance, Wikipedia, our go-to for crowd-sourced knowledge, might feel the impact in the long run. Fewer people are visiting the site directly, which means fewer contributions. So far, Wikipedia hasn’t taken a hit in traffic, but there’s a bigger issue at play—AI tools don’t always give proper credit or links back to original sources. This could open the floodgates to a wave of misinformation, which is pretty concerning.
Actually that isnt how it works. This is why Friday's Lions Den will be about LeoAI.
Generative AI is not a mirror of what it is trained on. That is why it is not exactly a replacement for search. It is trained on the data and the model weighs probabilities on the next token. That is how it comes up with the answers.
Think of it as glorified autofill.
However it comes up with the answers, people seem to actually use it more as a replacement for search and rely on these “answers“.
Well people are confused by what it is. That is like using an image generator and expecting a film to come out of it.
The chatbots are not a direct replacement for search, at least not at this point. It is like taking to a very knowledgeable friend. Of course, the more interaction, the more the bot learns what you are looking for.
This is the problem. People get upset about AI when they have the ability to help train it but will not.