The Way Web 3.0 Can Compete With Centralized AI

in LeoFinance15 days ago

It seems lost.

Big Tech is completely dominating AI. The money required to train a LLM is absurd. We are looking at tens of billions required. Hardware is getting more expensive since ever more powerful systems are needed. Data is getting harder to come by.

The combination means that only a few major corporations can play in this realm. We know the names. In the West we are dealing with Meta, Google, Amazon, Elon Musk, and OpenAi (backed by Microsoft).

How are startups, which is what Web 3.0 is, going to compete? Do we simply hand the crown over to Big Tech and just try to survive under their domain?

We can find the answer to these question in the nature of digital technology. Here is where Web 3.0 can start to focus attention in an effort to put forth some competition.


Image generated by Ideogram

The Way Web 3.0 Can Compete With Centralized AI

Big Tech's advantage is size.

Obviously, when discussing large language models, we will see tens of trillions of tokens being used. The last version of Llama used 16 trillion tokens. It is a forgone conclusion that Llama 4.0 will use a much higher number.

Again, the question of how to compete with that comes up. To start, Web 3.0 doesn't have the data. Secondly, the compute required to train this is astounding.

Fortunately, we have Moore's Law on steroids. In the AI world, compute is quickly outdated. Companies such as Meta and Google are constantly upgrading their hardware. Simply follow what NVIDIA is doing. They are rolling out the Blackwell series, which makes the earlier generations of Hopper compute obsolete.

At least this is the case for the larger companies.

For Web 3.0, it is the starting point. Since the resources of limited, these older systems can offer an entryway. When using smaller datasets, more specific to the desired outcome, Web 3.0 companies can start in a manner that differentiates it from the traditional companies.

In addition, when we look at inference, this can be pushed to the edge, since dealing with smaller models altogether.

Here the heavy lifting, i.e more complex stuff, is left to the larger models.

In other words, we start to see niches forming.

Censorship

One only needs to use Google Gemini to see how these models are trained to censor ideas. It also is programmed to have a political bias.

While Google stands out the most, we see this with all the major models. Part of this is due to the data it is trained upon. Nevertheless, we see an opportunity for Web 3.0.

One of the characteristics of blockchain is censorship resistant. Anything that is posted to chain cannot be altered. We also have the fact that people only need their key to write to the database.

This can provide a powerful foundation for AI models. Using that data, along with algorithms that eliminate the bias or censorship, we can see the opportunity for models that could have larger appeal. This is compounded due to the fact that AI results could them be posted to a blockchain.

Decentralized inference is also going to be key. Not only does it cost a fortune to run inference, there will be a bottleneck. Jensen Huang alluded to this in one of his talks. He feels we will see a billionfold increase in this over the next decade.

Each time someone prompts an AI system, inference is being used.

Web 3.0 systems can help to address this by running smaller models that utilize edge compute. Cryptocurrency is ideal for this since micropayments are required. It is the basis for a new Internet structure, a radical altering of the old client-server architecture.

Fill In The Gaps

There are plenty of niches that get overlooked. When we combine that with the fact that Big Tech codes in biases, we can see opportunity.

Web 3.0 can start the process by filling in the gaps. It can also foster a lower cost, more competitive environment. This is where targeted design, across the entire spectrum, will take hold.

For too long, the idea of Web 3.0 was simply to replicate Web 2.0.

When it comes to AI, it is impossible to replicate what the major corporations are doing. That said, a more focused, streamlined approach can yield enormous benefits.

In other words, provide something that people cannot find elsewhere.

This is the path that Web 3.0 can follow. Integrating AI tools into the platforms is crucial. However, it has to move far beyond that.

Ultimately, the entire AI stack should be targeted. Over time, this can happen. Of course, to go from zero to fully decentralized is impossible. Companies have to understand this when they are approaching the development of AI models.

The idea is to start small and expand. Niche products will not gain billions of users but it can start the move to thousands.

It is the starting point for competing with the likes of Google and Meta.


What Is Hive

Posted Using InLeo Alpha

Sort:  

I wonder if open source distributed computing projects could help with at least the compute portion. I'm thinking specifically of BOINC but there are other options. If the work for training can be broken up appropriately, something like that could provide a lot of computing power for "free".

I asked AI what niches Web3 can compete with Big Tech. Among 10, I like two:

  • Tokenized AI-Driven Content Platforms
  • Open-Source AI Educational Platform

I do wonder what AI will be like in 2025.