You are viewing a single comment's thread from:

RE: Token Price Doesn't Affect Fee Price

in LeoFinance6 months ago

I agree on the conclusion that fee depends on what users are willing to pay, since if it gets too expensive they just stop using the service, but I don't agree with the pessimism.

Can we increase it without incurring a cost or stress-testing bottlenecks? No.

Yes we can. We've already tested that completely filled 2 MB blocks (currently max allowed for witnesses to set) with over 7k transactions per block (somewhat mainnetish mix of comments, votes, transfers and custom jsons) are totally fine for normal desktop PCs with a lot of margin. That's about consensus (and witness) nodes. I fully expect that after we implement the optimizations that are already on the wish list (and we know how to make them), consensus node should be possible to run on an old smartphone level of hardware (by old I mean something like my yet-to-be-replaced SG9+ 😉). Of course it gets more hardware hungry when you want to run HAF, Hivemind and whatever other apps are going to sprout in the future to push price of Hive to 2k$. While normal servers can handle the data intake from such large blocks (tested), amount of data is going to be challenging for PostgreSQL - the database grows quite fast and we are yet to perform prolonged tests in such environment. But that only means we are going to need to work on the solutions. F.e. we could have slower but more capacious servers to handle relatively rare queries for old operations and fast servers that only keep data from last month and handle most of the load (requires implementation of HAF pruning). The bulk of the cost is likely to be not in the hardware, but in network connections. But with expensive Hive it should not be a problem 😊 and even backup witnesses should be able to afford to have "full nodes". Third party apps are likely have their own servers where they only keep operations that are needed for their specific app.

Sort:  

I would say this post is extremely optimistic in that the problem everyone says Bitcoin has... doesn't exist. And as always I make sure to point out that crypto is a collaboration and not a competition. And the problem I voiced was then counterpointed.

To say that we've tested our ability to put 21 Terabytes of data on chain a year is exactly the type of wild and completely unsubstantiated claim I was pointing out to begin with. Imagine someone running as fast as they could for 30 seconds and then being like, "I tested running as fast as I could so yeah obviously I could do that for like 5 hours straight. Here's how far I'll get in 5 hours." No dude: in 5 hours you'll be dead like the guy they named the marathon after. Not only that but the test itself is run in a completely controlled and sterilized environment that couldn't possibly hope to mimic the real thing in all its nuances.

It is not mathematically possible to prevent Hive bandwidth from having value. Doesn't matter how big the blocks are. It will have value if the free-market says it has value. If someone can swoop in and make $100 worth of profit by filling up our blockspace they're going to do it. And they will start paying for bandwidth if they run out and it's still profitable. Now there's a secondary market for buying bandwidth. Bandwidth now has value. The blocksize is completely irrelevant to this equation.

21 TB of data per year is more than manageable. Not by some yet to be developed future tech, but by today's equipment that has a price tag. With 2k$ HIVE that hardware becomes affordable even for backup witnesses.

As for the claim that consensus nodes are able to handle sustained max block traffic, it can be made because state does not really grow with bigger blocks. The things that grow (assuming the same mix of operation types as in mainnet) are number of comments and account related data, but these are subject to optimizations (points 7 and 8), at which point the growth really happens on disk rather than RAM. In reality we should rather expect increase in use of custom jsons, because blogging won't suddenly attract many new users, but third party apps might. Custom jsons leave no mark in state of consensus nodes. Block log growing too big? Split block log and pruning is getting final touches, but is already merged into develop. The only problem with such large block log will be replay, but many people already use a workaround by downloading prepared snapshots rather than performing full replay.

There is a lot of room between current level and max possible traffic, a lot of room between current price and 2k$. It won't suddenly jump between the two leaving us surprised. Also unlike Bitcoin, where changes are hard, Hive has no trouble in adapting. Price increase would be beneficial, because we could afford to finance development from DHF, which is not the case today.

This is all great to hear and everything but I'd like to refer back to the title of the post:
Token Price Doesn't Affect Fee Price

The demand to use the chain and the token price of the chain are two different things.
If we are offering free free free no matter what the market throws at us we are going to drown.
Free things get exploited. This is a known and indisputable fact.

You're assuming token price has to go up if demand to use the chain goes up.
It doesn't.
The demand to use the chain will go up because we are offering free service.
And people can leverage that free service into money in their own pocket.
That's not sustainable for us.

You are coming at this from a technical aspect, which is great.
But there is also an economic and business aspect that you're completely ignoring.
Honestly I do think it will work out fine in the end it just won't be pretty and many curveballs will be thrown.

But Hive does not offer free service. You can buy into the chain (vest) and gain the right to use it roughly in proportion to your share. Compared to other chains, in particular those that charge direct fees, we have the following:

  • whales are effectively subsidizing the plankton, because they can't really use their whole share - only in the unlikely scenario of RC costs shooting up drastically pushing smaller accounts out, whales will still be able to operate (so an app that absolutely has to have unhindered access to the chain has to have big stake)
  • users don't compete for place on the chain by pushing the fees up - while consensus-wise it is up to witnesses, default behavior is to place transactions in order of their appearance, so when the RC costs are normal and the limiting factor is the block size, smaller users are not cut off the chain
  • users can gain stake not only by investing money, but also by investing time and skills - if the on chain traffic increases due to proliferation of third party apps and the token price will make direct investments unaffordable to normal people, that's when gaining stake through blogging (or from DHF) will be easier

But you are right. It is possible that Hive will get 100-200 new whales representing some applications, those whales will flood the chain with their custom operations, RC costs will increase to the point where normal people can't afford to operate, which will drive them out. Token price would increase, but not to the point where node operators could suddenly afford much higher class equipment.
I don't believe in above scenario for one reason - if the application only needed the chain and not the community, it would be more reasonable to just set new chain with Hive code. Such apps could still be connected to Hive, operating on its own chain (can still be decentralized) for the most part, but broadcasting to Hive transactions that are needed for future replays. In fact recent developments (under the disguise of "lite accounts") seem to point in that direction.