2nd update of 2023: Moving to Ubuntu 22, binary storage for hive operations in HAF

in HiveDevs2 years ago (edited)

blocktrades update.png

Below are a few highlights of the Hive-related programming issues worked on by the BlockTrades team since my last report.

Hived (blockchain node software)

One of the most notable points for developers and users of hived is that we're moving to Ubuntu 22 as the offically supported platform for the next release of hived. There's two main reasons for this: 1) we want to use a new version of rocksdb that requires a faster file IO library that was added in Ubuntu 22 and 2) it will allow us to deprecate support of Postgres 12 in favor of Postgres 14 (these two often behave differently in terms of query performance, so it will simplify releases of haf apps that consistently perform well).

In order to move to Ubuntu 22 we also needed to solve some issues with the OpenSSL librariy used by hived as support for RIPEMD169 was removed from the library and hived uses this algorithm in several places.

As part of the upgrade to the new version of rocksdb, we also moved it out of the core hived repo, and rocksdb is now a submodule referenced by the main repo. This will make it easier and cleaner to upgrade to later versions of rocksdb in the future.

Other changes to hived include:

  • switching from using the fc class fc:uint128 to the native C++ uint128 type: https://gitlab.syncad.com/hive/hive/-/merge_requests/804
  • Tests for massive recurrent transfers to verify it according to relaxed rules to be added at HF28
  • Fixed problem related to silent truncation of fixed_string type (it silently cut off Hive account names that were too long).
  • debug_node_plugin API improvements (added function to simplify setting up VESTS/HIVE price for tests)
  • Fixes related to old cli_wallet stability (previously it could randomly crash on automated test runs while shutting down and requiring multiple test runs at times).
  • Extending unit test suite related to account history API calls
  • Preprequisites for changes required by new binary HAF storage implementation.
  • Simplifying (and speeding up) preparing of hived 5 million block data instance for automated tests.

Hive Application Framework (HAF) and related apps

A lot of work lately has focused on using binary storage of hive operations inside HAF databases, replacing the json format that was previously used. This change lowered the size of a fully populated HAF database by 700GB. Once we proved the basic theory was feasible, we've been working on measuring and optimizing performance of the new storage format.

We also wrote some new regression tests to verify the HAF database update procedure (the way an existing HAF database gets upgraded when the schema or core code of a HAF database needs to be changed).

Generic devops tasks

We updated the base docker images used by our docker containers (these are used both for CI testing and for production deployment) in various ways, including moving to Ubuntu 22.

We also created common CI jobs useful for publishing official docker images for various code repos (e.g. hived, HAF, hivemind, hafah, and other haf apps).

Some upcoming tasks

  • Re-use docker images in CI tests when possible to speedup testing time (in progress)
  • Continue work on HAF-based block explorer
  • Collect benchmarks for a hafah app operating in “irreversible block mode” and compare to a hafah app operating in “normal” mode (low priority)
  • Finalize plans for changes to hived and HAF in the coming year that the BlockTrades team will be working on.
  • Work out schedule for creation of various HAF-based applications (most important one here will be a HAF-based smart contract processing engine).
Sort:  

HAF-based smart contracts. Loving the sound of that.

Yes, I think it will be one of the most important things we do and we've been working towards this goal for a long time now.

Awesome stuff!

Me too!

First of all, we definitely needed this. The RipeMD deprecation has been an annoying problem for some time now.

Second, not sure if this is referenced anywhere but out of curiosity why was rocksDB selected as the NoSQL Database wrapper as opposed to the popular Redis?

As far as I can tell, redis is focused on being an in-memory database, while rocksdb is more tuned for file storage. Rocksdb was introduced to lower the memory footprint of hived by moving the account history data out of memory to disk storage, so it was a natural choice.


~~~ embed:1623784138854522882 twitter metadata:MTg4NDc3MTkxMnx8aHR0cHM6Ly90d2l0dGVyLmNvbS8xODg0NzcxOTEyL3N0YXR1cy8xNjIzNzg0MTM4ODU0NTIyODgyfA== ~~~

~~~ embed:1623942363289726976 twitter metadata:NDYxNDYwNzZ8fGh0dHBzOi8vdHdpdHRlci5jb20vNDYxNDYwNzYvc3RhdHVzLzE2MjM5NDIzNjMyODk3MjY5NzZ8 ~~~
~~~ embed:1623946034282987521 twitter metadata:Mjc3MDY0MzZ8fGh0dHBzOi8vdHdpdHRlci5jb20vMjc3MDY0MzYvc3RhdHVzLzE2MjM5NDYwMzQyODI5ODc1MjF8 ~~~

~~~ embed:1624274175086178304 twitter metadata:MzA2MzQ2Njg2MXx8aHR0cHM6Ly90d2l0dGVyLmNvbS8zMDYzNDY2ODYxL3N0YXR1cy8xNjI0Mjc0MTc1MDg2MTc4MzA0fA== ~~~
The rewards earned on this comment will go directly to the people( @creacioneslelys, @shiftrox, @hive-blockchain, @acidyo, @seckorama, @virtualgrowth ) sharing the post on Twitter as long as they are registered with @poshtoken. Sign up at https://hiveposh.com.

Should that be 2023 in the title? Time is rushing by.

Good to hear you are still finding optimisations. Hive could take off at any time and we need to be ready to handle it.

Hive time machine being added in next fork ;)

How will that affect immutability? Time travel is likely to encourage vote manipulation. I'm against it.

!BEER

Curation snipers love it

Yes, fixed now.

Hurra por @blocktrades la mejor plataforma del mundo.

Hurra por @blocktrades la mejor plataforma del mundo.

thanks a lot for the information

Is there anything else that uses rocksdb other than the account_history_rocksdb plugin that was supposedly deprecated in favour of sql_serializer?

No, rocksdb is currently only used by account_history_rocksdb (which is deprecated in favor of sql_serializer). But despite the deprecation, account_history_rocksdb will probably be used for a long time just because it takes less resources than a full HAF database. And there's also a possibility to use it for a better version of something like MIRA eventually (i.e. to store more data that is currently stored in chainbase).

Yes, yes, yes. Comment index and all account related data should be kept in RocksDB, as that constitutes vast majority of state consumption while only small fraction of it is needed at any given time.

Congratulations @blocktrades! Your post has been a top performer on the Hive blockchain And you have been rewarded with the following badge

Post with the highest payout of the day.

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

Check out our last posts:

The Hive Gamification Proposal
Support the HiveBuzz project. Vote for our proposal!

Excelente, seguimos mejorando el rendimiento y la escalabilidad de la Blockchain de Hive

This change lowered the size of a fully populated HAF database by 700GB.

That is a lot and should make running a HAF node more affordable to people.

Work out schedule for creation of various HAF-based applications (most important one here will be a HAF-based smart contract processing engine).

Is this the announcement of the planning of something that might show up in a future announcement?

Tests for massive recurrent transfers to verify it according to relaxed rules to be added at HF28

Speaking of the next hard fork, the fact that it seems the SEC is looking to ban staking on exchanges, it provides an opportunity for Hive to expand the fixed income market. Perhaps having time vaults added in the next HF is a good idea. Get people to lock up their HBD for extended period of times to enable the creation of bond-type systems on layer 2 while also providing more fixed income options on the base layer.

  1. It's basically a task to fill out a roadmap for the work to be done by our team this year (which will be a public roadmap, of course).
  2. The time vault idea is definitely interesting. I'll give it some thought (or maybe @howo will).

also there was a post some weeks ago about the idea of being able to delegate HBD like Hive; the author made some good points:

https://hive.blog/hive-167922/@rubencress/case-study-hbd-savings-delegation

I like the idea of using that for subscriptions. I gave it a little bit of thought and the implementation seems quite easy.

LOL that sounds like the time vault task was just delegated in one sentence.

IMHO time vaults fit another, much wider topic - making "free floating HBD balances". Basically what other coins, especially PoW, have - sending coins to an address rather than account. It is needed for privacy, but at the same time it would enable us to benefit from all the improvements made in the field of different types of signatures, some of which, I believe, enable time locking. Of course there are some downsides - since such balances won't be tied to stake (because it would defeat their purpose), transfers out of them would need to be paying transaction fees (some witness-established RC-to-HBD equivalent) unless signer mixed such transfer with some regular RC paid operation (which would reveal connection to the stake - either fee-less or private, not both). There is a lot of research to be made before we can start doing any coding though.

Muchas gracias por informar todo las nuevas características que tendrá Hive, de esta manera sabemos que usar , y que podemos ejecutar con Hive para mi la mejor plataforma, que siga creciendo.

$WINE


Congratulations, @theguruasia You Successfully Shared 0.200 WINEX With @blocktrades.
You Earned 0.200 WINEX As Curation Reward.
You Utilized 2/4 Successful Calls.

wine_logo


Contact Us : WINEX Token Discord Channel
WINEX Current Market Price : 0.152


Swap Your Hive <=> Swap.Hive With Industry Lowest Fee (0.1%) : Click This Link
Read Latest Updates Or Contact Us

!PGM
!PIZZA

Sent 0.1 PGM - 0.1 LVL- 1 STARBITS - 0.05 DEC - 1 SBT - 0.1 THG - 0.000001 SQM - 0.1 BUDS tokens to @torran

remaining commands 1

BUY AND STAKE THE PGM TO SEND A LOT OF TOKENS!

The tokens that the command sends are: 0.1 PGM-0.1 LVL-0.1 THGAMING-0.05 DEC-15 SBT-1 STARBITS-[0.00000001 BTC (SWAP.BTC) only if you have 2500 PGM in stake or more ]

5000 PGM IN STAKE = 2x rewards!

image.png
Discord image.png

Support the curation account @ pgm-curator with a delegation 10 HP - 50 HP - 100 HP - 500 HP - 1000 HP

Get potential votes from @ pgm-curator by paying in PGM, here is a guide

I'm a bot, if you want a hand ask @ zottone444


🍕 PIZZA !

I gifted $PIZZA slices here:
@torran(9/10) tipped @blocktrades (x1)

Send $PIZZA tips in Discord via tip.cc!

Amazing report from the team.

Hive application framework Smart Contract is an outstanding development 👌

How to do block trade, need guidance