You are viewing a single comment's thread from:

RE: How To Set Up A Hive Witness Using HIAB At Version 1.27.2

in HiveDevs3 years ago

Hi,

my blockchain version is 1.25.0

./run.sh dlblocks --> is up to date
./run.sh replay --> last step is database.cpp:5884 apply_hardfork HARDFORK 11 at block 3276913
./run.sh start --> return the following error :

1174662ms database.cpp:170 open ] Opened a blockchain database holding a state specific to head block: 3348714 and last irreversible block: 3348691
1174670ms database.cpp:183 operator() ] Blockchain state database is AT IRREVERSIBLE state specific to head block: 3348714 and LIB: 3348691
1174672ms database.cpp:248 open ] 10 assert_exception: Assert Exception
revision() == head_block_num(): Chainbase revision does not match head block num.
{"rev":0,"head_block":3348714}
database.cpp:192 operator()
1174672ms database.cpp:248 open ] args.data_dir: /steem/witness_node_data_dir/blockchain args.shared_mem_dir: /shm/ args.shared_file_size: 26843545600
1174675ms chain_plugin.cpp:538 open ] Error opening database. If the binary or configuration has changed, replay the blockchain explicitly using --replay-blockchain.
1174675ms chain_plugin.cpp:539 open ] If you know what you are doing you can skip this check and force open the database using --force-open.
1174675ms chain_plugin.cpp:540 open ] WARNING: THIS MAY CORRUPT YOUR DATABASE. FORCE OPEN AT YOUR OWN RISK.
1174675ms chain_plugin.cpp:541 open ] Error: {"code":10,"name":"assert_exception","message":"Assert Exception","stack":[{"context":{"level":"error","file":"database.cpp","line":192,"method":"operator()","hostname":"","timestamp":"2021-12-16T19:19:34"},"format":"revision() == head_block_num(): Chainbase revision does not match head block num.","data":{"rev":0,"head_block":3348714}},{"context":{"level":"warn","file":"database.cpp","line":248,"method":"open","hostname":"","timestamp":"2021-12-16T19:19:34"},"format":"rethrow","data":{"args.data_dir":"/steem/witness_node_data_dir/blockchain","args.shared_mem_dir":"/shm/","args.shared_file_size":"26843545600"}}]}

any idea ?

Thanks

Sort:  

I just ran into this. You need to invoke ./run.sh shm_size 25G or maybe even 30G to avoid the hive node from crashing and corrupting the database this way.