In this post I wrote about my first experiments, and in this post I wrote about some first workng code of the asynchonous python library for HIVE that I'm working on.
<p dir="auto">I've done a little bit more work, and things are starting to get together now. I've updates <a href="https://github.com/pibara/aiohivebot/tree/main" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">the code with docstrings and fixed the code so that basic linters don't complain. The code isn't up to production ready state yet (that's why I haven't pushed anything to pypi yet), but it's getting closer. There are now a few extra demo scripts in the github repo to demonstate some different types of callbacks and their usage. <p dir="auto">First let's revisit the basic working of the library. <h1>Auto scanning of node sub-API's. <p dir="auto">Let's first let things come together. When I wrote my post about my first experiment <a href="https://peakd.com/hive-139531/@pibara/paving-the-way-for-a-new-async-python-hive-library-aiohivebot--querying-the-nodes-to-create-a-config-json" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">here<a href="https://developers.hive.io/apidefinitions/#jsonrpc.get_methods" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">jsonrpc.get_methods API call.<span>, <a href="/@mahdiyari">@mahdiyari, <a href="/@techcoderx">@techcoderx and <a href="/@blocktrades">@blocktrades gave me valuable insights into the current disconnect between sub-APIs published by the nodes, and sub-API's actually supported. Not all the supported API's get published through the <p dir="auto"><a href="https://github.com/emre/hived-rpc-scanner" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">his efforts at a simple scanner, and a quick look at his code set me on the right track.<span>To the same post, <a href="/@emrebeyler">@emrebeyler commented with a reference to <p dir="auto">Right nos, in the <strong>aiohivebot library, every 15 minutes a scan in done of each one of the 13 public API nodes, starting of with a call to <a href="https://developers.hive.io/apidefinitions/#jsonrpc.get_methods" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">jsonrpc.get_methods, followed by a probe of most of the sub-APIs that werent advertised by the node. <p dir="auto">In the first version of the code this was all done under the hood, but in the latest version I've added a hook that a bot may use. <pre><code>import asyncio from aiohivebot import BaseBot class MyBot(BaseBot): """Example of an aiohivebot python bot without real utility""" def node_api_support(self, node_uri, api_support): print("NODE:", node_uri) for key, val in api_support.items(): if not val["published"]: print(" -", key, ":", val) pncset = MyBot() loop = asyncio.get_event_loop() loop.run_until_complete(pncset.run(loop)) <p dir="auto">After each scan of a public API node, the node_api_support method will be called if our subclass defines the method. The above example code prints info about the nodes to standard out. <p dir="auto">Here is the result of the above code running for a few minutes: <pre><code>NODE: api.deathwing.me - follow_api : {'published': False, 'available': True} - account_history_api : {'published': False, 'available': True} - bridge : {'published': False, 'available': True} NODE: hived.emre.sh - follow_api : {'published': False, 'available': True} - bridge : {'published': False, 'available': True} NODE: rpc.ausbit.dev - follow_api : {'published': False, 'available': True} - account_history_api : {'published': False, 'available': True} - bridge : {'published': False, 'available': True} NODE: rpc.mahdiyari.info - follow_api : {'published': False, 'available': True} - account_history_api : {'published': False, 'available': True} - reputation_api : {'published': False, 'available': False} - bridge : {'published': False, 'available': True} NODE: hive-api.arcange.eu - follow_api : {'published': False, 'available': True} - account_history_api : {'published': False, 'available': True} - bridge : {'published': False, 'available': True} - wallet_bridge_api : {'published': False, 'available': False} NODE: hive-api.3speak.tv - follow_api : {'published': False, 'available': True} - account_history_api : {'published': False, 'available': True} - bridge : {'published': False, 'available': True} NODE: api.openhive.network - jsonrpc : {'published': False, 'available': True} - follow_api : {'published': False, 'available': True} - account_by_key_api : {'published': False, 'available': True} - market_history_api : {'published': False, 'available': True} - account_history_api : {'published': False, 'available': False} - database_api : {'published': False, 'available': True} - rc_api : {'published': False, 'available': True} - reputation_api : {'published': False, 'available': True} - network_broadcast_api : {'published': False, 'available': False} - bridge : {'published': False, 'available': True} - block_api : {'published': False, 'available': True} - transaction_status_api : {'published': False, 'available': True} - condenser_api : {'published': False, 'available': False} - wallet_bridge_api : {'published': False, 'available': False} NODE: api.hive.blog - follow_api : {'published': False, 'available': True} - bridge : {'published': False, 'available': True} NODE: techcoderx.com - follow_api : {'published': False, 'available': True} - account_history_api : {'published': False, 'available': True} - bridge : {'published': False, 'available': True} NODE: anyx.io - follow_api : {'published': False, 'available': True} - transaction_status_api : {'published': False, 'available': False} - bridge : {'published': False, 'available': True} - wallet_bridge_api : {'published': False, 'available': False} NODE: hive.roelandp.nl - follow_api : {'published': False, 'available': True} - transaction_status_api : {'published': False, 'available': True} - bridge : {'published': False, 'available': True} - wallet_bridge_api : {'published': False, 'available': False} NODE: api.hive.blue - jsonrpc : {'published': False, 'available': False} - follow_api : {'published': False, 'available': True} - account_by_key_api : {'published': False, 'available': True} - market_history_api : {'published': False, 'available': True} - account_history_api : {'published': False, 'available': True} - database_api : {'published': False, 'available': True} - rc_api : {'published': False, 'available': True} - reputation_api : {'published': False, 'available': True} - network_broadcast_api : {'published': False, 'available': False} - bridge : {'published': False, 'available': True} - block_api : {'published': False, 'available': True} - transaction_status_api : {'published': False, 'available': True} - condenser_api : {'published': False, 'available': False} - wallet_bridge_api : {'published': False, 'available': False} NODE: hived.privex.io - jsonrpc : {'published': False, 'available': False} - follow_api : {'published': False, 'available': False} - account_by_key_api : {'published': False, 'available': False} - market_history_api : {'published': False, 'available': False} - account_history_api : {'published': False, 'available': False} - database_api : {'published': False, 'available': False} - rc_api : {'published': False, 'available': False} - reputation_api : {'published': False, 'available': False} - network_broadcast_api : {'published': False, 'available': False} - bridge : {'published': False, 'available': False} - block_api : {'published': False, 'available': False} - transaction_status_api : {'published': False, 'available': False} - condenser_api : {'published': False, 'available': False} - wallet_bridge_api : {'published': False, 'available': False} <p dir="auto"><strong>account_history<span>We see that indeed, as pointed out by <a href="/@mahdiyari">@mahdiyari that the <span> sub API is often not published but is available. It is clear that what is advertised is a subset of what is available. In some cases aparently on purpose because the sub-API is depricated, but in other cases because of technical reasons. <a href="/@blocktrades">@blocktrades pointed out that in the near future, new sub-APIs will run as REST API's complete with an available OpenApi definition on each node. I'll hope to keep track of this and integrate support when it becomes important. <h1>Basic operation hooks & JSON-RPC invocation <p dir="auto">Let's revisit the core hooks once more. Under the hood, all the nodes get queried about the last block they have available. For each node there is a seperate async task running. As soon as a node is found that has a block that hasn't been seen yet, that node is queried for that block by its in-library task, and the block is processed. The operations are extracted from the block, and depending on available methods in the class derived from BaseBot, these methods are called. <p dir="auto">In the below code, a <strong>vote_operation method is defined. and by doing so, all <strong>vote_operation type operations get forwarded to this method. <pre><code>class MyBot(BaseBot): """Example of an aiohivebot python bot without real utility""" def __init__(self): super().__init__() self.count = 0 async def vote_operation(self, body): """Handler for cote_operation type operations in the HIVE block stream""" if "voter" in body and "author" in body and "permlink" in body: result = await self.bridge.get_post(author=body["author"], permlink=body["permlink"]) content = result.result() if content and "is_paidout" in content and content["is_paidout"]: print("Vote by", body["voter"], "on expired post detected: @" + \ body["author"] + "/" + body["permlink"] ) if self.count == 1000000: self.abort() self.count += 1 <p dir="auto">Other valid operations include: <ul> <li>custom_json_operation <li>transfer_operation <li>comment_operation <li>claim_reward_balance_operation <li>feed_publish_operation <li>transfer_to_savings_operation <li>comment_options_operation <li>limit_order_cancel_operation <li>limit_order_create_operation <li>account_update_operation <li>transfer_from_savings_operation <li>cancel_transfer_from_savings_operation <li>claim_account_operation <li>withdraw_vesting_operation <li>account_witness_vote_operation <li>witness_set_properties_operation <li>update_proposal_votes_operation <li>transfer_to_vesting_operation <li>create_claimed_account_operation <li>account_update2_operation <li>delegate_vesting_shares_operation <li>convert_operation <li>witness_update_operation <li>delete_comment_operation <li>collateralized_convert_operation <li>set_withdraw_vesting_route_operation <li>account_witness_proxy_operation <li>account_create_operation <li>change_recovery_account_operation <li>account_witness_proxy_operation <li>request_account_recovery_operation <li>recover_account_operation <li>recurrent_transfer_operation <p dir="auto">You can define a method for each of these, and they will work in a similar way. <p dir="auto">Let's run the above code for a bit. <pre><code>Vote by kesityu.fashion on expired post detected: @whitneyalexx/re-kesityufashion-20231020t12513710z Vote by pibara on expired post detected: @steevc/searching-for-hive-growth <p dir="auto">So what is happening. A stream of vote_operation operations comes into our bot vote_operation method, and for each vote operation, the <strong>bridge.get_post gets called. <p dir="auto">Under the hood, all nodes that support the <strong>bridge sub-API are sorted according to their most recent reliability stats, at the moment that is their request failure percentage (that is measured through a decaying average), and the HTTP request/responce latency (also decaying average). Right now sorting takes place on reliability first and latency second. If the first node fails the request, the second in the list is called. This way we attempt to get the best possible reliability even if some of the nodes are flaky. We will see this in the next section. <h1>Maintaining stats on all nodes <p dir="auto">There is an other <em>special method that a bot or DApp backend can define, and that is the <strong>node_status method. Like the <em>node_api_support method, this method gets called roughly every 15 minutes for each node, and it allows for the reporting of botyh usage and reliability stats for all of the nodes. <pre><code>class MyBot(BaseBot): """Example of an aiohivebot python bot without real utility""" def __init__(self): super().__init__() self.count = 0 async def vote_operation(self, body): """Handler for cote_operation type operations in the HIVE block stream""" if "voter" in body and "author" in body and "permlink" in body: result = await self.bridge.get_post(author=body["author"], permlink=body["permlink"]) content = result.result() if content and "is_paidout" in content and content["is_paidout"]: pass if self.count == 1000000: self.abort() self.count += 1 def node_status(self, node_uri, error_percentage, latency, ok_rate, error_rate, block_rate): print("STATUS:", node_uri, "error percentage =", int(100*error_percentage)/100, "latency= ", int(100*latency)/100, "ok=", int(100*ok_rate)/100, "req/min, errors=", int(100*error_rate)/100, "req/min, blocks=", int(100*block_rate)/100, "blocks/min" ) <p dir="auto">If we start this code and let it run for 20 minutes or so, the result is as follows: <pre><code>STATUS: api.openhive.network error percentage = 0.03 latency= 20.51 ok= 77.64 req/min, errors= 0.16 req/min, blocks= 2.99 blocks/min STATUS: api.deathwing.me error percentage = 0.0 latency= 42.8 ok= 102.45 req/min, errors= 0.0 req/min, blocks= 4.66 blocks/min STATUS: hive-api.arcange.eu error percentage = 0.0 latency= 58.7 ok= 20.13 req/min, errors= 0.0 req/min, blocks= 0.66 blocks/min STATUS: hive-api.3speak.tv error percentage = 0.0 latency= 65.13 ok= 22.57 req/min, errors= 0.0 req/min, blocks= 3.15 blocks/min STATUS: api.hive.blog error percentage = 0.0 latency= 678.47 ok= 20.6 req/min, errors= 0.0 req/min, blocks= 2.49 blocks/min STATUS: hived.emre.sh error percentage = 0.0 latency= 56.78 ok= 47.67 req/min, errors= 0.0 req/min, blocks= 4.3 blocks/min STATUS: anyx.io error percentage = 0.0 latency= 90.86 ok= 19.4 req/min, errors= 0.0 req/min, blocks= 0.0 blocks/min STATUS: rpc.ausbit.dev error percentage = 0.0 latency= 30.52 ok= 34.71 req/min, errors= 0.33 req/min, blocks= 6.94 blocks/min STATUS: rpc.mahdiyari.info error percentage = 0.0 latency= 49.2 ok= 21.98 req/min, errors= 0.0 req/min, blocks= 2.47 blocks/min STATUS: hive.roelandp.nl error percentage = 33.92 latency= 323.85 ok= 17.37 req/min, errors= 5.79 req/min, blocks= 0.33 blocks/min STATUS: techcoderx.com error percentage = 0.0 latency= 306.67 ok= 20.64 req/min, errors= 0.0 req/min, blocks= 3.13 blocks/min <p dir="auto">Let's display this in a more web friendly way. <div class="table-responsive"><table> <thead> <tr><th>node<th>error %<th>latency (ms)<th>ok/min<th>errors/min<th>blocks/min <tbody> <tr><td>api.openhive.network<td>0.03<td>20.51<td>77.64<td>0.16<td>2.99 <tr><td>api.deathwing.me<td>0.0<td>42.8<td>102.45<td>0.0<td>4.66 <tr><td>hive-api.arcange.eu<td>0.0<td>58.7<td>20.13<td>0.0<td>0.66 <tr><td>hive-api.3speak.tv<td>0.0<td>65.13<td>22.57<td>0.0<td>3.15 <tr><td>api.hive.blog<td>0.0<td>678.47<td>20.6<td>0.0<td>2.49 <tr><td>hived.emre.sh<td>0.0<td>56.78<td>47.67<td>0.0<td>4.3 <tr><td>anyx.io<td>0.0<td>90.86<td>19.4<td>0.0<td>0.0 <tr><td>rpc.ausbit.dev<td>0.0<td>30.52<td>34.71<td>0.33<td>6.94 <tr><td>rpc.mahdiyari.info<td>0.0<td>49.2<td>21.98<td>0.0<td>2.47 <tr><td>hive.roelandp.nl<td>33.92<td>323.85<td>17.37<td>5.79<td>0.33 <tr><td>techcoderx.com<td>0.0<td>306.67<td>20.64<td>0.0<td>3.13 <tr><td>api.hive.blue<td>77.15<td>38.37<td>0.0<td>0.0<td>0.0 <tr><td>hived.privex.io<td>100.0<td>24.52<td>0.0<td>0.0<td>0.0 <p dir="auto">We see a few nodes with bad reliability at the moment I ran this code. We see latency from a mere 20 msec upto almost 700 msec. Forhter we see a quite decent division of the query load between nodes. not perfect, but more than good enough. The blocks is interesting too. <h1>Persistence <p dir="auto">A Bot or DApp backend shouldn't restart often, but crashes, system manintenance, and other forseen or unforseen restarts happen, and for many applications you don't want to loose blocks due to downtime. To accomodate this, the BaseBot has an optional constructir argument that defines the last succesfully processed block by the bot. <p dir="auto">There is also a special method <strong>block_processed that (if exists) get called after a block has completely been processed. <p dir="auto">The comination of these two allows a bot or backend to maintain its streaming state between runs. <pre><code>class MyBot(BaseBot): """Example of an aiohivebot python bot without real utility""" def __init__(self): data = {"block": 1} try: with open("persistent.json", encoding="utf-8") as persistent: data = json.load(persistent) except FileNotFoundError: data = {"block": None} super().__init__(start_block=data["block"]) async def block_processed(self,blockno): print(blockno) data = {"block": blockno} with open("persistent.json", "w", encoding="utf-8") as persistent: json.dump(data, persistent, indent=2) <p dir="auto">The first time this sctipt runs, it will start at the latest block <em>now. <pre><code>79641551 79641552 79641553 79641554 79641555 79641556 79641557 79641558 79641559 79641560 <p dir="auto">We stop the script and start it again a few minutes later. <pre><code>79641561 79641562 79641563 79641564 79641565 79641566 79641567 79641568 79641569 79641570 79641571 79641572 <p dir="auto">You can't see it from the static outut, but when I run the code the second time, it speeded through the first 2/3 of the blocks because it started with the last unprocessed block and continued at full speed untill it reached the last block again. <h1>Custom json hooks <p dir="auto">We already saw the standard operation level methods we can define for our bot or backend. There is one special and often seen operation that needs some extra consideration, and that is <strong>custom_json_operation. <p dir="auto">A custom_json_operation is a layer-2 operation with a L2 defined <strong>id field.<br /> As we see with the PodPing example (pp_podcast_update) and the SplinterLands example (sm_sell_card) , this id can be used directly with the prefix <strong>l2. <p dir="auto">The aiohivebot lib currently has minimal support for two special id's: <ul> <li>notify <li>ssc_mainnet_hive (Hive-Engine) <p dir="auto">The <em>notify id is used for setLastRead operations, while Hive-Engine ssc_mainnet_hive operations are used for actions on contracts. You can still use the l2_notify and l2_ssc_mainnet_hive if you like, but the below code shows you can use the prefix <em>notify_ and <em>engine_ to get some basic pre-processing done by BaseBot. <pre><code>class MyBot(BaseBot): """Example of an aiohivebot python bot without real utility""" def __init__(self): super().__init__() self.count = 0 async def engine_market_buy(self, required_auths, required_posting_auths, body): """Hive Engine custom json action for market buy""" print("Hive-engine market buy", body, required_posting_auths + required_auths) async def engine_market_sell(self, required_auths, required_posting_auths, body): """Hive Engine custom json action for market sell""" print("Hive-engine market sell", body, required_posting_auths + required_auths) async def engine_market_cancel(self, required_auths, required_posting_auths, body): """Hive Engine custom json action for market cancel""" print("Hive-engine market cancel", body, required_posting_auths + required_auths) async def l2_sm_sell_card(self, required_auths, required_posting_auths, body): print("sm_sell_card", body, required_posting_auths + required_auths) async def l2_pp_podcast_update(self,required_auths, required_posting_auths, body): if "iris" in body: print("pp_podcast_update", body["iris"], required_posting_auths + required_auths) async def notify_setLastRead(self, required_auths, required_posting_auths, body): print("notify setLastRead", body, required_posting_auths + required_auths) <p dir="auto">If we run this code we get the following output. <pre><code>Hive-engine market cancel {'type': 'sell', 'id': '491e2a518c5e34ca2fe75f934ec20de2bc771368'} ['dtake'] pp_podcast_update ['https://media.rss.com/ilestecrit-balados/feed.xml'] ['podping.bbb'] Hive-engine market cancel {'type': 'buy', 'id': '3ab75469a6779f961acea283673c6c82b12a5315-2'} ['hivemaker'] Hive-engine market cancel {'type': 'buy', 'id': '3ab75469a6779f961acea283673c6c82b12a5315-3'} ['hivemaker'] Hive-engine market cancel {'type': 'sell', 'id': '3ab75469a6779f961acea283673c6c82b12a5315-5'} ['hivemaker'] pp_podcast_update ['https://feeds.buzzsprout.com/2200169.rss', 'https://feeds.buzzsprout.com/2212120.rss', 'https://feeds.transistor.fm/why-not-me', 'https://jogabilida.de/category/podcasts/podcast-naogames/jack/feed/podcast/'] ['podping.ccc'] notify setLastRead {'date': '2023-10-27T17:02:36'} ['cool08'] Hive-engine market cancel {'type': 'buy', 'id': 'b3197ab1adaf89967726d756600aaa3021a5c77d-2'} ['ricksens85'] Hive-engine market cancel {'type': 'buy', 'id': 'b3197ab1adaf89967726d756600aaa3021a5c77d-4'} ['ricksens85'] Hive-engine market cancel {'type': 'sell', 'id': 'b3197ab1adaf89967726d756600aaa3021a5c77d-6'} ['ricksens85'] Hive-engine market cancel {'type': 'sell', 'id': 'b3197ab1adaf89967726d756600aaa3021a5c77d-7'} ['ricksens85'] Hive-engine market buy {'symbol': 'SWAP.BTC', 'quantity': '0.00279993', 'price': '102769.27037365'} ['solovey6o2'] Hive-engine market buy {'symbol': 'VOUCHER', 'quantity': '748.945', 'price': '0.0942189'} ['solovey6o2'] Hive-engine market sell {'symbol': 'SWAP.ETH', 'quantity': '0.00729680', 'price': '5435.29216758'} ['solovey6o2'] Hive-engine market sell {'symbol': 'VOUCHER', 'quantity': '160.722', 'price': '0.09995838'} ['solovey6o2'] Hive-engine market sell {'symbol': 'CHAOS', 'quantity': '22', 'price': '2.50999776'} ['scr00ge'] pp_podcast_update ['https://jogabilida.de/feed/podcast/'] ['podping.bbb'] pp_podcast_update ['https://feeds.buzzsprout.com/2168018.rss'] ['podping.ccc'] Hive-engine market buy {'symbol': 'SPS', 'quantity': '8095.73689888', 'price': '0.04170037'} ['barmus81'] Hive-engine market buy {'symbol': 'SWAP.BTC', 'quantity': '0.00176493', 'price': '102769.27037365'} ['barmus81'] Hive-engine market sell {'symbol': 'SPS', 'quantity': '7527.95490463', 'price': '0.04349998'} ['barmus81'] Hive-engine market sell {'symbol': 'SWAP.ETH', 'quantity': '0.03290901', 'price': '5858.09807359'} ['barmus81'] Hive-engine market sell {'symbol': 'VOUCHER', 'quantity': '638.065', 'price': '0.09995836'} ['barmus81'] pp_podcast_update ['https://feeds.buzzsprout.com/1885314.rss', 'https://feeds.buzzsprout.com/885151.rss'] ['podping.aaa'] Hive-engine market buy {'symbol': 'SWAP.USDT', 'quantity': '15.911194', 'price': '3.11094974'} ['all.coin.hive'] Hive-engine market buy {'symbol': 'SWAP.BNB', 'quantity': '0.07178346', 'price': '695.88833874'} ['all.coin.hive'] Hive-engine market buy {'symbol': 'SWAP.HBD', 'quantity': '95.12527011', 'price': '3.1151696'} ['all.coin.hive'] <h1>Low level hooks <p dir="auto">In most normal operations, you will only use the persistence and operation level methods in your code, but in rare cases you might desire lower level hooks for blocks, transactions, and wildcard operations rather than specific ones. <p dir="auto">The below example shows how to define a method on thes three levels. <pre><code>class MyBot(BaseBot): """Example of an aiohivebot python bot without real utility""" async def block(self, block, blockno): """Handler for block level data""" print("block", blockno, "witness =", block["witness"]) async def transaction(self, tid, transaction, block): """Handler foe block level data""" print("- transaction", tid) async def operation(self, operation, tid, transaction, block): print(" +", operation["type"]) <p dir="auto">The result from shortly running this code. <pre><code>block 79641866 witness = themarkymark - transaction 993a4b561652b87e93701403b0d4eacc66f01e78 + custom_json_operation - transaction 457a0555fad3bec4d7945abe02ca062240ac5dfe + custom_json_operation - transaction 1897ca711b32ab8b3deda347db60898ea3914838 + custom_json_operation - transaction 8a00475a54b1b5ab0918a4cb903cda39ead2fc91 + custom_json_operation - transaction d77922160990a75e611292acf03c63d01e7c07cb + custom_json_operation - transaction a8d6789b5821852d0307726ebdc326b4ee91f629 + custom_json_operation - transaction ce1ad0ef49d7763c1a9bca87c004d4579cf5ecc5 + custom_json_operation - transaction 25c2b0e3b6558b2310685eccbb86ec175068cf9e + custom_json_operation - transaction c6cb3259478da5fd2e4d599e0d56c1505737e7fc + custom_json_operation - transaction a57b660faad5cea60d35074cecf181735b66b4e2 + claim_reward_balance_operation - transaction 022173d1d47358be4b0e72423cb8765a8fa3e243 + custom_json_operation - transaction 0d82a754e7bc2847d399ad93f69b139ad074dffc + custom_json_operation - transaction ece087593f36970aa100282e3a231bf6626730a8 + custom_json_operation - transaction 5754bb041f7c6e87e2b0f4b0f025ef34f2d73e9a + custom_json_operation block 79641867 witness = gtg - transaction 349ad34bd5bd01593441ec9efab6f5b659712413 + custom_json_operation - transaction f6deed8c6a98b2c0910a3520ebf6fc1e44ea48c7 + custom_json_operation - transaction 74514ed3406e157f0a59ad098fc31170b76e8a70 + custom_json_operation + custom_json_operation - transaction 51a58664202d92e8ad201e77316e6416aa387f7f + custom_json_operation + custom_json_operation - transaction 9ec361db9005236989033198027f56d3d855bdc9 + comment_operation + comment_options_operation - transaction d3deb7ad7ea2b4a166b8037637e623f6a977ae64 + custom_json_operation - transaction 66ef443164df9a4c8693db4cb5e7ff21ffd15532 + vote_operation - transaction e5e611d6300ec2be6fa1c378b387425864bf6697 + vote_operation - transaction 6ea47c6da95306d6ef5eaad95704819c3e80371d + vote_operation - transaction 834522e5608d45712e34ceac1b4ff2003b215977 + vote_operation - transaction 8d9dcab5438e26c8e7c6196f58dee0e191611125 + vote_operation - transaction fbedaa09b21efdad0d4f1e36db844acd5abc4f8e + vote_operation - transaction f57bfb4be9eff7f4beadfc363aecd816c438b56e + vote_operation - transaction 36a935fce466f4633339e5a61d6d1ee44975ab2c + custom_json_operation - transaction 2f966db4c04fffc02c30464f86a71b8c03cbc81c + custom_json_operation - transaction 34f962d4fb4127de1faf180001f7dd12c85d7506 + custom_json_operation - transaction 73361bd30fdb2124d734b379db2c0a5f6e061b22 + claim_reward_balance_operation - transaction c67ad1f2ee92643cd90d1e42fce397a0f2214069 + custom_json_operation - transaction 5631729bfad57a82ed5f32ca42e937292a973350 + custom_json_operation - transaction c35ecfb0515ddfbca5d9390cc1ef97e88a4c8688 + custom_json_operation - transaction 1f223adc4f883f963d91e8bbebf5ae0623ee5ff5 + custom_json_operation - transaction 30dcd4c14b4949dd7ab304d9516f01cc7d529d97 + vote_operation - transaction e1e789d2cc2ea0545afacc4f30d0189391231cde + comment_operation + comment_options_operation - transaction 4eb417cb263d12ecfe42e48af0d4973b92215ea2 + custom_json_operation <h1>Comming up <p dir="auto">We've already got quite some stuf running, but it's importnat to note that we aren't quite ready to push our lib to pypi yet. A few things need to be taken care of to make the whole code trustable enough to run production projects with. We are close, but not quite there yet. <p dir="auto">After that, there will be two important extra features that we will need to implenment in order to cover a wider range of usecases, including the Hive Archeology bot (the prime reason I have for making this library), and be ready for future sub-APIs. <h3>Robust JSON-RPC client code <p dir="auto">Right now the chain event streaming seems to be quite robust already, but the same can not yet be said for the JSON-RPC code. For one, programming errors and server failure aren't quite separated yet. Something that makes development less than ideal right now. I will be look into this next. <h3>Client side method/param checks <p dir="auto">Part of the above issue can be covered by the fact that currently the published API method fingerprints aren't used yet. Fo one because the scan possibly only partially provides the info. I need to look into how to properly integrate the available method fingerprints in such a way that client side failure is enabled. <h3>Push code to pypi <p dir="auto">Once we have the above two issues fixed, I'll push a first version to pypi. I won't spent a long post on that, but I;ll post a short post stating it's available. <h3>Broadcast operations <p dir="auto">This is a big one. Without this one over half of all possible usecases won't be possible, but as a large part will, I'm pushing a first version before this feature. The library needs to be extended to allow for signed <a href="https://developers.hive.io/apidefinitions/#jsonrpc.get_methods" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">broadcast operations. <h3>coinZdense extended broadcast and account update operations <p dir="auto">Apart from the Hive Archeology bot, this library is also meant as a testing ground for my coinZdense. I'm planing to add some extra hooks to the library to allow for coinZdense concepts to be added to user metadata and custom_json or custom_binary in order to shadow-run hash-based signatures on regular broadcast ops. I'll write more about this when regular broadcast ops are first up and running. <h3>REST support <p dir="auto">This one I need to dive in a bit deeper. I had no idea that anything with REST was happening with HIVE. But it's esential that the library will support new REST sub-APIs as soon as possible. <h1>Available for projects <p dir="auto">If you think my skills and knowledge could be usefull for your project, I am currently available for contract work for up to 20 hours a week. My hourly rate depends on the type of activity (Python dev, C++ dev or data analysis), wether the project at hand will be open source or not, and if you want to sponsor my pet project coinZdense that aims to create a multi-language programming library for post-quantum signing and least authority subkey management. <div class="table-responsive"><table> <thead> <tr><th>Activity<th>Hourly rate<th>Open source discount<th>Minimal hours<th>Maximum hours <tbody> <tr><td>C++ development<td>150 $HBD<td>30 $HBD<td>4<td>- <tr><td>Python development<td>140 $HBD<td>30 $HBD<td>4<td>- <tr><td>Data analysis (python/pandas)<td>120 $HBD<td>-<td>2<td>- <tr><td>Sponsored coinZdense work<td>50 $HBD<td>-<td>0<td>- <tr><td>Paired up coinZdense work<td>25 $HBD<td>-<td>1<td>2x contract h <p dir="auto">Development work on open-source project get a 30 $HBD discount on my hourly rates. <p dir="auto">Next to contract work, you can also become a sponsor of my coinZdense project.<br /> Note that if you pair up to two coinZdense sponsor hours with a contract hour, you can sponsor twice the amount of hours to the coinZdense project. <p dir="auto">If you wish to pay for my services or sponsor my project with other coins than $HBD, all rates are slightly higher (same rates, but in Euro or euro equivalent value at transaction time). I welcome payments in Euro (through paypall), $HIVE, $QRL $ZEC, $LTC, $DOGE, $BCH, $ETH or $BTC/lightning. <p dir="auto">Contact: coin<at>z-den.se