I'm glad it is working for you as intended!
I would recommend only processing up to a maximum of maybe 1000 records in a batch, if you expect to have to do more than that in a single batch let me know.
The memo length can contain more than a few words. There is not a limit in this system but there is likely a limit within Hive-Engine/TribalDex. Unless your memo is more than 150 characters I wouldn't think any limit would be taking place on either end.
I have updated the get_hive_voter_details webpage to display the values in HIVE instead of HBD. I know you said along with but doing it this way would prevent any unwanted errors on your spreadsheet due to having an additional column. If you do want me to add another column and bring back HBD let me know.
One other note, to account for potential larger batches I have added an additional section to the webpage which will appear above the results section at the bottom if there are any specific batch entries which fail to process. This is intended to make it easier to find any of the entries which would need to re-run.
Thanks for the prompt action and response. 1000 records in a batch should be more than enough.
As for the update with get_hive_voter_details, yes please bring back the HBD column and list both Hive and HBD. I will make sure I pick up the Hive value.
Okay I have updated the webpage and brought back HBD so now the CSV has 12 columns in total. The 3 for HBD and then an addtional 3 for HIVE (each set of 3 consisting of: value with currency name, value as a number only, and formula used to determine value)