You are viewing a single comment's thread from:

RE: Free open source Splinterlands statistics tool v0.24.0

in Splinterlands4 months ago

wow, excellent work @beaker007. I was looking at your repository and one of the parts that I liked the most was your spl.py file. It's all premium content, great for starting an API for any program that analyzes Splinterlands data. I am tempted to create a program that analyzes the battles in the bronze and silver leagues, which gives an idea of ​​what new players entering Splinterladns must face. You know, since in Splinterlands everything that has to do with the experience of new players is fashionable, I think it should be shown what these players are going to have to face in their beginnings. It seems like an outburst to me that a bronze player has to face a level 5 to 8 summoner in that league.

A question about the different features in the API you've developed for Splinterlands; To manage the data, do you only use CSV and Panda or are you also using SQLite?

Sort:  

Hi @quigua,

Thanks for your comment, indeed it has grown to extended api. Often tempted to create a python package for it.


checkout his latest post:
@kalkulus is also busy with a spltools (python) repo. https://peakd.com/hive-13323/@kalkulus/a-strategy-for-spending-glint-based-on-current-collection-progress-and-updates-to-spltools

Indeed i currently only use csv files, it still performs good enough, with 3 accounts monitoring since 2023-04-30. But i should be moving towards a sql or at least Parquet files.

When you start with your tool and have questions let me known 👍

I understand that you have csv files per user. How big are those one-year files? Because I downloaded the battle data from the last few days for all active users and the csv files are around 5 GB. If I plan to keep permanent statistics on all users, I have to optimize the storage a lot. That's why I asked about using SQLite, but parquet files maybe a better solution.

I wanted to also try building it with mongodb. Had a small experiment with and is also nice for the way i handle the transactions.
This method with csv is just easy also for others to use without knowledge of setting up a database etc.

For now my biggest file is 40MB, not that much. I do not store all the data just the data i need.
The most important one is the transaction id (or battle_id), that always be use to migrate and extend data I'm really missing it in the future.

image.png