You are viewing a single comment's thread:

RE: Hive-Engine Node Benchmark Report - 2025-04-18

(edited)

I'll try to answer the questions in order.

  • Frankfurt Germany - Digital Ocean Droplet
  • It should state that, since it's variable it was vague, but I'll clarify, but the standard tests are 30 second timeout / time limit
  • I was originally running it on my dev machine, That was the first run from the server, it should populate over the week. If it's still not right this time next week, will adjust as needed, but you are probably right, I wasn't sure exactly how I wanted the trends section to be, so if any of them need it badly, that's one that will get refactored.
  • Latency is the average time of 5 samples of:
start_time = time.time()
# Make a simple query to measure latency - use a lightweight call
api.find("tokens", "tokens", {"symbol": "SWAP.HIVE"}, limit=1)
latency = time.time() - start_time

I would love some more input, kind of went into this blindly, and the original intent was to do what we did with @nectarflower with @flowerengine, which was run a benchmark every hour and store the results in the account metadata.

curl -s --data '{"jsonrpc":"2.0", "method":"database_api.find_accounts", "params": {"accounts":["flowerengine"]}, "id":1}' https://api.hive.blog | jq '.result.accounts[0].json_metadata | fromjson' | jq '.nodes[]'

@nectarflower runs on the :00 minute mark, and @flowerengine runs on the :30 minute mark, to avoid clash. Both benchmarks take ~10 minutes to do all the servers.


Would absolutely be thrilled to get a PR if you see something that could be done better. Also, I am aware the head section of this post was printed twice, that should be fixed on the next run, don't know how I didn't catch that the first 20 times I ran it.

0.00900080 BEE
1 comments

Thanks for the answers. I would love to do more PR's, but lately, time to reply is already a luxury. I will keep giving this a watch and provide any feedback, and if I find anyone keep on helping out, will point he/she in your direction.

I have noticed also you guys used python 3.13 (which is very recent). Would be nice to support older versions, but its not a problem if not. As in a few months, all pythons everywhere will be supporting 3.13 anyhow.

Just a thought.

0.00086004 BEE