curl / Mailing Lists / curl-library / Single Mail
Buy commercial curl support from WolfSSL. We help you work out your issues, debug your libcurl applications, use the API, port to new platforms, add new features and more. With a team lead by the curl founder himself.

Re: performance tests

From: Stefan Eissing via curl-library <curl-library_at_lists.haxx.se>
Date: Fri, 10 Feb 2023 10:02:59 +0100

> Am 09.02.2023 um 23:37 schrieb Daniel Stenberg via curl-library <curl-library_at_lists.haxx.se>:
>
> On Thu, 9 Feb 2023, Fabian Keil via curl-library wrote:
>
>> In the mean time I'll keep an eye on the curl commits to see if anyone beats me to it ...
>
> Here's some brainstorming.
>
> We could start out thinking about how to make it work. We need to regularly run a specific set of tasks, get timing numbers and store the those numbers somewhere probably together with some identifier saying which host that ran the test (so that we can use several hosts and still compare the right sets). We also need to store exact curl git version and probably curl -V output or something.
>
> As for what to run, we could make it simple and start with the command line from https://github.com/curl/curl/issues/10389 which times how fast curl gets the first byte back from a remote site. Using an external remote site seems fragile and using localhost makes a sadder test.
>
> Another test could be to download a huge file from localhost.
>
> We could start with something basic to get the thing going, then add more later on.
>
> Running the tests, I presume we can make them run N times and if all results are within a certain margin (M%) we consider them stable and we pick... the median result?
>
> How and where do we store the numbers? We need to be able to get them back easily to analyze trends, make graphs and understand deltas between specific performance diffs. We probably need to run a database somewhere for this.
>

I storm a little myself.

Have a cron job running nightly on your machine.[1]
Build a set of configurations from curl master.
Run a set of curl command lines.
Collect the output from "-w '%{json}\\n'"
Store results in sub-dirs named YYYY-MM-DD-<curl master rev>

Evaluate all stored results to produce the "result views" e.g. charts
Always eval all history data, eval script may change over time (bugs in eval!)

In case of external abnormalities (CDN down, etc),
delete the run data or just revert the curl-perf revision.

Something like:

curl-perf/nghttp2-openssl
  * build.sh
  * run.sh
  * data/
    * 2023-02-08-XXXX/
      * google.com.json, curl.se.json...
    * 2023-02-10-YYYY/
      * google.com.json, curl.se.json...
  * aggregate.sh
  * results/
    * handshake.json # always updated
    * downspeed.json # always updated


[1] It's not that your machine is the "best". But I assume it will be quite stable and available. Comparable runs seem key.

> --
>
> / daniel.haxx.se
> | Commercial curl support up to 24x7 is available!
> | Private help, bug fixes, support, ports, new features
> | https://curl.se/support.html
> --
> Unsubscribe: https://lists.haxx.se/listinfo/curl-library
> Etiquette: https://curl.se/mail/etiquette.html

-- 
Unsubscribe: https://lists.haxx.se/listinfo/curl-library
Etiquette:   https://curl.se/mail/etiquette.html
Received on 2023-02-10