Skip to content

Conversation

@Mikolaj-A-Kowalski
Copy link
Contributor

@Mikolaj-A-Kowalski Mikolaj-A-Kowalski commented Jun 16, 2025

Builds on top of #34

Related to #22

Adapts the performance regression inspection script of pyrealm. Some modifications were necessary to make it run with DEMENTpy, but following them the script runs and is capable to generate the report.

No documentation is provided yet. To run it one just needs to follow instructions from the pyrealm doc.

At the moment its usefulness seems to be limited I'm afraid. In the tests on my machine the profile entries seem to have a really high variance in the range of 15% so two profiling runes of the identical code do produce the disagreement with the default threshold value of 5%. Below is the example result of the comparison of the two consecutive runs without any code modification:

plot

At the moment I am not sure where does the variance comes from. Need to double check if the runs are seeded correctly and are reproducible. (EDIT: They are)

sjavis and others added 4 commits June 6, 2025 18:07
Adapted from pyrealm. Allows to compare two profiled runs of the code
and highlights which functions performance dropped below a threshold.

Note that at the memonet DEMENTpy exhibits high variance in runtime
of individual functions (observed up to 15%) so the check with default
settings (5% tolerance) will likley fail for the same code versions.
@Mikolaj-A-Kowalski Mikolaj-A-Kowalski mentioned this pull request Jun 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants