Skip to content

Conversation

@iluise
Copy link
Collaborator

@iluise iluise commented Oct 22, 2025

Description

  • Change groupby logic in score computation (all lazy and before compute) to be more general.
  • Restore the existing probabilistic scores even in case of groupby operations

Issue Number

Closes #1123

Checklist before asking for review

  • I have performed a self-review of my code
  • My changes comply with basic sanity checks:
    • I have fixed formatting issues with ./scripts/actions.sh lint
    • I have run unit tests with ./scripts/actions.sh unit-test
    • I have documented my code and I have updated the docstrings.
    • I have added unit tests, if relevant
  • I have tried my changes with data and code:
    • I have run the integration tests with ./scripts/actions.sh integration-test
    • (bigger changes) I have run a full training and I have written in the comment the run_id(s): launch-slurm.py --time 60
    • (bigger changes and experiments) I have shared a hegdedoc in the github issue with all the configurations and runs for this experiments
  • I have informed and aligned with people impacted by my change:
    • for config changes: the MatterMost channels and/or a design doc
    • for changes of dependencies: the MatterMost software development channel

@iluise iluise self-assigned this Oct 22, 2025
@iluise iluise added the eval anything related to the model evaluation pipeline label Oct 22, 2025
@iluise iluise changed the title [] [issue 1123] restore probabilistic scores Oct 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

eval anything related to the model evaluation pipeline

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

restore probabilistic scores in FastEvaluation

1 participant