Skip to content

Derive logprob for Split operation #7875

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

ricardoV94
Copy link
Member

@ricardoV94 ricardoV94 commented Jul 27, 2025


📚 Documentation preview 📚: https://pymc--7875.org.readthedocs.build/en/7875/

@ricardoV94
Copy link
Member Author

Classical mypy, feel free to review ignoring that, I'll fix it

Copy link

codecov bot commented Jul 27, 2025

Codecov Report

❌ Patch coverage is 93.54839% with 2 lines in your changes missing coverage. Please review.
✅ Project coverage is 92.94%. Comparing base (dc7cfee) to head (562c487).
⚠️ Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
pymc/logprob/tensor.py 93.54% 2 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #7875      +/-   ##
==========================================
+ Coverage   88.25%   92.94%   +4.69%     
==========================================
  Files         116      116              
  Lines       18845    18875      +30     
==========================================
+ Hits        16631    17544     +913     
+ Misses       2214     1331     -883     
Files with missing lines Coverage Δ
pymc/logprob/tensor.py 94.26% <93.54%> (-0.23%) ⬇️
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

# If the axis is over a dimension that was reduced in the logp (multivariate logp),
# We cannot split it into distinct entries. The mapping between values-densities breaks.
# We return the weighted logp by the split sizes. This is a good solution as any?
split_weights = splits / pt.sum(splits)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this legit?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think so? In MarginalMixture we decided to set the whole logp on the first entry, and zero for others, I like this approach more

# axis=-2 (i.e., 0, - batch dimension)
x_parts = pt.split(x, splits_size=[2, 1], n_splits=2, axis=-2)
x_parts_vv = [x_part.clone() for x_part in x_parts]
logp_parts = list(conditional_logp(dict(zip(x_parts, x_parts_vv))).values())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do i understand this correctly that each part is conditioned on the values of all other parts?

Thinking about e.g. the MVN case, where if you split the vector and condition each split on the other, you get two new MVN distributions

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's no marginalization going on, you can't evaluate the logp of only one part without providing the remaining ones. The only thing we do is join the value, get the logp, and split it again. We could argue that we don't want to do this for multivariate variables split along the core dimension, since there's no way you can split the logp (I did the weighing, but we can revert and raise NotImplemented)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants