Skip to content

Conversation

PTWaade
Copy link
Contributor

@PTWaade PTWaade commented Jul 19, 2025

This model uses @CCall to call C during the model. It should work with only Enzyme.jl.
Note that I had to specify the platform-specific name of the C standard library outside the model (I got an error otherwise that @CCall didn't accept local variables). I'm unsure if that causes problems with the current setup of ADTests.jl; if so, let me know what I can do to help :)

@PTWaade
Copy link
Contributor Author

PTWaade commented Jul 19, 2025

I added the MH meta-Bayesian model to this pull request; let me know if you'd like them split up.
I was also uncertain which category to give to the meta-Bayesian model, so I made a new category for it (feel free to change that)

@PTWaade
Copy link
Contributor Author

PTWaade commented Jul 21, 2025

I am uncertain why these errors happen; let me know if there is anything I can do.

And a different thing: Might I suggest making a category for interfacing with external packages? Then we can have models that call PyCall, Optimization.jl etc.
And I would suggest the meta-Bayesian model(s) go in that category then, where Turing itself is the "external" package interfaced with?

@penelopeysm
Copy link
Member

Uhhh, I'm going to guess that the GHA failure is because it's from a fork.

@penelopeysm
Copy link
Member

That's pretty annoying. There's no way to let GitHub Actions push to this repo if the code is from a fork, which in turn means there's no way to display the resulting webpage on GitHub Pages.

Implementing a proper workaround might take me longer than I'd like to spend on this right now, so for now I'm going to re-open this PR from a branch on this repo, so that the website preview can run.

@penelopeysm penelopeysm mentioned this pull request Jul 22, 2025
@penelopeysm
Copy link
Member

a category for interfacing with external packages

And yes, that sounds good! I think that makes more sense to me than putting it under core Turing features or the equivalent.

Other things that could go in there (I think there are basically examples from the docs that can be copy-pasted) would be DifferentialEquations.jl, Flux.jl, and HiddenMarkovModels.jl.

@penelopeysm
Copy link
Member

And finally, the code is running: unfortunately it seems that the model doesn't work on the GitHub Actions runner (https://github.com/TuringLang/ADTests/actions/runs/16431886947/job/46434857806?pr=36):

/lib/x86_64-linux-gnu/libc.so.6: undefined symbol: fabs

So everything shows as an error right now https://turinglang.org/ADTests/pr/.

Do you have any idea why?

@PTWaade
Copy link
Contributor Author

PTWaade commented Jul 22, 2025

Thanks for opening the PR locally (annoying that forks don't work!). I don't know if there is some other way to facilitate me helping with this - let me know. (I'd be happy to add a few more models).

Hm, for why it errors; apparently the base C library has different names depending on the platform. I've tested mine on a Mac, so maybe I got it wrong on Linux (it says that it's undefined).
But as you can see in the code, I had to define the libc variable outside the Turing model and use it inside it, because when I passed it as an argument to the model function, or created it inside the model function, @CCall complained that it didn't accept local variables.

I think a solution would be to just hardcode in the Linux-specific name of the standard C library - let me try that out and I'll get back to you!

@PTWaade
Copy link
Contributor Author

PTWaade commented Jul 22, 2025

Finally: what's the error on the meta-Bayesian model with FiniteDifferences? That worked locally for me, but maybe I've done something wrong.
Hm I'll have a look on whether this is one of the places where the parameter is differentiated in constrained space

@PTWaade
Copy link
Contributor Author

PTWaade commented Jul 22, 2025

Alright, there's no need to specify the C library's name, so the call_C model can just be:

@model function call_C(y = 0.0)
    x ~ Normal(0, 1)

    # Call C abs function
    x_abs = @ccall fabs(x::Cdouble)::Cdouble

    y ~ Normal(0, x_abs)
end

model = call_C()

I'll update it now

Comment on lines 26 to 27
# #Inverse temperature for actions
β ~ 1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# #Inverse temperature for actions
β ~ 1
# #Inverse temperature for actions
β = 1

This is probably why this model is failing. I'll also fix that on #37.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It actually did run locally for me with estimating the beta (just took ages). But I removed it for simplicity.
Let me know if you need anything :)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you prefer to keep it as beta ~ Exponential(1)? I could try changing it back to that.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No the simplicity is better I suppose - what matters is that it can differentiate through the Turing call :)
Noise parameters like that are common (ubiquitous) in these types of models, but not necessary for testing the differentiation :)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed it back to Exponential(1) anyway and it still works fine on FiniteDifferences, so happy to keep it that way.

Comment on lines 26 to 27
# #Inverse temperature for actions
β ~ 1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you prefer to keep it as beta ~ Exponential(1)? I could try changing it back to that.

Comment on lines 34 to 35
#Run the inner Bayesian inference
chns = sample(inner_m, inner_sampler, inner_n_samples, progress = false)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In #37 I also changed this to fix the random seed (otherwise, each time you calculate AD, the gradients will be different, and thus it's impossible to determine whether the gradients are correct). I think with this change we should at least see FiniteDifferences run correctly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes perfect sense - I guess that's necessary for any model function which includes a stochastic process which isn't a tilde sampling statement.
Maybe we should include an example of that in the base Julia functions category, there might be some reason that some fo the AD backends breaks (compiled ReverseDiff perhaps?)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is already one, I think it's called control_flow :)

@penelopeysm
Copy link
Member

@PTWaade, could you look over https://turinglang.org/ADTests/pr/ (built from #37) and let me know if you're happy for me to merge those changes?

@PTWaade
Copy link
Contributor Author

PTWaade commented Jul 25, 2025

Yes - I looked throgh it, and it looks great :)

But let's change the category name from "miscellaneous features" to "external libraries" or something :) That'd be the only thing.

Happy!

@penelopeysm
Copy link
Member

Great, in that case I will change the heading, push the changes to this PR, and then merge this one :)

@penelopeysm penelopeysm merged commit c0896c2 into TuringLang:main Jul 26, 2025
45 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants