Skip to content

Draft: use SciMLVerbosity verbosity system #647

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 33 commits into
base: master
Choose a base branch
from

Conversation

jClugstor
Copy link
Member

Checklist

  • Appropriate tests were added
  • Any code changes were done in a way that does not break public API
  • All documentation related to code changes were updated
  • The new code follows the
    contributor guidelines, in particular the SciML Style Guide and
    COLPRAC.
  • Any new documentation only uses public API

Additional context

I'm experimenting with using a ScopedValue to hold the NonlinearVerbosity object. There are a couple of reasons for this. Unlike LinearSolve, presently none of the caches have a verbose field that can hold the verbosity specifier. We could add these fields, but then only functions that explicitly depend on the cache can use the verbosity settings. Alternatively, we could thread the verbosity through every function that should be able to use it. Currently, many of the utility functions in NonlinearSolve don't explicitly depend on the cache, but do include warning messages, and we need to use the verbosity system in these functions if we want to be able to turn the warnings off while still having them available if needed. It seems unreasonable to have every function that might be used have to have the verbosity or the cache as one of it's arguments.

From what I can tell it seems like ScopedValues might be good for this, apparently avoiding some of the performance issues of normal globals. For now I have a global ScopedValue const nonlinear_verbose = ScopedValue{Union{NonlinearVerbosity{true}, NonlinearVerbosity{false}}}(). Then __init and solve! are called in the context of the @with macro, where nonlinear_verbose is set to the value of the verbose keyword. This allows any function that's called by __init or solve! to use the verbosity settings from the verbose keyword, without having to explicitly depend on either the verbosity object or the nonlinear cache.

I think my biggest concern here would be the performance of accessing the ScopedValue with nonlinear_verbose[], since that will need to be done in any place the @SciMLMessage macro would be used, which could be in parts of the code that is run very often.

I've been doing some profiling and some benchmarking and I do see a slowdown in some cases, but it's hard to tell if it's related to using ScopedValues or if it's other changes related to the verbosity system.

@oscardssmith do you have a sense of whether this is a good idea or not? If there are going to be really obvious performance issues then it's probably not worth it to do this

@jClugstor
Copy link
Member Author

After profiling some more it looks like accessing the ScopedValue is pretty expensive:

image

the getindex on the left takes even more time than the actual solve, more than doubling the actual time of the solve. That's dissapointing, it could have been nice :(

@oscardssmith
Copy link
Member

That's a lot more expensive than I would expect. Might be worth filing a bug.

@jClugstor jClugstor force-pushed the verbosity_system branch 3 times, most recently from 454836b to 0ac7245 Compare July 31, 2025 17:58
@jClugstor jClugstor marked this pull request as ready for review August 1, 2025 18:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants