-
Notifications
You must be signed in to change notification settings - Fork 84
Optim refactoring #662
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Optim refactoring #662
Conversation
Code Coverage SummaryResults for commit: 6d7ce0e Minimum allowed coverage is ♻️ This comment has been updated with latest results |
FilippoOlivo
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @dario-coscia, thank you for the PR. Could you please provide an example of how this new feature should be useful?
Moreover I am not convinced anymore about the need of both OptimizerInterface and TorchOptimizer. Specifically, creating a new optimizer from scratch, without inheriting from torch.optim is not straightforward and requires a lot of effort in my opinion. In this regard, I think we can combine together the two classes in a single one, called, for example, PinaOptimizer
4fdf1ae to
d04f75a
Compare
I agree with @FilippoOlivo |
|
Hi @GiovanniCanali @FilippoOlivo, This PR is not ready for review yet, but here’s the planned roadmap: Roadmap1. Restructure Optimizer / Scheduler Interface
2. Introduce Second-Order Optimizers
This roadmap should provide a clear view of the upcoming changes and priorities. |
* adding connectors for optimizers/schedulers * simplify configure_optimizers logic
Some updates on this…
|
|
I think we can start adding second order optimizers/new optimizers |
Description
This PR fixes #618.
Checklist