Replies: 1 comment
-
|
Yes, I see your point. It is definitely the case right now that different LR schedules use different parameters and so we do end up with unused parameters for all the schedules we are not using... For optimisers this problem is less relevant, because they mostly end up using the same parameters. For LR schedules, your idea is probably the way to go. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Right now we are defining optimizers with quite a number of wasteful parameters. However in the line below we know that for each optimizer we MUST request:
But this is exactly the use case within scope of
cfg_serializableThen the idea would be to create config schedule and solver classes so that we register only the parameters for this class.
We should however document it in the library so that it is more user friendly but that could be a nice way to get rid of unused parameters and confusing name we had so far I feel :)
tensorflow-image-models/tfimm/train/optimizer.py
Line 13 in b47f2a0
Beta Was this translation helpful? Give feedback.
All reactions