-
Notifications
You must be signed in to change notification settings - Fork 224
Open
Description
In trainer.py, why only update the parameters of dis_a and gen_a and ignore the parameters of dis_b and gen_b?
Lines 242 to 248 in a067be1
| dis_params = list(self.dis_a.parameters()) #+ list(self.dis_b.parameters()) | |
| gen_params = list(self.gen_a.parameters()) #+ list(self.gen_b.parameters()) | |
| self.dis_opt = torch.optim.Adam([p for p in dis_params if p.requires_grad], | |
| lr=lr_d, betas=(beta1, beta2), weight_decay=hyperparameters['weight_decay']) | |
| self.gen_opt = torch.optim.Adam([p for p in gen_params if p.requires_grad], | |
| lr=lr_g, betas=(beta1, beta2), weight_decay=hyperparameters['weight_decay']) |
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels