-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Open
Labels
Description
This is currently a hack to unblock things. Need better solution
Originally posted by @youliangtan in #257 (comment)
Need a more elegant solution to resolve:
making sure all `forward` function outputs participate in calculating loss.
If you already have done the above, then the distributed data parallel module wasn't able to locate the output tensors in the return value of your module's `forward` function. Please include the loss function and the structure of the return value of `forward` of your module when reporting this issue (e.g. list, dict, iterable).
Parameter indices which did not receive grad for rank 0: 437 438 439 440 441 442 443 444 445 446 447
In addition, you can set the environment variable TORCH_DISTRIBUTED_DEBUG to either INFO or DETAIL to print out information about which particular parameters did not receive gradient on this rank as part of this error
when tune-visual is enabled.
Reactions are currently unavailable