You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Summary:
Pull Request resolved: #3420
# context
* torchrec github workflow [pre-commit](https://github.com/meta-pytorch/torchrec/actions/runs/18187119531/job/51773690358) failed with the following message
```
diff --git a/torchrec/distributed/embedding_kernel.py b/torchrec/distributed/embedding_kernel.py
index e444f59..6c1dea2 100644
--- a/torchrec/distributed/embedding_kernel.py
+++ b/torchrec/distributed/embedding_kernel.py
@@ -105,7 +105,9 @@ def create_virtual_table_global_metadata(
# The param size only has the information for my_rank. In order to
# correctly calculate the size for other ranks, we need to use the current
# rank's shard size compared to the shard size of my_rank.
- curr_rank_rows = (param.size()[0] * metadata.shards_metadata[rank].shard_sizes[0]) // my_rank_shard_size # pyre-ignore[16]
+ curr_rank_rows = (
+ param.size()[0] * metadata.shards_metadata[rank].shard_sizes[0]
+ ) // my_rank_shard_size # pyre-ignore[16]
else:
curr_rank_rows = (
weight_count_per_rank[rank] if weight_count_per_rank is not None else 1
```
Reviewed By: spmex
Differential Revision: D83755478
fbshipit-source-id: e4ad086b66d79e203361e7853f547a8108b7181a
0 commit comments