-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
bugSomething isn't workingSomething isn't workingneeds triageWaiting to be triaged by maintainersWaiting to be triaged by maintainersver: 2.5.x
Description
Bug description
The precision_to_bits
dictionary in https://github.com/Lightning-AI/pytorch-lightning/blob/master/src/lightning/pytorch/utilities/model_summary/model_summary.py#L219 does not account for every type of precision, e.g., bf16-true
.
This will fail in getting the proper key from the dictionary and will default to 32.
What version are you seeing the problem on?
v2.5, master
How to reproduce the bug
In lightning/pytorch/utilities/model_summary/model_summary.py:L219
, just add the following when self._model.trainer.precision="bf16-true"
:
...
precision_to_bits = {"64": 64, "32": 32, "16": 16, "bf16": 16}
print(precision_to_bits.get(self._model.trainer.precision, 32))
raise
...
Error messages and logs
# Error messages and logs here please
Environment
Current environment
#- PyTorch Lightning Version (e.g., 2.5.0): master
#- PyTorch Version (e.g., 2.5): 2.5.1
#- Python version (e.g., 3.12): 3.10
#- OS (e.g., Linux): Ubuntu 22.04
#- CUDA/cuDNN version: 12.4
#- GPU models and configuration: 4xNVIDIA H100 NVL
#- How you installed Lightning(`conda`, `pip`, source): pip
More info
No response
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingneeds triageWaiting to be triaged by maintainersWaiting to be triaged by maintainersver: 2.5.x