Add gradient clipping (torch.nn.utils.clip_grad_norm_) inside native PyTorch loop to prevent exploding gradients.