Skip to content

fix: Using batchsize of batch_sampler when using BatchSampler#21566

Open
bigd4 wants to merge 6 commits intoLightning-AI:masterfrom
bigd4:fix_batch_sampler_batch_size
Open

fix: Using batchsize of batch_sampler when using BatchSampler#21566
bigd4 wants to merge 6 commits intoLightning-AI:masterfrom
bigd4:fix_batch_sampler_batch_size

Conversation

@bigd4
Copy link

@bigd4 bigd4 commented Mar 5, 2026

What does this PR do?

Fixes #21122

When DataLoader is defined with DataLoader(dataset, batch_sampler=BatchSampler(...)), the batch_size and drop_last will be default values (1 and False). So _dataloader_init_kwargs_resolve_sampler may return wrong parameters.

Changes

return batch_sampler.batch_size and batch_sampler.drop_last if batch_sampler is BatchSampler.

Before submitting
  • Was this discussed/agreed via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

Reviewer checklist
  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

📚 Documentation preview 📚: https://pytorch-lightning--21566.org.readthedocs.build/en/21566/

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Mar 5, 2026
@bigd4 bigd4 force-pushed the fix_batch_sampler_batch_size branch from 7db4e5e to bbfeb07 Compare March 5, 2026 07:05
@deependujha deependujha marked this pull request as ready for review March 5, 2026 12:08
@bigd4
Copy link
Author

bigd4 commented Mar 5, 2026

@deependujha
The tests fail becase pytorch-lightning/tests/tests_pytorch/trainer/connectors/test_data_connector.py line 198-200. The missing value are ['batch_sampler', 'batch_size', 'drop_last', 'sampler', 'shuffle'] instead of ['batch_sampler', 'sampler', 'shuffle'].

    loader = TestDataLoader(ds)
    sampler = SequentialSampler(ds)
    match = escape("missing arguments are ['batch_sampler', 'sampler', 'shuffle']")
    with pytest.raises(MisconfigurationException, match=match):
        _update_dataloader(loader, sampler, mode="fit")
    match = escape("missing arguments are ['batch_sampler', 'batch_size', 'drop_last', 'sampler', 'shuffle']")
    with pytest.raises(MisconfigurationException, match=match):
        _update_dataloader(loader, sampler, mode="predict")

I dont know if I should change match = escape("missing arguments are ['batch_sampler', 'sampler', 'shuffle']") or any other ideas?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

pl Generic label for PyTorch Lightning package

Projects

None yet

Development

Successfully merging this pull request may close these issues.

batch sampler, ddp, and sampler wrapper

2 participants