A rust port of pytorch dataloader library.
- Iterable or indexable (Map style)
DataLoader. - Customizable
Sampler,BatchSamplerandcollate_fn. - Parallel dataloader using
rayonfor indexable dataloader (experimental). - Integration with
ndarrayandtch-rs, CPU and GPU support. - Default collate function that will automatically collate most of your type (supporting nesting).
- Shuffling for iterable and indexable
DataLoader.
More info in the documentation.
Examples can be found in the examples folder but here there is a simple one
use ai_dataloader::DataLoader;
let loader = DataLoader::builder(vec![(0, "hola"), (1, "hello"), (2, "hallo"), (3, "bonjour")]).batch_size(2).shuffle().build();
for (label, text) in &loader {
println!("Label {label:?}");
println!("Text {text:?}");
}tch-rs integration
In order to collate your data into torch tensor that can run on the GPU, you must activate the tch feature.
This feature relies on the tch crate for bindings to the C++ libTorch API. The libtorch library is required can be downloaded either automatically or manually. The following provides a reference on how to set up your environment to use these bindings, please refer to the tch for detailed information or support.
This features could be added in the future:
RandomSamplerwith replacement- parallel
dataloaderfor iterable dataset - distributed
dataloader
The current MSRV is 1.63.