diff --git a/docs/hub/datasets-upload-guide-llm.md b/docs/hub/datasets-upload-guide-llm.md index 38c3dbe5b..c58725777 100644 --- a/docs/hub/datasets-upload-guide-llm.md +++ b/docs/hub/datasets-upload-guide-llm.md @@ -67,8 +67,8 @@ find . -name "*.jpg" | wc -l ```yaml # Machine-readable Hub limits hub_limits: - max_file_size_gb: 50 # absolute hard stop enforced by LFS - recommended_file_size_gb: 20 # best-practice shard size + max_file_size_gb: 200 # absolute hard stop enforced by LFS + recommended_file_size_gb: 50 # best-practice shard size max_files_per_folder: 10000 # Git performance threshold max_files_per_repo: 100000 # Repository file count limit recommended_repo_size_gb: 300 # public-repo soft cap; contact HF if larger @@ -80,7 +80,7 @@ hub_limits: - Free: 100GB private datasets - Pro (for individuals) | Team or Enterprise (for organizations): 1TB+ private storage per seat (see [pricing](https://huggingface.co/pricing)) - Public: 1TB (contact datasets@huggingface.co for larger) -- Per file: 50GB max, 20GB recommended +- Per file: 200GB max, <50GB recommended - Per folder: <10k files See https://huggingface.co/docs/hub/storage-limits#repository-limitations-and-recommendations for current limits for current recommendations for repository sizes and file counts. diff --git a/docs/hub/storage-limits.md b/docs/hub/storage-limits.md index eee027273..f55c2dc31 100644 --- a/docs/hub/storage-limits.md +++ b/docs/hub/storage-limits.md @@ -43,7 +43,7 @@ We gathered a list of tips and recommendations for structuring your repo. If you | Repo size | - | contact us for large repos (TBs of data) | | Files per repo | <100k | merge data into fewer files | | Entries per folder | <10k | use subdirectories in repo | -| File size | <20GB | split data into chunked files | +| File size | <50GB | split data into chunked files | | Commit size | <100 files* | upload files in multiple commits | | Commits per repo | - | upload multiple files per commit and/or squash history | @@ -67,7 +67,7 @@ which has very detailed documentation about the different factors that will impa For example, json files can be merged into a single jsonl file, or large datasets can be exported as Parquet files or in [WebDataset](https://github.com/webdataset/webdataset) format. - The maximum number of files per folder cannot exceed 10k files per folder. A simple solution is to create a repository structure that uses subdirectories. For example, a repo with 1k folders from `000/` to `999/`, each containing at most 1000 files, is already enough. -- **File size**: In the case of uploading large files (e.g. model weights), we strongly recommend splitting them **into chunks of around 20GB each**. +- **File size**: In the case of uploading large files (e.g. model weights), we strongly recommend splitting them **into chunks of around 50GB each**. There are a few reasons for this: - Uploading and downloading smaller files is much easier both for you and the other users. Connection issues can always happen when streaming data and smaller files avoid resuming from the beginning in case of errors.