Skip to content

Conversation

Andrei-Aksionov
Copy link

@Andrei-Aksionov Andrei-Aksionov commented Jun 9, 2025

Hey there 👋

While reviewing recipes, I’ve noticed that sometimes modifications made to “shared” methods in one recipe aren’t reflected in another.

For instance, consider the full fine-tune and LoRA fine-tune recipes.
These recipes should only differ in the way a model is instantiated and a checkpoint is saved. All other methods should be identical to the line. However, this isn’t the case.

As discussed in #2779, one potential solution is to implement a test that parses these methods and checks for equality.

This approach offers several benefits:

  • a) It ensures that the codebase remains intact.
  • b) When common methods are aligned, comparing files side-by-side becomes easier to identify differences between recipes, as there’s less “noise” from other differences.

The proposed test can:

  1. Test methods that should be fully identical
  2. Test methods that should be mostly identical with some minor variations

Important

This draft covers only full fine-tune and LoRA variants.
If the core team decides that this is the way to go - I'll extend it to all recipes.

Copy link

pytorch-bot bot commented Jun 9, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchtune/2807

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 9, 2025
@Andrei-Aksionov Andrei-Aksionov closed this by deleting the head repository Aug 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants