Skip to content

Deduplicate Transformers backend code using inheritance #21461

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 24, 2025

Conversation

hmellor
Copy link
Member

@hmellor hmellor commented Jul 23, 2025

Now that VLM support is merged and the model loading has been cleaned up a bit, the Transformers backend classes can be refactored to use inheritance, greatly reducing code duplication.

In summary, this PR:

  • Converts TransformersModel to a base class (TransformersBase) which:
    • inherits SupportsQuant, SupportsLoRA and SupportsPP
  • TransformersForCausalLM:
    • inherits TransformersBase and adds a language modelling head
  • TransformersForMultiModalLM:
    • inherits TransformersForCausalLM and SupportsMultiModal
    • adds multimodal preprocessing methods

Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@hmellor hmellor requested a review from Isotr0py July 23, 2025 14:32
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a well-designed refactoring of the Transformers backend classes. By leveraging inheritance, it significantly reduces code duplication and improves the overall structure and maintainability of the code. The new base class TransformersBase cleanly encapsulates common functionality, and the specialized classes TransformersForCausalLM and TransformersForMultimodalLM correctly extend it.

The logic for model initialization, forward pass, and weight loading appears to be preserved while being much cleaner. I've reviewed the changes and found no issues. This is a great improvement to the codebase.

Copy link
Collaborator

@Isotr0py Isotr0py left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@Isotr0py Isotr0py enabled auto-merge (squash) July 23, 2025 14:43
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Jul 23, 2025
@vllm-bot vllm-bot merged commit dde295a into vllm-project:main Jul 24, 2025
78 of 85 checks passed
@hmellor hmellor deleted the simplify-transformers-backend branch July 24, 2025 09:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants