Skip to content

Conversation

CSWYF3634076
Copy link
Contributor

@CSWYF3634076 CSWYF3634076 commented Aug 8, 2025

Purpose

Support Baidu Ernie4.5 VL model for vllm

Note: torch.compile is not supported. Due to the model limitations of multimodal experts and text experts, using torch.compile may fail to start

Copy link

github-actions bot commented Aug 8, 2025

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@mergify mergify bot added documentation Improvements or additions to documentation new-model Requests to new models labels Aug 8, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds support for the Baidu Ernie4.5 VL model. The changes include new model implementation files, rotary embedding logic, and a custom processor. While the overall structure seems correct, there are several critical issues that need to be addressed. These include hardcoded values that make the code brittle, unsafe tensor operations that could lead to runtime errors, incorrect function calls in the processor, and some incomplete logic indicated by TODOs. Addressing these points will significantly improve the robustness and maintainability of the new model support.

Copy link
Member

@ywang96 ywang96 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the contritbution! Is this PR ready for review? If not, could you please convert this to a draft PR?

@CSWYF3634076 CSWYF3634076 marked this pull request as draft August 9, 2025 07:21
@CSWYF3634076 CSWYF3634076 marked this pull request as ready for review August 12, 2025 08:41
@CSWYF3634076
Copy link
Contributor Author

Thanks for the contritbution! Is this PR ready for review? If not, could you please convert this to a draft PR?

@ywang96 Hi, the PR is ready for review. Could you help review it?

@CSWYF3634076 CSWYF3634076 requested a review from ywang96 August 12, 2025 08:46
@CSWYF3634076 CSWYF3634076 changed the title [WIP][Model] Add Ernie4.5 VL Model Support [Model] Add Ernie4.5 VL Model Support Aug 12, 2025
@DarkLight1337
Copy link
Member

Sorry for the delay, I'll take a look later today

@mergify mergify bot removed the needs-rebase label Aug 25, 2025
@Isotr0py Isotr0py enabled auto-merge (squash) August 25, 2025 06:28
auto-merge was automatically disabled August 25, 2025 08:41

Head branch was pushed to by a user without write access

@mergify mergify bot added the ci/build label Aug 26, 2025
@Isotr0py Isotr0py enabled auto-merge (squash) August 26, 2025 13:51
@vllm-bot vllm-bot merged commit 644d57d into vllm-project:main Aug 27, 2025
64 of 71 checks passed
tc-mb pushed a commit to tc-mb/vllm that referenced this pull request Aug 27, 2025
@hmellor
Copy link
Member

hmellor commented Aug 27, 2025

Why has decord been re-added as a dependency? In the past we went to the effort of removing it because it's been unmaintained for 3 years.

As far as I can tell nothing that was added in this PR needs it.

@DarkLight1337
Copy link
Member

It is only in the test dependencies, and is needed to load the model from HF Hub

@hmellor
Copy link
Member

hmellor commented Aug 27, 2025

Ah I see. That'll be why nothing in vLLM appears to be using it. Thanks for explaining.

epwalsh pushed a commit to epwalsh/vllm that referenced this pull request Aug 28, 2025
xiao-llm pushed a commit to xiao-llm/vllm that referenced this pull request Aug 28, 2025
xiao-llm pushed a commit to xiao-llm/vllm that referenced this pull request Aug 28, 2025
zhewenl pushed a commit to zhewenl/vllm that referenced this pull request Aug 28, 2025
dumb0002 pushed a commit to dumb0002/vllm that referenced this pull request Aug 28, 2025
2015aroras pushed a commit to 2015aroras/vllm that referenced this pull request Aug 29, 2025
nopperl pushed a commit to pfnet/vllm that referenced this pull request Sep 3, 2025
MatthewBonanni pushed a commit to MatthewBonanni/vllm that referenced this pull request Sep 3, 2025
MatthewBonanni pushed a commit to MatthewBonanni/vllm that referenced this pull request Sep 3, 2025
842974287 pushed a commit to 842974287/vllm that referenced this pull request Sep 3, 2025
zhewenl pushed a commit to zhewenl/vllm that referenced this pull request Sep 3, 2025
jinyouzhi pushed a commit to jinyouzhi/vllm that referenced this pull request Sep 11, 2025
jinyouzhi pushed a commit to jinyouzhi/vllm that referenced this pull request Sep 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci/build documentation Improvements or additions to documentation multi-modality Related to multi-modality (#4194) new-model Requests to new models ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants