Skip to content

Conversation

afierka-intel
Copy link

@afierka-intel afierka-intel commented Sep 26, 2025

Related change in extension:
HabanaAI/vllm-hpu-extension#370

The PR fixes case when we run vllm multiple times in the same process, e.g. running some pytest scenarios.

Cherry-pick from v1.23.0: #1983

Related change in extension:
HabanaAI/vllm-hpu-extension#370

The PR fixes case when we run vllm multiple times in the same process,
e.g. running some pytest scenarios.

Signed-off-by: Artur Fierka <[email protected]>
@afierka-intel
Copy link
Author

/run-gaudi-tests

@afierka-intel
Copy link
Author

/run-gaudi-tests

@afierka-intel afierka-intel merged commit 5390cac into habana_main Oct 15, 2025
47 checks passed
@afierka-intel afierka-intel deleted the port/afierka/clean-config branch October 15, 2025 06:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants