Skip to content

Qw23 t722 #1938

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: v0.9.1-dev
Choose a base branch
from
Open

Qw23 t722 #1938

wants to merge 2 commits into from

Conversation

NicholasTao
Copy link

What this PR does / why we need it?

Does this PR introduce any user-facing change?

How was this patch tested?

taoyuxiang and others added 2 commits July 22, 2025 16:19
Copy link

This pull request has conflicts, please resolve those before we can evaluate the pull request.

cos: Optional[torch.Tensor] = None,
sin: Optional[torch.Tensor] = None,
kv_cache: Optional[torch.Tensor] = None,
attn_metadata: Optional[AttentionMetadata] = None) -> torch.Tensor:
qkv, _ = self.qkv_proj(hidden_states)
q, k, v = qkv.split([self.q_size, self.kv_size, self.kv_size], dim=-1)
if type(self.rotary_emb) is RotaryEmbedding:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里逻辑改一下,是decode走cache模式,非decode走下面的else分支

)
for i in range(self.start_layer, self.end_layer):
layer = self.layers[i]
kv_c = kv_caches[i - self.start_layer] \

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

kv_c还是改为kv_cache吧

from vllm.model_executor.layers.rotary_embedding import (
DeepseekScalingRotaryEmbedding, RotaryEmbedding)

from vllm_ascend.ascend_config import get_ascend_config

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

deepseek_rope_init_func等几个接口删掉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants