Skip to content

Conversation

shiyuan680
Copy link
Contributor

@shiyuan680 shiyuan680 commented Jul 22, 2025

What this PR does / why we need it?

test for fused_moe.py

Does this PR introduce any user-facing change?

How was this patch tested?

@shiyuan680 shiyuan680 force-pushed the ut branch 10 times, most recently from 7c72764 to 48bb07b Compare July 23, 2025 01:45
Copy link

codecov bot commented Jul 23, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 65.08%. Comparing base (5f0b42e) to head (0c2d4f2).
Report is 6 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1930      +/-   ##
==========================================
+ Coverage   60.24%   65.08%   +4.83%     
==========================================
  Files          73       76       +3     
  Lines        8004     8274     +270     
==========================================
+ Hits         4822     5385     +563     
+ Misses       3182     2889     -293     
Flag Coverage Δ
unittests 65.08% <100.00%> (+4.83%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@shiyuan680 shiyuan680 force-pushed the ut branch 2 times, most recently from da539f2 to 46fba39 Compare July 23, 2025 03:05
@momo609 momo609 requested a review from Copilot July 23, 2025 03:36
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds comprehensive unit tests for the fused_moe.py module in the vLLM Ascend project. The tests cover the AscendFusedMoE class and AscendUnquantizedFusedMoEMethod with various configurations and execution paths.

  • Adds test coverage for fused MoE operations including initialization, forward passes, and weight processing
  • Implements extensive mocking infrastructure for distributed training and NPU-specific operations
  • Provides parameterized tests for different MoE configurations including expert parallelism and routing methods

def test_init_with_quant(self, mock_dist_env, default_moe_config):
mock_quant_config = MagicMock()
mock_quant_method = MagicMock()
mock_quant_config.get_quant_mothod.return_value = mock_quant_method
Copy link
Preview

Copilot AI Jul 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a typo in the method name 'get_quant_mothod'. It should be 'get_quant_method'.

Suggested change
mock_quant_config.get_quant_mothod.return_value = mock_quant_method
mock_quant_config.get_quant_method.return_value = mock_quant_method

Copilot uses AI. Check for mistakes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update

"""
global_num_experts, ep_size, select_softmax = others_param
with patch(
"vllm_ascend.ops.fused_moe.SELECT_GATING_TOPK_SOTFMAX_EXPERTS",
Copy link
Preview

Copilot AI Jul 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a typo in 'SOTFMAX_EXPERTS'. It should be 'SOFTMAX_EXPERTS'.

Suggested change
"vllm_ascend.ops.fused_moe.SELECT_GATING_TOPK_SOTFMAX_EXPERTS",
"vllm_ascend.ops.fused_moe.SELECT_GATING_TOPK_SOFTMAX_EXPERTS",

Copilot uses AI. Check for mistakes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

origin param is wrong spelling

@momo609 momo609 self-requested a review July 23, 2025 03:37
Signed-off-by: yangcheng <[email protected]>
@momo609
Copy link
Collaborator

momo609 commented Jul 23, 2025

lgtm

@wangxiyuan
Copy link
Collaborator

@yiz-liu please consider update this file as well in the future once you refactor fused moe code.

@wangxiyuan wangxiyuan merged commit ac0bf13 into vllm-project:main Jul 23, 2025
14 checks passed
@shiyuan680 shiyuan680 deleted the ut branch July 31, 2025 12:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants