-
Notifications
You must be signed in to change notification settings - Fork 2.2k
FleetModel Dpo, AutoModel => FleetModel. #3024
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Thanks for your contribution! |
… into fleetmodel-dpo-support
… into fleetmodel-dpo-support
| "LayerSpec", Callable[["GPTModelProvider"], "LayerSpec"] | ||
| ] = get_gpt_decoder_block_spec | ||
|
|
||
| transform_rules = {"n_routed_experts": "moe_num_experts"} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个fleet 已经更新为n_routed_experts ,不需要配了吧
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
|
|
||
| # MoE / FP8 | ||
| moe_num_experts: Optional[int] = None | ||
| num_moe_experts: Optional[int] = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里是和fleet 对齐吗,改成n_routed_experts了
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
… into fleetmodel-dpo-support
| self.config = config | ||
| self.dpo_config = copy.deepcopy(config.get("dpo_config", None)) | ||
| self.kto_config = copy.deepcopy(config.get("kto_config", None)) | ||
| self.dpo_config = copy.deepcopy(config.dpo_config) if hasattr(config, "dpo_config") else None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
为什么要这么改,原本的写法没问题
| elif hasattr(model, "init_config") and model.init_config is not None: | ||
| model_config_json = json.dumps(model.get_model_config(), ensure_ascii=False, indent=2) | ||
| self.vdl_writer.add_text("model_config", model_config_json) | ||
| # elif hasattr(model, "init_config") and model.init_config is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
为什么要注释掉
| try: | ||
| model_class = getattr(import_class, init_class) | ||
| return model_class | ||
| except AttributeError: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
临时写法的话,这里写个注释吧
|
|
||
|
|
||
| @register_base_model | ||
| class Glm4MoeModelFleet(Glm4MoePreTrainedModel): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
需要用Fleet后缀吗?
lugimzzz
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
支持glm4.5 fleet组网dpo训练,直接paddleformers-cli train /path/to/dpo.yaml启动训练
启动前先设置两个PYTHONPATH
export PYTHONPATH=/path/to/PaddleFleet/:/path/to/PaddleFleet/src/:$PYTHONPATH #修改为自己的paddlefleet路径
export PYTHONPATH=/path/to/PaddleFormers/:$PYTHONPATH