-
Notifications
You must be signed in to change notification settings - Fork 12.6k
Supporting Velvet model #11716
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Supporting Velvet model #11716
Conversation
src/llama-chat.h
Outdated
@@ -39,6 +39,7 @@ enum llm_chat_template { | |||
LLM_CHAT_TEMPLATE_GIGACHAT, | |||
LLM_CHAT_TEMPLATE_MEGREZ, | |||
LLM_CHAT_TEMPLATE_UNKNOWN, | |||
LLM_CHAT_TEMPLATE_VELVET |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's also recommended to add a test for this chat template, see test-chat-template.cpp
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw, this enum value is in the incorrect place. everything must come before LLM_CHAT_TEMPLATE_UNKNOWN
please also add a test case in test-chat-template.cpp
. Otherwise you will expect it to be broken in the future
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added the test case for chat template as you suggested and I did the requested changes. Let me know if I still missing something. Thanks.
Co-authored-by: Xuan-Son Nguyen <[email protected]>
Co-authored-by: Xuan-Son Nguyen <[email protected]>
Co-authored-by: Xuan-Son Nguyen <[email protected]>
Co-authored-by: Xuan-Son Nguyen <[email protected]>
src/llama-chat.h
Outdated
@@ -39,6 +39,7 @@ enum llm_chat_template { | |||
LLM_CHAT_TEMPLATE_GIGACHAT, | |||
LLM_CHAT_TEMPLATE_MEGREZ, | |||
LLM_CHAT_TEMPLATE_UNKNOWN, | |||
LLM_CHAT_TEMPLATE_VELVET |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw, this enum value is in the incorrect place. everything must come before LLM_CHAT_TEMPLATE_UNKNOWN
please also add a test case in test-chat-template.cpp
. Otherwise you will expect it to be broken in the future
I pause the pull request for more tests. |
Sorry I've been busy recently, feel free to ping when you're ready. |
would be nice having this integrated. I have tried the fork at https://github.com/fbuciuni90/llama.cpp but it did not work for the 2B ... not sure why. |
I've aligned the pull request content with the latest release. Could you let me know if I'm still missing anything before it can be merged? |
I'm adding support to Velvet models (see https://huggingface.co/Almawave/Velvet-14B)