Skip to content

Conversation

allenbenz
Copy link

@allenbenz allenbenz commented Apr 30, 2025

Ran into this while running convert against a sketchy model.

@turboderp
Copy link
Member

It's really weird that this can happen if the model otherwise works. It implies it was perfectly quantized which rather makes me think something went wrong. Though, in principle, if you're quantizing and already quantized model, perhaps? Idk. Did the converted model work after this?

@allenbenz
Copy link
Author

This was a while back, but I do remember it was a merge/finetune and the converted model worked.
I think you are right in that it was probably caused by some layer of model being effectively quantized by some part of the process of it being made.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants