Hello,
I hope this is the right place to ask, it would be nice to have the onnx weights added for the models specified below, just like was done for:
https://huggingface.co/jinaai/jina-embeddings-v2-base-en/tree/main
models:
jina-embedding-t-en-v1
jina-embedding-s-en-v1
jina-embedding-b-en-v1
jina-embedding-l-en-v1
But then putting the weights both quantized and not quantized in a folder named onnx just like:
https://huggingface.co/Xenova/jina-embeddings-v2-base-en
onnx
- model.onnx
- model_quantized.onnx
I will try to convert one myself (jina-embedding-s-en-v1) soon to see if it works, but it would be nice to have official weights to compare results.