-
Notifications
You must be signed in to change notification settings - Fork 300
Upgrade optimum-intel and transformers #2611
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upgrade optimum-intel and transformers #2611
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR upgrades two key dependencies for machine learning and model optimization: optimum-intel from version 1.25.1 to 1.25.2, and transformers from version 4.52.4 to 4.53.3. These updates are applied consistently across both testing and export requirement files.
- Updated
optimum-intelto version 1.25.2 for improved Intel hardware optimization support - Updated
transformersto version 4.53.3 for the latest model implementations and bug fixes
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.
| File | Description |
|---|---|
| tests/python_tests/requirements.txt | Updates dependency versions for testing environment |
| samples/export-requirements.txt | Updates dependency versions for model export functionality |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
You can also share your feedback on Copilot code review for a chance to win a $100 gift card. Take the survey.
samples/export-requirements.txt
Outdated
| --extra-index-url https://storage.openvinotoolkit.org/simple/wheels/nightly | ||
| openvino-tokenizers[transformers]~=2025.3.0.0.dev | ||
| optimum-intel[nncf]==1.25.1 | ||
| optimum-intel[nncf] @ git+https://github.com/huggingface/optimum-intel.git@v1.25.2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| optimum-intel[nncf] @ git+https://github.com/huggingface/optimum-intel.git@v1.25.2 | |
| optimum-intel[nncf]==v1.25.2 |
| @pytest.mark.parametrize("input_prompt", ["Hello everyone"]) | ||
| @pytest.mark.xfail( | ||
| reason="Missing config.json", | ||
| raises=subprocess.CalledProcessError, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fails on win only. Checking win locally
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sample works locally on Win, it's CI issue.
Let's xfail or skip for win only. Or wait for tomorrow:)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Today isn't the first day it fails
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
xpassed. So the problem was in cached model
| @pytest.mark.parametrize("input_prompt", ["Hello everyone"]) | ||
| @pytest.mark.xfail( | ||
| reason="Missing config.json", | ||
| raises=subprocess.CalledProcessError, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sample works locally on Win, it's CI issue.
Let's xfail or skip for win only. Or wait for tomorrow:)
No description provided.