Skip to content

Conversation

@quic-meetkuma
Copy link
Contributor

  • With this change the FT dependencies should be explicitly installed via "pip install -e .[ft]".
  • Added this so that eager stack can have different pytorch and transformers dependencies than the AOT stack.

quality = ["black", "ruff", "hf_doc_builder@git+https://github.com/huggingface/doc-builder.git"]

# New FT dependencies
ft = [
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

accelerate package?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

whl file of that would come from Eager team itself just like torch_qaic package.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would require explicit installation again by the user right. If it can be included here it would be good. When we do pip install .[ft] all the dependencies required for FT are installed in one go.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, plan is to include files here and make it install from /opt/ location.

@quic-meetkuma quic-meetkuma marked this pull request as draft November 25, 2025 08:56
…k to resolve the conflicting deps.

Signed-off-by: meetkuma <[email protected]>
test = ["pytest","pytest-mock"]
docs = ["Sphinx==7.1.2","sphinx-rtd-theme==2.0.0","myst-parser==3.0.1","sphinx-multiversion"]
quality = ["black", "ruff", "hf_doc_builder@git+https://github.com/huggingface/doc-builder.git"]
infer = [
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will be default; we just need to over right when we do pip install -e .[ft]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, having torch under dependencies section and different torch version under "ft" optional dependencies -> creates conflicts. Hence, proposing to move different torch versions under different sections. Same will be true for transformers. FT will soon have its own transformer version (either open-source or based on whl file which contains qaic backend changes).

@quic-swatia
Copy link
Contributor

quic-swatia commented Nov 26, 2025

In this PR, we should use pip install .[ft] in the Jenkinsfile once #629 is merged. This will test the correct intended environment in the CI.

]
ft = [
"tensorboard ; python_version>='3.10' and python_version<'3.12' and platform_machine=='x86_64'",
"transformers==4.55.0 ; python_version>='3.10' and python_version<'3.12' and platform_machine=='x86_64'",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dependencies already has the same version of transformers at line number 22 of this file. Why is this needed here?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please check the python version comment for this line as well.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Eager support python 3.10 and 3.12.
Explicitly adding transformers here to simplify future modifications to the FT’s transformer.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Transformers and torch should be handled uniformly. While both of them are added in the FT section, for inference one of them is kept in dependencies and other in infer section.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated in latest.

"tensorboard ; python_version>='3.10' and python_version<'3.12' and platform_machine=='x86_64'",
"transformers==4.55.0 ; python_version>='3.10' and python_version<'3.12' and platform_machine=='x86_64'",
"torch@https://download.pytorch.org/whl/cpu/torch-2.9.0%2Bcpu-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and platform_machine=='x86_64'",
"torch@https://download.pytorch.org/whl/cpu/torch-2.9.0%2Bcpu-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and platform_machine=='x86_64'",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we supporting python 3.11? At line number 20, "requires-python = ">=3.8,<3.11" is mentioned. It will not let QEff install for python 3.11.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that is global level python. It will soon start to support python=3.12 as well. In that case this line still holds true. No harm in adding this here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants