Skip to content

Conversation

shanjiaz
Copy link
Collaborator

@shanjiaz shanjiaz commented Sep 2, 2025

SUMMARY:
Use similar logic in the code base to determine which kv parameter name to use.

TEST PLAN:
Tested locally, should fix the kv cache errors.

Copy link

github-actions bot commented Sep 2, 2025

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @shanjiaz, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a robust mechanism to handle variations in key-value cache parameter naming within the transformers library. By dynamically adapting to the expected parameter name, it ensures that the system remains compatible across different library versions, thereby preventing potential runtime errors related to KV cache access.

Highlights

  • KV Cache Compatibility: Implemented logic to dynamically determine the correct KV cache parameter name (past_key_values or past_key_value) based on the module's forward signature.
  • Backward Compatibility: Ensures that the transformer fix remains backward compatible with different versions of the transformers library, preventing KV cache errors.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a backward-compatible way to handle the past_key_values parameter by inspecting the model's forward signature. This is a good improvement for compatibility with different transformers versions. My review includes a suggestion to optimize the implementation by caching the result of inspect.signature to avoid performance overhead during calibration, and to clarify a misleading comment.

Comment on lines +251 to +259
# Determine which past KV parameter name to use based on module's forward signature
_past_kv_name = (
"past_key_values" # transformers#39956
if "past_key_values" in inspect.signature(module.forward).parameters
else "past_key_value"
)

# Update both parameter names to maintain compatibility
kwargs[_past_kv_name] = kv_cache
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Calling inspect.signature() on every forward pass can be inefficient, especially during calibration which may involve many iterations. Since the forward signature of a module doesn't change at runtime, this value can be computed once and cached on the module.

Additionally, the comment on line 258, # Update both parameter names to maintain compatibility, is misleading as only one parameter is being updated.

The suggested code caches the parameter name on the module for performance and removes the confusing comment.

Suggested change
# Determine which past KV parameter name to use based on module's forward signature
_past_kv_name = (
"past_key_values" # transformers#39956
if "past_key_values" in inspect.signature(module.forward).parameters
else "past_key_value"
)
# Update both parameter names to maintain compatibility
kwargs[_past_kv_name] = kv_cache
if not hasattr(module, "_past_kv_name"):
# Determine which past KV parameter name to use once and cache it
module._past_kv_name = (
"past_key_values" # transformers#39956
if "past_key_values" in inspect.signature(module.forward).parameters
else "past_key_value"
)
kwargs[module._past_kv_name] = kv_cache

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this is run every forward pass, these changes are probably worth adding.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think there's a great place to cache this information.
@shanjiaz We can leave this the kv cache refactor, if that's easier

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure!

Copy link
Collaborator

@dsikka dsikka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, definitely need to support older transformers versions that we’ve pinned

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants