Skip to content

Conversation

epistoteles
Copy link

@epistoteles epistoteles commented Feb 24, 2025

This PR adds logprobs to the response_metadata of the chat completion request. If there are no logprobs available the logprobs will be None. top_logprobs will be included in the metadata if they are requested.

Merging this PR will enable you to turn this logprobs ❌ into a ✅: https://python.langchain.com/docs/integrations/chat/databricks/

Implements #69

@epistoteles
Copy link
Author

Note: It would be nice to include somewhere in the docs that this is how you have to request logprobs:

chat_model = ChatDatabricks(
    endpoint="gpt-4o-mini",
    temperature=0,
    max_tokens=1,
    extra_params={
        "logprobs": True,
        "top_logprobs": 20
    }
)

just setting

chat_model = ChatDatabricks(
    endpoint="gpt-4o-mini",
    temperature=0,
    max_tokens=1,
    logprobs: True,
    top_logprobs: 20
)

will not throw an error but also not work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant