Skip to content

MoonshotChat fails to authenticate with the moonshot.ai api #539

@joaompinto

Description

Checked other resources

  • This is a bug, not a usage question.
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain Community rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain Community.
  • I read what a minimal reproducible example is (https://stackoverflow.com/help/minimal-reproducible-example).
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Example Code

from langchain_community.chat_models import MoonshotChat

chat = MoonshotChat(
api_base="https://api.moonshot.ai/v1",
temperature=0.5,
model="kimi-k2.5",
)

messages = [
("system", "hello"),
("human", "who are you"),
]
chat.invoke(messages)

Error Message and Stack Trace (if applicable)

File "C:\Users\lameg\Projects\moontest\run.py", line 13, in
chat.invoke(messages)
~~~~~~~~~~~^^^^^^^^^^
File "C:\Users\lameg.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 402, in invoke
self.generate_prompt(
~~~~~~~~~~~~~~~~~~~~^
[self._convert_input(input)],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<6 lines>...
**kwargs,
^^^^^^^^^
).generations[0][0],
^
File "C:\Users\lameg.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 1121, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\lameg.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 931, in generate
self._generate_with_cache(
~~~~~~~~~~~~~~~~~~~~~~~~~^
m,
^^
...<2 lines>...
**kwargs,
^^^^^^^^^
)
^
File "C:\Users\lameg.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 1233, in _generate_with_cache
result = self._generate(
messages, stop=stop, run_manager=run_manager, **kwargs
)
File "C:\Users\lameg.venv\Lib\site-packages\langchain_community\chat_models\openai.py", line 476, in _generate
response = self.completion_with_retry(
messages=message_dicts, run_manager=run_manager, **params
)
File "C:\Users\lameg.venv\Lib\site-packages\langchain_community\chat_models\openai.py", line 387, in completion_with_retry
return self.client.create(**kwargs)
~~~~~~~~~~~~~~~~~~^^^^^^^^^^
File "C:\Users\lameg.venv\Lib\site-packages\openai_utils_utils.py", line 286, in wrapper
return func(*args, **kwargs)
File "C:\Users\lameg.venv\Lib\site-packages\openai\resources\chat\completions\completions.py", line 1192, in create
return self._post(
~~~~~~~~~~^
"/chat/completions",
^^^^^^^^^^^^^^^^^^^^
...<47 lines>...
stream_cls=Stream[ChatCompletionChunk],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "C:\Users\lameg.venv\Lib\site-packages\openai_base_client.py", line 1297, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\lameg.venv\Lib\site-packages\openai_base_client.py", line 1070, in request
raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Invalid Authentication', 'type': 'invalid_authentication_error'}}

Description

  • I'm trying to use the Kimi K2.5 LLM

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions