feat: add MiniMax as LLM provider for LangGraph and Semantic Kernel agents#479
feat: add MiniMax as LLM provider for LangGraph and Semantic Kernel agents#479octo-patch wants to merge 1 commit intoa2aproject:mainfrom
Conversation
…gents - Add MiniMax as an explicit model_source option in the LangGraph currency agent with ChatMiniMax subclass for think-tag stripping and function_calling structured output - Add MiniMax as a ChatServices enum option in the Semantic Kernel travel agent using OpenAI-compatible API via AsyncOpenAI client - Add MINIMAX_API_KEY environment variable support and configuration docs - Add 29 tests (13 unit + 3 integration per agent) - Update READMEs with MiniMax setup instructions and API docs links - Update .envexample with MiniMax configuration variables
Summary of ChangesHello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request significantly enhances the flexibility of the sample agents by introducing MiniMax as an additional Large Language Model (LLM) provider. This allows users to leverage MiniMax's models within both the LangGraph Currency Agent and the Semantic Kernel Travel Agent, broadening the choice of backend LLMs beyond Google, OpenAI, and Azure. The integration includes specific adaptations to ensure MiniMax's unique characteristics are handled correctly, providing a robust and seamless experience for developers. Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request adds support for MiniMax as an LLM provider to both the LangGraph and Semantic Kernel sample agents. The implementation is clean, well-tested, and follows good practices for integrating new providers. For the LangGraph agent, a ChatMiniMax class is introduced to handle MiniMax-specific behaviors like stripping <think> tags. For the Semantic Kernel agent, MiniMax is added as a new ChatService option leveraging its OpenAI-compatible API. The changes include comprehensive unit and integration tests, as well as documentation updates. I have one minor suggestion to improve the setup instructions in the README for consistency.
| echo "TOOL_LLM_URL=your_llm_url" > .env | ||
| echo "TOOL_LLM_NAME=your_llm_name" > .env | ||
| # If you're using MiniMax (https://platform.minimax.io): | ||
| echo "model_source=minimax" >> .env |
There was a problem hiding this comment.
For consistency with the other setup instructions and to avoid potential issues if the .env file doesn't exist, the first echo command for a new provider configuration should use > to create/overwrite the file, while subsequent commands for the same provider use >> to append.
| echo "model_source=minimax" >> .env | |
| echo "model_source=minimax" > .env |
Summary
Adds MiniMax as an LLM provider option in two sample agents:
LangGraph Currency Agent: Adds
minimaxas amodel_sourceoption alongsidegoogleand the generic OpenAI-compatible path. Includes aChatMiniMaxsubclass ofChatOpenAIthat:<think>reasoning tags from model responses (MiniMax M2.7 is a thinking model)function_callingfor structured output (instead ofjson_schemawhich MiniMax does not support)temperature=1.0(MiniMax requires(0.0, 1.0])Semantic Kernel Travel Agent: Adds
MINIMAXto theChatServicesenum and implements_get_minimax_chat_completion_service()usingOpenAIChatCompletionwith a pre-configuredAsyncOpenAIclient pointing tohttps://api.minimax.io/v1.Models
MiniMax-M2.7MiniMax-M2.7-highspeedChanges
samples/python/agents/langgraph/app/agent.pyChatMiniMaxclass +minimaxmodel_source branchsamples/python/agents/langgraph/app/test_minimax.pysamples/python/agents/langgraph/README.mdsamples/python/agents/semantickernel/agent.pyMINIMAXenum +_get_minimax_chat_completion_service()samples/python/agents/semantickernel/test_minimax.pysamples/python/agents/semantickernel/README.mdsamples/python/agents/semantickernel/.envexample7 files changed, 621 additions, 13 deletions, 29 tests
Test plan
API Reference