Skip to content

feat: add MiniMax as LLM provider for LangGraph and Semantic Kernel agents#479

Open
octo-patch wants to merge 1 commit intoa2aproject:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as LLM provider for LangGraph and Semantic Kernel agents#479
octo-patch wants to merge 1 commit intoa2aproject:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

Adds MiniMax as an LLM provider option in two sample agents:

  • LangGraph Currency Agent: Adds minimax as a model_source option alongside google and the generic OpenAI-compatible path. Includes a ChatMiniMax subclass of ChatOpenAI that:

    • Strips <think> reasoning tags from model responses (MiniMax M2.7 is a thinking model)
    • Uses function_calling for structured output (instead of json_schema which MiniMax does not support)
    • Defaults to temperature=1.0 (MiniMax requires (0.0, 1.0])
  • Semantic Kernel Travel Agent: Adds MINIMAX to the ChatServices enum and implements _get_minimax_chat_completion_service() using OpenAIChatCompletion with a pre-configured AsyncOpenAI client pointing to https://api.minimax.io/v1.

Models

Model ID Description
MiniMax-M2.7 Default — Peak Performance, Ultimate Value
MiniMax-M2.7-highspeed Same performance, faster and more agile

Changes

File Change
samples/python/agents/langgraph/app/agent.py Add ChatMiniMax class + minimax model_source branch
samples/python/agents/langgraph/app/test_minimax.py 13 unit + 3 integration tests
samples/python/agents/langgraph/README.md MiniMax setup instructions + API docs link
samples/python/agents/semantickernel/agent.py Add MINIMAX enum + _get_minimax_chat_completion_service()
samples/python/agents/semantickernel/test_minimax.py 10 unit + 3 integration tests
samples/python/agents/semantickernel/README.md MiniMax setup instructions + API docs link
samples/python/agents/semantickernel/.envexample MiniMax env var examples

7 files changed, 621 additions, 13 deletions, 29 tests

Test plan

  • LangGraph unit tests pass (13/13)
  • LangGraph integration tests pass with live MiniMax API (3/3)
  • Semantic Kernel unit tests pass (10/10)
  • Semantic Kernel integration tests pass with live MiniMax API (3/3)
  • Existing agent behavior unchanged (Google/OpenAI/Azure paths untouched)

API Reference

…gents

- Add MiniMax as an explicit model_source option in the LangGraph currency agent
  with ChatMiniMax subclass for think-tag stripping and function_calling
  structured output
- Add MiniMax as a ChatServices enum option in the Semantic Kernel travel agent
  using OpenAI-compatible API via AsyncOpenAI client
- Add MINIMAX_API_KEY environment variable support and configuration docs
- Add 29 tests (13 unit + 3 integration per agent)
- Update READMEs with MiniMax setup instructions and API docs links
- Update .envexample with MiniMax configuration variables
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the flexibility of the sample agents by introducing MiniMax as an additional Large Language Model (LLM) provider. This allows users to leverage MiniMax's models within both the LangGraph Currency Agent and the Semantic Kernel Travel Agent, broadening the choice of backend LLMs beyond Google, OpenAI, and Azure. The integration includes specific adaptations to ensure MiniMax's unique characteristics are handled correctly, providing a robust and seamless experience for developers.

Highlights

  • LangGraph Currency Agent Integration: Integrated MiniMax as a new LLM provider option for the LangGraph Currency Agent, including a custom ChatMiniMax class to handle MiniMax-specific response formatting (stripping tags) and structured output (using function_calling).
  • Semantic Kernel Travel Agent Integration: Added MiniMax as a new LLM provider option for the Semantic Kernel Travel Agent, extending the ChatServices enum and implementing a dedicated service function to configure the MiniMax client via its OpenAI-compatible API.
  • Comprehensive Testing: Introduced new unit and integration tests for both the LangGraph and Semantic Kernel agents to ensure proper functionality and compatibility with the MiniMax API.
  • Documentation Updates: Updated README files and environment examples for both agents to include clear setup instructions and API references for MiniMax.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds support for MiniMax as an LLM provider to both the LangGraph and Semantic Kernel sample agents. The implementation is clean, well-tested, and follows good practices for integrating new providers. For the LangGraph agent, a ChatMiniMax class is introduced to handle MiniMax-specific behaviors like stripping <think> tags. For the Semantic Kernel agent, MiniMax is added as a new ChatService option leveraging its OpenAI-compatible API. The changes include comprehensive unit and integration tests, as well as documentation updates. I have one minor suggestion to improve the setup instructions in the README for consistency.

echo "TOOL_LLM_URL=your_llm_url" > .env
echo "TOOL_LLM_NAME=your_llm_name" > .env
# If you're using MiniMax (https://platform.minimax.io):
echo "model_source=minimax" >> .env
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

For consistency with the other setup instructions and to avoid potential issues if the .env file doesn't exist, the first echo command for a new provider configuration should use > to create/overwrite the file, while subsequent commands for the same provider use >> to append.

Suggested change
echo "model_source=minimax" >> .env
echo "model_source=minimax" > .env

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant