Skip to content

Add MiniMax as an alternative OpenAI-compatible provider#474

Open
octo-patch wants to merge 6 commits intomicrosoft:mainfrom
octo-patch:feature/add-minimax-provider
Open

Add MiniMax as an alternative OpenAI-compatible provider#474
octo-patch wants to merge 6 commits intomicrosoft:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 13, 2026

Summary

  • Add MiniMax as an alternative OpenAI-compatible LLM provider for the course code samples
  • MiniMax offers large-context models (up to 204K tokens) via an OpenAI-compatible API, making it a drop-in option with the existing OpenAIChatClient
  • Update hotel_booking_workflow_sample.py to support automatic provider selection based on environment variables (MiniMax, GitHub Models, or OpenAI)

Changes

  • .env.example: Add MINIMAX_API_KEY, MINIMAX_BASE_URL, and MINIMAX_MODEL_ID variables
  • 00-course-setup/README.md: Add setup instructions for using MiniMax as an alternative provider
  • 14-microsoft-agent-framework/README.md: Add MiniMax provider example in the agent creation section
  • hotel_booking_workflow_sample.py: Implement provider selection logic that auto-detects MiniMax, GitHub Models, or OpenAI based on available environment variables
  • README.md: Mention MiniMax as a supported alternative provider

Why MiniMax?

The Microsoft Agent Framework is designed to be provider-agnostic (as noted in Lesson 14's README). MiniMax's OpenAI-compatible API makes integration seamless — no additional SDK or dependencies required. This gives learners another option for running the course examples, especially useful for those who may not have Azure or GitHub Models access.

Test Plan

  • Python syntax validation passes
  • No breaking changes to existing provider configurations
  • MiniMax provider is only activated when MINIMAX_API_KEY is set
  • Existing GitHub Models and OpenAI paths remain unchanged

Update (2026-03-18): Upgraded default model from MiniMax-M2.5 to MiniMax-M2.7 (latest). Also updated highspeed variant to MiniMax-M2.7-highspeed.

@github-actions
Copy link
Contributor

👋 Thanks for contributing @octo-patch! We will review the pull request and get back to you soon.

Update default model references across config, docs, and code samples
to use the latest MiniMax-M2.7 and MiniMax-M2.7-highspeed models.
@jingchang0623-crypto
Copy link

Great addition! MiniMax is becoming a popular choice for its large context windows and competitive pricing.

For those trying this out, here's a quick tip:

  • Use as the base URL
  • The models like work great with the OpenAI-compatible API

Also worth noting: MiniMax recently added support for function calling, making it even easier to integrate with agent frameworks. This is a great step for making AI agent education more accessible globally! 🌍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants