-
Notifications
You must be signed in to change notification settings - Fork 232
🔄 Add Multi-LLM Provider Support - OpenAI, Anthropic, DeepSeek, Grok #4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
bcharleson
wants to merge
11
commits into
firecrawl:main
Choose a base branch
from
bcharleson:main
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
- Implement tabbed interface for API Keys and LLM Settings - Add secure API key input fields with visibility toggle and test functionality - Include local storage persistence with clear all functionality - Add LLM provider selection with model switching - Implement proper modal animations and responsive design - Fix button alignment issues for professional UI appearance
- Add LLMManager for centralized LLM provider switching - Implement APIKeyManager for secure local storage of API keys - Add service classes for OpenAI, Anthropic, DeepSeek, and Grok APIs - Create unified LLMService interface for consistent API interactions - Support dynamic model selection and provider switching - Include proper error handling and fallback mechanisms
- Implement GET/POST endpoints for LLM provider and model configuration - Add server-side validation and error handling - Support dynamic LLM switching through API calls - Enable persistence of LLM preferences
- Add Settings modal integration to main page and fire-enrich page - Implement LLM switcher component in header - Update enrichment table to use dynamic LLM selection - Add proper styling and responsive design - Include user-friendly LLM provider indicators
- Modify orchestrator to use LLMManager for dynamic provider selection - Update agent enrichment strategy to support multiple LLM providers - Enhance type definitions for LLM configuration - Update API routes to use new LLM infrastructure - Maintain backward compatibility with existing functionality
- Add detailed implementation summary with architecture overview - Include setup instructions and usage guidelines - Document API key management and security considerations - Update README with new LLM switching features - Add necessary dependencies for multi-LLM support - Include comprehensive developer documentation
- Add test data for development and testing - Include utility scripts for development workflow - Remove .env.example as API keys are now managed through UI - Add comprehensive testing resources
- Fix ApiKeyStatus type casting in settings modal - Add missing @radix-ui/react-aspect-ratio dependency - Add react-day-picker dependency for UI components - Remove unused UI components to reduce bundle size - Ensure development server runs correctly on localhost:3002
- Add DEPLOYMENT_GUIDE.md with complete setup instructions for end users - Add FEATURE_SUMMARY.md highlighting all LLM switching enhancements - Include detailed provider information and model specifications - Document security features and API key management - Provide testing instructions and troubleshooting guides - Ready for sharing with non-developer friends and community
- Comprehensive verification of all LLM switching components - Checks required files, dependencies, and documentation - Validates LLM provider implementation - Provides clear next steps for end users - Ensures deployment readiness for sharing
mortadacherrak
approved these changes
Aug 23, 2025
jasonrowlandAG
approved these changes
Sep 3, 2025
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
🔄 Multi-LLM Provider Support for Fire-Enrich
🎯 Overview
This PR introduces comprehensive Multi-LLM Provider Support to Fire-Enrich, allowing users to seamlessly switch between different AI providers (OpenAI, Anthropic, DeepSeek, Grok) through an intuitive user interface.
✨ Key Features
🛠 Technical Implementation
📊 Benefits
🧪 Testing Instructions
git clone -b main https://github.com/bcharleson/fire-enrich.git
npm install
npm run dev -- -p 3002
node scripts/verify-deployment.js
📚 Documentation
DEPLOYMENT_GUIDE.md
- Complete setup instructionsFEATURE_SUMMARY.md
- Detailed overviewdocs/LLM_PROVIDER_SWITCHING.md
- Technical guide🚀 Impact
Makes Fire-Enrich more accessible, flexible, and cost-effective for users with different LLM preferences and budgets.
Ready to share these enhancements with the Fire-Enrich community! 🎉
Pull Request opened by Augment Code with guidance from the PR author