Skip to content

kryvokhyzha/langgraph-agent-toolkit

Repository files navigation

LangGraph Agent Toolkit Logo

🧰 LangGraph Agent Toolkit

CI/Testing build status docs status codecov
Package PyPI version PyPI Downloads Python Version
Meta Ruff GitHub License

πŸ“‹ Introduction

A comprehensive toolkit for building, deploying, and managing AI agents using LangGraph, FastAPI, and Streamlit. It provides a production-ready framework for creating conversational AI agents with features like multi-provider LLM support, streaming responses, observability, memory and prompt management.

What is langGraph-agent-toolkit?

The langgraph-agent-toolkit is a full-featured framework for developing and deploying AI agent services. Built on the foundation of:

  • LangGraph for agent creation with advanced flows and human-in-the-loop capabilities
  • FastAPI for robust, high-performance API services with streaming support
  • Streamlit for intuitive user interfaces

Key components include:

  • Data structures and settings built with Pydantic
  • LiteLLM proxy for universal multi-provider LLM support
  • Comprehensive memory management and persistence using PostgreSQL/SQLite
  • Advanced observability tooling via Langfuse and Langsmith
  • Modular architecture allowing customization while maintaining a consistent application structure

Whether you're building a simple chatbot or complex multi-agent system, this toolkit provides the infrastructure to develop, test, and deploy your LangGraph-based agents with confidence.

You can use DeepWiki to learn more about this repository.

πŸ“‘ Contents

πŸš€ Quickstart

  1. Create a .env file based on .env.example

  2. Option 1: Run with Python from source

    # Install dependencies
    pip install uv
    uv sync --frozen
    source .venv/bin/activate
    
    # Start the service
    python langgraph_agent_toolkit/run_api.py
    
    # In another terminal
    source .venv/bin/activate
    streamlit run langgraph_agent_toolkit/run_app.py
  3. Option 2: Run with Python from PyPi repository

    pip install langgraph-agent-toolkit

    ℹ️ For more details on installation options, see the Installation Documentation.

  4. Option 3: Run with Docker

    docker compose watch

πŸ“¦ Installation Options

The toolkit supports multiple installation options using "extras" to include just the dependencies you need.

For detailed installation instructions and available extras, see the Installation Documentation.

πŸ—οΈ Architecture

✨ Key Features

  1. LangGraph Integration

    • Latest LangGraph v0.3 features
    • Human-in-the-loop with interrupt()
    • Flow control with Command and langgraph-supervisor
  2. API Service

    • FastAPI with streaming and non-streaming endpoints
    • Support for both token-based and message-based streaming
    • Multiple agent support with URL path routing
    • Available agents and models listed at /info endpoint
    • Supports different runners (unicorn, gunicorn, mangum, azure functions)
  3. Developer Experience

    • Asynchronous design with async/await
    • Docker configuration with live reloading
    • Comprehensive testing suite
  4. Enterprise Components

    • Configurable PostgreSQL/SQLite connection pools
    • Observability via Langfuse and Langsmith
    • User feedback system
    • Prompt management system
    • LiteLLM proxy integration

For more details on features, see the Usage Documentation.

βš™οΈ Environment Setup

For detailed environment setup instructions, including creating your .env file and configuring LiteLLM, see the Environment Setup Documentation.

πŸ“‚ Project Structure

The repository contains:

  • langgraph_agent_toolkit/agents/blueprints/: Agent definitions
  • langgraph_agent_toolkit/agents/agent_executor.py: Agent execution control
  • langgraph_agent_toolkit/schema/: Protocol schema definitions
  • langgraph_agent_toolkit/core/: Core modules (LLM, memory, settings)
  • langgraph_agent_toolkit/service/service.py: FastAPI service
  • langgraph_agent_toolkit/client/client.py: Service client
  • langgraph_agent_toolkit/run_app.py: Chat interface
  • docker/: Docker configurations
  • tests/: Test suite

πŸ› οΈ Setup and Usage

For detailed setup and usage instructions, including building your own agent, Docker setup, using the AgentClient, and local development, see the Usage Documentation.

πŸ“š Documentation

Full documentation is available at GitHub repository and includes:

πŸ“š Useful Resources

πŸ‘₯ Development and Contributing

Thank you for considering contributing to Langgraph Agent Toolkit! We encourage the community to post Issues and Pull Requests.

Before you get started, please see our Contribution Guide.

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

About

A comprehensive toolkit for building, deploying, and managing AI agents using LangGraph, FastAPI, and Streamlit. It provides a production-ready framework for creating conversational AI agents with features like multi-provider LLM support, streaming responses, observability, memory and prompt management.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Contributors

Languages