A high-performance EVM blockchain relayer service for the PowerLoom Protocol ecosystem. This service handles batch submissions, transaction queuing, and on-chain consensus operations across multiple ecosystem components including decentralized sequencers, centralized sequencers, reward managers, identity managers, and other protocol services on Ethereum-compatible networks.
The relayer service acts as a bridge between off-chain data processing and on-chain settlement. It provides REST APIs for submitting batch data and rewards updates, manages transaction queuing via RabbitMQ, and processes blockchain transactions using multiple concurrent workers.
- FastAPI-based REST API for batch submissions and rewards updates
- Transaction queuing with RabbitMQ for reliable message delivery
- Multi-worker architecture using PM2 process management
- Redis integration for caching and state management
- Web3 integration for EVM blockchain interactions
- Comprehensive retry logic with exponential backoff
- Health monitoring and logging with structured logs
- Docker containerization for easy deployment
- Multi-component support for sequencers, reward managers, identity managers, and other protocol services
┌─────────────────┐ ┌─────────────┐ ┌─────────────────┐
│ FastAPI App │────│ RabbitMQ │────│ Tx Workers │
│ │ │ Queue │ │ │
│ • /submitBatch │ └─────────────┘ │ • Batch Txns │
│ • /submitRewards│ │ • Rewards Txns │
└─────────────────┘ └─────────────────┘
│ │
└──────────────────┬─────────────────────┘
│
┌─────────────────┐
│ EVM Blockchain │
│ (Web3.py) │
└─────────────────┘
- API Layer (
relayer.py): FastAPI application handling HTTP requests - Transaction Workers (
tx_worker.py): Background workers processing queued transactions - Message Queue: RabbitMQ for reliable transaction queuing
- Cache/Storage: Redis for temporary data storage and coordination
- Process Management: PM2 for managing multiple worker processes
- Docker and Docker Compose
- Python 3.8+ (for local development)
- Poetry (for dependency management)
-
Clone and navigate to the project:
cd relayer-py -
Build and run:
./build.sh
This will:
- Build the Docker image
- Start Redis, RabbitMQ, and the relayer service
- Expose the API on port 8080
-
Install dependencies:
poetry install
-
Configure settings:
cp settings/settings.example.json settings/settings.json # Edit settings.json with your configuration[!TIP] Configuration: The
settings/settings.jsonfile is required for the service to run. Copy fromsettings.example.jsonand update with your network-specific values, RPC endpoints, and signer credentials. -
Run the service:
# Start the API server poetry run python -m gunicorn_launcher # In another terminal, start transaction workers poetry run python -m tx_worker
The service is configured via settings/settings.json. Key configuration sections:
{
"relayer_service": {
"host": "0.0.0.0",
"port": 8080,
"keys_ttl": 86400,
"keepalive_secs": 600
}
}{
"anchor_chain": {
"rpc": {
"full_nodes": [
{
"url": "<powerloom-chain-rpc-url>",
"rate_limit": "100000000/day;18000/minute;300/second"
}
],
"retry": 5,
"request_time_out": 5
},
"chain_id": 104,
"polling_interval": 2
}
}{
"signers": [
{
"address": "signer-address",
"private_key": "signer-private-key"
}
],
"min_signer_balance_eth": 1
}Warning
Security Critical: Never commit real private keys to version control. Use environment variables or secure secret management systems in production. The private_key field should only contain placeholder values in configuration files.
POST /submitBatchSize
Content-Type: application/json
{
"epochID": "epoch_123",
"batchSize": 100,
"authToken": "your-auth-token"
}POST /submitSubmissionBatch
Content-Type: application/json
{
"epochID": "epoch_123",
"dataMarketAddress": "0x...",
"batchCID": "Qm...",
"authToken": "your-auth-token"
}POST /submitUpdateRewards
Content-Type: application/json
{
"dataMarketAddress": "0x...",
"slotIds": [1, 2, 3],
"submissions": [...],
"day": 123,
"eligibleNodes": [...],
"authToken": "your-auth-token"
}Note: This relayer service is used by multiple Powerloom Protocol components including sequencers, reward managers, and identity managers. The API endpoints handle various transaction types across the ecosystem.
Note
Authentication: All API endpoints require a valid authToken that must match the auth_token configured in your settings. This token is shared across ecosystem components for secure communication.
-
Configure environment variables:
export VPA_SIGNER_ADDRESSES="0xaddr1,0xaddr2"
[!IMPORTANT] Environment Variables: Set
VPA_SIGNER_ADDRESSESto a comma-separated list of all signer addresses. This determines the number of transaction worker processes that will be spawned. -
Build and deploy:
docker build -t powerloom-relayer . docker-compose up -d
The service uses PM2 for process management with the following configuration:
- relayer-api: FastAPI server instance
- tx_worker: Multiple transaction worker instances (one per signer)
Process count is automatically determined by the number of configured signers.
relayer-py/
├── relayer.py # Main FastAPI application
├── tx_worker.py # Transaction processing workers
├── data_models.py # Pydantic data models
├── settings/ # Configuration files
├── utils/ # Utility functions
├── helpers/ # Helper modules
├── auth/ # Authentication modules
├── Dockerfile # Container definition
├── docker-compose.yaml # Multi-service orchestration
├── pm2.config.js # Process management config
└── pyproject.toml # Python dependencies
# Run tests
poetry run pytest
# Run with coverage
poetry run pytest --cov=. --cov-report=html# Run linter
poetry run pylint relayer tx_worker
# Format code
poetry run black .
poetry run isort .Logs are structured using Loguru and can be accessed via:
# Docker logs
docker logs relayer
# PM2 logs
pm2 logsThe service includes built-in health monitoring:
- Redis connection health
- RabbitMQ connectivity
- Blockchain RPC node availability
- Transaction processing status
Tip
Logs First: When troubleshooting, always check the service logs first using docker logs relayer or pm2 logs. The structured logs will provide detailed error information and request IDs for correlation.
-
Connection refused to Redis/RabbitMQ:
- Ensure services are running:
docker-compose ps - Check network connectivity between containers
- Ensure services are running:
-
Transaction failures:
- Verify signer private keys and addresses
- Check signer ETH balance (minimum 1 ETH required)
- Review blockchain RPC configuration
-
Authentication errors:
- Verify
auth_tokenin settings matches API requests
- Verify
Enable debug logging by setting environment variable:
export LOG_LEVEL=DEBUG- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
Apache License 2.0