Skip to content

Commit 43e3271

Browse files
committed
Add Kafka setup doc
1 parent 6ccd8cb commit 43e3271

File tree

1 file changed

+149
-0
lines changed

1 file changed

+149
-0
lines changed

KAFKA_SETUP.md

Lines changed: 149 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,149 @@
1+
# Kafka Development Setup
2+
3+
This document describes how to set up and test the Kafka consumer functionality in the TC Review API.
4+
5+
## Quick Start
6+
7+
### 1. Start Kafka Services
8+
9+
```bash
10+
# Start Kafka and related services
11+
docker compose -f docker-compose.kafka.yml up -d
12+
13+
# Verify services are running
14+
docker compose -f docker-compose.kafka.yml ps
15+
```
16+
17+
This will start:
18+
- **Zookeeper** on port 2181
19+
- **Kafka** on port 9092
20+
- **Kafka UI** on port 8080 (web interface)
21+
22+
### 2. Configure Environment
23+
24+
```bash
25+
# Copy the sample environment file
26+
cp .env.sample .env
27+
28+
# Update the .env file with your database and other configurations
29+
# Kafka settings are pre-configured for local development
30+
```
31+
32+
### 3. Start the Application
33+
34+
```bash
35+
# Install dependencies
36+
pnpm install
37+
38+
# Start in development mode
39+
pnpm run start:dev
40+
```
41+
42+
The application will automatically:
43+
- Connect to Kafka on startup
44+
- Subscribe to registered topics
45+
- Start consuming messages
46+
47+
## Testing Kafka Events
48+
49+
### Using Kafka UI (Recommended)
50+
51+
1. Open http://localhost:8080 in your browser
52+
2. Navigate to Topics
53+
3. Create or select the `avscan.action.scan` topic
54+
4. Produce a test message with JSON payload:
55+
```json
56+
{
57+
"scanId": "test-123",
58+
"submissionId": "sub-456",
59+
"status": "initiated",
60+
"timestamp": "2025-01-01T12:00:00Z"
61+
}
62+
```
63+
64+
### Using Command Line
65+
66+
```bash
67+
# Create a topic (optional - auto-created)
68+
docker exec -it kafka kafka-topics --create --topic avscan.action.scan --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1
69+
70+
# Produce a test message
71+
docker exec -it kafka kafka-console-producer --topic avscan.action.scan --bootstrap-server localhost:9092
72+
# Then type your JSON message and press Enter
73+
74+
# Consume messages (for debugging)
75+
docker exec -it kafka kafka-console-consumer --topic avscan.action.scan --from-beginning --bootstrap-server localhost:9092
76+
```
77+
78+
## Development Workflow
79+
80+
### Adding New Event Handlers
81+
82+
1. Create a new handler class extending `BaseEventHandler`:
83+
```typescript
84+
@Injectable()
85+
export class MyCustomHandler extends BaseEventHandler implements OnModuleInit {
86+
private readonly topic = 'my.custom.topic';
87+
88+
constructor(private readonly handlerRegistry: KafkaHandlerRegistry) {
89+
super(LoggerService.forRoot('MyCustomHandler'));
90+
}
91+
92+
onModuleInit() {
93+
this.handlerRegistry.registerHandler(this.topic, this);
94+
}
95+
96+
getTopic(): string {
97+
return this.topic;
98+
}
99+
100+
async handle(message: any): Promise<void> {
101+
// Your custom logic here
102+
}
103+
}
104+
```
105+
106+
2. Register the handler in the KafkaModule providers array
107+
3. The handler will automatically be registered and start consuming messages
108+
109+
### Monitoring and Debugging
110+
111+
- **Application Logs**: Check console output for Kafka connection status and message processing
112+
- **Kafka UI**: Monitor topics, partitions, and consumer groups at http://localhost:8080
113+
- **Health Checks**: Kafka connection status is included in application health checks
114+
115+
### Environment Variables
116+
117+
All Kafka-related environment variables are documented in `.env.sample`:
118+
119+
- `KAFKA_BROKERS`: Comma-separated list of Kafka brokers
120+
- `KAFKA_CLIENT_ID`: Unique client identifier
121+
- `KAFKA_GROUP_ID`: Consumer group ID
122+
- `KAFKA_SSL_ENABLED`: Enable SSL encryption
123+
- Connection timeouts and retry configurations
124+
125+
## Troubleshooting
126+
127+
### Common Issues
128+
129+
1. **Connection Refused**: Ensure Kafka is running with `docker compose -f docker-compose.kafka.yml ps`
130+
2. **Topic Not Found**: Topics are auto-created by default, or create manually using Kafka UI
131+
3. **Consumer Group Issues**: Check consumer group status in Kafka UI under "Consumers"
132+
133+
### Cleanup
134+
135+
```bash
136+
# Stop and remove Kafka services
137+
docker compose -f docker-compose.kafka.yml down
138+
139+
# Remove volumes (clears all Kafka data)
140+
docker compose -f docker-compose.kafka.yml down -v
141+
```
142+
143+
## Production Considerations
144+
145+
- Configure SSL/TLS and SASL authentication for production environments
146+
- Set appropriate retention policies for topics
147+
- Monitor consumer lag and processing metrics
148+
- Configure dead letter queues for failed messages
149+
- Set up proper alerting for consumer failures

0 commit comments

Comments
 (0)