Replies: 3 comments
-
|
Beta Was this translation helpful? Give feedback.
-
|
I propose the following features:
These items can be scheduled across development phases, but I recommend implementing items 4, 5 and 6 in the first phase. |
Beta Was this translation helpful? Give feedback.
-
|
Thank you for the valuable suggestions! |
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
-
Status: Draft
Related Issue: #7646
Table of Contents
1. Background & Motivation
1.1 Current Situation
Seata currently lacks an official command-line benchmark tool for performance testing. Users who want to evaluate Seata's performance need to:
This creates barriers for:
1.2 Goals
Provide a simple, standardized, command-line benchmark tool that:
2. Requirements
2.1 Functional Requirements
Must Have (v1.0):
Should Have (v1.0):
Could Have (Future):
2.2 Non-Functional Requirements
3. User Interface Design
3.1 Command-Line Interface
Basic Usage:
Advanced Options:
3.2 Parameters
--server-s--mode-m--tps-t--threads--duration-d--warmup-duration--export-csv--application-id--tx-service-group--help-h--version-V3.3 Output Format
Console Output (Real-time Progress):
Final Report:
CSV Export Format:
4. Technical Design
4.1 Architecture
4.2 Core Components
4.2.1 BenchmarkApplication (Main Entry)
4.2.2 TransactionExecutor (Strategy Pattern)
Interface:
Implementations:
ATModeExecutor: Empty AT transaction (begin -> commit, no SQL)TCCModeExecutor: Mock TCC transaction (empty try/confirm/cancel)4.2.3 WorkloadGenerator
4.2.4 MetricsCollector
Key Metrics:
Sampling Strategy:
4.2.5 Reporter
4.3 Empty Transaction Implementation
AT Mode Empty Transaction
TCC Mode Mock Transaction
4.4 Technology Stack
5. Open Questions
Please provide your feedback on the following design decisions:
Q1: TUI (Terminal User Interface) Support?
Options:
--tuiflag (flexible, more complexity)Recommendation: Start with A (plain text), add B in v2.0 if community requests
Q2: TCC Mode Implementation?
Options:
--scenariooption in futureRecommendation: A for v1.0 (aligns with "no database dependency" goal)
Q3: Monitoring global_table/branch_table?
Options:
--monitor-dbparameterRecommendation: A for v1.0 (keep it simple)
Q4: Result Export Formats?
Options:
--export-csv--export-jsonRecommendation: B (CSV is sufficient for most analysis needs)
Q5: CLI Framework Choice?
Options:
Recommendation: A (aligns with modern Java practices)
Q6: XA Mode Support?
Options:
Recommendation: A (community meeting prioritized AT/TCC)
Q7: Warmup Support?
Options:
--warmup-durationRecommendation: B (important for accurate benchmarks, but optional parameter)
Q8: Latency Percentiles?
Options:
Recommendation: B (P99.9 is valuable with minimal cost)
6. Implementation Roadmap
Phase 1: MVP (v0.1)
Scope:
Deliverables:
Phase 2: Enhancement (v0.2)
Scope:
--warmup-duration)--export-csv)Deliverables:
Phase 3: Advanced Features (v1.0) - TBD
Scope (based on community feedback):
--scenario)Deliverables:
7. Community Feedback
Please share your thoughts on:
How to Provide Feedback:
Thank you for your valuable input!
Beta Was this translation helpful? Give feedback.
All reactions