-
Notifications
You must be signed in to change notification settings - Fork 0
JSON Output Structured Results
wikigen provides structured JSON output for integration with CI/CD pipelines, build systems, and programmatic tooling. The -json flag enables serialization of generation results to stdout, containing comprehensive metadata about each generated page including status, file size, generation timing, and project-level summaries. This page documents the JSON output format, data structures, and practical integration patterns.
When the -json flag is enabled, wikigen outputs a JSON array containing generation results for each project processed. This structured format replaces the human-readable stderr output and provides machine-parseable results suitable for automated workflows. Each result includes project-level metadata, per-page generation status and statistics, and overall completion timing.
JSON output is controlled by the -json command-line flag:
./wikigen -r owner/repo -json
./wikigen -f repos.txt -json
./wikigen -local /path/to/repo project-name -jsonWhen -json is set, results are written to stdout as a JSON array, while any error messages or progress information remain on stderr for real-time monitoring.
The top-level output is an array of WikiResult objects, one per project processed. Each result contains overall generation metadata and an array of per-page results.
{
"project": "string",
"repos": ["owner/repo1", "owner/repo2"],
"output_dir": "/absolute/path/to/wiki-output/project",
"pages": [/* WikiPageResult array */],
"total_pages": 42,
"failed": 2,
"duration": "3m45s",
"status": "completed"
}| Field | Type | Description |
|---|---|---|
project |
string | Project name, derived from the last path segment of a single repo or the group name for multi-repo wikis |
repos |
array | List of processed repositories in owner/repo format |
output_dir |
string | Absolute filesystem path to the generated wiki directory |
pages |
array | Array of WikiPageResult objects describing each generated page |
total_pages |
integer | Total number of pages in the wiki structure |
failed |
integer | Count of pages that failed generation after all retry attempts |
duration |
string | Total elapsed time from start to completion, formatted as Go duration string (e.g., "3m45s", "12.5s") |
status |
string | Overall completion status: "completed" (success), "error" (fatal error), or "dry-run" (structure-only) |
Sources:
The status field indicates the result of the generation process:
-
"completed": All pages generated successfully; some or all pages may have succeeded (check
failedcount) - "error": Fatal error occurred before structure determination or during initial setup (no pages generated)
- "dry-run": Dry-run mode completed; wiki structure was determined but pages were not generated
- "running": Initial state before processing completes (only visible if generation is interrupted)
Sources:
Each page in the pages array contains metadata about a single generated wiki page.
{
"title": "System Overview",
"filename": "System-Overview",
"size": 12847,
"status": "ok"
}| Field | Type | Description |
|---|---|---|
title |
string | Human-readable page title as determined by Claude analysis |
filename |
string | GitHub Wiki-compatible filename (spaces converted to hyphens), without .md extension |
size |
integer | File size in bytes of the generated markdown; 0 indicates generation failure |
status |
string | Generation outcome: "ok" (successfully generated and ≥200 bytes), "failed" (generation failed or file too small), "pending" (queued for generation in dry-run mode) |
Sources:
The per-page status field indicates generation success:
- "ok": Page was successfully generated with ≥200 bytes of content
- "failed": Page generation failed after maximum retry attempts (default 3) or file size is <200 bytes
- "pending": Page is queued for generation; appears only in dry-run output
Sources:
$ ./wikigen -r octocat/Hello-World -json[
{
"project": "Hello-World",
"repos": [
"octocat/Hello-World"
],
"output_dir": "/home/user/wiki-output/Hello-World",
"pages": [
{
"title": "System Overview",
"filename": "System-Overview",
"size": 8432,
"status": "ok"
},
{
"title": "Installation & Setup",
"filename": "Installation-Setup",
"size": 6215,
"status": "ok"
},
{
"title": "API Reference",
"filename": "API-Reference",
"size": 0,
"status": "failed"
}
],
"total_pages": 3,
"failed": 1,
"duration": "2m34s",
"status": "completed"
}
]When multiple repositories are grouped into a single wiki (using project:owner/repo format):
$ ./wikigen -r "myapp:owner/frontend,myapp:owner/backend" -json[
{
"project": "myapp",
"repos": [
"owner/frontend",
"owner/backend"
],
"output_dir": "/home/user/wiki-output/myapp",
"pages": [
{
"title": "Architecture Overview",
"filename": "Architecture-Overview",
"size": 15234,
"status": "ok"
},
{
"title": "Frontend Components",
"filename": "Frontend-Components",
"size": 9876,
"status": "ok"
},
{
"title": "Backend API",
"filename": "Backend-API",
"size": 11245,
"status": "ok"
}
],
"total_pages": 3,
"failed": 0,
"duration": "3m47s",
"status": "completed"
}
]When processing multiple projects with -f repos.txt:
$ ./wikigen -f repos.txt -p 2 -jsonThe output is an array of results, one per project:
[
{
"project": "project-a",
"repos": ["owner/project-a"],
"output_dir": "/home/user/wiki-output/project-a",
"pages": [...],
"total_pages": 5,
"failed": 0,
"duration": "1m23s",
"status": "completed"
},
{
"project": "project-b",
"repos": ["owner/project-b"],
"output_dir": "/home/user/wiki-output/project-b",
"pages": [...],
"total_pages": 8,
"failed": 1,
"duration": "1m23s",
"status": "completed"
}
]Sources:
flowchart TD
A["CLI Invocation<br/>-json flag"] --> B["generateWiki<br/>per project"]
B --> C["Create WikiResult"]
C --> D["Phase 1: Structure<br/>Determine pages"]
D --> E["Populate TotalPages"]
E --> F["Phase 2: Generate Pages<br/>Parallel execution"]
F --> G["Update Page Status<br/>ok/failed"]
G --> H["Finalize Result<br/>Status, Duration"]
H --> I["JSON Encoding<br/>to stdout"]
I --> J["Array of WikiResult"]
JSON output enables integration with automated build pipelines. The structured format allows downstream tools to:
- Parse generation results programmatically without regex-based text parsing
-
Detect failures by examining the
failedcount or per-pagestatusvalues - Extract page metadata for publishing, archiving, or cross-referencing
-
Track generation performance using the
durationfield
#!/bin/bash
result=$(./wikigen -r owner/repo -json | jq '.[0]')
total=$(echo "$result" | jq '.total_pages')
failed=$(echo "$result" | jq '.failed')
if [ "$failed" -gt 0 ]; then
echo "❌ Wiki generation failed: $failed/$total pages"
exit 1
fi
echo "✅ Wiki generated successfully: $total pages"
# Continue with wiki deployment...#!/bin/bash
./wikigen -f repos.txt -json | jq '.[] | {project, total_pages, failed, status}' | \
jq -r '"[\(.project)] \(.total_pages) pages, \(.failed) failed (\(.status))"'Output:
[project-a] 5 pages, 0 failed (completed)
[project-b] 8 pages, 1 failed (completed)
[project-c] 0 pages, 0 failed (error)
Sources:
- main.go:904 — Flag definition
- main.go:1058-1061 — Output encoding
When both -dry-run and -json flags are combined, the output reflects structure determination without page content generation:
[
{
"project": "example",
"repos": ["owner/example"],
"output_dir": "/home/user/wiki-output/example",
"pages": [
{
"title": "System Overview",
"filename": "System-Overview",
"size": 0,
"status": "pending"
},
{
"title": "API Reference",
"filename": "API-Reference",
"size": 0,
"status": "pending"
}
],
"total_pages": 2,
"failed": 0,
"duration": "0s",
"status": "dry-run"
}
]In dry-run mode:
- All pages have
status: "pending"(not yet generated) - All pages have
size: 0(no content written) -
durationreflects structure determination time only -
statusis "dry-run" rather than "completed"
Sources:
The JSON output uses standard UTF-8 encoding with indentation (2 spaces) for readability:
enc := json.NewEncoder(os.Stdout)
enc.SetIndent("", " ")
enc.Encode(results)Sources:
Results from parallel project processing are synchronized before JSON output:
var mu sync.Mutex
var results []*WikiResult
// Inside parallel task goroutine:
mu.Lock()
results = append(results, result)
mu.Unlock()
// After all tasks complete:
for _, r := range results {
r.Duration = elapsed.String()
}All results share the same Duration field (total elapsed time across all projects), as duration is calculated once after all concurrent tasks complete.
Sources:
When a fatal error occurs during generation, the result is still returned with status: "error":
[
{
"project": "invalid-repo",
"repos": ["owner/invalid-repo"],
"output_dir": "",
"pages": [],
"total_pages": 0,
"failed": 0,
"duration": "0.5s",
"status": "error"
}
]Error details are logged to stderr and optionally to the log file specified by -log flag. The JSON output contains only the result metadata; detailed error messages are not included in the JSON structure itself.
Sources:
Filter results by status:
./wikigen -f repos.txt -json | jq '.[] | select(.status == "completed")'Extract failed pages across all projects:
./wikigen -f repos.txt -json | jq '.[] | select(.failed > 0) | {project, failed, pages: (.pages | map(select(.status == "failed")))}'Calculate aggregate statistics:
./wikigen -f repos.txt -json | jq '[.[] | {total: .total_pages, failed: .failed}] | {total_pages: map(.total) | add, total_failed: map(.failed) | add}'In Go:
var results []*WikiResult
err := json.NewDecoder(os.Stdin).Decode(&results)
if err != nil {
log.Fatal(err)
}
for _, r := range results {
log.Printf("[%s] %d/%d pages generated", r.Project, r.TotalPages-r.Failed, r.TotalPages)
}In Python:
import json
import sys
results = json.load(sys.stdin)
for result in results:
print(f"[{result['project']}] {result['total_pages']} pages, {result['failed']} failed")The JSON output does not impact generation performance; it is produced after all pages are generated and only affects output formatting. When combined with the -p flag for parallel projects or -pp flag for parallel pages, JSON output is still a single unified array written once at completion.
For batch processing of many repositories, consider:
- Streaming parsing: Large JSON arrays (100+ projects) should be parsed incrementally rather than loaded entirely into memory
- Progress monitoring: While generation is in progress, results are not available; use stderr progress output for real-time status
- Result persistence: Save JSON output to a file for later analysis or archival:
./wikigen -f repos.txt -json > results.json 2>progress.logSources:
wikigen produces two distinct forms of output:
| Output | Destination | Format | Purpose |
|---|---|---|---|
| Progress & Errors | stderr | Human-readable | Real-time monitoring during generation |
| JSON Results | stdout | Structured JSON | Programmatic processing |
| Generated Pages | Filesystem | Markdown | Content storage and publishing |
| Error Log | _errors.log |
Text log | Persistent record of page-level failures |
This separation allows tools to simultaneously monitor progress in real-time while capturing structured results for further processing.
Sources:
- CLI Usage & Commands — Complete command-line interface reference including the -json flag
- Output Format & Wiki Structure — Generated wiki file structure and GitHub Wiki compatibility
- Error Handling & Retry Mechanism — Error logging to _errors.log and automatic retry strategy
- Parallel Processing & Concurrency — Concurrent page and project generation with -p and -pp flags
- Architecture & Design — Overall generation workflow and two-phase approach
- System Overview
- Architecture & Design
- CLI Usage & Commands
- Configuration & Environment
- Input Formats & Repository Configuration
- Authentication & Git Integration
- Output Format & Wiki Structure
- Error Handling & Retry Mechanism
- Parallel Processing & Performance
- Input Validation & Security
- Build & Deployment
- Claude Code Integration
- Wiki Generation Processing Flow
- Multi-Repository Wiki Support
- Progress Tracking & Output Modes