-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathdoc.go
More file actions
105 lines (79 loc) · 3.17 KB
/
doc.go
File metadata and controls
105 lines (79 loc) · 3.17 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
/*
Package flowengine provides an agent-style LLM orchestration framework.
# Overview
FlowEngine is a zero-external-dependency framework for building composable
LLM workflows with RPC-style action definitions and type-safe response handling.
It is designed for orchestrating relatively stable flows within a single context.
# Core Features
- Agent-style Step definitions with Context, Guide, Task, Input
- RPC-style action registration with automatic JSON Schema generation
- Session management with context chaining
- Configurable retry with validation and hint injection
- Type-safe parameter extraction using generics
- Composable flows: Chain, Decide, Parallel, Iterate
- Function Call output mode for structured responses
- Caching adapter decorator
- Zero external dependencies
# Three-Layer Prompt Model
The engine enforces a strict "Three-Layer" structure for prompts:
- Layer 1: Context (System) - Inject "Just-in-Time" data using <context> tag
- Layer 2: Guide (System) - Define "How to operate" using <guide> tag
- Layer 3: Task (User) - Define "What to do" using <task> tag
# Quick Start
// 1. Define a step with actions
step := flowengine.NewStep("intent").
Context("You are an intent classifier").
Task("Classify user query intent").
Input("Query: {{.query}}").
Action("search", SearchParams{}).
Action("chat", ChatParams{}).
Build()
// 2. Create engine with adapter
engine := flowengine.New(yourAdapter)
// 3. Run with metadata
fc, resp, err := engine.Run(ctx, step, flowengine.Metadata{"query": "red dress"})
// 4. Type-safe parameter extraction
if params, ok := flowengine.ParamsAs[SearchParams](resp, "search"); ok {
// Use params.Keywords, params.MaxCount
}
# Session Support
Session management enables conversation context chaining:
// Run within a session (context is preserved across calls)
fc1, resp1, _ := engine.RunWithSession(ctx, "session-id", step1, metadata)
fc2, resp2, _ := engine.RunWithSession(ctx, "session-id", step2, metadata)
// Delete session when done
engine.DeleteSession(ctx, "session-id")
# Composable Flows
// Sequential execution
flow := flowengine.Chain(step1, step2, step3)
// Conditional branching
flow := flowengine.Decide(
flowengine.When("search", searchHandler),
flowengine.When("chat", chatHandler),
flowengine.Otherwise(defaultHandler),
)
// Parallel execution
flow := flowengine.Parallel(step1, step2)
// Iteration until condition
flow := flowengine.Iterate(step, func(r *Response) bool {
return r.Is("done")
})
# Adapter Interfaces
Implement the Adapter interface to integrate with any LLM provider:
type Adapter interface {
Call(ctx context.Context, system, user string) (string, error)
Stream(ctx context.Context, system, user string) (<-chan string, error)
}
// Session-aware adapter
type SessionAdapter interface {
Adapter
CallWithSession(ctx context.Context, system, user, previousResponseID string) (resp, responseID string, error)
DeleteSession(ctx context.Context, responseID string) error
}
// Function Call adapter
type ToolAdapter interface {
Adapter
CallWithTools(ctx context.Context, system, user string, tools []ToolDef) (*ToolCall, error)
}
*/
package flowengine