You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* clarified abn documentation
Signed-off-by: Michael Kalantar <[email protected]>
* clarified abn documentation
Signed-off-by: Michael Kalantar <[email protected]>
---------
Signed-off-by: Michael Kalantar <[email protected]>
Copy file name to clipboardExpand all lines: docs/tutorials/abn/abn.md
+20-39Lines changed: 20 additions & 39 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,22 +4,7 @@ template: main.html
4
4
5
5
# A/B Experiments
6
6
7
-
A/B testing an application's backend component is challenging.
8
-
A/B testing typically relies on business metrics computed by a frontend, user-facing, service.
9
-
Metric values often depend on one or more interactions with backend (not user-facing) components.
10
-
To A/B test a backend component, it is necessary to be able to associate a metric value (computed by the frontend) to the version of the backend component that contributed to its computation.
11
-
The challenge is that the frontend service often does not know which version of the backend component processed a given request.
12
-
13
-
To address this challenge, Iter8 introduces an A/B/n SDK which provides a frontend service with two APIs:
14
-
15
-
a. **Lookup()** - identifies a version of a backend component to send a request to
16
-
17
-
b. **WriteMetric()** - associates a metric with a backend component
18
-
19
-
This SDK, implemented using gRPC, can be used from a number of frontend implementation languages including *Node.js*, *Python*, *Ruby*, and *Go*, among others. Details of the Iter8 SDK are documented in the [gRPC protoc file](https://github.com/iter8-tools/iter8/blob/v0.13.0/abn/grpc/abn.proto).
20
-
21
-
This tutorial describes an A/B testing experiment for a backend component.
22
-
Example implementations of frontend components are provided in *Node.js* and *Go*.
7
+
This tutorial describes an [A/B testing](../../user-guide/topics/ab_testing.md) experiment for a backend component.
@@ -32,7 +17,7 @@ Example implementations of frontend components are provided in *Node.js* and *Go
32
17
33
18
## Launch Iter8 A/B/n service
34
19
35
-
Deploy the Iter8 A/B/n service. When deploying the service, specify which Kubernetes resources to watch for each application. To watch for versions of the *backend* application in the *default* namespace, configure the service to watch for service and deployment resources:
20
+
Deploy the Iter8 A/B/n service. When deploying the service, specify which Kubernetes resource types to watch for each application. To watch for versions of the *backend* application in the *default* namespace, configure the service to watch for Kubernetes service and deployment resources:
36
21
37
22
```shell
38
23
helm install --repo https://iter8-tools.github.io/hub iter8-abn abn \
To simplify specification, Iter8 assumes certain conventions:
44
29
45
-
- resources of all versions are deployed to the same namespace
46
-
- there is only one resource of each resource type among the resources of a version
47
-
- all resources that comprise the baseline version are named as: _<application\_name>_
48
-
- all resources that comprise the i<sup>th</sup> candidate version are named as: _<application\_name>-candidate-<i>_
30
+
- The baseline track identifier is the application name
31
+
- Track identifiers associated with candidate versions are of the form `<application_name>-candidate-<index>`
32
+
- All resource objects for all versions are deployed in the same namespace
33
+
- There is only 1 resource object of a given type in each version
34
+
- The name of each object in the version associated with the baseline track is the application name
35
+
- The name of each object in the version associate with a candidate track is of the form `<application_name>-candidate-<index>` where index is 1, 2, etc.
36
+
49
37
50
38
## Deploy the sample application
51
39
52
40
Deploy both the frontend and backend components of the application as described in each tab:
53
41
54
42
=== "frontend"
55
-
Install the frontend service using an implementation in the language of your choice:
43
+
Install the frontend component using an implementation in the language of your choice:
56
44
57
45
=== "node"
58
46
```shell
@@ -66,10 +54,10 @@ Deploy both the frontend and backend components of the application as described
The frontend service is implemented to call **Lookup()** before each call to the backend service. It sends its request to the recommended backend service.
57
+
The frontend component is implemented to call *Lookup()* before each call to the backend component. The frontend componet uses the returned track identifier to route the request to a version of the backend component.
70
58
71
59
=== "backend"
72
-
Deploy version *v1* of the *backend* component as track *backend*.
60
+
Deploy version *v1* of the *backend* component, associating it with the track identifier*backend*.
Before calling the backend, the frontend uses *Lookup()* to identify the track to send requests to. Since there is only one version of the backend deployed, all requests will be sent to it.
82
-
83
69
## Generate load
84
70
85
-
Generate load. In separate shells, port-forward requests to the frontend service and generate load for multiple users. For example:
71
+
Generate load. In separate shells, port-forward requests to the frontend component and generate load for multiple users. A [script](https://raw.githubusercontent.com/iter8-tools/docs/main/samples/abn-sample/generate_load.sh) is provided to do this. To use it:
86
72
```shell
87
73
kubectl port-forward service/frontend 8090:8090
88
74
```
89
75
```shell
90
76
curl -s https://raw.githubusercontent.com/iter8-tools/docs/main/samples/abn-sample/generate_load.sh | sh -s --
91
77
```
92
78
93
-
Note that the the names `foo` and `foobar` are examples. They may be mapped to the same track label -- since we are using
94
-
95
79
## Deploy a candidate version
96
80
97
-
Deploy version *v2* of the *backend* component as track *backend-candidate-1*.
81
+
Deploy version *v2* of the *backend* component, associating it with the track identifier*backend-candidate-1*.
Until the candidate version is ready; that is, until all expected resources are deployed and available, calls to *Lookup()* will continue to return only the *backend* track.
107
-
Once the candidate version is ready, *Lookup()* will return both tracks so that requests will be distributed between them.
90
+
Until the candidate version is ready; that is, until all expected resources are deployed and available, calls to *Lookup()* will return only the *backend* track identifier.
91
+
Once the candidate version is ready, *Lookup()* will return both track identifiers so that requests will be distributed between versions.
108
92
109
93
## Launch experiment
110
94
@@ -117,7 +101,7 @@ iter8 k launch \
117
101
```
118
102
119
103
??? note "About this experiment"
120
-
This experiment periodically (in this case, once a minute) reads the `abn` metrics associated with the *backend* application component in the *default* namespace. These metrics are written by the frontend service using the *WriteMetric()* interface as a part of processing user requests.
104
+
This experiment periodically (in this case, once a minute) reads the `abn` metrics associated with the *backend* application component in the *default* namespace. These metrics are written by the frontend component using the *WriteMetric()* interface as a part of processing user requests.
121
105
122
106
## Inspect experiment report
123
107
@@ -149,9 +133,9 @@ iter8 k report
149
133
abn/sample_metric/min | 0.00 | 1.00
150
134
abn/sample_metric/stddev | 28.52 | 31.91
151
135
```
152
-
The output allows you to compare the versions against each other and select a winner. Since the experiment runs periodically, you should expect the values in the report to change over time.
136
+
The output allows you to compare the versions against each other and select a winner. Since the experiment runs periodically, the values in the report will change over time.
153
137
154
-
Once a winner is identified, the experiment can be terminated and the winner can be promoted and the candidate versions can be deleted.
138
+
Once a winner is identified, the experiment can be terminated, the winner can be promoted, and the candidate version(s) can be deleted.
0 commit comments