Skip to content

Commit 5203bb8

Browse files
authored
A few tweaks to the docs (#346)
1 parent 53bb5b9 commit 5203bb8

File tree

6 files changed

+37
-29
lines changed

6 files changed

+37
-29
lines changed

docs/commands.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# RedisAI Commands
22
RedisAI is a Redis module, and as such it implements several data types and the respective commands to use them.
33

4-
All of RedisAI's commands are begin with the `AI.` prefix. The following sections describe these commands.
4+
All of RedisAI's commands begin with the `AI.` prefix. The following sections describe these commands.
55

66
**Syntax Conventions**
77

@@ -365,7 +365,7 @@ def addtwo(a, b):
365365
It can be stored as a RedisAI script using the CPU device with [`redis-cli`](https://redis.io/topics/rediscli) as follows:
366366

367367
```
368-
$ cat addtwo.py | redis-cli -x AI.SCRIPTSET myscript addtwo CPU TAG SOURCE myscript:v0.1
368+
$ cat addtwo.py | redis-cli -x AI.SCRIPTSET myscript addtwo CPU TAG myscript:v0.1 SOURCE
369369
OK
370370
```
371371

@@ -514,9 +514,9 @@ The **`AI.DAGRUN`** command specifies a direct acyclic graph of operations to ru
514514

515515
It accepts one or more operations, split by the pipe-forward operator (`|>`).
516516

517-
By default, the DAG execution context is local, meaning that loading and persisting tensors should be done explicitly. The user should specify which key tensors to load from keyspace using the `LOAD` keyword, and which command outputs to persist to the keyspace using the `PERSIST` keyspace.
517+
By default, the DAG execution context is local, meaning that tensor keys appearing in the DAG only live in the scope of the command. That is, setting a tensor with `TENSORSET` will store it local memory and not set it to an actual database key. One can refer to that key in subsequent commands within the DAG, but that key won't be visible outside the DAG or to other clients - no keys are open at the database level.
518518

519-
When `PERSIST` is not present, object savings are done locally and kept only during the context of the DAG meaning that no output keys are open.
519+
Loading and persisting tensors from/to keyspace should be done explicitly. The user should specify which key tensors to load from keyspace using the `LOAD` keyword, and which command outputs to persist to the keyspace using the `PERSIST` keyspace.
520520

521521
As an example, if `command 1` sets a tensor, it can be referenced by any further command on the chaining.
522522

@@ -611,21 +611,21 @@ The following example obtains the previously-run 'myscript' script's runtime sta
611611

612612
```
613613
redis> AI.INFO myscript
614-
1) KEY
614+
1) key
615615
2) "myscript"
616-
3) TYPE
616+
3) type
617617
4) SCRIPT
618-
5) BACKEND
618+
5) backend
619619
6) TORCH
620-
7) DEVICE
620+
7) device
621621
8) CPU
622-
9) DURATION
622+
9) duration
623623
10) (integer) 11391
624-
11) SAMPLES
624+
11) samples
625625
12) (integer) -1
626-
13) CALLS
626+
13) calls
627627
14) (integer) 1
628-
15) ERRORS
628+
15) errors
629629
16) (integer) 0
630630
```
631631

docs/configuration.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -151,6 +151,8 @@ redis-server --loadmodule /usr/lib/redis/modules/redisai.so \
151151
### THREADS_PER_QUEUE
152152
The **THREADS_PER_QUEUE** configuration option controls the number of worker threads allocated to each device's job queue. Multiple threads can be used for executing different independent operations in parallel.
153153

154+
Note that RedisAI maintains one job queue per device (CPU, GPU:0, GPU:1). Each job queue is consumed by THREADS_PER_QUEUE threads.
155+
154156
This option significantly improves the performance of simple, low-effort computation-wise models since there is spare computation cycle available from modern CPUs and hardware accelerators (GPUs, TPUs, ...).
155157

156158
_Expected Value_

docs/developer.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,16 +6,15 @@ The following sections discuss topics relevant to the development of the RedisAI
66

77
RedisAI bundles together best-of-breed technologies for delivering stable and fast model serving. To do so, we need to abstract from what each specific DL/ML framework offers and provide common data structures and APIs to the DL/ML domain.
88

9-
109
As a way of representing tensor data we've embraced [dlpack](https://github.com/dmlc/dlpack) - a community effort to define a common tensor data structure that can be shared by different frameworks, supported by cuPy, cuDF, DGL, TGL, PyTorch, and MxNet.
1110

1211
**Data Structures**
1312

1413
RedisAI provides the following data structures:
1514

1615
* **Tensor**: represents an n-dimensional array of values
17-
* **Model**: represents a frozen graph by one of the supported DL/ML framework backends
18-
* **Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html)
16+
* **Model**: represents a computation graph by one of the supported DL/ML framework backends
17+
* **Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html) program
1918

2019
## Source code layout
2120

@@ -33,7 +32,6 @@ of complexity incrementally.
3332

3433
**redisai.c**
3534

36-
3735
This is the entry point of the RedisAI module, responsible for registering the new commands in the Redis server, and containing all command functions to be called. This file is also responsible for exporting of Tensor, Script and Model APIs to other Modules.
3836

3937
**tensor.h**

docs/images/graph.pb.png

7.89 KB
Loading

docs/index.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,12 @@
44

55
RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. RedisAI both simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure, as well as maximizes computation throughput by adhering to the principle of data locality.
66

7+
RedisAI is a joint effort between [Redis Labs](https://www.redislabs.com) and [Tensorwerk](https://tensorwerk.com).
8+
79
## Where Next?
810
* The [Introduction](intro.md) is the recommended starting point
911
* The [Quickstart](quickstart.md) page provides information about building, installing and running RedisAI
10-
* The [Commands](commands.md) page is a reference of RedisAI's API
12+
* The [Commands](commands.md) page is a reference of the RedisAI API
1113
* The [Clients](clients.md) page lists RedisAI clients by programming language
1214
* The [Configuration](configuration.md) page explains how to configure RedisAI
1315
* The [Developer](developer.md) page has more information about the design and implementation of the RedisAI module

docs/intro.md

Lines changed: 18 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -31,10 +31,10 @@ In broad strokes, RedisAI's looks as follows:
3131
| | | myscript +----->+ Script +---+ +----+-----+ +-->+ ONNXRuntime | | | |
3232
| | | | | +--------+ | ^ | +-------------+ | | |
3333
| | +----------+ | | | | | | |
34-
| | | | | +--------+ | | | +-------------+ | | |
35-
| | | mydag +----->+ DAG +---+ | +-->+ ... | | | |
36-
| | | | | +--------+ | +-------------+ | | |
37-
| | +----------+ ^ +------------------------|-----------------------------+ | |
34+
| | ^ | +--------+ | | | +-------------+ | | |
35+
| | | | + DAG +---+ | +-->+ ... | | | |
36+
| | | | +--------+ | +-------------+ | | |
37+
| | | +------------------------|-----------------------------+ | |
3838
| +--------------|--------------------------|-------------------------------+ |
3939
| v v |
4040
| +--------------+-----------------+ +------------------------------------+ |
@@ -83,22 +83,22 @@ A **Redis module** is a shared library that can be loaded by the Redis server du
8383
* [Modules published at redis.io](https://redis.io/modules)
8484

8585
### Why RedisAI?
86-
RedisAI bundles together best-of-breed technologies for delivering stable and performant graph serving. Every DL/ML framework ships with a backend for executing the graphs developed by it, and the common practice for serving these is building a simple server.
86+
RedisAI bundles together best-of-breed technologies for delivering stable and performant computation graph serving. Every DL/ML framework ships with a runtime for executing the models developed with it, and the common practice for serving these is building a simple server around them.
8787

8888
RedisAI aims to be that server, saving you from the need of installing the backend you're using and developing a server for it. By itself that does not justify RedisAI's existence so there's more to it. Because RedisAI is implemented as a Redis module it automatically benefits from the server's capabilities: be it Redis' native data types, its robust eco-system of clients, high-availability, persistence, clustering, and Enterprise support.
8989

90-
Because Redis is an in-memory data structure server RedisAI uses it for storing all of its data. The main data type supported by RedisAI is the Tensor that is the standard representation of data in the DL/ML domain. Because tensors are stored memory space of the Redis server they are readily accessible to any of RedisAI's backend libraries at minimal latency.
90+
Because Redis is an in-memory data structure server RedisAI uses it for storing all of its data. The main data type supported by RedisAI is the Tensor that is the standard representation of data in the DL/ML domain. Because tensors are stored in the memory space of the Redis server, they are readily accessible to any of RedisAI's backend libraries at minimal latency.
9191

9292
The locality of data, which is tensor data in adjacency to DL/ML models backends, allows RedisAI to provide optimal performance when serving models. It also makes it a perfect choice for deploying DL/ML models in production and allowing them to be used by any application.
9393

94-
Furthermore, RedisAI is also an optimal testbed for models as it allows the parallel execution of multiple graphs and, in future versions, assessing their respective performance in real-time.
94+
Furthermore, RedisAI is also an optimal testbed for models as it allows the parallel execution of multiple computation graphs and, in future versions, assessing their respective performance in real-time.
9595

9696
#### Data Structures
9797
RedisAI provides the following data structures:
9898

9999
* **Tensor**: represents an n-dimensional array of values
100-
* **Model**: represents a frozen graph by one of the supported DL/ML framework backends
101-
* **Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html)
100+
* **Model**: represents a computation graph by one of the supported DL/ML framework backends
101+
* **Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html) program
102102

103103
#### DL/ML Backends
104104
RedisAI supports the following DL/ML identifiers and respective backend libraries:
@@ -135,7 +135,7 @@ docker exec -it redisai redis-cli
135135
```
136136

137137
## Using RedisAI Tensors
138-
A **Tensor** is an n-dimensional array and is the standard vehicle for DL/ML data. RedisAI adds to Redis a Tensor data structure that implements the tensor type. Like any datum in Redis, RedisAI's Tensors are identified by key names.
138+
A **Tensor** is an n-dimensional array and is the standard representation for data in DL/ML workloads. RedisAI adds to Redis a Tensor data structure that implements the tensor type. Like any datum in Redis, RedisAI's Tensors are identified by key names.
139139

140140
Creating new RedisAI tensors is done with the [`AI.TENSORSET` command](commands.md#aitensorset). For example, consider the tensor: $\begin{equation*} tA = \begin{bmatrix} 2 \\ 3 \end{bmatrix} \end{equation*}$.
141141

@@ -154,7 +154,7 @@ Copy the command to your cli and hit the `<ENTER>` on your keyboard to execute i
154154
OK
155155
```
156156

157-
The reply 'OK' means that the operation was successful. We've called the `AI.TENSORSET` command to set the key named 'tA' with the tensor's data, but the name could have been any string value. The `FLOAT` argument specifies the type of values that the tensor stores, and in this case a double-precision floating-point. After the type argument comes the tensor's shape as a list of its dimensions, or just a single dimension of 2.
157+
The reply 'OK' means that the operation was successful. We've called the `AI.TENSORSET` command to set the key named 'tA' with the tensor's data, but the name could have been any string value. The `FLOAT` argument specifies the type of values that the tensor stores, and in this case a single-precision floating-point. After the type argument comes the tensor's shape as a list of its dimensions, or just a single dimension of 2.
158158

159159
The `VALUES` argument tells RedisAI that the tensor's data will be given as a sequence of numeric values and in this case the numbers 2 and 3. This is useful for development purposes and creating small tensors, however for practical purposes the `AI.TENSORSET` command also supports importing data in binary format.
160160

@@ -188,7 +188,7 @@ A **Model** is a Deep Learning or Machine Learning frozen graph that was generat
188188

189189
Models, like any other Redis and RedisAI data structures, are identified by keys. A Model's key is created using the [`AI.MODELSET` command](commands.md#aimodelset) and requires the graph payload serialized as protobuf for input.
190190

191-
In our examples, we'll use one of the graphs that RedisAI uses in its tests, namely 'graph.pb', which can be downloaded from [here](https://github.com/RedisAI/RedisAI/raw/master/test/test_data/graph.pb).
191+
In our examples, we'll use one of the graphs that RedisAI uses in its tests, namely 'graph.pb', which can be downloaded from [here](https://github.com/RedisAI/RedisAI/raw/master/test/test_data/graph.pb). This graph was created using TensorFlow with [this script](https://github.com/RedisAI/RedisAI/blob/master/test/test_data/tf-minimal.py).
192192

193193
??? info "Downloading 'graph.pb'"
194194
Use a web browser or the command line to download 'graph.pb':
@@ -197,6 +197,12 @@ In our examples, we'll use one of the graphs that RedisAI uses in its tests, nam
197197
wget https://github.com/RedisAI/RedisAI/raw/master/test/test_data/graph.pb
198198
```
199199

200+
You can view the computation graph using [Netron](https://lutzroeder.github.io/netron/), which supports all frameworks supported by RedisAI.
201+
202+
![Computation graph visualized in Netron](images/graph.pb.png "Computation Graph Visualized in Netron")
203+
204+
This is a great way to inspect a graph and find out node names for inputs and outputs.
205+
200206
redis-cli doesn't provide a way to read files' contents, so to load the model with it we'll use the command line and output pipes:
201207

202208
```

0 commit comments

Comments
 (0)