You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/commands.md
+12-12Lines changed: 12 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
# RedisAI Commands
2
2
RedisAI is a Redis module, and as such it implements several data types and the respective commands to use them.
3
3
4
-
All of RedisAI's commands are begin with the `AI.` prefix. The following sections describe these commands.
4
+
All of RedisAI's commands begin with the `AI.` prefix. The following sections describe these commands.
5
5
6
6
**Syntax Conventions**
7
7
@@ -365,7 +365,7 @@ def addtwo(a, b):
365
365
It can be stored as a RedisAI script using the CPU device with [`redis-cli`](https://redis.io/topics/rediscli) as follows:
366
366
367
367
```
368
-
$ cat addtwo.py | redis-cli -x AI.SCRIPTSET myscript addtwo CPU TAG SOURCE myscript:v0.1
368
+
$ cat addtwo.py | redis-cli -x AI.SCRIPTSET myscript addtwo CPU TAG myscript:v0.1 SOURCE
369
369
OK
370
370
```
371
371
@@ -514,9 +514,9 @@ The **`AI.DAGRUN`** command specifies a direct acyclic graph of operations to ru
514
514
515
515
It accepts one or more operations, split by the pipe-forward operator (`|>`).
516
516
517
-
By default, the DAG execution context is local, meaning that loading and persisting tensors should be done explicitly. The user should specify which key tensors to load from keyspace using the `LOAD` keyword, and which command outputs to persist to the keyspace using the `PERSIST` keyspace.
517
+
By default, the DAG execution context is local, meaning that tensor keys appearing in the DAG only live in the scope of the command. That is, setting a tensor with `TENSORSET` will store it local memory and not set it to an actual database key. One can refer to that key in subsequent commands within the DAG, but that key won't be visible outside the DAG or to other clients - no keys are open at the database level.
518
518
519
-
When `PERSIST` is not present, object savings are done locally and kept only during the context of the DAG meaning that no output keys are open.
519
+
Loading and persisting tensors from/to keyspace should be done explicitly. The user should specify which key tensors to load from keyspace using the `LOAD` keyword, and which command outputs to persist to the keyspace using the `PERSIST` keyspace.
520
520
521
521
As an example, if `command 1` sets a tensor, it can be referenced by any further command on the chaining.
522
522
@@ -611,21 +611,21 @@ The following example obtains the previously-run 'myscript' script's runtime sta
The **THREADS_PER_QUEUE** configuration option controls the number of worker threads allocated to each device's job queue. Multiple threads can be used for executing different independent operations in parallel.
153
153
154
+
Note that RedisAI maintains one job queue per device (CPU, GPU:0, GPU:1). Each job queue is consumed by THREADS_PER_QUEUE threads.
155
+
154
156
This option significantly improves the performance of simple, low-effort computation-wise models since there is spare computation cycle available from modern CPUs and hardware accelerators (GPUs, TPUs, ...).
Copy file name to clipboardExpand all lines: docs/developer.md
+2-4Lines changed: 2 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,16 +6,15 @@ The following sections discuss topics relevant to the development of the RedisAI
6
6
7
7
RedisAI bundles together best-of-breed technologies for delivering stable and fast model serving. To do so, we need to abstract from what each specific DL/ML framework offers and provide common data structures and APIs to the DL/ML domain.
8
8
9
-
10
9
As a way of representing tensor data we've embraced [dlpack](https://github.com/dmlc/dlpack) - a community effort to define a common tensor data structure that can be shared by different frameworks, supported by cuPy, cuDF, DGL, TGL, PyTorch, and MxNet.
11
10
12
11
**Data Structures**
13
12
14
13
RedisAI provides the following data structures:
15
14
16
15
***Tensor**: represents an n-dimensional array of values
17
-
***Model**: represents a frozen graph by one of the supported DL/ML framework backends
18
-
***Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html)
16
+
***Model**: represents a computation graph by one of the supported DL/ML framework backends
17
+
***Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html) program
19
18
20
19
## Source code layout
21
20
@@ -33,7 +32,6 @@ of complexity incrementally.
33
32
34
33
**redisai.c**
35
34
36
-
37
35
This is the entry point of the RedisAI module, responsible for registering the new commands in the Redis server, and containing all command functions to be called. This file is also responsible for exporting of Tensor, Script and Model APIs to other Modules.
Copy file name to clipboardExpand all lines: docs/index.md
+3-1Lines changed: 3 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,10 +4,12 @@
4
4
5
5
RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. RedisAI both simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure, as well as maximizes computation throughput by adhering to the principle of data locality.
6
6
7
+
RedisAI is a joint effort between [Redis Labs](https://www.redislabs.com) and [Tensorwerk](https://tensorwerk.com).
8
+
7
9
## Where Next?
8
10
* The [Introduction](intro.md) is the recommended starting point
9
11
* The [Quickstart](quickstart.md) page provides information about building, installing and running RedisAI
10
-
* The [Commands](commands.md) page is a reference of RedisAI's API
12
+
* The [Commands](commands.md) page is a reference of the RedisAI API
11
13
* The [Clients](clients.md) page lists RedisAI clients by programming language
12
14
* The [Configuration](configuration.md) page explains how to configure RedisAI
13
15
* The [Developer](developer.md) page has more information about the design and implementation of the RedisAI module
@@ -83,22 +83,22 @@ A **Redis module** is a shared library that can be loaded by the Redis server du
83
83
* [Modules published at redis.io](https://redis.io/modules)
84
84
85
85
### Why RedisAI?
86
-
RedisAI bundles together best-of-breed technologies for delivering stable and performant graph serving. Every DL/ML framework ships with a backend for executing the graphs developed by it, and the common practice for serving these is building a simple server.
86
+
RedisAI bundles together best-of-breed technologies for delivering stable and performant computation graph serving. Every DL/ML framework ships with a runtime for executing the models developed with it, and the common practice for serving these is building a simple server around them.
87
87
88
88
RedisAI aims to be that server, saving you from the need of installing the backend you're using and developing a server for it. By itself that does not justify RedisAI's existence so there's more to it. Because RedisAI is implemented as a Redis module it automatically benefits from the server's capabilities: be it Redis' native data types, its robust eco-system of clients, high-availability, persistence, clustering, and Enterprise support.
89
89
90
-
Because Redis is an in-memory data structure server RedisAI uses it for storing all of its data. The main data type supported by RedisAI is the Tensor that is the standard representation of data in the DL/ML domain. Because tensors are stored memory space of the Redis server they are readily accessible to any of RedisAI's backend libraries at minimal latency.
90
+
Because Redis is an in-memory data structure server RedisAI uses it for storing all of its data. The main data type supported by RedisAI is the Tensor that is the standard representation of data in the DL/ML domain. Because tensors are stored in the memory space of the Redis server, they are readily accessible to any of RedisAI's backend libraries at minimal latency.
91
91
92
92
The locality of data, which is tensor data in adjacency to DL/ML models backends, allows RedisAI to provide optimal performance when serving models. It also makes it a perfect choice for deploying DL/ML models in production and allowing them to be used by any application.
93
93
94
-
Furthermore, RedisAI is also an optimal testbed for models as it allows the parallel execution of multiple graphs and, in future versions, assessing their respective performance in real-time.
94
+
Furthermore, RedisAI is also an optimal testbed for models as it allows the parallel execution of multiple computation graphs and, in future versions, assessing their respective performance in real-time.
95
95
96
96
#### Data Structures
97
97
RedisAI provides the following data structures:
98
98
99
99
***Tensor**: represents an n-dimensional array of values
100
-
***Model**: represents a frozen graph by one of the supported DL/ML framework backends
101
-
***Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html)
100
+
***Model**: represents a computation graph by one of the supported DL/ML framework backends
101
+
***Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html) program
102
102
103
103
#### DL/ML Backends
104
104
RedisAI supports the following DL/ML identifiers and respective backend libraries:
A **Tensor** is an n-dimensional array and is the standard vehicle for DL/ML data. RedisAI adds to Redis a Tensor data structure that implements the tensor type. Like any datum in Redis, RedisAI's Tensors are identified by key names.
138
+
A **Tensor** is an n-dimensional array and is the standard representation for data in DL/ML workloads. RedisAI adds to Redis a Tensor data structure that implements the tensor type. Like any datum in Redis, RedisAI's Tensors are identified by key names.
139
139
140
140
Creating new RedisAI tensors is done with the [`AI.TENSORSET` command](commands.md#aitensorset). For example, consider the tensor: $\begin{equation*} tA = \begin{bmatrix} 2 \\ 3 \end{bmatrix} \end{equation*}$.
141
141
@@ -154,7 +154,7 @@ Copy the command to your cli and hit the `<ENTER>` on your keyboard to execute i
154
154
OK
155
155
```
156
156
157
-
The reply 'OK' means that the operation was successful. We've called the `AI.TENSORSET` command to set the key named 'tA' with the tensor's data, but the name could have been any string value. The `FLOAT` argument specifies the type of values that the tensor stores, and in this case a double-precision floating-point. After the type argument comes the tensor's shape as a list of its dimensions, or just a single dimension of 2.
157
+
The reply 'OK' means that the operation was successful. We've called the `AI.TENSORSET` command to set the key named 'tA' with the tensor's data, but the name could have been any string value. The `FLOAT` argument specifies the type of values that the tensor stores, and in this case a single-precision floating-point. After the type argument comes the tensor's shape as a list of its dimensions, or just a single dimension of 2.
158
158
159
159
The `VALUES` argument tells RedisAI that the tensor's data will be given as a sequence of numeric values and in this case the numbers 2 and 3. This is useful for development purposes and creating small tensors, however for practical purposes the `AI.TENSORSET` command also supports importing data in binary format.
160
160
@@ -188,7 +188,7 @@ A **Model** is a Deep Learning or Machine Learning frozen graph that was generat
188
188
189
189
Models, like any other Redis and RedisAI data structures, are identified by keys. A Model's key is created using the [`AI.MODELSET` command](commands.md#aimodelset) and requires the graph payload serialized as protobuf for input.
190
190
191
-
In our examples, we'll use one of the graphs that RedisAI uses in its tests, namely 'graph.pb', which can be downloaded from [here](https://github.com/RedisAI/RedisAI/raw/master/test/test_data/graph.pb).
191
+
In our examples, we'll use one of the graphs that RedisAI uses in its tests, namely 'graph.pb', which can be downloaded from [here](https://github.com/RedisAI/RedisAI/raw/master/test/test_data/graph.pb). This graph was created using TensorFlow with [this script](https://github.com/RedisAI/RedisAI/blob/master/test/test_data/tf-minimal.py).
192
192
193
193
??? info "Downloading 'graph.pb'"
194
194
Use a web browser or the command line to download 'graph.pb':
@@ -197,6 +197,12 @@ In our examples, we'll use one of the graphs that RedisAI uses in its tests, nam
0 commit comments