struct2tensor is a library for parsing structured data inside of tensorflow. In particular, it makes it easy to manipulate structured data, e.g., slicing, flattening, copying substructures, and so on, as part of a TensorFlow model graph. The notebook in 'examples/prensor_playground.ipynb' provides a few examples of struct2tensor in action and an introduction to the main concepts. You can run the notebook in your browser through Google's colab environment, or download the file to run it in your own Jupyter environment.
There are two main use cases of this repo:
- To create a PIP package. The PIP package contains plug-ins (OpKernels) to an existing tensorflow installation.
- To staticlly link with tensorflow-serving.
As these processes are independent, one can follow either set of directions below.
From a virtual environment, run:
pip install struct2tensorStruct2Tensor also hosts nightly packages at https://pypi-nightly.tensorflow.org on Google Cloud. To install the latest nightly package, please use the following command:
pip install --extra-index-url https://pypi-nightly.tensorflow.org/simple struct2tensorThis will install the nightly packages for the major dependencies of struct2tensor such as TensorFlow Metadata (TFMD).
The struct2tensor PIP package is useful for creating models. It works with tensorflow 2.x.
In order to unify the process, we recommend compiling struct2tensor inside a docker container.
Go to your home directory.
Download the source code.
git clone https://github.com/google/struct2tensor.git
cd ~/struct2tensorInstall docker-compose.
Use it to build a pip wheel for Python 3.7 with tensorflow version 2:
docker-compose build --build-arg PYTHON_VERSION=3.7 manylinux2014
docker-compose run -e TF_VERSION=RELEASED_TF_2 manylinux2014This will create a manylinux package in the ~/struct2tensor/dist directory.
In order to construct a static library for tensorflow-serving, we run:
bazel build -c opt struct2tensor:struct2tensor_kernels_and_opsThis can also be linked into another library.
TensorFlow Serving docker image
struct2tensor needs a couple of custom TensorFlow ops to function. If you train
a model with struct2tensor and wants to serve it with TensorFlow Serving, the
TensorFlow Serving binary needs to link with those custom ops. We have a
pre-built docker image that contains such a binary. The Dockerfile is
available at tools/tf_serving_docker/Dockerfile. The image is available at
gcr.io/tfx-oss-public/s2t_tf_serving.
Please see the Dockerfile for details. But in brief, the image exposes port
8500 as the gRPC endpoint and port 8501 as the REST endpoint. You can set
two environment variables MODEL_BASE_PATH and MODEL_NAME to point it to
your model (either mount it to the container, or put your model on GCS).
It will look for a saved model at
${MODEL_BASE_PATH}/${MODEL_NAME}/${VERSION_NUMBER}, where VERSION_NUMBER
is an integer.
| struct2tensor | tensorflow |
|---|---|
| 0.40.0 | 2.9.0 |
| 0.39.0 | 2.8.0 |
| 0.38.0 | 2.8.0 |
| 0.37.0 | 2.7.0 |
| 0.36.0 | 2.7.0 |
| 0.35.0 | 2.6.0 |
| 0.34.0 | 2.6.0 |
| 0.33.0 | 2.5.0 |
| 0.32.0 | 2.5.0 |
| 0.31.0 | 2.5.0 |
| 0.30.0 | 2.4.0 |
| 0.29.0 | 2.4.0 |
| 0.28.0 | 2.4.0 |
| 0.27.0 | 2.4.0 |
| 0.26.0 | 2.3.0 |
| 0.25.0 | 2.3.0 |
| 0.24.0 | 2.3.0 |
| 0.23.0 | 2.3.0 |
| 0.22.0 | 2.2.0 |
| 0.21.1 | 2.1.0 |
| 0.21.0 | 2.1.0 |
| 0.0.1.dev* | 1.15 |