Compile example clients other than Inception
NOTE: The Bitnami package for TensorFlow Serving is configured to deploy the TensorFlow Inception Serving API. This image also ships other tools like Bazel or the TensorFlow Python library for training models. Training operations will require higher hardware requirements in terms of CPU, RAM and disk. It is highly recommended to check the requirements for these operations and scale your server accordingly.
As an example, this section describes how to compile and test the mnist utilities:
-
Clone the TensorFlow Serving repository and checkout the compatible versions of TensorFlow Serving and TensorFlow:
$ cd ~ $ sudo apt-get install -y git $ git clone https://github.com/tensorflow/serving.git $ cd serving/ $ git checkout 1.11.1
-
Compile the client tools mnist_client and mnist_saved_model:
$ bazel build --action_env=PYTHON_BIN_PATH=/opt/bitnami/python/bin/python \ --compilation_mode=opt --strip=always --nobuild_runfile_links \ //tensorflow_serving/example:mnist_client //tensorflow_serving/example:mnist_saved_model
This operation may take a considerable amount of time (depending on the hardware of your machine), please be patient.
Once the compilation is done, the utilities are ready to be used. To test both of them, execute the following commands:
-
Export the model:
$ bazel-bin/tensorflow_serving/example/mnist_saved_model /tmp/mnist_model
-
Stop the already running service:
$ sudo /opt/bitnami/ctlscript.sh stop
-
Start the server with valid parameters for mnist:
$ tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_model/ &
-
Use the mnist client:
$ bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:9000
-
Once the test is done, restart the server with the default values:
$ sudo pkill -f tensorflow_model_server.*model_name=mnist $ sudo /opt/bitnami/ctlscript.sh stop
For more information, please check the TensorFlow Serving tutorial.