generaltensorflow-serving

Compile example clients other than Inception

NOTE: The Bitnami TensorFlow Serving Stack is configured to deploy the TensorFlow Inception Serving API. This image also ships other tools like Bazel or the TensorFlow Python library for training models. Training operations will require higher hardware requirements in terms of CPU, RAM and disk. It is highly recommended to check the requirements for these operations and scale your server accordingly.

As an example, this section describes how to compile and test the mnist utilities:

  • Clone the TensorFlow Serving repository and checkout the compatible versions of TensorFlow Serving and TensorFlow:

    $ cd ~
    $ git clone https://github.com/tensorflow/serving.git
    $ cd serving/
    $ git submodule update --init
    $ git checkout 0.6.0
    $ cd tensorflow
    $ git checkout v1.2.0
    
  • In the TensorFlow submodule, execute the configure step with the values that you want to use:

    $ cd ~/serving/tensorflow
    $ ./configure
    
  • Compile the client tools mnist_client and mnist_saved_model:

    $ cd ~/serving/
    $ bazel build --action_env=PYTHON_BIN_PATH=/opt/bitnami/python/bin/python \
        --compilation_mode=opt --strip=always --nobuild_runfile_links \
        //tensorflow_serving/example:mnist_client //tensorflow_serving/example:mnist_saved_model
    

    This operation may take a considerable amount of time (depending on the hardware of your machine), please be patient.

Once the compilation is done, the utilities are ready to be used. To test both of them, execute the following commands:

  • Export the model:

    $ bazel-bin/tensorflow_serving/example/mnist_saved_model /tmp/mnist_model
    
  • Stop the already running service:

    $ sudo /opt/bitnami/ctlscript.sh stop
    
  • Start the server with valid parameters for mnist:

    $ tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_model/ &
    
  • Use the mnist client:

    $ bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:9000
    
  • Once the test is done, restart the server with the default values:

    $ sudo pkill -f tensorflow_model_server.*model_name=mnist
    $ sudo /opt/bitnami/ctlscript.sh stop
    

For more information, please check the TensorFlow Serving tutorial.