generaltensorflow-serving

Train a model

NOTE: The Bitnami TensorFlow Serving Stack is configured to deploy the TensorFlow Inception Serving API. This image also ships other tools like Bazel or the TensorFlow Python library for training models. Training operations will require higher hardware requirements in terms of CPU, RAM and disk. It is highly recommended to check the requirements for these operations and scale your server accordingly.

The imagenet_train and imagenet_eval utilities are already compiled in our stack. You can find them at /opt/bitnami/tensorflow-serving/bin.

Please read the Inception model training guide if you want to know more about this.