-
Notifications
You must be signed in to change notification settings - Fork 1
Using a Custom Worker Dockerfile for a Prediction Microservice
For any models that need specific libraries features that cannot be installed through pip or require a custom Dockerfile:
In combinedtechstack/.env
file add variable
PREDICTION_YOUR_MODEL=Your_model_folder_name
where Your_model_folder_name
is the folder name in combinedtechstack/prediction/models
containing the model and PREDICTION_YOUR_MODEL
is the name of your model that will be referenced later (such as PREDICTION_SCENE_DETECT
).
Within the combinedtechstack/prediction/models/your_model_here
add your custom Dockerfile
.
Navigate to the COMBINEDTECHSTACK/docker-compose.yml
file and within it find the Prediction Microservice Workers
section.
Create a new worker in the docker-compose.yml
:
worker_your_model: # In this line, change the "your_model" to a better name
container_name: ${PREDICTION_YOUR_MODEL}_worker
command: python3 worker.py
build:
context: .
dockerfile: prediction/models/${PREDICTION_YOUR_MODEL}/Dockerfile
args:
- MODEL_NAME=${PREDICTION_YOUR_MODEL}
volumes:
- ./prediction/models/${PREDICTION_YOUR_MODEL}:/app/model
- prediction_images:/app/images
environment:
- GUNICORN_CMD_ARGS=--reload
- API_KEY=${API_KEY_PREDICTION}
- SERVER_SOCKET=${SERVER_SOCKET}
depends_on:
- redis
- server
Where PREDICTION_YOUR_MODEL
will be automatically filled by Docker with the model paramater given in Step 1. Notably dockerfile: prediction/models/${PREDICTION_YOUR_MODEL}/Dockerfile
will point to your custom Dockerfile.
The worker will now be built using the custom Dockerfile.
For an example of a working Dockerfile in the model, see the: Speech Rec NER Microservice Custom Dockerfile
(Documentation by Christopher Doan)