Customize the Base Container Image
HPE Machine Learning Inferencing Software provides default images to execute bentoml
and openllm
models from openllm://
or s3://
— but you can provide a different base image to add additional libraries or change things like dependency versions.
You can provide different default images by:
- Updating the Helm chart
defaultImages
values atdefaultImages.bentoml
anddefaultImages.openllm
. - Specifying a value for the
image
field when creating a Packaged Model.
Default Images #
BentoML #
python 3.9
RUN pip install bentoml[aws]==1.1.11
The container must provide the bentoml
command on the PATH and allow it to be executed as container arguments. You may use the default BentoML container as a base container, or provide your own. HPE Machine Learning Inferencing Software will inject the necessary scripts to download the model and launch the bentoml
command pointing to the download model.
OpenLLM #
python 3.11
apt-get install -y --no-install-recommends \
build-essential \
ca-certificates \
ccache \
curl \
bc \
libssl-dev ca-certificates make \
git python3-pip
pip3 install -v --no-cache-dir "vllm==0.2.7"
RUN pip3 install openllm==0.4.44
The container must provide the openllm
command on the PATH and allow it to be executed as container arguments. You may use the default OpenLLM container as a base container, or provide your own. HPE Machine Learning Inferencing Software will inject the necessary scripts to download the model and launch the openllm
command pointing to the download model.