-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
UPSTREAM: <carry>: Move Data Science Pipelines Dockerfiles to a different path #119
base: master
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,4 @@ | ||
# Copyright 2021-2024 The Kubeflow Authors | ||
# Copyright 2021-2022 The Kubeflow Authors | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
|
@@ -12,47 +12,72 @@ | |
# See the License for the specific language governing permissions and | ||
# limitations under the License. | ||
|
||
# Build arguments | ||
ARG SOURCE_CODE=. | ||
# 1. Build api server application | ||
FROM golang:1.21.7-bookworm as builder | ||
RUN apt-get update && apt-get install -y cmake clang musl-dev openssl | ||
WORKDIR /go/src/github.com/kubeflow/pipelines | ||
COPY . . | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this path needs to be pointing to where the odh/dsp APIServer code is, and not kubeflow/pipelines, I believe. That might be causing this CI failure: https://github.com/opendatahub-io/data-science-pipelines/actions/runs/11960604493/job/33344984082?pr=119
It's looking for an There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. From https://github.com/opendatahub-io/data-science-pipelines/pull/119/files#r1852961305:
That said, it seems like we should leave the original Dockerfiles identical to upstream. Does my point of view make sense? |
||
RUN GO111MODULE=on go build -o /bin/apiserver backend/src/apiserver/*.go | ||
# Check licenses and comply with license terms. | ||
RUN ./hack/install-go-licenses.sh | ||
# First, make sure there's no forbidden license. | ||
RUN go-licenses check ./backend/src/apiserver | ||
RUN go-licenses csv ./backend/src/apiserver > /tmp/licenses.csv && \ | ||
diff /tmp/licenses.csv backend/third_party_licenses/apiserver.csv && \ | ||
go-licenses save ./backend/src/apiserver --save_path /tmp/NOTICES | ||
|
||
# 2. Compile preloaded pipeline samples | ||
FROM python:3.8 as compiler | ||
RUN apt-get update -y && apt-get install --no-install-recommends -y -q default-jdk python3-setuptools python3-dev jq | ||
RUN wget https://bootstrap.pypa.io/get-pip.py && python3 get-pip.py | ||
COPY backend/requirements.txt . | ||
RUN python3 -m pip install -r requirements.txt --no-cache-dir | ||
|
||
FROM registry.access.redhat.com/ubi8/go-toolset:1.21 as builder | ||
|
||
DharmitD marked this conversation as resolved.
Show resolved
Hide resolved
|
||
USER root | ||
# Downloading Argo CLI so that the samples are validated | ||
ENV ARGO_VERSION v3.4.17 | ||
RUN curl -sLO https://github.com/argoproj/argo-workflows/releases/download/${ARGO_VERSION}/argo-linux-amd64.gz && \ | ||
gunzip argo-linux-amd64.gz && \ | ||
chmod +x argo-linux-amd64 && \ | ||
mv ./argo-linux-amd64 /usr/local/bin/argo | ||
|
||
RUN dnf install -y cmake clang openssl | ||
|
||
COPY ${SOURCE_CODE}/go.mod ./ | ||
COPY ${SOURCE_CODE}/go.sum ./ | ||
WORKDIR / | ||
COPY ./samples /samples | ||
COPY backend/src/apiserver/config/sample_config.json /samples/ | ||
|
||
RUN GO111MODULE=on go mod download | ||
# Compiling the preloaded samples. | ||
# The default image is replaced with the GCR-hosted python image. | ||
RUN set -e; \ | ||
< /samples/sample_config.json jq .[].file --raw-output | while read pipeline_yaml; do \ | ||
pipeline_py="${pipeline_yaml%.yaml}"; \ | ||
python3 "$pipeline_py"; \ | ||
done | ||
|
||
# Copy the source | ||
COPY ${SOURCE_CODE}/ ./ | ||
# 3. Start api web server | ||
FROM debian:stable | ||
|
||
RUN GO111MODULE=on go build -o /bin/apiserver ./backend/src/apiserver/ && \ | ||
dnf clean all | ||
|
||
FROM registry.access.redhat.com/ubi8/ubi-minimal:latest | ||
ARG COMMIT_SHA=unknown | ||
ENV COMMIT_SHA=${COMMIT_SHA} | ||
ARG TAG_NAME=unknown | ||
ENV TAG_NAME=${TAG_NAME} | ||
ENV LOG_LEVEL info | ||
|
||
WORKDIR /bin | ||
|
||
COPY --from=builder /opt/app-root/src/backend/src/apiserver/config/ /config | ||
COPY backend/src/apiserver/config/ /config | ||
COPY --from=builder /bin/apiserver /bin/apiserver | ||
|
||
# Copy licenses and notices. | ||
COPY --from=builder /tmp/licenses.csv /third_party/licenses.csv | ||
COPY --from=builder /tmp/NOTICES /third_party/NOTICES | ||
COPY --from=compiler /samples/ /samples/ | ||
RUN chmod +x /bin/apiserver | ||
|
||
USER root | ||
|
||
# Adding CA certificate so API server can download pipeline through URL and wget is used for liveness/readiness probe command | ||
RUN microdnf install -y ca-certificates wget | ||
RUN apt-get update && apt-get install -y ca-certificates wget | ||
|
||
USER 1001 | ||
# Pin sample doc links to the commit that built the backend image | ||
RUN sed -E "s#/(blob|tree)/master/#/\1/${COMMIT_SHA}/#g" -i /config/sample_config.json && \ | ||
sed -E "s/%252Fmaster/%252F${COMMIT_SHA}/#g" -i /config/sample_config.json | ||
|
||
# Expose apiserver port | ||
EXPOSE 8888 | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why are you changing the original Dockerfiles? Is that to reflect the upstream ones?Well, I think https://github.com/opendatahub-io/data-science-pipelines/pull/119/files#r1852961305 answers this question.
Please let me know, otherwise.