Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: improve Dockerfile and remove vuln #658

Merged
merged 3 commits into from
Sep 11, 2023
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 21 additions & 3 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM openjdk:11-jre-slim-stretch as server
FROM eclipse-temurin:11-jdk-jammy as server

ARG SPARK_VERSION=3.4.1
ARG HADOOP_VERSION=3
Expand All @@ -9,7 +9,7 @@ COPY server/ ./server/
WORKDIR /home/app/server/
RUN ./gradlew build -x test -PSPARK_VERSION=${SPARK_VERSION}

FROM node:lts-alpine3.14 as frontend
FROM node:lts-alpine3.18 as frontend

ARG SPARK_VERSION=3.4.1
ARG HADOOP_VERSION=3
Expand All @@ -23,7 +23,7 @@ RUN wget "https://downloads.apache.org/spark/spark-${SPARK_VERSION}/spark-${SPAR
WORKDIR /home/app/frontend/
RUN yarn install && yarn build

FROM openjdk:11-jre-slim-bullseye
FROM eclipse-temurin:11-jdk-jammy

ARG SPARK_VERSION=3.4.1
ARG HADOOP_VERSION=3
Expand All @@ -45,7 +45,25 @@ COPY --from=frontend /home/app/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION

COPY k8s/ ./k8s/

ARG spark_uid=10000
ARG spark_gid=10001
RUN groupadd -g ${spark_gid} spark && useradd spark -u ${spark_uid} -g ${spark_gid} -m -s /bin/bash
RUN mkdir -p /home/db /tmp/spark-events /tmp/staging /tmp/s3a ${SPARK_HOME}/workdir && \
chmod -R go+rwX /tmp && \
chmod -R go+rX /opt /home && \
chmod g+wX ${SPARK_HOME}/workdir && \
chown -R spark:spark ${SPARK_HOME} /home/db && \
chmod -R go+rX ${SPARK_HOME}
RUN apt-get update && apt-get upgrade -y && \
apt-get autoremove --purge -y curl wget && \
apt-get install -y --no-install-recommends --allow-downgrades -y atop procps && \
apt-get clean && rm -rf /var/lib/apt/lists /var/cache/apt/*

EXPOSE 8080
EXPOSE 25333

ENTRYPOINT ["java", "-jar", "/home/app/application.jar"]

# Specify the User that the actual main process will run as
USER ${spark_uid}
SHELL ["/bin/bash", "-c"]