Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace extract-image-names.sh with JS script #518

Open
wants to merge 23 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 30 additions & 13 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ jobs:
timeout-minutes: 120

outputs:
images: ${{ steps.bake_metadata.outputs.images }}
images: ${{ steps.set-output.outputs.result }}

# Make sure we fail if any command in a piped command sequence fails
defaults:
Expand Down Expand Up @@ -77,15 +77,32 @@ jobs:
build.json
.github/workflows/env.hcl

- name: Set output image names
id: bake_metadata
# bake-action metadata output has gotten too big, so we first write it
# to a file. See https://github.com/aiidalab/aiidalab-docker-stack/issues/491
run: |
cat << EOF > bake_metadata.json
${{ steps.build-upload.outputs.metadata }}
EOF
images=$(.github/workflows/extract-image-names.sh bake_metadata.json)
echo "images=${images}" >> "${GITHUB_OUTPUT}"
# Pretty-print for GHA logs
echo "$images" | jq
- name: Set output variables
id: set-output
uses: actions/github-script@v7
with:
# We need to produce the following JSON string from the bake action output,
# which is then used in follow-up steps to uniquelly identify the built
# images by their digests, and not by their tags. See e.g. test.yml
# {
# "BASE_IMAGE": "ghcr.io/aiidalab/base@sha256:bc53c...",
# "BASE_WITH_SERVICES_IMAGE":"ghcr.io/aiidalab/base-with-services@sha256:0df1...",
# "FULL_STACK_IMAGE":"ghcr.io/aiidalab/full-stack@sha256:dd04...",
# "LAB_IMAGE":"ghcr.io/aiidalab/lab@sha256:e8c7b3a662660ad20fef7...",
# }
script: |
const bake_output = JSON.parse(process.env.BAKE_METADATA, 'utf-8');
images = {};
for (const key in bake_output) {
const image = bake_output[key];
// turn e.g. 'full-stack' to 'FULL_STACK_IMAGE' key
const image_envvar = `${key.toUpperCase().replace(/-/g, "_")}_IMAGE`;
// create full canonical path to the image using its digest, i.e.
// ghcr.io/aiidalab/base@sha256:cdad93278a...
const image_digest = `${image["image.name"]}@${image["containerimage.digest"]}`;
images[image_envvar] = image_digest;
}
console.log(images);
return images;
env:
BAKE_METADATA: ${{ steps.build-upload.outputs.metadata }}
52 changes: 0 additions & 52 deletions .github/workflows/extract-image-names.sh

This file was deleted.

16 changes: 8 additions & 8 deletions stack/base/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ USER root
# rsync: needed to support the new AiiDA backup command
# povray: rendering engine used in aiidalab-widgets-base
# bc: needed to compute the resources for computer setup
ENV EXTRA_APT_PACKAGES "curl povray rsync build-essential bc"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is just unrelated cleanup to get rid of Docker deprecation warnings.

ENV EXTRA_APT_PACKAGES="curl povray rsync build-essential bc"

# For ARM64 we need to install erlang as it is not available on conda-forge
# (this is needed later as rabbitmq dependency in base-with-services image,
Expand Down Expand Up @@ -41,7 +41,7 @@ RUN echo "notebook==$(jupyter-notebook --version)" >> /opt/requirements.txt
RUN cat /opt/requirements.txt | xargs -I{} conda config --system --add pinned_packages {}

# Configure pip to use the same requirements file as constraints file.
ENV PIP_CONSTRAINT /opt/requirements.txt
ENV PIP_CONSTRAINT=/opt/requirements.txt
# Ensure that pip installs packages to '~/.local/lib/python3.X/site-packages/' by default
# by implicitly passing the '--user' option to 'pip install'
# Otherwise, pip would install into /opt/conda and such packages would be lost
Expand Down Expand Up @@ -79,12 +79,12 @@ COPY config-quick-setup.yaml .
COPY before-notebook.d/* /usr/local/bin/before-notebook.d/

# Configure AiiDA.
ENV SETUP_DEFAULT_AIIDA_PROFILE true
ENV AIIDA_PROFILE_NAME default
ENV AIIDA_USER_EMAIL aiida@localhost
ENV AIIDA_USER_FIRST_NAME Giuseppe
ENV AIIDA_USER_LAST_NAME Verdi
ENV AIIDA_USER_INSTITUTION Khedivial
ENV SETUP_DEFAULT_AIIDA_PROFILE=true
ENV AIIDA_PROFILE_NAME=default
ENV AIIDA_USER_EMAIL=aiida@localhost
ENV AIIDA_USER_FIRST_NAME=Giuseppe
ENV AIIDA_USER_LAST_NAME=Verdi
ENV AIIDA_USER_INSTITUTION=Khedivial

# Install the load-singlesshagent.sh script as described here:
# https://aiida.readthedocs.io/projects/aiida-core/en/v2.0.0/howto/ssh.html#starting-the-ssh-agent
Expand Down
2 changes: 1 addition & 1 deletion stack/full-stack/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# syntax=docker/dockerfile:1
FROM base-with-services as base
FROM base-with-services AS base

FROM lab

Expand Down
30 changes: 15 additions & 15 deletions stack/lab/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -51,9 +51,9 @@ COPY --chown=${NB_UID}:${NB_GID} gears.svg ${CONDA_DIR}/share/jupyter/nbextensio
COPY before-notebook.d/* /usr/local/bin/before-notebook.d/

# Configure AiiDAlab environment.
ENV AIIDALAB_HOME /home/${NB_USER}
ENV AIIDALAB_APPS ${AIIDALAB_HOME}/apps
ENV AIIDALAB_DEFAULT_GIT_BRANCH master
ENV AIIDALAB_HOME=/home/${NB_USER}
ENV AIIDALAB_APPS=${AIIDALAB_HOME}/apps
ENV AIIDALAB_DEFAULT_GIT_BRANCH=master

# Specify which apps to install in addition to the home app. The
# AIIDALAB_DEFAULT_APPS variable should be a whitespace-delimited variable
Expand All @@ -66,10 +66,10 @@ ENV AIIDALAB_DEFAULT_GIT_BRANCH master
# Please note that multiple entries must be whitespace delimited.
# Please see `aiidalab install --help` for more information.
# ENV AIIDALAB_DEFAULT_APPS "aiidalab-widgets-base~=1.0"
ENV AIIDALAB_DEFAULT_APPS ""
ENV AIIDALAB_DEFAULT_APPS=""

# Specify default factory reset (not set):
ENV AIIDALAB_FACTORY_RESET ""
ENV AIIDALAB_FACTORY_RESET=""

USER ${NB_USER}

Expand All @@ -87,16 +87,16 @@ RUN mkdir -p /home/${NB_USER}/apps
# but that may come with other problems for people with flaky internet connections.
# Instead, here we configure Jupyter to kill all kernels that have been alive for
# more than 12 hours. We also close all inactive terminals after 10 minutes.
ENV NOTEBOOK_ARGS \
"--NotebookApp.default_url='/apps/apps/home/start.ipynb'" \
"--ContentsManager.allow_hidden=True" \
"--MappingKernelManager.buffer_offline_messages=True" \
"--MappingKernelManager.cull_busy=True" \
"--MappingKernelManager.cull_connected=True" \
"--MappingKernelManager.cull_idle_timeout=64800" \
"--MappingKernelManager.cull_interval=300" \
"--TerminalManager.cull_inactive_timeout=3600" \
"--TerminalManager.cull_interval=300"
ENV NOTEBOOK_ARGS=\
"--NotebookApp.default_url=/apps/apps/home/start.ipynb "\
"--ContentsManager.allow_hidden=True "\
"--MappingKernelManager.buffer_offline_messages=True "\
"--MappingKernelManager.cull_busy=True "\
"--MappingKernelManager.cull_connected=True "\
"--MappingKernelManager.cull_idle_timeout=64800 "\
"--MappingKernelManager.cull_interval=300 "\
"--TerminalManager.cull_inactive_timeout=3600 "\
"--TerminalManager.cull_interval=300"

# Set up the logo of notebook interface
ARG PYTHON_MINOR_VERSION
Expand Down