Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

quicktest update & merge CerebNet.utils.metrics into FastSurferCNN.utils.metrics #590

Draft
wants to merge 8 commits into
base: dev
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/actions/build-docker/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
# The build artifact includes the image in docker-image.tar and the name of the image in image_name.env

name: 'Build a FastSurfer docker image'
author: 'David Kuegler'
description: 'Build a FastSurfer image'

inputs:
Expand Down
1 change: 1 addition & 0 deletions .github/actions/load-docker/action.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
name: 'Load a docker image'
author: 'David Kuegler'
description: 'Load a docker image from an artifact'

inputs:
Expand Down
1 change: 1 addition & 0 deletions .github/actions/load-processed/action.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
name: 'Load processed data'
author: 'David Kuegler'
description: 'Load a docker image from an artifact'

inputs:
Expand Down
49 changes: 41 additions & 8 deletions .github/actions/run-fastsurfer/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
# The build artifact includes the image in docker-image.tar and the name of the image in image_name.env

name: 'Run Fastsurfer'
author: 'David Kuegler'
description: 'Runs FastSurfer'

inputs:
Expand All @@ -30,21 +31,24 @@ inputs:
license:
required: true
description: "The FreeSurfer license"
cleanup:
type: string
default: 'false'
description: "Whether all docker images and containers should be cleaned"

runs:
using: "composite"
steps:
- name: Load the docker image
id: load-docker
uses: ./.github/actions/load-docker
with:
docker-image: ${{ inputs.docker-image }}
- name: Set up FreeSurfer License and download T1 image
# This is a failfast step and thus should be run first
shell: bash
id: prep
continue-on-error: false
env:
IMAGE_HREF: ${{ inputs.image-href }}
EXTRA_ARGS: ${{ inputs.extra-args }}
run: |
# Preparation code
echo "::group::Check arguments and prepare directories"
DATA_DIR=$(dirname $(pwd))/data
echo "DATA_DIR=$DATA_DIR" >> $GITHUB_ENV # DATA_DIR is used in the next 2 steps as well
Expand All @@ -58,15 +62,32 @@ runs:
echo "Could not find subject folder for ${{ inputs.subject-id }}!"
exit 1
fi
# Check if the docker needs to be loaded (fail if not)
if [[ -z "$(docker images -q "${{ inputs.docker-image }}")" ]] ; then di="0" ; else di="1" ; fi
echo "IMAGE_LOADED=$di" >> $GITHUB_OUTPUT
echo "::endgroup::"
- name: Load the docker image
if: steps.prep.outputs.IMAGE_LOADED == 0
id: load-docker
uses: ./.github/actions/load-docker
with:
docker-image: ${{ inputs.docker-image }}
- name: Reuse loaded image
if: steps.prep.outputs.IMAGE_LOADD != 0
id: load-docker
shell: bash
run: |
# populate image-name
echo "image-name=${{ inputs.docker-image }}" > $GITHUB_OUTPUT
- name: FastSurfer Segmentation pipeline
shell: bash
if: ${{ ! contains( inputs.extra-args, '--surf_only' ) }}
env:
IMAGE_NAME: ${{ steps.load-docker.outputs.image-name }}
run: |
# Segmentation call
echo "::group::FastSurfer Segmentation"
docker run --rm -t -v "$DATA_DIR:/data" -u $(id -u):$(id -g) --env TQDM_DISABLE=1 "$IMAGE_NAME" \
docker run -t -v "$DATA_DIR:/data" -u $(id -u):$(id -g) --env TQDM_DISABLE=1 --name seg_container "$IMAGE_NAME" \
--fs_license /data/.fs_license --t1 /data/T1${{ inputs.file-extension }} --sid "${{ inputs.subject-id }}" \
--sd /data --threads $(nproc) --seg_only ${{ inputs.extra-args }}
echo "::endgroup::"
Expand All @@ -76,15 +97,27 @@ runs:
env:
IMAGE_NAME: ${{ steps.load-docker.outputs.image-name }}
run: |
# Surface pipeline call
echo "::group::FastSurfer Surface reconstruction"
docker run --rm -t -v "$DATA_DIR:/data" -u $(id -u):$(id -g) --env TQDM_DISABLE=1 "$IMAGE_NAME" \
docker run -t -v "$DATA_DIR:/data" -u $(id -u):$(id -g) --env TQDM_DISABLE=1 --name surf_container "$IMAGE_NAME" \
--fs_license /data/.fs_license --t1 /data/T1${{ inputs.file-extension }} --sid "${{ inputs.subject-id }}" \
--sd /data --threads $(nproc) --surf_only ${{ inputs.extra-args }}
echo "::endgroup::"
- name: Docker image and container cleanup
if: inputs.cleanup == 'true'
shell: bash
run: |
# Cleanup code
echo "::group::Docker image and container cleanup"
if [[ "${{ steps.prep.outputs.IMAGE_LOADED }}" == 0 ]] ; then docker image rm "${{ steps.load-docker.outputs.image-name }}" ; fi
if [[ "${{ contains( inputs.extra-args, '--surf_only' ) }}" == "0" ]] ; then docker container rm seg_container ; fi
if [[ "${{ contains( inputs.extra-args, '--seg_only' ) }}" == "0" ]] ; then docker container rm surf_container ; fi
echo "::endgroup::"
- name: Create archive of Processed subject-data
shell: bash
run: |
echo "::group::Save ${{ inputs.subject-id }} as artifact"
# Archive data code
echo "::group::Save ${{ inputs.subject-id }} as an artifact"
tar cfz "$DATA_DIR/${{ inputs.subject-id }}.tar.gz" -C "$DATA_DIR" "${{ inputs.subject-id }}"
- name: Save processed data
uses: actions/upload-artifact@v4
Expand Down
18 changes: 16 additions & 2 deletions .github/actions/run-tests/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
# The build artifact includes the image in docker-image.tar and the name of the image in image_name.env

name: 'Run Fastsurfer'
author: 'David Kuegler'
description: 'Runs FastSurfer'

inputs:
Expand All @@ -23,6 +24,10 @@ inputs:
required: true
type: string
description: "The url to the reference file to download for comparison"
junit-file:
default: /tmp/quicktest.junit.xml
type: string
description: "The path where the JUnit-XML file should be saved"

runs:
using: "composite"
Expand Down Expand Up @@ -67,6 +72,15 @@ runs:
shell: bash
run: |
echo "::group::Run tests"
python -m pytest test/quick_test
flags=("--junit-xml=${{ inputs.junit-file }}")
if [[ "$ACTION_RUNNER_DEBUG" == "true" ]]
then
flags+=(-vv --log_cli_level=DEBUG)
fi
python -m pytest "${flags[@]}" test/quicktest
echo "::endgroup::"

- name: Upload the the JUnit XML file as an artifact
uses: actions/upload-artifact@v4
with:
name: fastsurfer-${{ github.sha }}-junit-${{ inputs.subject-id }}
path: ${{ inputs.junit-file }}
89 changes: 54 additions & 35 deletions .github/workflows/quicktest.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -28,11 +28,11 @@ on:
workflow_dispatch:
inputs:
docker-image:
description: 'Which docker image should be used to run this test (build-cached => build from git)'
description: 'Which docker image should be used to run this test (build-cached => build from git, also pr/<#no>)'
default: build-cached
type: string
freesurfer-build-image:
description: 'FreeSurfer build image to build with (default="": deep MI/fastsurfer-build:freesurferXXX; extract version from Docker/install_fs_pruned.sh)'
description: 'FreeSurfer build image to build with ("" (default) => deepmi/fastsurfer-build:freesurferXXX; extract version from Docker/install_fs_pruned.sh)'
type: string

permissions: read-all
Expand All @@ -42,7 +42,7 @@ env:
REFERENCE_DIR: /tmp/reference

jobs:
build-docker-latest:
build-docker:
name: 'Check and build the current docker image'
runs-on: ubuntu-latest
timeout-minutes: 180
Expand Down Expand Up @@ -117,16 +117,26 @@ jobs:
# this may also need adaptations in the test script at the bottom
echo "EXTRA_ARGS=" >> $GITHUB_OUTPUT
- name: Checkout repository
if: ${{ steps.parse.outputs.CONTINUE && steps.parse.outputs.DOCKER_IMAGE == 'build-cached' }}
# DOCKER_IMAGE is build-cached => we want to build, check this repository out to build the docker image
if: steps.parse.outputs.CONTINUE == 'true' && steps.parse.outputs.DOCKER_IMAGE == 'build-cached'
uses: actions/checkout@v4
- name: Checkout repository
# DOCKER_IMAGE starts with pr/*** => we want to build, check the PR out to build the docker image
if: steps.parse.outputs.CONTINUE == 'true' && startsWith(steps.parse.outputs.DOCKER_IMAGE, 'pr/')
uses: actions/checkout@v4
with:
ref: ${{ steps.parse.outputs.DOCKER_IMAGE }}
- name: Get the FreeSurfer version
if: ${{ steps.parse.outputs.CONTINUE && steps.parse.outputs.DOCKER_IMAGE == 'build-cached' }}
# we want to build the docker image, get what we want to do for freesurfer (read fslink from install_pruned.sh,
if: steps.parse.outputs.CONTINUE == 'true' && (steps.parse.outputs.DOCKER_IMAGE == 'build-cached' || startsWith(steps.parse.outputs.DOCKER_IMAGE, 'pr/'))
shell: bash
id: parse-version
run: |
# get the FreeSurfer version from install_fs_pruned.sh
{
eval "$(grep "^fslink=" ./Docker/install_fs_pruned.sh)"
fslink="$(grep "^fslink=" ./Docker/install_fs_pruned.sh)"
fslink="${fslink:7}"
if [[ "${fslink:0:1}" == '"' ]] || [[ "${fslink:0:1}" == "'" ]] ; then fslink="${fslink:1:-1}" ; fi
fs_version="$(basename "$(dirname "$fslink")")"
fs_version_short="${fs_version//\./}"
echo "FS_VERSION=$fs_version"
Expand All @@ -139,19 +149,19 @@ jobs:
echo "FASTSURFER_HOME=$(pwd)"
} > $GITHUB_OUTPUT
- uses: Deep-MI/FastSurfer/.github/actions/build-docker@dev
if: ${{ steps.parse.outputs.CONTINUE && steps.parse.outputs.DOCKER_IMAGE == 'build-cached' }}
if: steps.parse.outputs.CONTINUE == 'true' && steps.parse.outputs.DOCKER_IMAGE == 'build-cached'
# This action needs the "full" checkout (located at ${{ inputs.fastsurfer-home }})
with:
fastsurfer-home: ${{ steps.parse-version.outputs.FASTSURFER_HOME }}
# currently, this image has to be updated and used to circumvent storage limitations in github actions
# and it is also faster to use this prebuilt, reduced-size freesurfer distribution
freesurfer-build-image: "${{ steps.parse-version.outputs.FS_BUILD_IMAGE }}"
fastsurfer:
name: 'Run FastSurfer on sample images'
needs: build-docker-latest
fastsurfer-test:
name: 'Run FastSurfer on sample images and perform tests'
needs: build-docker
runs-on: ubuntu-latest
timeout-minutes: 180
if: needs.build-docker-latest.outputs.continue
if: needs.build-docker.outputs.continue == 'true'
strategy:
# the following matrix strategy will result in one run per subject as matrix is "one-dimensional".
# Additional parameters under "include" are then added as additional (dependent) information
Expand All @@ -161,43 +171,52 @@ jobs:
- subject-id: 1.0mm
file-extension: ".mgz"
image-key: QUICKTEST_IMAGE_HREF_1mm
case-key: QUICKTEST_TARGET_HREF_1mm
extra-args: ""
- subject-id: 0.8mm
file-extension: ".nii.gz"
image-key: QUICKTEST_IMAGE_HREF_08mm
case-key: QUICKTEST_TARGET_HREF_08mm
extra-args: "--3T"
steps:
- uses: actions/checkout@v4
with:
sparse-checkout: .github/actions
- name: Check out the repository that is to be tested
uses: actions/checkout@v4
- name: Run FastSurfer for ${{ matrix.subject-id }} with the previously created docker container
uses: Deep-MI/FastSurfer/.github/actions/run-fastsurfer@dev
# TODO: revert back to Deep-MI/FastSurfer/...
# uses: Deep-MI/FastSurfer/.github/actions/run-fastsurfer@dev
uses: ./.github/actions/run-fastsurfer@dev
with:
subject-id: ${{ matrix.subject-id }}
file-extension: ${{ matrix.file-extension }}
image-href: ${{ secrets[matrix.image-key] }}
license: ${{ secrets.QUICKTEST_LICENSE }}
docker-image: ${{ needs.build-docker-latest.outputs.docker-image }}
extra-args: ${{ needs.build-docker-latest.outputs.extra-args }}
tests:
name: 'Download data and perform tests'
needs: fastsurfer
runs-on: ubuntu-latest
timeout-minutes: 60
strategy:
# the following matrix strategy will result in one run per subject as matrix is "one-dimensional".
# Additional parameters under "include" are then added as additional (dependent) information
matrix:
subject-id: [0.8mm, 1.0mm]
include:
- subject-id: 0.8mm
case-key: QUICKTEST_TARGET_HREF_08mm
- subject-id: 1.0mm
case-key: QUICKTEST_TARGET_HREF_1mm
steps:
- uses: actions/checkout@v4
docker-image: ${{ needs.build-docker.outputs.docker-image }}
extra-args: ${{ matrix.extra-args }} ${{ needs.build-docker.outputs.extra-args }}
cleanup: 'true'
- name: Run tests
uses: Deep-MI/FastSurfer/.github/actions/run-tests@dev
# TODO: revert back to Deep-MI/FastSurfer/...
# uses: Deep-MI/FastSurfer/.github/actions/run-tests@dev
uses: ./.github/actions/run-tests
with:
subject-id: ${{ matrix.subject-id }}
subjects-dir: ${{ env.SUBJECTS_DIR }}
reference-dir: ${{ env.REFERENCE_DIR }}
case-href: ${{ secrets[matrix.case-key] }}
junit-file: /tmp/fastsurfer-quicktest-${{ matrix.subject-id }}.junit.xml
annotation:
name: Annotate test results as checks
runs-on: ubuntu-latest
needs: [fastsurfer-test, build-docker]
if: always() && needs.build-docker.outputs.continue == 'true'
steps:
- name: Retrieve test JUnit files
uses: actions/download-artifact@v4
with:
pattern: fastsurfer-${{ github.sha }}-junit-*
merge-multiple: 'true'
path: /tmp
- name: Write the results into the check
uses: mikepenz/action-junit-report@v5
with:
report_paths: /tmp/fastsurfer-quicktest-*.junit.xml
check_name: Annotate the test results as checks
34 changes: 10 additions & 24 deletions CerebNet/inference.py
Original file line number Diff line number Diff line change
Expand Up @@ -277,11 +277,7 @@ def _calc_segstats(
"""

def _get_ids_startswith(_label_map: dict[int, str], prefix: str) -> list[int]:
return [
id
for id, name in _label_map.items()
if name.startswith(prefix) and not name.endswith("Medullare")
]
return [id for id, name in _label_map.items() if name.startswith(prefix) and not name.endswith("Medullare")]

freesurfer_id2cereb_name = self.cereb_name2fs_id.__reversed__()
freesurfer_id2name = self.freesurfer_name2id.__reversed__()
Expand All @@ -291,10 +287,7 @@ def _get_ids_startswith(_label_map: dict[int, str], prefix: str) -> list[int]:
47: ("Right", "Right-Cerebellum-Cortex"),
632: ("Vermis", "Cbm_Vermis"),
}
merge_map = {
id: _get_ids_startswith(label_map, prefix=prefix)
for id, (prefix, _) in meta_labels.items()
}
merge_map = {id: _get_ids_startswith(label_map, prefix=prefix) for id, (prefix, _) in meta_labels.items()}

# calculate PVE
from FastSurferCNN.segstats import pv_calc
Expand Down Expand Up @@ -375,22 +368,16 @@ def _get_subject_dataset(
from FastSurferCNN.utils.parser_defaults import ALL_FLAGS

raise ValueError(
f"Cannot resolve the intended filename "
f"{subject.get_attribute('cereb_statsfile')} for the "
f"cereb_statsfile, maybe specify an absolute path via "
f"{ALL_FLAGS['cereb_statsfile'](dict)['flag']}."
f"Cannot resolve the intended filename {subject.get_attribute('cereb_statsfile')} for the "
f"cereb_statsfile, maybe specify an absolute path via {ALL_FLAGS['cereb_statsfile'](dict)['flag']}."
)
if not subject.has_attribute(
"norm_name"
) or not subject.fileexists_by_attribute("norm_name"):
if not subject.has_attribute("norm_name") or not subject.fileexists_by_attribute("norm_name"):
from FastSurferCNN.utils.parser_defaults import ALL_FLAGS

raise ValueError(
f"Cannot resolve the file name "
f"{subject.get_attribute('norm_name')} for the bias field "
f"corrected image, maybe specify an absolute path via "
f"{ALL_FLAGS['norm_name'](dict)['flag']} or the file does not "
f"exist."
f"Cannot resolve the file name {subject.get_attribute('norm_name')} for the bias field corrected "
f"image, maybe specify an absolute path via {ALL_FLAGS['norm_name'](dict)['flag']} or the file "
f"does not exist."
)

norm_file = subject.filename_by_attribute("norm_name")
Expand Down Expand Up @@ -510,9 +497,8 @@ def run(self, subject_dirs: SubjectList):
)

logger.info(
f"Subject {idx + 1}/{len(subject_dirs)} with id "
f"'{subject.id}' processed in {pred_time - start_time :.2f} "
f"sec."
f"Subject {idx + 1}/{len(subject_dirs)} with id '{subject.id}' processed in "
f"{pred_time - start_time :.2f} sec."
)
except Exception as e:
logger.exception(e)
Expand Down
2 changes: 1 addition & 1 deletion CerebNet/utils/meters.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,9 @@
import torch

from CerebNet.data_loader.data_utils import GRAY_MATTER, VERMIS_NAMES
from CerebNet.utils.metrics import DiceScore, dice_score, hd, volume_similarity
from CerebNet.utils.misc import plot_confusion_matrix, plot_predictions
from FastSurferCNN.utils import logging
from FastSurferCNN.utils.metrics import DiceScore, dice_score, hd, volume_similarity

logger = logging.get_logger(__name__)

Expand Down
Loading