Skip to content

Commit

Permalink
refactor(integ): restructure integration test framework (#427)
Browse files Browse the repository at this point in the history
refactor(integ): restructure integration test framework

- run `PRE_AWS_INTERACTION_HOOK` between each stack deploy/destroy
- fix output redirection for parallel tests
- unify parallel and sequential code paths
- clean code organization
- clean log output and organize file artifacts
- update README.md with updated instructions
  • Loading branch information
jusiskin authored May 18, 2021
1 parent 2da01e5 commit 2e52314
Show file tree
Hide file tree
Showing 15 changed files with 612 additions and 151 deletions.
2 changes: 2 additions & 0 deletions integ/.eslintignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
node_modules
cdk.output
60 changes: 48 additions & 12 deletions integ/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,22 +3,58 @@
To run all test suites:

1. Build and install dependencies by running build.sh from the top-level RFDK directory
1. Configure AWS credentials. There are a few options for this:
* Configure credentials [using environment variables](https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/loading-node-credentials-environment.html)
* Run the integration tests on an [EC2 Instance with an IAM role](https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/loading-node-credentials-iam.html)
* Configure credentials using the [shared credentials file](https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/loading-node-credentials-shared.html)
1. *[Optional]* Set the environment variable `CDK_DEFAULT_REGION` to the region the test should be deployed in (defaults to `us-west-2`)
1. Modify the `test-config.sh` configuration file. Alternatively, the same variables can be set using environment
variables, but the `SKIP_TEST_CONFIG` environment variable must be set to `true`. In bash, this can done with the
following command:

1. Configure AWS credentials (tests will use the default AWS profile, so either set up a default profile in .aws/credentials or use temporary credentials).
```sh
export SKIP_TEST_CONFIG=true
```

1. Set the environment variable CDK_DEFAULT_REGION to the region the test should be deployed in
Currently the following options can be configured:
* **REQUIRED:** for the Deadline repository test component:
* `USER_ACCEPTS_SSPL_FOR_RFDK_TESTS`

1. Configure test-config.sh. This script configures which test modules will run and overrides certain default values. Currently these include:
* Options required for all Deadline test components:
* DEADLINE_VERSION - version of the Deadline repository installer used for the test
* DEADLINE_STAGING_PATH - Complete path to local staging folder for Deadline assets (see `packages/aws-rfdk/docs/DockerImageRecipes.md` for more information)
* Options required for the Deadline repository test component:
* USER_ACCEPTS_SSPL_FOR_RFDK_TESTS - should be set to true. Setting this variable is considered acceptance of the terms of the SSPL license. Follow [this link](https://www.mongodb.com/licensing/server-side-public-license) to read the terms of the SSPL license.
* Options required for the Deadline worker fleet test component (use `aws --region <region> ec2 describe-images --owners 357466774442 --filters "Name=name,Values=*Worker*" "Name=name,Values=*<version>*" --query 'Images[*].[ImageId, Name]' --output text` to discover AMI's):
* LINUX_DEADLINE_AMI_ID - set to the ID of an available Linux worker fleet AMI with Deadline installed.
* WINDOWS_DEADLINE_AMI_ID - set to the ID of an available Windows worker fleet AMI with Deadline installed.
Should be set to `true` to accept the MongoDB SSPL. Setting this variable is
considered acceptance of the terms of the
[SSPL license](https://www.mongodb.com/licensing/server-side-public-license).
* *[Optional]* configuration for **all** Deadline test components:
* `DEADLINE_VERSION`

1. Execute `yarn run e2e` from the `integ` directory. This will handle deploying the necessary stacks, run the appropriate tests on them, and then tear them down.
Version of the Deadline repository installer used for the test
* `DEADLINE_STAGING_PATH`

Complete path to local staging folder for Deadline assets (see
[DockerImageRecipes](../packages/aws-rfdk/docs/DockerImageRecipes.md) for more information)
* *[Optional]* configuration for the Deadline worker fleet test component:
* `LINUX_DEADLINE_AMI_ID`

The ID of a Linux AMI that has the Deadline client installed. The Deadline version should match the version
specified in `DEADLINE_VERSION`.
* `WINDOWS_DEADLINE_AMI_ID`

The ID of a Windows AMI that has the Deadline client installed. The Deadline version should match the version
specified in `DEADLINE_VERSION`.

1. From the `integ` directory, run:

yarn e2e

This will orchestrate the integration tests including:

1. Deploying the CloudFormation stacks
1. Execute tests against the stacks
1. Tear down the CloudFormation stacks
1. Output the results

Testing artifacts will be persisted in the `integ/.e2etemp` directory.
Subsequent executions of the integration tests will delete this directory,
so take care to persist the artifacts if desired.

# Example Output:

Expand Down
14 changes: 11 additions & 3 deletions integ/components/deadline/common/scripts/bash/component_e2e.sh
Original file line number Diff line number Diff line change
Expand Up @@ -13,16 +13,24 @@ if [[ $(basename $(pwd)) != $COMPONENT_NAME ]]; then
exit 1
fi

function log_error () {
exit_code=$?
action=$1
echo "[${COMPONENT_NAME}] ${action} failed"
return $exit_code
}

SKIP_TEST_CHECK=\$SKIP_${COMPONENT_NAME}_TEST
SKIP_TEST_CHECK=$(eval "echo $SKIP_TEST_CHECK" 2> /dev/null) || SKIP_TEST_CHECK=false
if [[ ! "${SKIP_TEST_CHECK}" = "true" ]]; then

# Load utility functions
source "../common/scripts/bash/deploy-utils.sh"

ensure_component_artifact_dir "${COMPONENT_NAME}"

if [[ $OPTION != '--destroy-only' ]]; then
deploy_component_stacks $COMPONENT_NAME
execute_component_test $COMPONENT_NAME
deploy_component_stacks $COMPONENT_NAME || log_error "app deployment"
execute_component_test $COMPONENT_NAME || log_error "running test suite"
fi
if [[ $OPTION != '--deploy-and-test-only' ]]; then
destroy_component_stacks $COMPONENT_NAME
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
#!/bin/bash
#
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
# SPDX-License-Identifier: Apache-2.0

# Handle errors manually
set +e
# Fail on unset variables
set -u

COMPONENT_ROOT="$1"
COMPONENT_NAME=$(basename "$COMPONENT_ROOT")
START_TIME=$SECONDS

# Before changing directories, we determine the
# asbolute path of INTEG_TEMP_DIR, since it is a relative
# path
export INTEG_TEMP_DIR=$(readlink -fm "${INTEG_TEMP_DIR}")

cd "$INTEG_ROOT/$COMPONENT_ROOT"

# Ensure the component's artifact subdir exists
source "../common/scripts/bash/deploy-utils.sh"
ensure_component_artifact_dir "${COMPONENT_NAME}"

(
set +e
../common/scripts/bash/component_e2e.sh "$COMPONENT_NAME"
exit_code=$?
echo $exit_code > "${INTEG_TEMP_DIR}/${COMPONENT_NAME}/exitcode"
exit $exit_code
)
test_exit_code=$?

FINISH_TIME=$SECONDS
cat > "${INTEG_TEMP_DIR}/${COMPONENT_NAME}/timings.sh" <<EOF
${COMPONENT_NAME}_START_TIME=${START_TIME}
${COMPONENT_NAME}_FINISH_TIME=${FINISH_TIME}
EOF

# Clean-up if test failed
if [[ $test_exit_code -ne 0 ]]
then
# A failed cleanup should propagate to the calling process
set -e
../common/scripts/bash/component_e2e.sh "$COMPONENT_NAME" --destroy-only
fi

exit 0
93 changes: 70 additions & 23 deletions integ/components/deadline/common/scripts/bash/deploy-utils.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,60 +6,107 @@
# This hook function is meant to be run before any interactions with AWS (such as a cdk deploy or destroy)
function run_aws_interaction_hook() {
# Invoke hook function if it is exported and name is defined in PRE_AWS_INTERACTION_HOOK variable
if [ ! -z "${PRE_AWS_INTERACTION_HOOK+x}" ] && [ "$(type -t $PRE_AWS_INTERACTION_HOOK)" == "function" ]
if [ ! -z "${PRE_AWS_INTERACTION_HOOK+x}" ] && [ "$(type -t $PRE_AWS_INTERACTION_HOOK)" == "function" ]
then
$PRE_AWS_INTERACTION_HOOK
fi
}

function ensure_component_artifact_dir () {
component_name=$1
# Ensure component artifact sub-directory exists
mkdir -p "${INTEG_TEMP_DIR}/${component_name}"
}

function deploy_component_stacks () {
COMPONENT_NAME=$1

run_aws_interaction_hook
echo "[${COMPONENT_NAME}] started"

echo "Running $COMPONENT_NAME end-to-end test..."
ensure_component_artifact_dir "${COMPONENT_NAME}"

echo "Deploying test app for $COMPONENT_NAME test suite"
if [ "${RUN_TESTS_IN_PARALLEL-}" = true ]; then
npx cdk deploy "*" --require-approval=never > "$INTEG_TEMP_DIR/${COMPONENT_NAME}_deploy.txt" 2>&1
else
npx cdk deploy "*" --require-approval=never
fi
echo "Test app $COMPONENT_NAME deployed."
# Generate the cdk.out directory which includes a manifest.json file
# this can be used to determine the deployment ordering
echo "[${COMPONENT_NAME}] synthesizing started"
npx cdk synth &> "${INTEG_TEMP_DIR}/${COMPONENT_NAME}/synth.log"
echo "[${COMPONENT_NAME}] synthesizing complete"

echo "[${COMPONENT_NAME}] app deployment started"

# Empty the deploy log file in case it was non-empty
deploy_log_path="${INTEG_TEMP_DIR}/${COMPONENT_NAME}/deploy.txt"
cp /dev/null "${deploy_log_path}"

for stack in $(cdk_stack_deploy_order); do
run_aws_interaction_hook

echo "[${COMPONENT_NAME}] -> [${stack}] stack deployment started"
npx cdk deploy --app cdk.out --require-approval=never -e "${stack}" &>> "${deploy_log_path}"
echo "[${COMPONENT_NAME}] -> [${stack}] stack deployment complete"
done

echo "[${COMPONENT_NAME}] app deployment complete"

return 0
}

function cdk_stack_deploy_order () {
# Outputs the stacks in topological deploy order
"${INTEG_ROOT}/scripts/node/stack-order"
}

function cdk_stack_destroy_order () {
# Outputs the stacks in topological destroy order
"${INTEG_ROOT}/scripts/node/stack-order" -r
}

function execute_component_test () {
COMPONENT_NAME=$1

run_aws_interaction_hook

echo "Running test suite $COMPONENT_NAME..."
if [ "${RUN_TESTS_IN_PARALLEL-}" = true ]; then
yarn run test "$COMPONENT_NAME.test" --json --outputFile="$INTEG_TEMP_DIR/$COMPONENT_NAME.json" > "$INTEG_TEMP_DIR/${COMPONENT_NAME}.txt" 2>&1
test_report_path="${INTEG_TEMP_DIR}/${COMPONENT_NAME}/test-report.json"
test_output_path="${INTEG_TEMP_DIR}/${COMPONENT_NAME}/test-output.txt"

echo "[${COMPONENT_NAME}] running test suite started"
ensure_component_artifact_dir "${COMPONENT_NAME}"
yarn run test "$COMPONENT_NAME.test" --json --outputFile="${test_report_path}" &> "${test_output_path}"
echo "[${COMPONENT_NAME}] running test suite complete"


if [[ -f "${test_report_path}" && $(node -pe "require('${test_report_path}').numFailedTests") -eq 0 ]]
then
echo "[${COMPONENT_NAME}] test suite passed"
else
yarn run test "$COMPONENT_NAME.test" --json --outputFile="$INTEG_TEMP_DIR/$COMPONENT_NAME.json"
echo "[${COMPONENT_NAME}] test suite failed"
fi
echo "Test suite $COMPONENT_NAME complete."

return 0
}

function destroy_component_stacks () {
COMPONENT_NAME=$1

run_aws_interaction_hook
ensure_component_artifact_dir "${COMPONENT_NAME}"

echo "Destroying test app $COMPONENT_NAME..."
if [ "${RUN_TESTS_IN_PARALLEL-}" = true ]; then
npx cdk destroy "*" -f > "$INTEG_TEMP_DIR/${COMPONENT_NAME}_destroy.txt" 2>&1
else
npx cdk destroy "*" -f
fi
echo "[${COMPONENT_NAME}] app destroy started"

destroy_log_path="${INTEG_TEMP_DIR}/${COMPONENT_NAME}/destroy.txt"
# Empty the destroy log file in case it was non-empty
cp /dev/null "${destroy_log_path}"
for stack in $(cdk_stack_destroy_order); do
run_aws_interaction_hook

echo "[${COMPONENT_NAME}] -> [${stack}] stack destroy started"
npx cdk destroy --app cdk.out -e -f "${stack}" &>> "${destroy_log_path}"
echo "[${COMPONENT_NAME}] -> [${stack}] stack destroy complete"
done

# Clean up artifacts
rm -f "./cdk.context.json"
rm -rf "./cdk.out"
echo "Test app $COMPONENT_NAME destroyed."

echo "[${COMPONENT_NAME}] app destroy complete"

return 0
}
3 changes: 2 additions & 1 deletion integ/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,8 @@
"eslint-plugin-license-header": "^0.2.0",
"jest": "^26.6.3",
"pkglint": "0.32.0",
"ts-jest": "^26.5.6"
"ts-jest": "^26.5.6",
"typescript": "~4.2.4"
},
"dependencies": {
"@aws-cdk/aws-autoscaling": "1.104.0",
Expand Down
2 changes: 0 additions & 2 deletions integ/scripts/bash/cleanup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -25,5 +25,3 @@ for COMPONENT in **/cdk.json; do
done

rm -rf "$INTEG_ROOT/node_modules"
rm -rf "$INTEG_ROOT/stage"
rm -rf "$INTEG_ROOT/.e2etemp"
24 changes: 20 additions & 4 deletions integ/scripts/bash/deploy-infrastructure.sh
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,25 @@ shopt -s globstar
# Deploy the infrastructure app, a cdk app containing only a VPC to be supplied to the following tests
INFRASTRUCTURE_APP="$INTEG_ROOT/components/_infrastructure"
cd "$INFRASTRUCTURE_APP"
echo "Deploying RFDK-integ infrastructure..."
npx cdk deploy "*" --require-approval=never || yarn run tear-down
echo "RFDK-integ infrastructure deployed."
mkdir -p "${INTEG_TEMP_DIR}/infrastructure"
echo "[infrastructure] deployment started"

# Handle errors manually
set +e

# Hide the deploy log unless something goes wrong (save the scrollback buffer)
npx cdk deploy "*" --require-approval=never &> "${INTEG_TEMP_DIR}/infrastructure/deploy.txt"
deploy_exit_code=$?

# If an exit code was returned from the deployment, output the deploy log
if [[ $deploy_exit_code -ne 0 ]]
then
echo "[infrastructure] deployment failed"
cat "${INTEG_TEMP_DIR}/infrastructure/deploy.txt"
else
echo "[infrastructure] deployment complete"
fi

cd "$INTEG_ROOT"

exit 0
exit $deploy_exit_code
Loading

0 comments on commit 2e52314

Please sign in to comment.