Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TeamCity: archive debug logs if there are over 1000 of them #8507

Conversation

modular-magician
Copy link
Collaborator

Description

This PR addresses issues like in this build:

Screenshot 2024-10-23 at 10 31 02

The plugin used for uploading artifacts to S3 has a limit of 1000 artifacts per build. After contacting JetBrains support they suggested archiving files as a solution. The downside to this is that this would make it harder to open specific artifacts in your web browser, instead you'd need to download the entire zip/tar file, open it and then find the logs file you want.

The compromise discussed in the team chat was to archive logs only if a build exceeds the limit. This PR introduces that behaviour.

Testing

I used this PR to create a test project in TeamCity for manual testing of the bash in the new build step. This PR has some commits from when I forced builds to create >1000 or <1000 debug*.txt files and then see how the new build step reacted.

When there are >=1000 artifacts present

This commit GoogleCloudPlatform/magic-modules@7b966a7 shows how I forced builds to look like they had >1000 debug logs.

Here's a build in TeamCity showing the outcome:

In step 1 >1000 debug files are made

Screenshot 2024-10-23 at 10 59 22

In step 2

21:33:51 Step 2/2: Tasks after running nightly tests: archive artifacts(debug logs) if there are >=1000 before S3 upload (Command Line)
21:33:51   Starting: /opt/teamcity-agent/temp/agentTmp/custom_script8203854418285142512
21:33:51   in directory: /opt/teamcity-agent/work/d9c38c33567f381b
21:33:51   Post-test step - archive artifacts(debug logs) if there are >=1000 before S3 upload
21:33:51   Found >=1000 debug artifacts; archiving before upload to S3
21:33:51   Listing files matching the artifact rule value /opt/teamcity-agent/work/d9c38c33567f381b/debug*
21:33:51   debug-google-1-TerraformProviders_SarahTesting1000artifactFix_GOOGLE_NIGHTLYTESTS_GOOGLE_PACKAGE_ACCESSAPPROVAL-archive.tar.gz
21:33:51   Finished
21:33:51   Process exited with code 0

The result is the artifacts tab contains only that .tar.gz file:

Screenshot 2024-10-23 at 11 00 48

When there are <1000 artifacts present

This build shows the opposite scenario, where I made the builds instead create only 10 debug files (GoogleCloudPlatform/magic-modules@bfbca51)

Step 2 now looks like this:

22:10:50 Step 2/2: Tasks after running nightly tests: archive artifacts(debug logs) if there are >=1000 before S3 upload (Command Line)
22:10:50   Starting: /opt/teamcity-agent/temp/agentTmp/custom_script15021977902611658096
22:10:50   in directory: /opt/teamcity-agent/work/d9c38c33567f381b
22:10:50   Post-test step - archive artifacts(debug logs) if there are >=1000 before S3 upload
22:10:50   Found <1000 debug artifacts; we won't archive them before upload to S3
22:10:50 

And the artifacts tab shows the 10 files unchanged:

Screenshot 2024-10-23 at 11 04 14

Release Note Template for Downstream PRs (will be copied)

See Write release notes for guidance.


Derived from GoogleCloudPlatform/magic-modules#12083

[upstream:b828c9b15196b197675e0b5a1bd05254e8a82198]

Signed-off-by: Modular Magician <[email protected]>
@modular-magician modular-magician merged commit d358981 into hashicorp:main Oct 24, 2024
4 checks passed
@modular-magician modular-magician deleted the downstream-pr-b828c9b15196b197675e0b5a1bd05254e8a82198 branch November 16, 2024 03:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant