Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TeamCity: archive debug logs if there are over 1000 of them #12083

Conversation

SarahFrench
Copy link
Contributor

@SarahFrench SarahFrench commented Oct 22, 2024

Description

This PR addresses issues like in this build:

Screenshot 2024-10-23 at 10 31 02

The plugin used for uploading artifacts to S3 has a limit of 1000 artifacts per build. After contacting JetBrains support they suggested archiving files as a solution. The downside to this is that this would make it harder to open specific artifacts in your web browser, instead you'd need to download the entire zip/tar file, open it and then find the logs file you want.

The compromise discussed in the team chat was to archive logs only if a build exceeds the limit. This PR introduces that behaviour.

Testing

I used this PR to create a test project in TeamCity for manual testing of the bash in the new build step. This PR has some commits from when I forced builds to create >1000 or <1000 debug*.txt files and then see how the new build step reacted.

When there are >=1000 artifacts present

This commit 7b966a7 shows how I forced builds to look like they had >1000 debug logs.

Here's a build in TeamCity showing the outcome:

In step 1 >1000 debug files are made

Screenshot 2024-10-23 at 10 59 22

In step 2

21:33:51 Step 2/2: Tasks after running nightly tests: archive artifacts(debug logs) if there are >=1000 before S3 upload (Command Line)
21:33:51   Starting: /opt/teamcity-agent/temp/agentTmp/custom_script8203854418285142512
21:33:51   in directory: /opt/teamcity-agent/work/d9c38c33567f381b
21:33:51   Post-test step - archive artifacts(debug logs) if there are >=1000 before S3 upload
21:33:51   Found >=1000 debug artifacts; archiving before upload to S3
21:33:51   Listing files matching the artifact rule value /opt/teamcity-agent/work/d9c38c33567f381b/debug*
21:33:51   debug-google-1-TerraformProviders_SarahTesting1000artifactFix_GOOGLE_NIGHTLYTESTS_GOOGLE_PACKAGE_ACCESSAPPROVAL-archive.tar.gz
21:33:51   Finished
21:33:51   Process exited with code 0

The result is the artifacts tab contains only that .tar.gz file:

Screenshot 2024-10-23 at 11 00 48

When there are <1000 artifacts present

This build shows the opposite scenario, where I made the builds instead create only 10 debug files (bfbca51)

Step 2 now looks like this:

22:10:50 Step 2/2: Tasks after running nightly tests: archive artifacts(debug logs) if there are >=1000 before S3 upload (Command Line)
22:10:50   Starting: /opt/teamcity-agent/temp/agentTmp/custom_script15021977902611658096
22:10:50   in directory: /opt/teamcity-agent/work/d9c38c33567f381b
22:10:50   Post-test step - archive artifacts(debug logs) if there are >=1000 before S3 upload
22:10:50   Found <1000 debug artifacts; we won't archive them before upload to S3
22:10:50 

And the artifacts tab shows the 10 files unchanged:

Screenshot 2024-10-23 at 11 04 14

Release Note Template for Downstream PRs (will be copied)

See Write release notes for guidance.


@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 5 files changed, 63 insertions(+), 3 deletions(-))

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 5 files changed, 65 insertions(+), 4 deletions(-))

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 5 files changed, 96 insertions(+), 10 deletions(-))

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 6 files changed, 126 insertions(+), 40 deletions(-))

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 6 files changed, 127 insertions(+), 41 deletions(-))

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 6 files changed, 127 insertions(+), 41 deletions(-))

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 5 files changed, 66 insertions(+), 5 deletions(-))

@SarahFrench SarahFrench requested a review from shuyama1 October 23, 2024 10:07
@SarahFrench SarahFrench marked this pull request as ready for review October 23, 2024 10:07
@SarahFrench
Copy link
Contributor Author

SarahFrench commented Oct 23, 2024

Hi @shuyama1 - this PR addresses recent issues with artifacts for the Compute builds in TeamCity. The PR description links to some builds where I show the new build step in action.

Something annoying I noticed is that the TeamCity UI makes it looks like you can open archived files in the UI:

Screenshot 2024-10-23 at 11 09 30

However if you click to expand it causes a 500 error (you can try here):

Screenshot 2024-10-23 at 11 11 34

The error mentions "java.util.zip.ZipException: Not in GZIP format" but I think the file format is fine? I'm not sure if this is something I can resolve myself.

The file is still ok to be downloaded and opened locally though.

Copy link
Member

@shuyama1 shuyama1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the fix!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants