Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conformance results for v1.20/k0s v0.10.0 #1310

Merged
merged 1 commit into from
Feb 20, 2021

Conversation

jnummelin
Copy link
Contributor

Signed-off-by: Jussi Nummelin [email protected]

Pre-submission checklist:

Please check each of these after submitting your pull request:

  • If this is a new entry, have you submitted a signed participation form?
  • Did you include the product/project logo in SVG, EPS or AI format?
  • Does your logo clearly state the name of the product/project and follow the other logo guidelines?
  • If your product/project is open source, did you include the repo_url?
  • Did you copy and paste the installation and configuration instructions into the README.md file in addition to linking to them?

@cncf-ci cncf-ci added the release-documents-checked All release documents are present and pass all checks label Feb 4, 2021
@cncf-ci
Copy link
Collaborator

cncf-ci commented Feb 4, 2021

Found v1.20 in logs

@cncf-ci
Copy link
Collaborator

cncf-ci commented Feb 4, 2021

This conformance request failed to include all of the required tests for v1.20

@cncf-ci
Copy link
Collaborator

cncf-ci commented Feb 4, 2021

The first test found to be mssing was [sig-scheduling] SchedulerPredicates [Serial] validates that there is no conflict between pods with same hostPort but different hostIP and protocol [LinuxOnly] [Conformance]

@jnummelin
Copy link
Contributor Author

The first test found to be mssing was [sig-scheduling] SchedulerPredicates [Serial] validates that there is no conflict between pods with same hostPort but different hostIP and protocol [LinuxOnly] [Conformance]

#1308 is seeing the same thing, although in their case the validation result has changed without them changing the test result data.

Feels like a bug in the bot validating the results 🤷

@taylorwaggoner
Copy link
Contributor

Thanks for pointing that out @jnummelin - we are looking into this and will get back to you!

@jnummelin
Copy link
Contributor Author

@taylorwaggoner thanks

riaankleinhans pushed a commit to ii/k8s-conformance-fork that referenced this pull request Feb 7, 2021
@riaankleinhans
Copy link
Collaborator

@jnummelin I looked at the failure message and replicated it in another repo.

The error is valid and there is a miss-match between the conformance.yaml file in the K/K Master branch and the content of your junit file. This was caused by PR #98299 changing the conformance.yaml file.

Your data contain the test [sig-scheduling] SchedulerPredicates [Serial] validates that there is no conflict between pods with same hostPort but different hostIP and protocol [LinuxOnly] [Conformance] but that was replaced by [sig-network] HostPort validates that there is no conflict between pods with same hostPort but different hostIP and protocol [LinuxOnly] [Conformance] in the PR mentioned.
Unfortunately, this will happen if the conformance.yaml get updated between the pulling of the data and the submission of the results.
You can resolve this by pulling the latest data and re-run your tests.

Sorry, for the confusing error message. The message has a logic error, reporting the test that fell out and not the missing new test. (We are working on that)

@riaankleinhans
Copy link
Collaborator

@jnummelin It seems to be a bug.
We are trying to sort it out as soon as possible.

@hh
Copy link
Collaborator

hh commented Feb 10, 2021

I think we found the issue upstream. Our validation compares the tests in your logs to a curated list of tests in the main branch: test/conformance/testdata/conformance.yaml

A 1.16 test was changed/renamed/relocated and accidentally tagged as updated for 1.20. However it should have been update for inclusion in 1.21.

@heyste created a PR to update the metadata and look forward to that being merged by #sig-scheduling, #sig-networking, and #sig-arch:

kubernetes/kubernetes#98940

I'm asking for priority here, so watch this space, and thanks for your patience. Once this merges we should be able to re-validate against the correct / updated list of tests for 1.20.

@jnummelin
Copy link
Contributor Author

@hh thanks for the update

@riaankleinhans
Copy link
Collaborator

@jnummelin I believe we figured out the issue causing the failure of PR checks.
on 16 Jan, after the release for 1.20 was cut, two commits changed the 1.20 conformance.yaml file.
This cause a mismatch between what you get out of Sonobuoy and that the bot check against the v1.20 .yaml file.
A PR was created to revert these commits, and that should stop the pain we are experiencing.
We will discuss the topic at the next conformance meeting to see how the historical conformance.yaml file should be protected against unwanted change.
Once the PR merge I believe the bot will check are label your PR correctly.

@cncf-ci
Copy link
Collaborator

cncf-ci commented Feb 15, 2021

Automatically verified as having all required tests present and passed

@cncf-ci
Copy link
Collaborator

cncf-ci commented Feb 15, 2021

Automatically verified as having all required tests present and passed

@jnummelin
Copy link
Contributor Author

@Riaankl thanks for addressing this, I now see all the proper labels in place.

@taylorwaggoner taylorwaggoner merged commit e0f3f09 into cncf:master Feb 20, 2021
tnorlin pushed a commit to tnorlin/k8s-conformance that referenced this pull request Dec 6, 2023
devidask27 pushed a commit to platform9/k8s-conformance that referenced this pull request Feb 20, 2024
tnorlin pushed a commit to tnorlin/k8s-conformance that referenced this pull request Oct 25, 2024
tnorlin pushed a commit to tnorlin/k8s-conformance that referenced this pull request Dec 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants