-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
VSTest process being killed/failing often for UWP and WASDK test hosts #7937
Comments
Hi @michael-hawker. Thank you for reporting, we will investigate this. |
Hey @michael-hawker. The behavior you have described may be caused by OOM errors sometimes. May I ask you to extend your workflow with a task that runs always at the end and outputs system events? E.g. using powershell: Also it would be very useful for us to try to reproduce this in a fork. Is it possible? What credentials or secrets have to be defined? |
@vpolikarpov-akvelon not familiar with I was able to run into the issue locally in release mode in one case, so there may be something going on with .NET Native, investigating a bit more on my side too. Or that's an orthogonal issue... |
We've been tracking down other known halts to our test process, but here was definitely an interim case where the build succeeded eventually on this run: https://github.com/CommunityToolkit/Windows/actions/runs/5684332115/job/15413161632?pr=157#step:18:895 Where it failed previously here: |
I tried collecting crash dump. Test suite fails on task If you want to inspect dump yourself, you should add a couple of steps to your workflow. Add this after - name: Enable User-Mode Dumps collecting
shell: powershell
run: |
New-Item '${{ github.workspace }}\CrashDumps' -Type Directory
Set-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows\Windows Error Reporting\LocalDumps' -Name 'DumpFolder' -Type ExpandString -Value '${{ github.workspace }}\CrashDumps'
Set-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows\Windows Error Reporting\LocalDumps' -Name 'DumpCount' -Type DWord -Value '10'
Set-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows\Windows Error Reporting\LocalDumps' -Name 'DumpType' -Type DWord -Value '2' And this at the end of the workflow to upload created dumps as an artifacts: - name: Artifact - CrashDumps
uses: actions/upload-artifact@v3
if: ${{ always() }}
with:
name: CrashDumps-${{ matrix.platform }}
path: './CrashDumps' |
Hey @michael-hawker. Do you have any news on this issue? Do you experience these problems still? Do you have any new information that can help with investigation as we are in a dead end currently. |
Thanks for checking in @vpolikarpov-akvelon we're still noticing issues. I've added the crash dump collection to our CI in one of my PRs currently facing issues, though I'm having trouble digging into the UWP minidump as my .NETCore CLR version doesn't match the one of the CLI, so I'm not sure how to install/resolve having the right one to dig deeper - any suggestions? Seems like the dump used I added a microsoft/vstest#2952 (comment) to hopefully poke that along about the test process failing more gracefully and providing more information about failures from the test process itself when it does crash. I also raised the issue with the platform team since the error you pointed out seemed to come from within the system's UI dll, but it didn't seem familiar to them, so will need more investigation. Hoping if I get two dumps failing in the same place that it'll help to pass them along, though I'd like to crack the managed stack open on the UWP one first. |
Was able to get dumps/stacks again on both WinUI2/3 pipelines even after ignoring that test (not sure how executing test was gotten from stack though before), however I think in a different test based on last passed test gets the same overflow message in the The test here doesn't really do much that the test that passed previously does, I think it just happens to be the one where whatever buffer happens to overflow is based on the number of run tests? Considering it's the same stack and exception we saw earlier. Do builds with merges to |
There is no any difference between runners that handle on-push and on-pr builds. If you suspect that failures depend on the amount of resources available, then you may try running your build on larger runners. |
@vpolikarpov-akvelon good to know. Yes, we've been waiting a while for the .NET Foundation to be approved for larger runners so we can try that. FYI @ChrisSfanos |
Hi @michael-hawker - I think we can move you to the DNF GitHub Enterprise to allow access. We can investigate |
@ChrisSfanos I believe we should already be connected to that? I'll ping you offline so we can follow-up. |
… it's just random based on number of tests run...) Need to comment out as Ignore ignored... see CommunityToolkit/Tooling-Windows-Submodule#121 Related to investigation, see info in actions/runner-images#7937
… it's just random based on number of tests run...) Need to comment out as Ignore ignored... see CommunityToolkit/Tooling-Windows-Submodule#121 Related to investigation, see info in actions/runner-images#7937
… it's just random based on number of tests run...) Need to comment out as Ignore ignored... see CommunityToolkit/Tooling-Windows-Submodule#121 Related to investigation, see info in actions/runner-images#7937
… it's just random based on number of tests run...) Need to comment out as Ignore ignored... see CommunityToolkit/Tooling-Windows-Submodule#121 Related to investigation, see info in actions/runner-images#7937
Hey @michael-hawker. Do you have any updates on this? Did you test your workflow on larger runners? |
Well, I'm closing this issue for now due to inactivity. Feel free to contact our team using internal channel (e.g. Teams) if you still need help with this issue. |
@vpolikarpov-akvelon was just about to respond, the larger runners didn't help (though it took longer to see it hit again, but it did just happen this morning since we flipped over to them). Happened in another random test (a string converter test). Really seems like a hiccup in the test process or platform or something. I'll follow-up with the platform team and the dumps I provided them. |
Description
It's not 100% consistent, but it's happening frequently and re-running our pipeline isn't a guarantee. It really kicked up this past week with the change to the environment, so that seems to be a factor? Especially since it's occurring for both UWP and WASDK based test projects? We haven't been changing anything to our test infrastructure code.
We hadn't been having any troubles like this recently up until the environment change over.
Bad runs:
https://github.com/CommunityToolkit/Windows/actions/runs/5537722051/jobs/10106880349
https://github.com/CommunityToolkit/Windows/actions/runs/5544940047/jobs/10124367762
https://github.com/CommunityToolkit/Windows/actions/runs/5514189775/jobs/10053173416
https://github.com/CommunityToolkit/Labs-Windows/actions/runs/5602439673/jobs/10247729757
Sometimes it runs fine:
https://github.com/CommunityToolkit/Windows/actions/runs/5523957535/jobs/10075613144
Maybe it's both the 20230630.1.0 and 20230706.1.0 and just getting worse with the newer one?
Going back over more than a couple of weeks in our builds I don't see this issue at all on 20230612.1:
https://github.com/CommunityToolkit/Windows/actions/runs/5310225807/jobs/9611943142
https://github.com/CommunityToolkit/Labs-Windows/actions/runs/5336453990/jobs/9671187354
We've made a few changes like upgrading from .NET 6 SDK to .NET 7 SDK , but see the errors consistently before/after that change. Didn't see any failures on 20230612.1 VMs. We haven't changed any of the tests being run in this time.
Platforms affected
Runner images affected
Image version and build link
We're seeing this across repos but they have the same image:
Is it regression?
20230612.1
Expected behavior
VSTest process able to finish tests normally.
Actual behavior
We're seeing the test process fail for both our UWP and WASDK run of tests:
https://github.com/CommunityToolkit/Labs-Windows/actions/runs/5602439673/jobs/10249333579?pr=418#step:21:477
https://github.com/CommunityToolkit/Labs-Windows/actions/runs/5602439673/jobs/10249333759?pr=418#step:21:558
I have one PR which is on 20230716.1.0 and just refusing to get past testing: https://github.com/CommunityToolkit/Labs-Windows/actions/runs/5602439673/attempts/1 (on 4th attempt at the moment)
Repro steps
Attempt to build either repo in GitHub Actions Runner using main workflow:
https://github.com/CommunityToolkit/Windows
https://github.com/CommunityToolkit/Labs-Windows
The text was updated successfully, but these errors were encountered: