Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flaky test - k8seventsreceiver - TestNewReceiver #7952

Closed
jpkrohling opened this issue Feb 17, 2022 · 11 comments
Closed

Flaky test - k8seventsreceiver - TestNewReceiver #7952

jpkrohling opened this issue Feb 17, 2022 · 11 comments
Assignees
Labels
bug Something isn't working flaky test a test is flaky

Comments

@jpkrohling
Copy link
Member

jpkrohling commented Feb 17, 2022

Seen here: https://github.com/open-telemetry/opentelemetry-collector-contrib/runs/5209488349?check_suite_focus=true

make[2]: Entering directory '/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/k8seventsreceiver'
go test -race -v -timeout 300s --tags="" ./...
=== RUN   TestLoadConfig
--- PASS: TestLoadConfig (0.00s)
=== RUN   TestDefaultConfig
--- PASS: TestDefaultConfig (0.00s)
=== RUN   TestFactoryType
--- PASS: TestFactoryType (0.00s)
=== RUN   TestCreateReceiver
--- PASS: TestCreateReceiver (0.00s)
=== RUN   TestK8sEventToLogData
--- PASS: TestK8sEventToLogData (0.00s)
=== RUN   TestUnknownSeverity
--- PASS: TestUnknownSeverity (0.00s)
=== RUN   TestNewReceiver
E0216 00:57:41.677750   12900 runtime.go:78] Observed a panic: "invalid memory address or nil pointer dereference" (runtime error: invalid memory address or nil pointer dereference)
goroutine 7 [running]:
k8s.io/apimachinery/pkg/util/runtime.logPanic({0x20d56a0, 0x32f9530})
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/runtime/runtime.go:74 +0xe6
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc00033f898})
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/runtime/runtime.go:48 +0xb0
panic({0x20d56a0, 0x32f9530})
	/opt/hostedtoolcache/go/1.17.6/x64/src/runtime/panic.go:1047 +0x266
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch.func1(0xc00009a1c0, 0xc000224090, 0xc000102c60, 0xc00033fb00)
	/home/runner/go/pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:319 +0x127a
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch(0xc00009a1c0, 0xc000102c60)
	/home/runner/go/pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:361 +0x2d6
k8s.io/client-go/tools/cache.(*Reflector).Run.func1()
	/home/runner/go/pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:221 +0x45
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc00005fed8)
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:155 +0x82
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0003921c0, {0x2557820, 0xc000100320}, 0x1, 0xc000102c60)
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:156 +0xcf
k8s.io/client-go/tools/cache.(*Reflector).Run(0xc00009a1c0, 0xc000102c60)
	/home/runner/go/pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:220 +0x296
k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:56 +0x3f
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:73 +0x7f
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:71 +0xdf
panic: runtime error: invalid memory address or nil pointer dereference [recovered]
	panic: runtime error: invalid memory address or nil pointer dereference

goroutine 7 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc00033f898})
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/runtime/runtime.go:55 +0x145
panic({0x20d56a0, 0x32f9530})
	/opt/hostedtoolcache/go/1.17.6/x64/src/runtime/panic.go:1047 +0x266
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch.func1(0xc00009a1c0, 0xc000224090, 0xc000102c60, 0xc00033fb00)
	/home/runner/go/pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:319 +0x127a
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch(0xc00009a1c0, 0xc000102c60)
	/home/runner/go/pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:361 +0x2d6
k8s.io/client-go/tools/cache.(*Reflector).Run.func1()
	/home/runner/go/pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:221 +0x45
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc00005fed8)
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:155 +0x82
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0003921c0, {0x2557820, 0xc000100320}, 0x1, 0xc000102c60)
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:156 +0xcf
k8s.io/client-go/tools/cache.(*Reflector).Run(0xc00009a1c0, 0xc000102c60)
	/home/runner/go/pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:220 +0x296
k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:56 +0x3f
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:73 +0x7f
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
	/home/runner/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:71 +0xdf
FAIL	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/k8seventsreceiver	0.084s
FAIL
make[2]: *** [../../Makefile.Common:46: test] Error 1
make[2]: Leaving directory '/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/k8seventsreceiver'
make[1]: *** [Makefile:156: for-all-target-./receiver/k8seventsreceiver] Error 2
make[1]: Leaving directory '/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib'
make: *** [Makefile:72: gotest] Error 2

cc @dmitryax

@dmitryax
Copy link
Member

dmitryax commented Feb 20, 2022

Seems like the same error as #6986

It's very hard to debug because the panic is recovered in the k8s client library and the original stack trace is lost. I'll take another look later.

Please post any new occurrences here. We can disable the test if it becomes annoying.

@mx-psi mx-psi added the flaky test a test is flaky label Feb 21, 2022
@codeboten
Copy link
Contributor

@jpkrohling
Copy link
Member Author

@dmitryax
Copy link
Member

@dmitryax dmitryax self-assigned this Apr 11, 2022
@dmitryax
Copy link
Member

I'm going to work on it this week

@jpkrohling
Copy link
Member Author

jpkrohling commented Apr 12, 2022

This has the same underlying cause as #9002.

@jpkrohling
Copy link
Member Author

I'm closing this, as this might have been addressed by #9332.

@dmitryax
Copy link
Member

The tests are still skipped. I believe we need to reenable them to verify

@dmitryax
Copy link
Member

NVM, I got confused. They are not skipped

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working flaky test a test is flaky
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants