Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Grafana Oncall remote Grafana instance authentication / installation problem #4500

Closed
atillaqb opened this issue Jun 11, 2024 · 1 comment
Closed

Comments

@atillaqb
Copy link

atillaqb commented Jun 11, 2024

What went wrong?

What happened:
Plugin Oncall installation failed via Helm chart and remote Grafana OSS.
Grafana OSS setup:

  • version: 10.4.2 - installed through Grafana Operator.
  • Grafana OSS installed on Kubernetes cluster 1.
  • Grafana OSS exposed by Ingress controller and SELF-SIGNED Certificate Authority (we are running our own).
  • Grafana OSS has an address: https://grafana.kubernetes1.example.com

Grafana Oncall OSS setup:

  • Grafana Oncall OSS installed on Kubernetes cluster 2.
  • Grafana Oncall OSS exposed by service type: Load Balancer (without ingress controller) on port 8080, by using helm chart property of service.
  • I configured DNS resolution to this load balancer which created by service.
  • Oncall DNS address: oncall.kubernetes2.example.com (Without HTTPS).
  • Oncall Port: 8080.

Grafana Oncall Helm values:

base_url: "oncall.kubernetes2.example.com:8080"
base_url_protocol: http
externalGrafana:
  url: "https://grafana.kubernetes1.example.com"

What did you expect to happen:
Install Grafana Oncall plugin correctly.

How do we reproduce it?

  1. Install Grafana OSS on Kubernetes cluster 1.
  2. Expose Grafana OSS through Ingress controller.
  3. Install Grafana Oncall OSS on Kubernetes cluster 2 via Helm chart.
  4. Oncall OSS will be already exposed by LoadBalancer service on port 8080.
  5. Try to install Grafana Oncall OSS plugin 1.6.2 on Grafana OSS on Kubernetes 1.
  6. Get error: An unknown error occurred when trying to install the plugin. Verify OnCall API URL, http://oncall.kubernetes2.example.com:8080, is correct?
    Refresh your page and try again, or try removing your plugin configuration and reconfiguring.

Grafana OnCall Version

v1.6.2

Product Area

Auth, Helm

Grafana OnCall Platform?

Kubernetes

User's Browser?

Edge latest

Anything else to add?

Logs from Grafana OSS when I'm clicking on connect in Grafana Oncall plugin install page:

logger=context userId=0 orgId=1 uname= t=2024-06-11T08:28:49.224952231Z level=error msg="Error from access control system" error="failed to authenticate user in target org: API key does not belong to target org" accessErrorID=ACE6304270508
logger=accesscontrol t=2024-06-11T08:28:49.225285882Z level=warn msg="No entity set for access control evaluation"
logger=context userId=0 orgId=1 uname= t=2024-06-11T08:28:49.225811145Z level=error msg="Request error" error="runtime error: invalid memory address or nil pointer dereference" stack="runtime/panic.go:261 (0x566b217)\nruntime/signal_unix.go:861 (0x566b1e5)\ngithub.jparrowsec.cn/grafana/grafana/pkg/services/accesscontrol/accesscontrol.go:170 (0x9bff842)\ngithub.jparrowsec.cn/grafana/grafana/pkg/services/navtree/navtreeimpl/admin.go:101 (0x9bfed48)\ngithub.jparrowsec.cn/grafana/grafana/pkg/services/navtree/navtreeimpl/navtree.go:158 (0x9c0573b)\ngithub.jparrowsec.cn/grafana/grafana/pkg/api/index.go:73 (0x9287077)\ngithub.jparrowsec.cn/grafana/grafana/pkg/api/index.go:256 (0x9288a44)\ngithub.jparrowsec.cn/grafana/grafana/pkg/api/response/web_hack.go:40 (0x6bd738f)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:131 (0x6bc9c4e)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/services/contexthandler/contexthandler.go:137 (0x92ac941)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/render.go:44 (0x6bca7dd)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:137 (0x6bc9cf2)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/middleware/csrf/csrf.go:66 (0x889ce3a)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/middleware/recovery.go:180 (0x9284a3d)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/middleware/loggermw/logger.go:72 (0x889de56)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/middleware/request_metrics.go:75 (0x706f935)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/middleware/request_tracing.go:88 (0x9285715)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/middleware/requestmeta/request_metadata.go:66 (0x92aba95)\nnet/http/server.go:2136 (0x59c6728)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/context.go:52 (0x6bc8736)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/router.go:155 (0x6bcbead)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/router.go:221 (0x6bcc9f4)\ngithub.jparrowsec.cn/grafana/grafana/pkg/web/macaron.go:163 (0x6bca11c)\nnet/http/server.go:2938 (0x59c940d)\nnet/http/server.go:2009 (0x59c52f3)\nruntime/asm_amd64.s:1650 (0x568a720)\n"
logger=context userId=0 orgId=1 uname= t=2024-06-11T08:28:49.225846098Z level=info msg="Request Completed" method=GET path=/d/000000042/archiving status=302 remote_addr=10.93.11.94 time_ms=14 duration=14.362911ms size=24 referer= handler=/d/:uid/:slug status_source=server
logger=context userId=1 orgId=1 uname=admin t=2024-06-11T08:28:50.010904682Z level=info msg="Request Completed" method=POST path=/api/plugin-proxy/grafana-oncall-app/api/internal/v1/plugin/self-hosted/install status=499 remote_addr=10.93.11.233 time_ms=60002 duration=1m0.002770559s size=0 referer=https://grafana.mgmt.int.questback.com/plugins/grafana-oncall-app handler=/api/plugin-proxy/:pluginId/* status_source=downstream
logger=infra.usagestats t=2024-06-11T08:28:51.256381811Z level=info msg="Usage stats are ready to report"

Logs at the SAME time in Grafana Oncall OSS Engine pod:

2024-06-11 08:28:52 Tue Jun 11 08:28:51 2024 - *** HARAKIRI ON WORKER 5 (pid: 18, try: 1) ***
2024-06-11 08:28:52 HARAKIRI: -- syscall> 42 0x9 0x7fff22b4ed90 0x10 0x0 0x0 0x0 0x7fff22b4ece8 0x7f70f81d4f63
2024-06-11 08:28:52 HARAKIRI: -- wchan> 0
2024-06-11 08:28:52 Tue Jun 11 08:28:51 2024 - HARAKIRI !!! worker 5 status !!!
2024-06-11 08:28:52 Tue Jun 11 08:28:51 2024 - HARAKIRI [core 0] 10.93.55.57 - POST /api/internal/v1/plugin/self-hosted/install since 1718094470
2024-06-11 08:28:52 Tue Jun 11 08:28:51 2024 - HARAKIRI !!! end of worker 5 status !!!
2024-06-11 08:28:52 DAMN ! worker 5 (pid: 18) died, killed by signal 9 :( trying respawn ...
2024-06-11 08:28:52 Respawned uWSGI worker 5 (new pid: 19)

Curl from Grafana OSS pod on Kubernetes1 to Grafana Oncall endpoint on kubernetes2:

grafana-deployment-6675986687-2vcv4:/usr/share/grafana$ curl http://oncall.kubernetes2.example.com:8080
Ok
@vashishkov
Copy link

@atillaqb Hello. Did you solve this? Have same issue on v1.8.13 with same setup

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants