Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot Debug context canceled Error #63

Closed
YiannisGkoufas opened this issue Mar 5, 2020 · 5 comments
Closed

Cannot Debug context canceled Error #63

YiannisGkoufas opened this issue Mar 5, 2020 · 5 comments

Comments

@YiannisGkoufas
Copy link

Hi,

I have been getting this error from kube-rbac-proxy sidecar and I am really struggling to understand what is the reason behind it. kube-rbac-proxy is a sidecar to this deamonset:

apiVersion: extensions/v1beta1
kind: DaemonSet
metadata:
  labels:
    app: node-exporter
  name: node-exporter
spec:
  revisionHistoryLimit: 10
  selector:
    matchLabels:
      app: node-exporter
  template:
    metadata:
      labels:
        app: node-exporter
    spec:
      containers:
      - args:
        - --web.listen-address=127.0.0.1:9101
        - --path.procfs=/host/proc
        - --path.sysfs=/host/sys
        - --collector.filesystem.ignored-mount-points=^/(dev|proc|sys|var/lib/docker/.+)($|/)
        - --collector.filesystem.ignored-fs-types=^(autofs|binfmt_misc|cgroup|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|mqueue|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|sysfs|tracefs)$
        - --no-collector.wifi
        image: registry.redhat.io/openshift3/prometheus-node-exporter:v3.11
        imagePullPolicy: Always
        name: node-exporter
        resources: {}
        terminationMessagePath: /dev/termination-log
        terminationMessagePolicy: File
        volumeMounts:
        - mountPath: /host/proc
          name: proc
        - mountPath: /host/sys
          name: sys
        - mountPath: /host/root
          mountPropagation: HostToContainer
          name: root
          readOnly: true
      - args:
        - --v=10
        - --logtostderr=true
        - --secure-listen-address=:9100
        - --upstream=http://127.0.0.1:9101/
        - --tls-cert-file=/etc/tls/private/tls.crt
        - --tls-private-key-file=/etc/tls/private/tls.key
        - --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_RSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256
        image: registry.redhat.io/openshift3/ose-kube-rbac-proxy:v3.11
        imagePullPolicy: Always
        name: kube-rbac-proxy
        ports:
        - containerPort: 9100
          hostPort: 9100
          name: https
          protocol: TCP
        resources:
          limits:
            cpu: 20m
            memory: 40Mi
          requests:
            cpu: 10m
            memory: 20Mi
        terminationMessagePath: /dev/termination-log
        terminationMessagePolicy: File
        volumeMounts:
        - mountPath: /etc/tls/private
          name: node-exporter-tls
      dnsPolicy: ClusterFirst
      hostNetwork: true
      hostPID: true
      nodeSelector:
        beta.kubernetes.io/os: linux
      restartPolicy: Always
      schedulerName: default-scheduler
      securityContext: {}
      serviceAccount: node-exporter
      serviceAccountName: node-exporter
      terminationGracePeriodSeconds: 30
      tolerations:
      - operator: Exists
      volumes:
      - hostPath:
          path: /proc
          type: ""
        name: proc
      - hostPath:
          path: /sys
          type: ""
        name: sys
      - hostPath:
          path: /
          type: ""
        name: root
      - name: node-exporter-tls
        secret:
          defaultMode: 420
          secretName: node-exporter-tls
  updateStrategy:
    rollingUpdate:
      maxUnavailable: 1
    type: RollingUpdate

The logs I can see in the pod are these:

I0305 15:58:58.660116   52302 main.go:186] Valid token audiences: 
I0305 15:58:58.660216   52302 main.go:248] Reading certificate files
I0305 15:58:58.660264   52302 reloader.go:98] reloading key /etc/tls/private/tls.key certificate /etc/tls/private/tls.crt
I0305 15:58:58.660465   52302 main.go:281] Starting TCP socket on :9100
I0305 15:58:58.857782   52302 main.go:288] Listening securely on :9100
I0305 15:59:23.358521   52302 request.go:1017] Request Body: {"kind":"TokenReview","apiVersion":"authentication.k8s.io/v1","metadata":{"creationTimestamp":null},"spec":{"token":"TOKEN"},"status":{"user":{}}}
I0305 15:59:23.358722   52302 round_trippers.go:423] curl -k -v -XPOST  -H "User-Agent: kube-rbac-proxy/v0.0.0 (linux/ppc64le) kubernetes/$Format" -H "Accept: application/json, */*" -H "Content-Type: application/json" -H "Authorization: Bearer  BEARER" 'https://172.30.0.1:443/apis/authentication.k8s.io/v1/tokenreviews'
I0305 15:59:23.886863   52302 round_trippers.go:443] POST https://172.30.0.1:443/apis/authentication.k8s.io/v1/tokenreviews 201 Created in 528 milliseconds
I0305 15:59:23.886914   52302 round_trippers.go:449] Response Headers:
I0305 15:59:23.886922   52302 round_trippers.go:452]     Cache-Control: no-store
I0305 15:59:23.886928   52302 round_trippers.go:452]     Content-Type: application/json
I0305 15:59:23.886934   52302 round_trippers.go:452]     Content-Length: 1287
I0305 15:59:23.886939   52302 round_trippers.go:452]     Date: Thu, 05 Mar 2020 15:59:23 GMT
I0305 15:59:23.887001   52302 request.go:1017] Response Body: {"kind":"TokenReview","apiVersion":"authentication.k8s.io/v1","metadata":{"creationTimestamp":null},"spec":{"token":"TOKEN"},"status":{"authenticated":true,"user":{"username":"system:serviceaccount:openshift-monitoring:prometheus-k8s","uid":"464a5b41-00a8-11ea-b65b-40f2e95c5cac","groups":["system:serviceaccounts","system:serviceaccounts:openshift-monitoring","system:authenticated"]}}}
I0305 15:59:23.957965   52302 proxy.go:199] kube-rbac-proxy request attributes: attrs=0
I0305 15:59:23.958155   52302 request.go:1017] Request Body: {"kind":"SubjectAccessReview","apiVersion":"authorization.k8s.io/v1","metadata":{"creationTimestamp":null},"spec":{"nonResourceAttributes":{"path":"/metrics","verb":"get"},"user":"system:serviceaccount:openshift-monitoring:prometheus-k8s","groups":["system:serviceaccounts","system:serviceaccounts:openshift-monitoring","system:authenticated"],"uid":"464a5b41-00a8-11ea-b65b-40f2e95c5cac"},"status":{"allowed":false}}
I0305 15:59:23.958291   52302 round_trippers.go:423] curl -k -v -XPOST  -H "User-Agent: kube-rbac-proxy/v0.0.0 (linux/ppc64le) kubernetes/$Format" -H "Authorization: Bearer BEARER" -H "Accept: application/json, */*" -H "Content-Type: application/json" 'https://172.30.0.1:443/apis/authorization.k8s.io/v1/subjectaccessreviews'
I0305 15:59:23.959469   52302 round_trippers.go:443] POST https://172.30.0.1:443/apis/authorization.k8s.io/v1/subjectaccessreviews 201 Created in 1 milliseconds
I0305 15:59:23.959482   52302 round_trippers.go:449] Response Headers:
I0305 15:59:23.959490   52302 round_trippers.go:452]     Cache-Control: no-store
I0305 15:59:23.959498   52302 round_trippers.go:452]     Content-Type: application/json
I0305 15:59:23.959507   52302 round_trippers.go:452]     Content-Length: 575
I0305 15:59:23.959514   52302 round_trippers.go:452]     Date: Thu, 05 Mar 2020 15:59:23 GMT
I0305 15:59:23.959536   52302 request.go:1017] Response Body: {"kind":"SubjectAccessReview","apiVersion":"authorization.k8s.io/v1","metadata":{"creationTimestamp":null},"spec":{"nonResourceAttributes":{"path":"/metrics","verb":"get"},"user":"system:serviceaccount:openshift-monitoring:prometheus-k8s","groups":["system:serviceaccounts","system:serviceaccounts:openshift-monitoring","system:authenticated"],"uid":"464a5b41-00a8-11ea-b65b-40f2e95c5cac"},"status":{"allowed":true,"reason":"RBAC: allowed by ClusterRoleBinding \"prometheus-k8s\" of ClusterRole \"prometheus-k8s\" to ServiceAccount \"prometheus-k8s/openshift-monitoring\""}}
2020/03/05 15:59:33 http: proxy error: context canceled
I0305 15:59:53.357908   52302 proxy.go:199] kube-rbac-proxy request attributes: attrs=0
2020/03/05 16:00:03 http: proxy error: context canceled
I0305 16:00:23.258756   52302 proxy.go:199] kube-rbac-proxy request attributes: attrs=0
2020/03/05 16:00:33 http: proxy error: context canceled
I0305 16:00:53.258463   52302 proxy.go:199] kube-rbac-proxy request attributes: attrs=0
2020/03/05 16:01:03 http: proxy error: context canceled

This error seems to be coming from the golang.org/x/net library, but I cannot understand how to investigate it more.
The amazing thing is that in one of the (identical in terms of setup/configuration) nodes the proxy works without a problem.
Would be very grateful if someone can help me with this one.

@YiannisGkoufas
Copy link
Author

Just an additional comment. Although in the daemonset yaml I posted there is the registry.redhat.io/openshift3/ose-kube-rbac-proxy:v3.11 image, I have tried and built/tested myself kube-rbac-proxy in master and 0.4.1 version with the exact same behavior

@brancz
Copy link
Owner

brancz commented Mar 17, 2020

This log line means that the client cancelled the request, there's not much we can/should do here I think.

@wenhuwang
Copy link

Is there a result of this questions? I have the same questions with kube-rbac-proxy:0.4.1 images.

# kubectl -n xxx logs -f  node-exporter-bvqhf kube-rbac-proxy 
2021/11/16 05:57:55 http: proxy error: context canceled
2021/11/16 05:57:55 http: proxy error: context canceled
2021/11/16 05:57:55 http: proxy error: context canceled
2021/11/16 05:57:55 http: proxy error: context canceled
2021/11/16 05:57:55 http: proxy error: context canceled
2021/11/16 05:57:55 http: proxy error: context canceled
2021/11/16 05:57:55 http: proxy error: context canceled
2021/11/16 05:57:55 http: proxy error: context canceled

@ibihim
Copy link
Collaborator

ibihim commented Apr 25, 2022

This is a stale issue. If there is no request for keeping it open within a month, I would like to close it.

@ibihim
Copy link
Collaborator

ibihim commented Jun 7, 2022

I will close the issue as being stale. Feel free to reopen it, if it is still an issue.

@ibihim ibihim closed this as completed Jun 7, 2022
ibihim pushed a commit to ibihim/kube-rbac-proxy that referenced this issue Oct 20, 2022
ibihim pushed a commit to ibihim/kube-rbac-proxy that referenced this issue Apr 26, 2023
Merge upstream v0.14.0 on downstream
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants