prometheus-adapter: apiserver was unable to write a JSON response: http2: stream closed

What happened? Deploy prometheus adapter with kube-prometheus Got below error in prometheus adapter pod log(repeat every 30 seconds):

I0506 11:28:29.190304       1 adapter.go:93] successfully using in-cluster auth
I0506 11:28:29.624391       1 serving.go:273] Generated self-signed cert (/var/run/serving-cert/apiserver.crt, /var/run/serving-cert/apiserver.key)
I0506 11:28:30.131352       1 serve.go:96] Serving securely on [::]:6443
E0506 11:28:53.628657       1 writers.go:149] apiserver was unable to write a JSON response: http2: stream closed
E0506 11:28:53.628687       1 status.go:64] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0506 11:28:53.628699       1 writers.go:149] apiserver was unable to write a JSON response: http2: stream closed
E0506 11:28:53.630557       1 writers.go:149] apiserver was unable to write a JSON response: http2: stream closed
E0506 11:28:53.630851       1 status.go:64] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0506 11:28:53.632975       1 status.go:64] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}

Did you expect to see some different? No errors

How to reproduce it (as minimally and precisely as possible): kubectl apply -f manifests

Environment

  • Prometheus Adapter version: v0.5.0 , Same issue with v0.6.0, adapter log similar error and then panics with v0.7.0 , error logs are
I0507 04:28:55.121299       1 dynamic_serving_content.go:129] Starting serving-cert::/var/run/serving-cert/apiserver.crt::/var/run/serving-cert/apiserver.key
I0507 04:28:55.121318       1 tlsconfig.go:219] Starting DynamicServingCertificateController
I0507 04:28:55.121548       1 configmap_cafile_content.go:205] Starting client-ca::kube-system::extension-apiserver-authentication::client-ca-file
I0507 04:28:55.121558       1 shared_informer.go:197] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
I0507 04:28:55.221420       1 shared_informer.go:204] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file 
I0507 04:28:55.221677       1 shared_informer.go:204] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file 
E0507 04:29:23.979358       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0507 04:29:23.979384       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0507 04:29:23.980432       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0507 04:29:23.980764       1 runtime.go:76] Observed a panic: runtime error: invalid memory address or nil pointer dereference
goroutine 356 [running]:
k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1.1(0xc0033c0180)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/server/filters/timeout.go:108 +0x107
panic(0x1861bc0, 0x2a840f0)
	/usr/lib/golang/src/runtime/panic.go:679 +0x1b2
compress/gzip.(*Writer).Write(0xc003226000, 0xc003388f80, 0x71, 0x71, 0x30, 0x1920b00, 0xc0037a8d01)
	/usr/lib/golang/src/compress/gzip/gzip.go:168 +0x237
k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc001ebc0a0, 0xc003388f80, 0x71, 0x71, 0xc003388f80, 0x71, 0x71)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/handlers/responsewriters/writers.go:182 +0x54e
k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x1ab3a63, 0x10, 0x7fa9dc4292c8, 0xc002b481e0, 0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646300, 0xc8, 0x1d597a0, 0xc001ebc050)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/handlers/responsewriters/writers.go:117 +0x389
k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x1d8e6e0, 0xc0005d50e0, 0x1d8e920, 0x2ac4d18, 0x0, 0x0, 0x0, 0x0, 0x7fa9dc6a8ed8, 0xc0001fc020, ...)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/handlers/responsewriters/writers.go:251 +0x555
k8s.io/apiserver/pkg/endpoints/discovery.(*APIVersionHandler).ServeHTTP(0xc003a22d80, 0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646300)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/discovery/version.go:81 +0x18f
k8s.io/apiserver/pkg/endpoints/discovery.(*APIVersionHandler).handle(...)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/discovery/version.go:77
github.com/emicklei/go-restful.(*Container).dispatch(0xc0005579e0, 0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646300)
	/home/sur/src/redhat/go/pkg/mod/github.com/emicklei/go-restful@v2.9.5+incompatible/container.go:288 +0xa4f
github.com/emicklei/go-restful.(*Container).Dispatch(...)
	/home/sur/src/redhat/go/pkg/mod/github.com/emicklei/go-restful@v2.9.5+incompatible/container.go:199
k8s.io/apiserver/pkg/server.director.ServeHTTP(0x1ac198d, 0x1a, 0xc0005579e0, 0xc0004bebd0, 0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646300)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/server/handler.go:146 +0x4d3
k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646300)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/filters/authorization.go:64 +0x512
net/http.HandlerFunc.ServeHTTP(0xc000491dc0, 0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646300)
	/usr/lib/golang/src/net/http/server.go:2007 +0x44
k8s.io/apiserver/pkg/server/filters.WithMaxInFlightLimit.func1(0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646300)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/server/filters/maxinflight.go:160 +0x5dc
net/http.HandlerFunc.ServeHTTP(0xc001cdc750, 0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646300)
	/usr/lib/golang/src/net/http/server.go:2007 +0x44
k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646300)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/filters/impersonation.go:50 +0x1fe6
net/http.HandlerFunc.ServeHTTP(0xc000491e00, 0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646300)
	/usr/lib/golang/src/net/http/server.go:2007 +0x44
k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646200)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/filters/authentication.go:131 +0xa8c
net/http.HandlerFunc.ServeHTTP(0xc0004cf4a0, 0x7fa9dc6a8ed8, 0xc0001fc020, 0xc003646200)
	/usr/lib/golang/src/net/http/server.go:2007 +0x44
k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc0033c0180, 0xc001cd69c0, 0x1d93160, 0xc0001fc020, 0xc003646200)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/server/filters/timeout.go:113 +0xd0
created by k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/server/filters/timeout.go:99 +0x1cb

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Reactions: 19
  • Comments: 63 (18 by maintainers)

Most upvoted comments

I’m also facing the same issues. is there any solution, please. prometheus-adapter:v0.9.1 and Kubernetes v1.21.5 error : apiserver received an error that is not an metav1.Status: &errors.errorString{s:“http2: stream closed”}

We had some issue on “AKS version 1.21.2” and Prometheus adaptor version v0.9.0.

Part of the log

E0602 00:10:39.074807 1 writers.go:117] apiserver was unable to write a JSON response: http2: stream closed I0602 00:10:39.074834 1 panic.go:965] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1” latency=“11.302014ms” userAgent=“Go-http-client/2.0” audit-ID=“4a2e6a4b-d1ef-4da0-a8c0-4e84c27615ce” srcIP=“10.254.1.1:51580” resp=200 E0602 00:10:39.074845 1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:“http2: stream closed”}: http2: stream closed E0602 00:10:39.074880 1 wrap.go:54] timeout or abort while handling: method=GET URI=“/apis/custom.metrics.k8s.io/v1beta1” audit-ID=“4a2e6a4b-d1ef-4da0-a8c0-4e84c27615ce” I0602 00:10:39.074973 1 panic.go:965] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1” latency=“11.675588ms” userAgent=“Go-http-client/2.0” audit-ID=“283e653f-f7e2-482a-b410-01b2a72664a2” srcIP=“10.254.1.1:51580” resp=200 E0602 00:10:39.074992 1 wrap.go:54] timeout or abort while handling: method=GET URI=“/apis/custom.metrics.k8s.io/v1beta1” audit-ID=“283e653f-f7e2-482a-b410-01b2a72664a2” I0602 00:10:39.075137 1 panic.go:965] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1” latency=“11.828874ms” userAgent=“Go-http-client/2.0” audit-ID=“0f98e845-f456-4f57-9563-ed3e913d86fd” srcIP=“10.254.1.1:51580” resp=200 E0602 00:10:39.075166 1 wrap.go:54] timeout or abort while handling: method=GET URI=“/apis/custom.metrics.k8s.io/v1beta1” audit-ID=“0f98e845-f456-4f57-9563-ed3e913d86fd” E0602 00:10:39.075202 1 writers.go:117] apiserver was unable to write a JSON response: http: Handler timeout E0602 00:10:39.076366 1 writers.go:130] apiserver was unable to write a fallback JSON response: http2: stream closed I0602 00:10:39.076592 1 panic.go:965] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1” latency=“11.676924ms” userAgent=“Go-http-client/2.0” audit-ID=“0cd76a97-36a0-483a-ba67-86e9f246930c” srcIP=“10.254.1.1:51580” resp=200 E0602 00:10:39.076633 1 wrap.go:54] timeout or abort while handling: method=GET URI=“/apis/custom.metrics.k8s.io/v1beta1” audit-ID=“0cd76a97-36a0-483a-ba67-86e9f246930c” E0602 00:10:39.076673 1 writers.go:117] apiserver was unable to write a JSON response: http: Handler timeout E0602 00:10:39.082087 1 writers.go:117] apiserver was unable to write a JSON response: http: Handler timeout E0602 00:10:39.084555 1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:“http: Handler timeout”}: http: Handler timeout E0602 00:10:39.085661 1 writers.go:130] apiserver was unable to write a fallback JSON response: http: Handler timeout I0602 00:10:39.085962 1 panic.go:965] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1” latency=“22.647887ms” userAgent=“Go-http-client/2.0” audit-ID=“25cda6c8-d26f-4513-afbd-e78e40a474c2” srcIP=“10.254.1.1:51580” resp=200 E0602 00:10:39.085986 1 wrap.go:54] timeout or abort while handling: method=GET URI=“/apis/custom.metrics.k8s.io/v1beta1” audit-ID=“25cda6c8-d26f-4513-afbd-e78e40a474c2” E0602 00:10:39.085991 1 writers.go:117] apiserver was unable to write a JSON response: http: Handler timeout E0602 00:10:39.087391 1 timeout.go:135] post-timeout activity - time-elapsed: 12.495042ms, GET “/apis/custom.metrics.k8s.io/v1beta1” result: <nil> E0602 00:10:39.089336 1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:“http: Handler timeout”}: http: Handler timeout E0602 00:10:39.090424 1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:“http: Handler timeout”}: http: Handler timeout E0602 00:10:39.091687 1 timeout.go:135] post-timeout activity - time-elapsed: 16.498247ms, GET “/apis/custom.metrics.k8s.io/v1beta1” result: <nil> E0602 00:10:39.093678 1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:“http: Handler timeout”}: http: Handler timeout E0602 00:10:39.095855 1 writers.go:130] apiserver was unable to write a fallback JSON response: http: Handler timeout E0602 00:10:39.096944 1 writers.go:130] apiserver was unable to write a fallback JSON response: http: Handler timeout E0602 00:10:39.099086 1 writers.go:130] apiserver was unable to write a fallback JSON response: http: Handler timeout E0602 00:10:39.100362 1 timeout.go:135] post-timeout activity - time-elapsed: 23.701592ms, GET “/apis/custom.metrics.k8s.io/v1beta1” result: <nil> E0602 00:10:39.101407 1 timeout.go:135] post-timeout activity - time-elapsed: 26.394414ms, GET “/apis/custom.metrics.k8s.io/v1beta1” result: <nil> E0602 00:10:39.102476 1 timeout.go:135] post-timeout activity - time-elapsed: 16.47346ms, GET “/apis/custom.metrics.k8s.io/v1beta1” result: <nil> I0602 00:10:40.817469 1 httplog.go:104] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1?timeout=32s” latency=“20.563964ms” userAgent=“kube-controller-manager/v1.21.8+ee73ea2 (linux/amd64) kubernetes/f7310cc/system:serviceaccount:kube-system:namespace-controller” audit-ID=“e3baa381-91d4-439c-8d82-a99d88a1df8c” srcIP=“10.254.0.1:41644” resp=200 I0602 00:10:41.268466 1 httplog.go:104] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1/namespaces/master-cd/pods/%2A/spring_cloud_stream_binder_kafka_offset_Rollover?labelSelector=app%3Dcustomerbill-acbr%2Capp.kubernetes.io%2Finstance%3Dcustomerbill-appliedcustomerbillingrate” latency=“38.893951ms” userAgent=“kube-controller-manager/v1.21.8+ee73ea2 (linux/amd64) kubernetes/f7310cc/system:serviceaccount:kube-system:horizontal-pod-autoscaler” audit-ID=“f9e0fcd8-ddf2-4acc-b302-18d6e943c6d2” srcIP=“10.254.0.1:41644” resp=404 I0602 00:10:41.321660 1 httplog.go:104] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1/namespaces/master-cd/pods/%2A/Latency_CycleRollOver_seconds?labelSelector=app%3Dcustomerbill-acbr%2Capp.kubernetes.io%2Finstance%3Dcustomerbill-appliedcustomerbillingrate” latency=“40.062263ms” userAgent=“kube-controller-manager/v1.21.8+ee73ea2 (linux/amd64) kubernetes/f7310cc/system:serviceaccount:kube-system:horizontal-pod-autoscaler” audit-ID=“a1cb6b73-5469-44d2-86d1-0ee37dc94e69” srcIP=“10.254.0.1:41644” resp=404 I0602 00:10:42.236036 1 httplog.go:104] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1?timeout=32s” latency=“22.410863ms” userAgent=“cluster-policy-controller/v0.0.0 (linux/amd64) kubernetes/$Format/system:serviceaccount:openshift-infra:resourcequota-controller” audit-ID=“b70a86dd-eacd-46af-865e-78f35d09841e” srcIP=“10.254.1.1:52016” resp=200 I0602 00:10:42.667459 1 httplog.go:104] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1/namespaces/rtb-eu-mexico-cd3/pods/%2A/spring_cloud_stream_binder_kafka_offset_Rollover?labelSelector=app%3Dcustomerbill-acbr%2Capp.kubernetes.io%2Finstance%3Dcustomerbill-appliedcustomerbillingrate” latency=“30.923405ms” userAgent=“kube-controller-manager/v1.21.8+ee73ea2 (linux/amd64) kubernetes/f7310cc/system:serviceaccount:kube-system:horizontal-pod-autoscaler” audit-ID=“f5aba336-7b1d-4306-85b2-a3c98a7ff890” srcIP=“10.254.0.1:41644” resp=404 I0602 00:10:42.709278 1 httplog.go:104] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1/namespaces/rtb-eu-mexico-cd3/pods/%2A/Latency_CycleRollOver_seconds?labelSelector=app%3Dcustomerbill-acbr%2Capp.kubernetes.io%2Finstance%3Dcustomerbill-appliedcustomerbillingrate” latency=“38.221175ms” userAgent=“kube-controller-manager/v1.21.8+ee73ea2 (linux/amd64) kubernetes/f7310cc/system:serviceaccount:kube-system:horizontal-pod-autoscaler” audit-ID=“6d3abd85-815b-45af-9080-ca768c726440” srcIP=“10.254.0.1:41644” resp=404 I0602 00:10:45.403819 1 httplog.go:104] “HTTP” verb=“GET” URI=“/healthz” latency=“649.027�s” userAgent=“kube-probe/1.21” audit-ID=“8ed9dd10-05ad-476e-b13d-3cb309888042” srcIP=“10.254.37.1:42658” resp=200 I0602 00:10:45.404423 1 httplog.go:104] “HTTP” verb=“GET” URI=“/healthz” latency=“114.015�s” userAgent=“kube-probe/1.21” audit-ID=“8b1e5600-63fc-4c33-b353-44d0ba13e7d0” srcIP=“10.254.37.1:42656” resp=200 I0602 00:10:46.470888 1 httplog.go:104] “HTTP” verb=“GET” URI=“/openapi/v2” latency=“4.046499ms” userAgent=“” audit-ID=“4fde3621-6b22-4cdb-a673-eb8d91df1c29” srcIP=“10.254.0.1:41644” resp=304 I0602 00:10:46.473486 1 httplog.go:104] “HTTP” verb=“GET” URI=“/openapi/v2” latency=“293.028�s” userAgent=“” audit-ID=“78d36042-d770-416a-a6a6-44b1ff3bfcb6” srcIP=“10.254.2.1:38716” resp=304 I0602 00:10:46.474318 1 httplog.go:104] “HTTP” verb=“GET” URI=“/openapi/v2” latency=“267.16�s” userAgent=“” audit-ID=“a77d7bf1-db19-44b4-b5b3-9b25263e06b0” srcIP=“10.254.1.1:52016” resp=304 I0602 00:10:47.050865 1 httplog.go:104] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1?timeout=32s” latency=“19.743576ms” userAgent=“cluster-policy-controller/v0.0.0 (linux/amd64) kubernetes/$Format/system:serviceaccount:openshift-infra:resourcequota-controller” audit-ID=“ed12733c-6927-4efc-8488-7d8d3b4ecf8e” srcIP=“10.254.1.1:52016” resp=200 I0602 00:10:47.571096 1 httplog.go:104] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1?timeout=32s” latency=“18.400355ms” userAgent=“kube-controller-manager/v1.21.8+ee73ea2 (linux/amd64) kubernetes/f7310cc/system:serviceaccount:kube-system:namespace-controller” audit-ID=“f7ae71cd-769a-43c7-a539-6394a4994af9” srcIP=“10.254.0.1:41644” resp=200 E0602 00:10:48.087168 1 writers.go:117] apiserver was unable to write a JSON response: http2: stream closed E0602 00:10:48.087521 1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:“http2: stream closed”}: http2: stream closed I0602 00:10:48.087227 1 panic.go:965] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1” latency=“34.856573ms” userAgent=“Go-http-client/2.0” audit-ID=“51c28451-f700-46c3-9acf-760360f85929” srcIP=“10.254.0.1:41636” resp=200 E0602 00:10:48.087570 1 wrap.go:54] timeout or abort while handling: method=GET URI=“/apis/custom.metrics.k8s.io/v1beta1” audit-ID=“51c28451-f700-46c3-9acf-760360f85929” E0602 00:10:48.089399 1 writers.go:130] apiserver was unable to write a fallback JSON response: http2: stream closed I0602 00:10:48.090544 1 panic.go:965] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1” latency=“38.178148ms” userAgent=“Go-http-client/2.0” audit-ID=“52b5d8e9-8f5f-4c59-96ae-60b842f32806” srcIP=“10.254.0.1:41636” resp=200 E0602 00:10:48.090589 1 wrap.go:54] timeout or abort while handling: method=GET URI=“/apis/custom.metrics.k8s.io/v1beta1” audit-ID=“52b5d8e9-8f5f-4c59-96ae-60b842f32806” E0602 00:10:48.090685 1 timeout.go:135] post-timeout activity - time-elapsed: 3.414355ms, GET “/apis/custom.metrics.k8s.io/v1beta1” result: <nil> E0602 00:10:48.092758 1 writers.go:117] apiserver was unable to write a JSON response: http: Handler timeout E0602 00:10:48.094491 1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:“http: Handler timeout”}: http: Handler timeout E0602 00:10:48.181403 1 writers.go:130] apiserver was unable to write a fallback JSON response: http: Handler timeout

Please re-open the issue

@benjaminhuo @daniel-habib i had the same error as Prometheus Adapter chart was creating default rules. You can avoid that by using this chart configuration :

# Default value is true
rules.default=false

Here is my configuration :

prometheus:
  url: http://prometheus-server.prometheus.svc
  port: 80

image:
  tag: v0.7.0

rbac:
  create: true

logLevel: 10

rules:

# This one is important and will remove the default rules.
  default: false
  external:
  - seriesQuery: '{__name__=~"^buildkite_queues_scheduled_jobs_count$"}'
    resources:
      overrides:
        job: {resource: "namespace"}
        queue: {resource: "pod"}
    name:
      matches: ""
      as: "buildkite_queues_scheduled_jobs_count"
    metricsQuery: sum(<<.Series>>{<<.LabelMatchers>>}) by (<<.GroupBy>>)

With this only configuration that gets created is :

cat /etc/adapter/config.yaml

externalRules:
- metricsQuery: sum(<<.Series>>{<<.LabelMatchers>>}) by (<<.GroupBy>>)
  name:
    as: buildkite_queues_scheduled_jobs_count
    matches: ""
  resources:
    overrides:
      job:
        resource: namespace
      queue:
        resource: pod
  seriesQuery: '{__name__=~"^buildkite_queues_scheduled_jobs_count$"}'

The same problem: K8S (v1.26.3), helm, prometheus, prometheus-adapter. Metric is present in Prometheus, Adapter is running, showing external metric, but not value. Logs: I1005 07:54:13.175138 1 httplog.go:132] “HTTP” verb=“LIST” URI=“/apis/external.metrics.k8s.io/v1beta1/namespaces/default/kafka_topic_partition_current_offset” latency=“3.342849ms” userAgent=“kube-controller-manager/v1.26.3 (linux/amd64) kubernetes/f18584a/system:serviceaccount:kube-system:horizontal-pod-autoscaler” audit-ID=“4e5d5ac2-0411-4a4d-aab4-108d56119aa3” srcIP=“10.XX.XX.128:10332” resp=404 … 1005 07:54:19.321095 1 writers.go:122] apiserver was unable to write a JSON response: http2: stream closed E1005 07:54:19.321115 1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:“http2: stream closed”}: http2: stream closed I1005 07:54:19.321209 1 panic.go:884] “HTTP” verb=“GET” URI=“/apis/custom.metrics.k8s.io/v1beta1” latency=“11.018902ms” userAgent=“Go-http-client/2.0” audit-ID=“65d37eb7-0d5d-4aff-865d-adcc5349c9b4” srcIP=“10.XX.XX.128:5200” resp=200 E1005 07:54:19.321233 1 wrap.go:54] timeout or abort while handling: method=GET URI=“/apis/custom.metrics.k8s.io/v1beta1” audit-ID=“65d37eb7-0d5d-4aff-865d-adcc5349c9b4” E1005 07:54:19.321509 1 writers.go:122] apiserver was unable to write a JSON response: http2: stream closed

Running v0.8.3 and I bumped into this issue too

E0201 21:20:16.900344       1 writers.go:107] apiserver was unable to write a JSON response: http2: stream closed
E0201 21:20:16.902991       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0201 21:20:16.905160       1 writers.go:120] apiserver was unable to write a fallback JSON response: http2: stream closed
E0201 21:20:16.905531       1 writers.go:107] apiserver was unable to write a JSON response: http2: stream closed
E0201 21:20:16.907333       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0201 21:20:16.908808       1 writers.go:120] apiserver was unable to write a fallback JSON response: http2: stream closed
E0201 21:20:46.912815       1 writers.go:107] apiserver was unable to write a JSON response: http2: stream closed
E0201 21:20:46.912848       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0201 21:20:46.913995       1 writers.go:101] apiserver was unable to close cleanly the response writer: http2: stream closed
E0201 21:20:46.914030       1 writers.go:120] apiserver was unable to write a fallback JSON response: http2: stream closed
E0201 21:20:46.914354       1 writers.go:101] apiserver was unable to close cleanly the response writer: http2: stream closed
E0201 21:20:46.915218       1 writers.go:101] apiserver was unable to close cleanly the response writer: http2: stream closed
E0201 21:20:46.916360       1 writers.go:101] apiserver was unable to close cleanly the response writer: http2: stream closed
E0201 21:20:46.916582       1 writers.go:101] apiserver was unable to close cleanly the response writer: http2: stream closed

Still happening on version 0.10.0 with K8s api-server 1.23.5

same stuff here, DigitalOcean managed k8s 1.23, prometheus adapter version v0.10.0

@brancz @DirectXMan12 @daniel-habib Sorry for the late response. I ran some tests using kube-prometheus: Tests are done in k8s v1.16.7, same problem in k8s v1.17

  • if kube-promehteus stack is deployed in the default namespace which is monitoring, even with custom metrics enabled(import ‘kube-prometheus/kube-prometheus-custom-metrics.libsonnet’), there is no panic and no error like apiserver was unable to write a JSON response: http2: stream closed.
  • if kube-promehteus stack is deployed in a different namespace by changing examples/kustomize.jsonnet without enabling custom metrics, everything is fine. No panic and no error logs
  • if kube-promehteus stack is deployed in a different namespace with custom metrics enabled(import ‘kube-prometheus/kube-prometheus-custom-metrics.libsonnet’), then there is panic and error messages like below. And if I change the prometheus adapter image from v0.7.0 to v0.6.0, then no panic anymore, only error log like apiserver was unable to write a JSON response: http2: stream closed exists

So for me it seems that this is related to prometheus adapter custom metrics functionality and namespace. I am wondering if there is something special with the default monitoring namespace?

@daniel-habib did you deploy prometheus adapter in a different namespace instead of the default monitoring namespace?

I0601 08:03:05.113483       1 shared_informer.go:204] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file 
I0601 08:03:05.113483       1 shared_informer.go:204] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file 
E0601 08:03:23.519665       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0601 08:03:23.519722       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0601 08:03:23.521181       1 writers.go:118] apiserver was unable to write a fallback JSON response: http2: stream closed
E0601 08:03:23.526186       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0601 08:03:23.527525       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0601 08:03:23.527708       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0601 08:03:23.528301       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0601 08:03:23.528639       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0601 08:03:23.528732       1 writers.go:118] apiserver was unable to write a fallback JSON response: http2: stream closed
E0601 08:03:23.528970       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0601 08:03:23.529429       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0601 08:03:23.530499       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0601 08:03:23.530852       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0601 08:03:23.531980       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0601 08:03:23.533048       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0601 08:03:23.533268       1 writers.go:105] apiserver was unable to write a JSON response: http2: stream closed
E0601 08:03:23.534207       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0601 08:03:23.535325       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0601 08:03:23.536514       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}
E0601 08:03:23.537787       1 runtime.go:76] Observed a panic: runtime error: invalid memory address or nil pointer dereference
goroutine 698 [running]:
k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1.1(0xc0003a3b00)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/server/filters/timeout.go:108 +0x107
panic(0x1861bc0, 0x2a840f0)
	/usr/lib/golang/src/runtime/panic.go:679 +0x1b2
compress/gzip.(*Writer).Write(0xc000218dc0, 0xc003384100, 0x71, 0x71, 0x30, 0x1920b00, 0xc0009bad01)
	/usr/lib/golang/src/compress/gzip/gzip.go:168 +0x237
k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc00043c3c0, 0xc003384100, 0x71, 0x71, 0xc003384100, 0x71, 0x71)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/handlers/responsewriters/writers.go:182 +0x54e
k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x1ab3a63, 0x10, 0x7f32b6596520, 0xc000471400, 0x7f32b6596490, 0xc0004b6e20, 0xc0002b0100, 0xc8, 0x1d597a0, 0xc00043c2d0)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/handlers/responsewriters/writers.go:117 +0x389
k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x1d8e6e0, 0xc00054ecc0, 0x1d8e920, 0x2ac4d18, 0x0, 0x0, 0x0, 0x0, 0x7f32b6596490, 0xc0004b6e20, ...)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/handlers/responsewriters/writers.go:251 +0x555
k8s.io/apiserver/pkg/endpoints/discovery.(*APIVersionHandler).ServeHTTP(0xc0002ee9c0, 0x7f32b6596490, 0xc0004b6e20, 0xc0002b0100)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/discovery/version.go:81 +0x18f
k8s.io/apiserver/pkg/endpoints/discovery.(*APIVersionHandler).handle(...)
	/home/sur/src/redhat/go/pkg/mod/k8s.io/apiserver@v0.17.3/pkg/endpoints/discovery/version.go:77
github.com/emicklei/go-restful.(*Container).dispatch(0xc00067d4d0, 0x7f32b6596490, 0xc0004b6e20, 0xc0002b0100)
	/home/sur/src/redhat/go/pkg/mod/github.com/emicklei/go-restful@v2.9.5+incompatible/container.go:288 +0xa4f
github.com/emicklei/go-restful.(*Container).Dispatch(...)
	/home/sur/src/redhat/go/pkg/mod/github.com/emicklei/go-restful@v2.9.5+incompatible/container.go:199
k8s.io/apiserver/pkg/server.director.ServeHTTP(0x1ac198d, 0x1a, 0xc00067d4d0, 0xc0004cc690, 0x7f32b6596490, 0xc0004b6e20, 0xc0002b0100)

I am seeing this issue in prometheus adapter v0.8.4 with Kubernetes API server 1.20.

@s-urbaniak We also had the same issue in v0.8.2 on K8S 1.19, and after seing this ticket, we updated to v0.8.3 but we still have the issue. Here is a stack trace from the pod logs :

Click to expand logs
E0331 17:15:16.060777       1 writers.go:107] apiserver was unable to write a JSON response: http: Handler timeout
E0331 17:15:16.060820       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http: Handler timeout"}
E0331 17:15:16.071111       1 writers.go:120] apiserver was unable to write a fallback JSON response: http: Handler timeout
I0331 17:15:16.075749       1 trace.go:205] Trace[42445098]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/gtp-microdemo/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 16:59:31.654) (total time: 944421ms):
Trace[42445098]: ---"Listing from storage done" 944406ms (17:15:00.060)
Trace[42445098]: [15m44.42162848s] [15m44.42162848s] END
I0331 17:15:24.200171       1 trace.go:205] Trace[1886749641]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh-ingestion/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:15:22.204) (total time: 1995ms):
Trace[1886749641]: ---"Listing from storage done" 1995ms (17:15:00.200)
Trace[1886749641]: [1.995692268s] [1.995692268s] END
I0331 17:16:28.283211       1 trace.go:205] Trace[1252433410]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/gtp-microdemo/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:16:27.416) (total time: 866ms):
Trace[1252433410]: ---"Listing from storage done" 866ms (17:16:00.283)
Trace[1252433410]: [866.604453ms] [866.604453ms] END
I0331 17:16:28.782621       1 trace.go:205] Trace[263673455]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/monitoring/pods,user-agent:kubectl/v1.19.2 (darwin/amd64) kubernetes/f574309,client:10.10.102.99 (31-Mar-2021 17:16:27.933) (total time: 849ms):
Trace[263673455]: ---"Listing from storage done" 848ms (17:16:00.781)
Trace[263673455]: [849.331452ms] [849.331452ms] END
I0331 17:16:57.976854       1 trace.go:205] Trace[654156552]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/istio-system/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:16:57.471) (total time: 504ms):
Trace[654156552]: ---"Listing from storage done" 504ms (17:16:00.976)
Trace[654156552]: [504.865599ms] [504.865599ms] END
I0331 17:16:59.765504       1 trace.go:205] Trace[523023790]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh-ingestion/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:16:59.174) (total time: 590ms):
Trace[523023790]: ---"Listing from storage done" 590ms (17:16:00.765)
Trace[523023790]: [590.850447ms] [590.850447ms] END
I0331 17:16:59.976617       1 trace.go:205] Trace[616592834]: "List" url:/apis/metrics.k8s.io/v1beta1/pods,user-agent:popeye/v0.0.0 (linux/amd64) kubernetes/$Format,client:10.10.102.101 (31-Mar-2021 17:16:59.076) (total time: 900ms):
Trace[616592834]: ---"Listing from storage done" 890ms (17:16:00.966)
Trace[616592834]: [900.481732ms] [900.481732ms] END
E0331 17:18:12.196488       1 writers.go:107] apiserver was unable to write a JSON response: http: Handler timeout
E0331 17:18:12.196504       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http: Handler timeout"}
E0331 17:18:12.235174       1 writers.go:120] apiserver was unable to write a fallback JSON response: http: Handler timeout
I0331 17:18:12.280018       1 trace.go:205] Trace[1668894547]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/monitoring/pods,user-agent:k9s/v0.0.0 (linux/amd64) kubernetes/$Format,client:10.10.101.97 (31-Mar-2021 17:02:27.173) (total time: 945078ms):
Trace[1668894547]: ---"Listing from storage done" 945022ms (17:18:00.196)
Trace[1668894547]: [15m45.078843727s] [15m45.078843727s] END
I0331 17:19:34.186849       1 trace.go:205] Trace[1396407138]: "List" url:/apis/metrics.k8s.io/v1beta1/pods,user-agent:gatekeeper/v3.3.0 (linux/amd64) 201a78d/2021-01-28T03:05:52Z,client:10.10.102.95 (31-Mar-2021 17:19:32.535) (total time: 1651ms):
Trace[1396407138]: ---"Listing from storage done" 1642ms (17:19:00.177)
Trace[1396407138]: [1.651356299s] [1.651356299s] END
I0331 17:20:03.259899       1 trace.go:205] Trace[1784165450]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/gtp-microdemo/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:20:02.275) (total time: 984ms):
Trace[1784165450]: ---"Listing from storage done" 984ms (17:20:00.259)
Trace[1784165450]: [984.184748ms] [984.184748ms] END
I0331 17:22:44.763736       1 trace.go:205] Trace[131303228]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/gtp-microdemo/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:22:43.483) (total time: 1280ms):
Trace[131303228]: ---"Listing from storage done" 1280ms (17:22:00.763)
Trace[131303228]: [1.280376113s] [1.280376113s] END
I0331 17:22:44.777457       1 trace.go:205] Trace[524107899]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/monitoring/pods,user-agent:kubectl/v1.19.2 (darwin/amd64) kubernetes/f574309,client:10.10.101.97 (31-Mar-2021 17:22:43.393) (total time: 1384ms):
Trace[524107899]: ---"Listing from storage done" 1383ms (17:22:00.776)
Trace[524107899]: [1.384360786s] [1.384360786s] END
I0331 17:23:45.265705       1 trace.go:205] Trace[241829995]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/monitoring/pods,user-agent:k9s/v0.0.0 (darwin/amd64) kubernetes/$Format,client:10.10.102.99 (31-Mar-2021 17:23:43.832) (total time: 1433ms):
Trace[241829995]: ---"Listing from storage done" 1432ms (17:23:00.264)
Trace[241829995]: [1.433295225s] [1.433295225s] END
I0331 17:24:43.877158       1 trace.go:205] Trace[1732416884]: "List" url:/apis/metrics.k8s.io/v1beta1/pods,user-agent:gatekeeper/v3.3.0 (linux/amd64) 201a78d/2021-01-28T03:05:52Z,client:10.10.102.95 (31-Mar-2021 17:24:42.831) (total time: 1045ms):
Trace[1732416884]: ---"Listing from storage done" 1034ms (17:24:00.866)
Trace[1732416884]: [1.045716597s] [1.045716597s] END
I0331 17:25:18.161790       1 trace.go:205] Trace[1497350969]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/gtp-microdemo/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:25:17.377) (total time: 784ms):
Trace[1497350969]: ---"Listing from storage done" 783ms (17:25:00.161)
Trace[1497350969]: [784.043687ms] [784.043687ms] END
I0331 17:25:18.859718       1 trace.go:205] Trace[1090184026]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/istio-system/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:25:18.174) (total time: 685ms):
Trace[1090184026]: ---"Listing from storage done" 685ms (17:25:00.859)
Trace[1090184026]: [685.195566ms] [685.195566ms] END
I0331 17:26:24.024595       1 trace.go:205] Trace[246303844]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh-ingestion/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:26:22.627) (total time: 1397ms):
Trace[246303844]: ---"Listing from storage done" 1396ms (17:26:00.024)
Trace[246303844]: [1.397010926s] [1.397010926s] END
I0331 17:26:47.941322       1 trace.go:205] Trace[187874685]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:26:47.322) (total time: 619ms):
Trace[187874685]: ---"Listing from storage done" 618ms (17:26:00.940)
Trace[187874685]: [619.259002ms] [619.259002ms] END
I0331 17:27:14.922687       1 trace.go:205] Trace[242867928]: "List" url:/apis/metrics.k8s.io/v1beta1/pods,user-agent:popeye/v0.0.0 (linux/amd64) kubernetes/$Format,client:10.10.102.101 (31-Mar-2021 17:27:14.345) (total time: 576ms):
Trace[242867928]: ---"Listing from storage done" 563ms (17:27:00.909)
Trace[242867928]: [576.927576ms] [576.927576ms] END
I0331 17:28:20.211666       1 trace.go:205] Trace[1215380840]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:28:18.183) (total time: 2028ms):
Trace[1215380840]: ---"Listing from storage done" 2028ms (17:28:00.211)
Trace[1215380840]: [2.028598026s] [2.028598026s] END
I0331 17:29:24.915566       1 trace.go:205] Trace[1609092681]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:29:23.858) (total time: 1057ms):
Trace[1609092681]: ---"Listing from storage done" 1057ms (17:29:00.915)
Trace[1609092681]: [1.05713702s] [1.05713702s] END
I0331 17:29:25.967903       1 trace.go:205] Trace[467415965]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:29:24.934) (total time: 1033ms):
Trace[467415965]: ---"Listing from storage done" 1033ms (17:29:00.967)
Trace[467415965]: [1.033591807s] [1.033591807s] END
I0331 17:29:27.023691       1 trace.go:205] Trace[1826137458]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/gtp-microdemo/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:29:25.998) (total time: 1024ms):
Trace[1826137458]: ---"Listing from storage done" 1024ms (17:29:00.023)
Trace[1826137458]: [1.024849876s] [1.024849876s] END
I0331 17:29:30.063713       1 trace.go:205] Trace[1758491021]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:29:27.040) (total time: 3022ms):
Trace[1758491021]: ---"Listing from storage done" 3022ms (17:29:00.063)
Trace[1758491021]: [3.022757294s] [3.022757294s] END
I0331 17:29:33.103677       1 trace.go:205] Trace[1511053346]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:29:30.081) (total time: 3022ms):
Trace[1511053346]: ---"Listing from storage done" 3022ms (17:29:00.103)
Trace[1511053346]: [3.022594816s] [3.022594816s] END
I0331 17:29:36.175544       1 trace.go:205] Trace[1829734186]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:29:33.128) (total time: 3047ms):
Trace[1829734186]: ---"Listing from storage done" 3047ms (17:29:00.175)
Trace[1829734186]: [3.047347074s] [3.047347074s] END
I0331 17:29:36.979417       1 trace.go:205] Trace[1423252229]: "List" url:/apis/metrics.k8s.io/v1beta1/nodes,user-agent:k9s/v0.0.0 (darwin/amd64) kubernetes/$Format,client:10.10.101.97 (31-Mar-2021 17:29:29.816) (total time: 7162ms):
Trace[1423252229]: [7.162527934s] [7.162527934s] END
E0331 17:29:36.979839       1 runtime.go:76] Observed a panic: runtime error: index out of range [0] with length 0
goroutine 99430 [running]:
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1.1(0xc003ffcf60)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:106 +0x113
panic(0x1c7caa0, 0xc00161f8e0)
	/usr/local/go/src/runtime/panic.go:969 +0x1b9
github.com/directxman12/k8s-prometheus-adapter/vendor/sigs.k8s.io/metrics-server/pkg/api.(*nodeMetrics).getNodeMetrics(0xc0000eca00, 0xc00356e700, 0x18, 0x20, 0x18, 0x20, 0x0, 0x0, 0xaa2ae7)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/sigs.k8s.io/metrics-server/pkg/api/node.go:212 +0x51c
github.com/directxman12/k8s-prometheus-adapter/vendor/sigs.k8s.io/metrics-server/pkg/api.(*nodeMetrics).List(0xc0000eca00, 0x20538a0, 0xc0035fbda0, 0xc0026f1050, 0x0, 0x0, 0x202a220, 0xc0026f1050)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/sigs.k8s.io/metrics-server/pkg/api/node.go:93 +0x38a
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/handlers.ListResource.func1(0x20484a0, 0xc00011b0a0, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/handlers/get.go:277 +0xfaf
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints.restfulListResource.func1(0xc0035fbce0, 0xc0035dc1c0)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1190 +0x91
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc0035fbce0, 0xc0035dc1c0)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:449 +0x2d5
github.com/directxman12/k8s-prometheus-adapter/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0002d94d0, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/github.com/emicklei/go-restful/container.go:294 +0x65a
github.com/directxman12/k8s-prometheus-adapter/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/github.com/emicklei/go-restful/container.go:204
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x1db05ae, 0x1a, 0xc0002d94d0, 0xc000098770, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x5de
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x165
net/http.HandlerFunc.ServeHTTP(0xc000335f80, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x59a
net/http.HandlerFunc.ServeHTTP(0xc000987c00, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186
net/http.HandlerFunc.ServeHTTP(0xc000987c40, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.WithMaxInFlightLimit.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/maxinflight.go:184 +0x4cf
net/http.HandlerFunc.ServeHTTP(0xc000335fb0, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x165
net/http.HandlerFunc.ServeHTTP(0xc000406000, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x23dd
net/http.HandlerFunc.ServeHTTP(0xc000987c80, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186
net/http.HandlerFunc.ServeHTTP(0xc000987cc0, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x165
net/http.HandlerFunc.ServeHTTP(0xc000406030, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186
net/http.HandlerFunc.ServeHTTP(0xc000987d00, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x165
net/http.HandlerFunc.ServeHTTP(0xc000406090, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e500)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x6d2
net/http.HandlerFunc.ServeHTTP(0xc000903ef0, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e500)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e400)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38a
net/http.HandlerFunc.ServeHTTP(0xc000987d40, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e400)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc003ffcf60, 0xc00058c660, 0x2053f60, 0xc00011b098, 0xc00356e400)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:111 +0xb8
created by github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:97 +0x1cc

goroutine 99429 [running]:
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apimachinery/pkg/util/runtime.logPanic(0x1a98d40, 0xc000fc03c0)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:74 +0x95
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0xc0033a7c98, 0x1, 0x1)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:48 +0x89
panic(0x1a98d40, 0xc000fc03c0)
	/usr/local/go/src/runtime/panic.go:969 +0x1b9
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP(0xc00058c660, 0x2048620, 0xc0035dc150, 0xc00356e400)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:117 +0x448
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.WithWaitGroup.func1(0x2048620, 0xc0035dc150, 0xc00356e300)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/waitgroup.go:59 +0x137
net/http.HandlerFunc.ServeHTTP(0xc0004060c0, 0x2048620, 0xc0035dc150, 0xc00356e300)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithRequestInfo.func1(0x2048620, 0xc0035dc150, 0xc00356e200)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/requestinfo.go:39 +0x269
net/http.HandlerFunc.ServeHTTP(0xc000406120, 0x2048620, 0xc0035dc150, 0xc00356e200)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithWarningRecorder.func1(0x2048620, 0xc0035dc150, 0xc00356e100)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/warning.go:35 +0x1a7
net/http.HandlerFunc.ServeHTTP(0xc00058c680, 0x2048620, 0xc0035dc150, 0xc00356e100)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithCacheControl.func1(0x2048620, 0xc0035dc150, 0xc00356e100)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/cachecontrol.go:31 +0xa8
net/http.HandlerFunc.ServeHTTP(0xc00058c6c0, 0x2048620, 0xc0035dc150, 0xc00356e100)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.withRequestReceivedTimestampWithClock.func1(0x2048620, 0xc0035dc150, 0xc00356e000)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/request_received_time.go:38 +0x1a7
net/http.HandlerFunc.ServeHTTP(0xc000406150, 0x2048620, 0xc0035dc150, 0xc00356e000)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/httplog.WithLogging.func1(0x203b3a0, 0xc00011b088, 0xc001829f00)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/httplog/httplog.go:91 +0x322
net/http.HandlerFunc.ServeHTTP(0xc00058c720, 0x203b3a0, 0xc00011b088, 0xc001829f00)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.withPanicRecovery.func1(0x203b3a0, 0xc00011b088, 0xc001829f00)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/wrap.go:70 +0xe6
net/http.HandlerFunc.ServeHTTP(0xc00058c760, 0x203b3a0, 0xc00011b088, 0xc001829f00)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server.(*APIServerHandler).ServeHTTP(0xc000406180, 0x203b3a0, 0xc00011b088, 0xc001829f00)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/handler.go:189 +0x51
net/http.serverHandler.ServeHTTP(0xc0008dc000, 0x203b3a0, 0xc00011b088, 0xc001829f00)
	/usr/local/go/src/net/http/server.go:2843 +0xa3
net/http.initALPNRequest.ServeHTTP(0x20538a0, 0xc003bb74d0, 0xc000a82380, 0xc0008dc000, 0x203b3a0, 0xc00011b088, 0xc001829f00)
	/usr/local/go/src/net/http/server.go:3415 +0x8d
github.com/directxman12/k8s-prometheus-adapter/vendor/golang.org/x/net/http2.(*serverConn).runHandler(0xc000b2a780, 0xc00011b088, 0xc001829f00, 0xc0024f0fa0)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/golang.org/x/net/http2/server.go:2152 +0x8b
created by github.com/directxman12/k8s-prometheus-adapter/vendor/golang.org/x/net/http2.(*serverConn).processHeaders
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/golang.org/x/net/http2/server.go:1882 +0x505
E0331 17:29:36.979876       1 wrap.go:58] apiserver panic'd on GET /apis/metrics.k8s.io/v1beta1/nodes?limit=1
http2: panic serving 192.168.19.128:58779: runtime error: index out of range [0] with length 0
goroutine 99430 [running]:
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1.1(0xc003ffcf60)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:106 +0x113
panic(0x1c7caa0, 0xc00161f8e0)
	/usr/local/go/src/runtime/panic.go:969 +0x1b9
github.com/directxman12/k8s-prometheus-adapter/vendor/sigs.k8s.io/metrics-server/pkg/api.(*nodeMetrics).getNodeMetrics(0xc0000eca00, 0xc00356e700, 0x18, 0x20, 0x18, 0x20, 0x0, 0x0, 0xaa2ae7)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/sigs.k8s.io/metrics-server/pkg/api/node.go:212 +0x51c
github.com/directxman12/k8s-prometheus-adapter/vendor/sigs.k8s.io/metrics-server/pkg/api.(*nodeMetrics).List(0xc0000eca00, 0x20538a0, 0xc0035fbda0, 0xc0026f1050, 0x0, 0x0, 0x202a220, 0xc0026f1050)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/sigs.k8s.io/metrics-server/pkg/api/node.go:93 +0x38a
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/handlers.ListResource.func1(0x20484a0, 0xc00011b0a0, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/handlers/get.go:277 +0xfaf
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints.restfulListResource.func1(0xc0035fbce0, 0xc0035dc1c0)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1190 +0x91
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc0035fbce0, 0xc0035dc1c0)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:449 +0x2d5
github.com/directxman12/k8s-prometheus-adapter/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0002d94d0, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/github.com/emicklei/go-restful/container.go:294 +0x65a
github.com/directxman12/k8s-prometheus-adapter/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/github.com/emicklei/go-restful/container.go:204
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x1db05ae, 0x1a, 0xc0002d94d0, 0xc000098770, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x5de
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x165
net/http.HandlerFunc.ServeHTTP(0xc000335f80, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x59a
net/http.HandlerFunc.ServeHTTP(0xc000987c00, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186
net/http.HandlerFunc.ServeHTTP(0xc000987c40, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.WithMaxInFlightLimit.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/maxinflight.go:184 +0x4cf
net/http.HandlerFunc.ServeHTTP(0xc000335fb0, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x165
net/http.HandlerFunc.ServeHTTP(0xc000406000, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x23dd
net/http.HandlerFunc.ServeHTTP(0xc000987c80, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186
net/http.HandlerFunc.ServeHTTP(0xc000987cc0, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x165
net/http.HandlerFunc.ServeHTTP(0xc000406030, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186
net/http.HandlerFunc.ServeHTTP(0xc000987d00, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x165
net/http.HandlerFunc.ServeHTTP(0xc000406090, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e600)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e500)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x6d2
net/http.HandlerFunc.ServeHTTP(0xc000903ef0, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e500)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f7a69cf6748, 0xc00011b098, 0xc00356e400)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38a
net/http.HandlerFunc.ServeHTTP(0xc000987d40, 0x7f7a69cf6748, 0xc00011b098, 0xc00356e400)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc003ffcf60, 0xc00058c660, 0x2053f60, 0xc00011b098, 0xc00356e400)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:111 +0xb8
created by github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:97 +0x1cc

goroutine 99429 [running]:
github.com/directxman12/k8s-prometheus-adapter/vendor/golang.org/x/net/http2.(*serverConn).runHandler.func1(0xc00011b088, 0xc0033a7f8e, 0xc000b2a780)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/golang.org/x/net/http2/server.go:2145 +0x16f
panic(0x1a98d40, 0xc000fc03c0)
	/usr/local/go/src/runtime/panic.go:969 +0x1b9
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0xc0033a7c98, 0x1, 0x1)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:55 +0x10c
panic(0x1a98d40, 0xc000fc03c0)
	/usr/local/go/src/runtime/panic.go:969 +0x1b9
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP(0xc00058c660, 0x2048620, 0xc0035dc150, 0xc00356e400)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:117 +0x448
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.WithWaitGroup.func1(0x2048620, 0xc0035dc150, 0xc00356e300)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/waitgroup.go:59 +0x137
net/http.HandlerFunc.ServeHTTP(0xc0004060c0, 0x2048620, 0xc0035dc150, 0xc00356e300)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithRequestInfo.func1(0x2048620, 0xc0035dc150, 0xc00356e200)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/requestinfo.go:39 +0x269
net/http.HandlerFunc.ServeHTTP(0xc000406120, 0x2048620, 0xc0035dc150, 0xc00356e200)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithWarningRecorder.func1(0x2048620, 0xc0035dc150, 0xc00356e100)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/warning.go:35 +0x1a7
net/http.HandlerFunc.ServeHTTP(0xc00058c680, 0x2048620, 0xc0035dc150, 0xc00356e100)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithCacheControl.func1(0x2048620, 0xc0035dc150, 0xc00356e100)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/cachecontrol.go:31 +0xa8
net/http.HandlerFunc.ServeHTTP(0xc00058c6c0, 0x2048620, 0xc0035dc150, 0xc00356e100)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters.withRequestReceivedTimestampWithClock.func1(0x2048620, 0xc0035dc150, 0xc00356e000)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/endpoints/filters/request_received_time.go:38 +0x1a7
net/http.HandlerFunc.ServeHTTP(0xc000406150, 0x2048620, 0xc0035dc150, 0xc00356e000)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/httplog.WithLogging.func1(0x203b3a0, 0xc00011b088, 0xc001829f00)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/httplog/httplog.go:91 +0x322
net/http.HandlerFunc.ServeHTTP(0xc00058c720, 0x203b3a0, 0xc00011b088, 0xc001829f00)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters.withPanicRecovery.func1(0x203b3a0, 0xc00011b088, 0xc001829f00)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/filters/wrap.go:70 +0xe6
net/http.HandlerFunc.ServeHTTP(0xc00058c760, 0x203b3a0, 0xc00011b088, 0xc001829f00)
	/usr/local/go/src/net/http/server.go:2042 +0x44
github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server.(*APIServerHandler).ServeHTTP(0xc000406180, 0x203b3a0, 0xc00011b088, 0xc001829f00)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/k8s.io/apiserver/pkg/server/handler.go:189 +0x51
net/http.serverHandler.ServeHTTP(0xc0008dc000, 0x203b3a0, 0xc00011b088, 0xc001829f00)
	/usr/local/go/src/net/http/server.go:2843 +0xa3
net/http.initALPNRequest.ServeHTTP(0x20538a0, 0xc003bb74d0, 0xc000a82380, 0xc0008dc000, 0x203b3a0, 0xc00011b088, 0xc001829f00)
	/usr/local/go/src/net/http/server.go:3415 +0x8d
github.com/directxman12/k8s-prometheus-adapter/vendor/golang.org/x/net/http2.(*serverConn).runHandler(0xc000b2a780, 0xc00011b088, 0xc001829f00, 0xc0024f0fa0)
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/golang.org/x/net/http2/server.go:2152 +0x8b
created by github.com/directxman12/k8s-prometheus-adapter/vendor/golang.org/x/net/http2.(*serverConn).processHeaders
	/go/src/github.com/directxman12/k8s-prometheus-adapter/vendor/golang.org/x/net/http2/server.go:1882 +0x505
I0331 17:29:39.247672       1 trace.go:205] Trace[1791678536]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/gtp-microdemo/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:29:36.218) (total time: 3029ms):
Trace[1791678536]: ---"Listing from storage done" 3028ms (17:29:00.247)
Trace[1791678536]: [3.029111942s] [3.029111942s] END
I0331 17:29:40.278318       1 trace.go:205] Trace[1859038666]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/gtp-microdemo/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:29:39.264) (total time: 1014ms):
Trace[1859038666]: ---"Listing from storage done" 1013ms (17:29:00.278)
Trace[1859038666]: [1.01408658s] [1.01408658s] END
I0331 17:29:40.279599       1 trace.go:205] Trace[1556301804]: "List" url:/apis/metrics.k8s.io/v1beta1/nodes,user-agent:k9s/v0.0.0 (linux/amd64) kubernetes/$Format,client:10.10.101.97 (31-Mar-2021 17:29:36.748) (total time: 3531ms):
Trace[1556301804]: ---"Listing from storage done" 3530ms (17:29:00.279)
Trace[1556301804]: [3.53102746s] [3.53102746s] END
I0331 17:30:28.489485       1 trace.go:205] Trace[1653574570]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/gtp-microdemo/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:30:27.688) (total time: 801ms):
Trace[1653574570]: ---"Listing from storage done" 801ms (17:30:00.489)
Trace[1653574570]: [801.10832ms] [801.10832ms] END
I0331 17:33:13.093205       1 trace.go:205] Trace[559269233]: "List" url:/apis/metrics.k8s.io/v1beta1/pods,user-agent:gatekeeper/v3.3.0 (linux/amd64) 201a78d/2021-01-28T03:05:52Z,client:10.10.102.95 (31-Mar-2021 17:33:12.365) (total time: 727ms):
Trace[559269233]: ---"Listing from storage done" 717ms (17:33:00.083)
Trace[559269233]: [727.412719ms] [727.412719ms] END
I0331 17:35:28.459511       1 trace.go:205] Trace[1745800610]: "List" url:/apis/metrics.k8s.io/v1beta1/namespaces/tinyeh/pods,user-agent:kube-controller-manager/v1.19.8 (linux/amd64) kubernetes/fd5d415/system:serviceaccount:kube-system:horizontal-pod-autoscaler,client:10.10.103.142 (31-Mar-2021 17:35:27.079) (total time: 1380ms):
Trace[1745800610]: ---"Listing from storage done" 1380ms (17:35:00.459)
Trace[1745800610]: [1.380278869s] [1.380278869s] END


It would be nice to reopen the issue if possible.

Note : I do not know if it is relevant but we installed it with kube-prometheus release-0.7 and cutomized the prometheus-adapter to point to v0.8.3 with jsonnet.

@stafot thank you! yes, it seems that this is the same issue then and should be fixed (famous last words) in the next bump of the api machinery dependencies.

hello, sorry but same error on 0.9.3 with Kubernetes 1.21 and 1.22

At pod restart, there are no errors and pod CPU util is fine, but after a few minutes errors start again and pod starts routinely CPU spiking. image image These spikes are entirely attributed to the adapter. Is anyone else seeing this behavior?

Still happening on version 0.10.0 with Azure AKS 1.24.6 Kubernetes version.

As I mentioned multiple times in the past, this issue will remain close as the original issue (panic) was fixed. However if anyone is interested in having the spammy logs being investigated feel free to open another one. But, I don’t think we have the resources to investigate it today.

We are experiencing this error as well in Eks 1.21 + prometheus-adapter:v0.9.1. Also, we have to allocate huge amount of memory (4G in our Production env) for each prometheus_adapter pod to avoid OOM. Not sure if this error has anything to do with that.

Same. This is on OVH managed K8S.

I0621 20:13:07.663059 1 httplog.go:104] "HTTP" verb="GET" URI="/apis/custom.metrics.k8s.io/v1beta1" latency="22.508197ms" userAgent="Go-http-client/2.0" audit-ID="ccebd973-4fbf-4885-bfb2-c4e9812f0827" srcIP="10.2.0.0:47978" resp=200
I0621 20:13:07.675854 1 httplog.go:104] "HTTP" verb="GET" URI="/apis/custom.metrics.k8s.io/v1beta1" latency="39.416013ms" userAgent="Go-http-client/2.0" audit-ID="64db2e56-39e1-41a8-84cf-aca4927b270f" srcIP="10.2.0.0:47978" resp=200
E0621 20:13:07.677675 1 writers.go:117] apiserver was unable to write a JSON response: http2: stream closed
E0621 20:13:07.677718 1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}: http2: stream closed
I0621 20:13:07.677751 1 panic.go:965] "HTTP" verb="GET" URI="/apis/custom.metrics.k8s.io/v1beta1" latency="23.814641ms" userAgent="Go-http-client/2.0" audit-ID="f5880006-ea7c-4853-890c-34d4fb246f04" srcIP="10.2.0.0:47978" resp=200
E0621 20:13:07.677773 1 wrap.go:54] timeout or abort while handling: method=GET URI="/apis/custom.metrics.k8s.io/v1beta1" audit-ID="f5880006-ea7c-4853-890c-34d4fb246f04"
E0621 20:13:07.677987 1 writers.go:117] apiserver was unable to write a JSON response: http2: stream closed
I0621 20:13:07.678005 1 panic.go:965] "HTTP" verb="GET" URI="/apis/custom.metrics.k8s.io/v1beta1" latency="31.430253ms" userAgent="Go-http-client/2.0" audit-ID="791069d5-72f1-4a9a-bcd3-d1fd8a1d06f6" srcIP="10.2.0.0:47978" resp=200
E0621 20:13:07.678015 1 wrap.go:54] timeout or abort while handling: method=GET URI="/apis/custom.metrics.k8s.io/v1beta1" audit-ID="791069d5-72f1-4a9a-bcd3-d1fd8a1d06f6"
E0621 20:13:07.679133 1 writers.go:130] apiserver was unable to write a fallback JSON response: http2: stream closed
E0621 20:13:07.680319 1 timeout.go:135] post-timeout activity - time-elapsed: 2.51136ms, GET "/apis/custom.metrics.k8s.io/v1beta1" result: <nil>
E0621 20:13:07.682498 1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http2: stream closed"}: http2: stream closed
I0621 20:13:07.688191 1 panic.go:965] "HTTP" verb="GET" URI="/apis/custom.metrics.k8s.io/v1beta1" latency="40.195178ms" userAgent="Go-http-client/2.0" audit-ID="349af75e-dcb4-49b3-8eb3-39d032607b8e" srcIP="10.2.0.0:47978" resp=200
E0621 20:13:07.688209 1 wrap.go:54] timeout or abort while handling: method=GET URI="/apis/custom.metrics.k8s.io/v1beta1" audit-ID="349af75e-dcb4-49b3-8eb3-39d032607b8e"
E0621 20:13:07.688228 1 writers.go:130] apiserver was unable to write a fallback JSON response: http2: stream closed
E0621 20:13:07.689398 1 timeout.go:135] post-timeout activity - time-elapsed: 11.367445ms, GET "/apis/custom.metrics.k8s.io/v1beta1" result: <nil>
E0621 20:13:07.689775 1 writers.go:111] apiserver was unable to close cleanly the response writer: http: Handler timeout
E0621 20:13:07.692788 1 timeout.go:135] post-timeout activity - time-elapsed: 4.544575ms, GET "/apis/custom.metrics.k8s.io/v1beta1" result: <nil>

Still happening on v0.10.0 (Helm chart v4.2.0) w/ kubeadm-managed cluster v1.24.15

The solution proposed in here does remove the error from logs, but kubectl top no command still not work with error: Metrics API not available error

any update?

Still happening

If you are seeing this in your logs and your prometheus-adapter pod is in a CrashLoopBackOff then you probably need to increase the amount of memory that prometheus-adapter has allocated. Out of the box the helm chart does not specify enough to handle the default ruleset it ships with.

The original issue mentioned a panic which was fixed by https://github.com/kubernetes/kubernetes/pull/94589 and then vendored in https://github.com/kubernetes-sigs/prometheus-adapter/pull/352.

We investigated the spammy logs in the past but haven’t been able to exactly pinpoint where the issue is coming from. I summed up the status of the investigation in https://github.com/kubernetes-sigs/prometheus-adapter/issues/292#issuecomment-767444375.

Are you sure that the fix you mentionned is sufficient ?

The panic doesn’t happen anymore so yes it is. That said the logs are still present but so far I haven’t seen any disruption coming from them nor I have been reported any. So I am pretty confident to say that it is pretty harmless.

/cc @DirectXMan12 @brancz I have the exact same problem (1 runtime.go:76] Observed a panic: runtime error: invalid memory address or nil pointer dereference goroutine 698 [running]:),while trying to deploy prometheus-adapter in kops. I used the manifests from kube-prometheus. Tried to deploy it in prometheus and kube-system namespaces with no success. Previously had the same setup on EKS and worked as expected. A main difference is that on eks used monitoring namespace. Any ideas on how i could debug this?

@benjaminhuo @daniel-habib i had the same error as Prometheus Adapter chart was creating default rules. You can avoid that by using this chart configuration : (…)

I’m running Kubernetes 1.18.3 and was brought here by the same issue. Your fix @shardulsrivastava works wonders.

Thank you!