kubelogin: error: You must be logged in to the server (Unauthorized)

I followed the Keycloak documentation, but cant really seem to make it work. Keycloak is setup as pr. the docs, and when I run below command, it looks like I’m getting the response that I should.

kubectl oidc-login get-token -v1 \
 --oidc-issuer-url=https://keycloak-domain.org/auth/realms/kubernetes \
 --oidc-client-id=kubernetes \
 --oidc-client-secret=secret-goes-here
...
I0927 21:37:02.504991   32273 get_token.go:81] the ID token has the claim: groups=[kubernetes:admin]
I0927 21:37:02.504973   32273 get_token.go:81] the ID token has the claim: aud=kubernetes
I0927 21:37:02.505052   32273 get_token.go:81] the ID token has the claim: iss=https://keycloak-domain.org/auth/realms/kubernetes
I0927 21:37:02.505037   32273 get_token.go:81] the ID token has the claim: sub=uuid-goes-here
...

kube-api is configured.

$ cat /etc/kubernetes/manifests/kube-apiserver.yaml
...
    - --oidc-client-id=kubernetes
    - --oidc-groups-claim=groups
    - --oidc-issuer-url=https://keycloak-domain.org/auth/realms/kubernetes
    - --oidc-username-claim=email
...

I applied below to kubernetes.

kind: ClusterRoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: keycloak-admin-group
roleRef:
  apiGroup: rbac.authorization.k8s.io
  kind: ClusterRole
  name: cluster-admin
subjects:
- kind: Group
  name: kubernetes:admin

And added below to my kubeconfig file, which I have exported with export KUBECONFIG=./kubeconfig

...
contexts:
- context:
    cluster: green-bird-3416
    user: keycloak
  name: keycloak@green-bird-3416
current-context: keycloak@green-bird-3416
kind: Config
preferences: {}
users:
- name: keycloak
  user:
    exec:
      apiVersion: client.authentication.k8s.io/v1beta1
      command: kubelogin
      args:
      - get-token
      - --oidc-issuer-url=https://keycloak-domain.org/auth/realms/kubernetes
      - --oidc-client-id=kubernetes
      - --oidc-client-secret=secret-goes-here

It generates a temp file at ~/.kube/cache/oidc-login/d721553ba91f6078f86a5cb2caa2f78eb4d27898b238dfad310b87f01ecdd117 with what looks like correct content.

But when i try and execute kubectl commands I just get:

$ kubectl get pods
You got a valid token until 2019-09-27 21:50:29 +0200 CEST
error: You must be logged in to the server (Unauthorized)

What am I missing here ?

About this issue

  • Original URL
  • State: open
  • Created 5 years ago
  • Comments: 16 (2 by maintainers)

Most upvoted comments

I get same issue with 1.14.8 (kops) at first. But I found what’s wrong with my settings.

  • if you have --oidc-username-claim=email in kubeapiserver, you will need add - --oidc-extra-scope=email in kubelogin args.

my finial working configuration looks like this

  kubeAPIServer:
    oidcIssuerURL: https://accounts.google.com
    oidcClientID: xxx.apps.googleusercontent.com
    oidcUsernameClaim: email
users:
- name: google
  user:
    exec:
      apiVersion: client.authentication.k8s.io/v1beta1
      args:
      - oidc-login
      - get-token
      - --oidc-issuer-url=https://accounts.google.com
      - --oidc-client-id=xxx.apps.googleusercontent.com
      - --oidc-client-secret=xxx
      - --oidc-extra-scope=email
      - --oidc-extra-scope=profile
      command: kubectl

I encountered the issue before. We may need to clear the cache especially when you tried many different ways.

What I did was:

rm -rf ~/.kube/cache
rm -rf ~/.kube/http-cache

And then it would initiate a new process once you use the user and eventually worked perfectly fine.

FYI, this is my element configured in ~/.kube/config:

- name: oidc-cluster-admin
  user:
    exec:
      apiVersion: client.authentication.k8s.io/v1beta1
      args:
      - oidc-login
      - get-token
      - --oidc-issuer-url=<REDACTED>
      - --oidc-redirect-url-hostname=<REDACTED>
      - --oidc-client-id=<REDACTED>
      - --oidc-client-secret=<REDACTED>
      - --oidc-extra-scope=email
      - --certificate-authority=/tmp/ca.pem
      command: kubectl
      env: null
      provideClusterInfo: false

Hi Everyone! I had the same original issue, i’m using the authentication with the IDP Keycloak. The authentication by the browser is working but i receive the below message (log level 1) from the kubectl --user=oidc get nodes command.

I0923 17:16:30.416277 35800 get_token.go:107] you already have a valid token until 2021-09-23 17:21:28 +0200 CEST I0923 17:16:30.416287 35800 get_token.go:114] writing the token to client-go error: You must be logged in to the server (Unauthorized)

From the Kubernetes API pod, the error is the same as explained by @int128 1 authentication.go:53] Unable to authenticate the request due to an error: invalid bearer token

The result of the kubectl oidc-login setup command is returning the token completed.

My environment :

Kubernetes version 1.19.6 Deployed by Kubespray

-------API CONFIG FILE - --oidc-issuer-url=https://keycloak.localdomain.lan/auth/realms/Kubernetes - --oidc-client-id=kubernetes - --oidc-ca-file=/etc/kubernetes/ssl/localdomain.lan.pem

-------KUBECONFIG FILE

  • name: oidc user: exec: apiVersion: client.authentication.k8s.io/v1beta1 args: - oidc-login - get-token - --oidc-issuer-url=https://keycloak.localdomain.lan/auth/realms/Kubernetes - --oidc-client-id=kubernetes - --oidc-client-secret=SECRETID - --insecure-skip-tls-verify - -v1 command: kubectl env: null provideClusterInfo: false

Does someone already find the issue and solved it please ? Thanks in advance for any help!!

@hbceylan just google oauth clientid and rolling-update the cluster after the config change.

It seems the kube-apiserver does not accept a token. Would you check the log of kube-apiserver?

# tail the log
kubectl logs -n kube-system --tail=10 -f kube-apiserver-ip-xxxxxxxx

# try API access
kubectl get pods

Some message should appear like:

E1009 09:26:54.912586       1 authentication.go:65] Unable to authenticate the request due to an error: invalid bearer token