ceph-csi: kubernetes: Unable to attach or mount volumes timed out waiting for the condition
Describe the bug
PVC mounting
Every time a pod wants to use a volume there is an undefined time (from 1 to 15 minutes) until it succeeds.
Environment details
k8s v1.16 docker v18.6.3 csi-rbd v2.1.1 (latest available) Ubuntu 18.04.4 LTS 4.15.0-101-generic
ceph-13.2.2-0.el7.x86_64 ceph-base-13.2.2-0.el7.x86_64 ceph-common-13.2.2-0.el7.x86_64 ceph-deploy-2.0.1-0.noarch ceph-iscsi-config-2.6-2.6.el7.noarch ceph-mds-13.2.2-0.el7.x86_64 ceph-mgr-13.2.2-0.el7.x86_64 ceph-mon-13.2.2-0.el7.x86_64 ceph-osd-13.2.2-0.el7.x86_64 ceph-radosgw-13.2.2-0.el7.x86_64 ceph-release-1-1.el7.noarch ceph-selinux-13.2.2-0.el7.x86_64 libcephfs2-13.2.2-0.el7.x86_64 python-cephfs-13.2.2-0.el7.x86_64
Steps to reproduce
Always at pod restart
Actual results
Then everything works as expected.
What is that time?
Who establishes it?
Is there a way to parameterize it?
Expected behavior
The pod should mount the volume in less than 30 seconds / 1 minute
Logs
If the issue is in PVC mounting please attach complete logs of below containers.
describe pod
Type Reason Age From Message
---- ------ ---- ---- -------
Normal Scheduled <unknown> default-scheduler Successfully assigned git/gitlab-gitaly-0 to node00b.workspace.domain.com
Normal SuccessfulAttachVolume 8m4s attachdetach-controller AttachVolume.Attach succeeded for volume "pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053"
Warning FailedMount 6m21s kubelet, node00b.workspace.domain.com Unable to attach or mount volumes: unmounted volumes=[repo-data], unattached volumes=[custom-ca-certificates default-token-nsd89 gitaly-config init-gitaly-secrets gitaly-secrets repo-data etc-ssl-certs]: timed out waiting for the condition
Warning FailedMount 4m6s kubelet, node00b.workspace.domain.com Unable to attach or mount volumes: unmounted volumes=[repo-data], unattached volumes=[gitaly-secrets repo-data etc-ssl-certs custom-ca-certificates default-token-nsd89 gitaly-config init-gitaly-secrets]: timed out waiting for the condition
Warning FailedMount 112s kubelet, node00b.workspace.domain.com Unable to attach or mount volumes: unmounted volumes=[repo-data], unattached volumes=[default-token-nsd89 gitaly-config init-gitaly-secrets gitaly-secrets repo-data etc-ssl-certs custom-ca-certificates]: timed out waiting for the condition
Normal Pulled 42s kubelet, node00b.workspace.domain.com Container image "registry.gitlab.com/gitlab-org/build/cng/alpine-certificates:20171114-r3" already present on machine
Normal Created 42s kubelet, node00b.workspace.domain.com Created container certificates
Normal Started 42s kubelet, node00b.workspace.domain.com Started container certificates
Normal Pulling 40s kubelet, node00b.workspace.domain.com Pulling image "busybox:latest"
Normal Pulled 38s kubelet, node00b.workspace.domain.com
---------------------------
Successfully pulled image "busybox:latest"
Normal Created 38s kubelet, node00b.workspace.domain.com Created container configure
Normal Started 38s kubelet, node00b.workspace.domain.com Started container configure
Normal Pulled 37s kubelet, node00b.workspace.domain.com Container image "registry.gitlab.com/gitlab-org/build/cng/gitaly:v12.9.3" already present on machine
Normal Created 37s kubelet, node00b.workspace.domain.com Created container gitaly
Normal Started 36s kubelet, node00b.workspace.domain.com Started container gitaly
kubelet
May 26 15:28:56 node00b.workspace.domain.com kubelet[64419]: E0526 15:28:56.862824 64419 goroutinemap.go:150] Operation for "/var/lib/kubelet/plugins/rbd.csi.ceph.com/csi.sock" failed. No retries permitted until 2020-05-26 15:30:58.86276487 +0200 CEST m=+9429.926617333 (durationBeforeRetry 2m2s). Error: "RegisterPlugin error -- failed to get plugin info using RPC GetInfo at socket /var/lib/kubelet/plugins/rbd.csi.ceph.com/csi.sock, err: rpc error: code = Unimplemented desc = unknown service pluginregistration.Registration"
May 26 15:29:16 node00b.workspace.domain.com kubelet[64419]: E0526 15:29:16.411411 64419 nestedpendingoperations.go:270] Operation for "\"kubernetes.io/csi/rbd.csi.ceph.com^0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2\"" failed. No retries permitted until 2020-05-26 15:29:16.911291367 +0200 CEST m=+9327.975143870 (durationBeforeRetry 500ms). Error: "Volume has not been added to the list of VolumesInUse in the node's volume status for volume \"pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053\" (UniqueName: \"kubernetes.io/csi/rbd.csi.ceph.com^0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2\") pod \"gitlab-gitaly-0\" (UID: \"4c4a25eb-361c-418f-a08e-814757e0e62c\") "
May 26 15:29:16 node00b.workspace.domain.com kubelet[64419]: I0526 15:29:16.411538 64419 reconciler.go:207] operationExecutor.VerifyControllerAttachedVolume started for volume "default-token-nsd89" (UniqueName: "kubernetes.io/secret/4c4a25eb-361c-418f-a08e-814757e0e62c-default-token-nsd89") pod "gitlab-gitaly-0" (UID: "4c4a25eb-361c-418f-a08e-814757e0e62c")
May 26 15:29:16 node00b.workspace.domain.com kubelet[64419]: I0526 15:29:16.411695 64419 reconciler.go:207] operationExecutor.VerifyControllerAttachedVolume started for volume "gitaly-secrets" (UniqueName: "kubernetes.io/empty-dir/4c4a25eb-361c-418f-a08e-814757e0e62c-gitaly-secrets") pod "gitlab-gitaly-0" (UID: "4c4a25eb-361c-418f-a08e-814757e0e62c")
csi-rbdplugin
I0527 15:03:50.497825 63711 mount_linux.go:173] Cannot run systemd-run, assuming non-systemd OS
I0527 15:03:50.497864 63711 mount_linux.go:174] systemd-run failed with: exit status 1
I0527 15:03:50.497884 63711 mount_linux.go:175] systemd-run output: Failed to create bus connection: No such file or directory
I0527 15:03:50.497976 63711 utils.go:165] ID: 38504 GRPC response: {"usage":[{"available":99309441024,"total":105554829312,"unit":1,"used":6228611072},{"available":6553490,"total":6553600,"unit":2,"used":110}]}
I0527 15:03:59.681382 63711 utils.go:159] ID: 38505 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0527 15:03:59.681414 63711 utils.go:160] ID: 38505 GRPC request: {}
I0527 15:03:59.683842 63711 utils.go:165] ID: 38505 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0527 15:03:59.689145 63711 utils.go:159] ID: 38506 GRPC call: /csi.v1.Node/NodeGetVolumeStats
I0527 15:03:59.689187 63711 utils.go:160] ID: 38506 GRPC request: {"volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2","volume_path":"/var/lib/kubelet/pods/447ed8b7-5c7b-4331-b5e9-3af6ed931be4/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount"}
I0527 14:55:25.930189 63711 mount_linux.go:173] Cannot run systemd-run, assuming non-systemd OS
I0527 14:55:25.930236 63711 mount_linux.go:174] systemd-run failed with: exit status 1
I0527 14:55:25.930264 63711 mount_linux.go:175] systemd-run output: Failed to create bus connection: No such file or directory
I0527 14:55:25.930382 63711 utils.go:165] ID: 38437 GRPC response: {"usage":[{"available":11681529856,"total":21003583488,"unit":1,"used":9305276416},{"available":1308647,"total":1310720,"unit":2,"used":2073}]}
I0527 14:55:39.993978 63711 utils.go:159] ID: 38438 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC call: /csi.v1.Node/NodeUnpublishVolume
I0527 14:55:39.994021 63711 utils.go:160] ID: 38438 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC request: {"target_path":"/var/lib/kubelet/pods/4c4a25eb-361c-418f-a08e-814757e0e62c/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount","volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2"}
I0527 14:55:39.995313 63711 mount_linux.go:238] Unmounting /var/lib/kubelet/pods/4c4a25eb-361c-418f-a08e-814757e0e62c/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount
I0527 14:55:40.018902 63711 nodeserver.go:520] ID: 38438 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: successfully unbound volume 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 from /var/lib/kubelet/pods/4c4a25eb-361c-418f-a08e-814757e0e62c/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount
I0527 14:55:40.018954 63711 utils.go:165] ID: 38438 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC response: {}
I0527 14:55:40.107871 63711 utils.go:159] ID: 38439 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0527 14:55:40.107903 63711 utils.go:160] ID: 38439 GRPC request: {}
I0527 14:55:40.109017 63711 utils.go:165] ID: 38439 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0527 14:55:40.117340 63711 utils.go:159] ID: 38440 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC call: /csi.v1.Node/NodeUnstageVolume
I0527 14:55:40.117370 63711 utils.go:160] ID: 38440 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC request: {"staging_target_path":"/var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount","volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2"}
I0527 14:55:40.118463 63711 mount_linux.go:238] Unmounting /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2
I0527 14:55:41.361192 63711 nodeserver.go:609] ID: 38440 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 successfully unmounted volume (0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2) from staging path (/var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2)
I0527 14:55:41.361476 63711 utils.go:165] ID: 38440 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC response: {}
I0527 14:55:57.705413 63711 utils.go:159] ID: 38441 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0527 14:55:57.705453 63711 utils.go:160] ID: 38441 GRPC request: {}
I0527 14:55:57.706772 63711 utils.go:165] ID: 38441 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0527 14:55:57.722224 63711 utils.go:159] ID: 38442 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC call: /csi.v1.Node/NodeStageVolume
I0527 14:55:57.722274 63711 utils.go:160] ID: 38442 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC request: {"secrets":"***stripped***","staging_target_path":"/var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount","volume_capability":{"AccessType":{"Mount":{"fs_type":"ext4","mount_flags":["discard"]}},"access_mode":{"mode":1}},"volume_context":{"clusterID":"16042995-ff80-4561-8d47-70d5a80ff4ea","imageFeatures":"layering","pool":"rbd","storage.kubernetes.io/csiProvisionerIdentity":"1585758279174-8081-rbd.csi.ceph.com"},"volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2"}
I0527 14:55:57.727472 63711 rbd_util.go:585] ID: 38442 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 setting disableInUseChecks on rbd volume to: false
I0527 14:55:58.429413 63711 rbd_util.go:212] ID: 38442 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: status csi-vol-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 using mon 192.168.0.42:6789,192.168.0.43:6789,192.168.0.44:6789, pool rbd
W0527 14:55:58.556065 63711 rbd_util.go:234] ID: 38442 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: no watchers on csi-vol-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2
I0527 14:55:58.556123 63711 rbd_attach.go:208] ID: 38442 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: map mon 192.168.0.42:6789,192.168.0.43:6789,192.168.0.44:6789
I0527 14:55:58.691865 63711 nodeserver.go:211] ID: 38442 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd image: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2/rbd was successfully mapped at /dev/rbd3
I0527 14:55:58.692042 63711 mount_linux.go:405] Attempting to determine if disk "/dev/rbd3" is formatted using blkid with args: ([-p -s TYPE -s PTTYPE -o export /dev/rbd3])
I0527 14:55:58.768536 63711 mount_linux.go:408] Output: "DEVNAME=/dev/rbd3\nTYPE=ext4\n", err: <nil>
I0527 14:55:58.768611 63711 mount_linux.go:405] Attempting to determine if disk "/dev/rbd3" is formatted using blkid with args: ([-p -s TYPE -s PTTYPE -o export /dev/rbd3])
I0527 14:55:58.797327 63711 mount_linux.go:408] Output: "DEVNAME=/dev/rbd3\nTYPE=ext4\n", err: <nil>
I0527 14:55:58.797364 63711 mount_linux.go:298] Checking for issues with fsck on disk: /dev/rbd3
I0527 14:55:58.929543 63711 mount_linux.go:394] Attempting to mount disk /dev/rbd3 in ext4 format at /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2
I0527 14:55:58.929620 63711 mount_linux.go:146] Mounting cmd (mount) with arguments (-t ext4 -o _netdev,discard,defaults /dev/rbd3 /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2)
I0527 14:55:58.957602 63711 nodeserver.go:187] ID: 38442 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: successfully mounted volume 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 to stagingTargetPath /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2
I0527 14:55:58.957672 63711 utils.go:165] ID: 38442 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC response: {}
I0527 14:55:58.966484 63711 utils.go:159] ID: 38443 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0527 14:55:58.966510 63711 utils.go:160] ID: 38443 GRPC request: {}
I0527 14:55:58.967433 63711 utils.go:165] ID: 38443 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0527 14:55:58.979237 63711 utils.go:159] ID: 38444 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC call: /csi.v1.Node/NodePublishVolume
I0527 14:55:58.979256 63711 utils.go:160] ID: 38444 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC request: {"staging_target_path":"/var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount","target_path":"/var/lib/kubelet/pods/447ed8b7-5c7b-4331-b5e9-3af6ed931be4/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount","volume_capability":{"AccessType":{"Mount":{"fs_type":"ext4","mount_flags":["discard"]}},"access_mode":{"mode":1}},"volume_context":{"clusterID":"16042995-ff80-4561-8d47-70d5a80ff4ea","imageFeatures":"layering","pool":"rbd","storage.kubernetes.io/csiProvisionerIdentity":"1585758279174-8081-rbd.csi.ceph.com"},"volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2"}
I0527 14:55:58.982381 63711 nodeserver.go:437] ID: 38444 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 target /var/lib/kubelet/pods/447ed8b7-5c7b-4331-b5e9-3af6ed931be4/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount
isBlock false
fstype ext4
stagingPath /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2
readonly false
mountflags [bind _netdev discard]
I0527 14:55:58.984252 63711 mount_linux.go:173] Cannot run systemd-run, assuming non-systemd OS
I0527 14:55:58.984276 63711 mount_linux.go:174] systemd-run failed with: exit status 1
I0527 14:55:58.984290 63711 mount_linux.go:175] systemd-run output: Failed to create bus connection: No such file or directory
I0527 14:55:58.984312 63711 mount_linux.go:146] Mounting cmd (mount) with arguments (-t ext4 -o bind,_netdev /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 /var/lib/kubelet/pods/447ed8b7-5c7b-4331-b5e9-3af6ed931be4/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount)
I0527 14:55:58.989605 63711 mount_linux.go:146] Mounting cmd (mount) with arguments (-t ext4 -o bind,remount,_netdev,discard /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 /var/lib/kubelet/pods/447ed8b7-5c7b-4331-b5e9-3af6ed931be4/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount)
I0527 14:55:58.993929 63711 nodeserver.go:345] ID: 38444 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: successfully mounted stagingPath /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 to targetPath /var/lib/kubelet/pods/447ed8b7-5c7b-4331-b5e9-3af6ed931be4/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount
I0527 14:55:58.993977 63711 utils.go:165] ID: 38444 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC response: {}
I0527 14:56:02.735354 63711 utils.go:159] ID: 38445 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0527 14:56:02.735379 63711 utils.go:160] ID: 38445 GRPC request: {}
I0527 14:56:02.736539 63711 utils.go:165] ID: 38445 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0527 14:56:02.740903 63711 utils.go:159] ID: 38446 GRPC call: /csi.v1.Node/NodeGetVolumeStats
This excerpt may help:
>>> start, pod working
I0528 06:19:01.878800 32873 cephcsi.go:117] Driver version: v2.1.1 and Git version: 9022d899eb6fd464f0be33701d8160ecd1317467
I0528 06:19:01.879097 32873 cephcsi.go:144] Initial PID limit is set to 9830
I0528 06:19:01.879177 32873 cephcsi.go:153] Reconfigured PID limit to -1 (max)
I0528 06:19:01.879186 32873 cephcsi.go:172] Starting driver type: rbd with name: rbd.csi.ceph.com
I0528 06:19:01.881721 32873 mount_linux.go:173] Cannot run systemd-run, assuming non-systemd OS
I0528 06:19:01.881735 32873 mount_linux.go:174] systemd-run failed with: exit status 1
I0528 06:19:01.881750 32873 mount_linux.go:175] systemd-run output: Failed to create bus connection: No such file or directory
I0528 06:19:01.891810 32873 server.go:116] Listening for connections on address: &net.UnixAddr{Name:"//csi/csi.sock", Net:"unix"}
I0528 06:19:02.525063 32873 utils.go:159] ID: 1 GRPC call: /csi.v1.Identity/GetPluginInfo
I0528 06:19:02.525112 32873 utils.go:160] ID: 1 GRPC request: {}
I0528 06:19:02.527732 32873 identityserver-default.go:37] ID: 1 Using default GetPluginInfo
I0528 06:19:02.527764 32873 utils.go:165] ID: 1 GRPC response: {"name":"rbd.csi.ceph.com","vendor_version":"v2.1.1"}
I0528 06:19:03.993764 32873 utils.go:159] ID: 2 GRPC call: /csi.v1.Node/NodeGetInfo
I0528 06:19:03.993805 32873 utils.go:160] ID: 2 GRPC request: {}
I0528 06:19:03.995008 32873 nodeserver-default.go:58] ID: 2 Using default NodeGetInfo
I0528 06:19:03.995030 32873 utils.go:165] ID: 2 GRPC response: {"accessible_topology":{},"node_id":"node00b.workspace.domain.com"}
I0528 06:19:17.942621 32873 utils.go:159] ID: 3 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0528 06:19:17.942658 32873 utils.go:160] ID: 3 GRPC request: {}
I0528 06:19:17.943934 32873 utils.go:165] ID: 3 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0528 06:19:17.951057 32873 utils.go:159] ID: 4 GRPC call: /csi.v1.Node/NodeGetVolumeStats
I0528 06:19:17.951093 32873 utils.go:160] ID: 4 GRPC request: {"volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-05f077da-9b56-11ea-9049-22b91a6b11fb","volume_path":"/var/lib/kubelet/pods/452240eb-c473-4397-bae5-fbf0fdc51381/volumes/kubernetes.io~csi/pvc-3e50b5eb-9213-42a3-84da-53659634131d/mount"}
>>>
I0528 06:19:17.955706 32873 mount_linux.go:173] Cannot run systemd-run, assuming non-systemd OS
I0528 06:19:17.955758 32873 mount_linux.go:174] systemd-run failed with: exit status 1
I0528 06:19:17.955779 32873 mount_linux.go:175] systemd-run output: Failed to create bus connection: No such file or directory
I0528 06:19:17.955862 32873 utils.go:165] ID: 4 GRPC response: {"usage":[{"available":20940611584,"total":21003583488,"unit":1,"used":46194688},{"available":1310699,"total":1310720,"unit":2,"used":21}]}
I0528 06:19:29.439101 32873 utils.go:159] ID: 5 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0528 06:19:29.439141 32873 utils.go:160] ID: 5 GRPC request: {}
I0528 06:19:29.440333 32873 utils.go:165] ID: 5 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0528 06:19:29.446953 32873 utils.go:159] ID: 6 GRPC call: /csi.v1.Node/NodeGetVolumeStats
I0528 06:19:29.447084 32873 utils.go:160] ID: 6 GRPC request: {"volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-2d073fe9-9b4e-11ea-9ccf-ae03623255b0","volume_path":"/var/lib/kubelet/pods/9939130d-eda7-4fc9-9c0e-644059b2d2f9/volumes/kubernetes.io~csi/pvc-2e40f56c-24df-4ad0-9c80-a4f57a689eb3/mount"}
>>>
I0528 06:19:29.452068 32873 mount_linux.go:173] Cannot run systemd-run, assuming non-systemd OS
I0528 06:19:29.452122 32873 mount_linux.go:174] systemd-run failed with: exit status 1
I0528 06:19:29.452151 32873 mount_linux.go:175] systemd-run output: Failed to create bus connection: No such file or directory
I0528 06:19:29.452252 32873 utils.go:165] ID: 6 GRPC response: {"usage":[{"available":20940484608,"total":21003583488,"unit":1,"used":46321664},{"available":1310678,"total":1310720,"unit":2,"used":42}]}
I0528 06:19:57.950567 32873 utils.go:159] ID: 7 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0528 06:19:57.950611 32873 utils.go:160] ID: 7 GRPC request: {}
I0528 06:19:57.951775 32873 utils.go:165] ID: 7 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0528 06:19:57.958932 32873 utils.go:159] ID: 8 GRPC call: /csi.v1.Node/NodeGetVolumeStats
I0528 06:19:57.958974 32873 utils.go:160] ID: 8 GRPC request: {"volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-39ff5ac1-7a6e-11ea-9d12-f2eedd7cc4d2","volume_path":"/var/lib/kubelet/pods/43648d04-21e7-4488-a588-00826358a8c5/volumes/kubernetes.io~csi/pvc-801d01b4-d369-47b1-9eaf-683d9cb0b59e/mount"}
>>> delete pod and wait 6 minutes ...
NAME READY STATUS RESTARTS AGE
gitlab-gitaly-0 0/1 PodInitializing 0 6m39s
I0528 06:26:37.188358 32873 mount_linux.go:173] Cannot run systemd-run, assuming non-systemd OS
I0528 06:26:37.188392 32873 mount_linux.go:174] systemd-run failed with: exit status 1
I0528 06:26:37.188413 32873 mount_linux.go:175] systemd-run output: Failed to create bus connection: No such file or directory
I0528 06:26:37.188496 32873 utils.go:165] ID: 71 GRPC response: {"usage":[{"available":20940484608,"total":21003583488,"unit":1,"used":46321664},{"available":1310678,"total":1310720,"unit":2,"used":42}]}
I0528 06:26:37.756635 32873 utils.go:159] ID: 72 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC call: /csi.v1.Node/NodeUnpublishVolume
I0528 06:26:37.756659 32873 utils.go:160] ID: 72 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC request: {"target_path":"/var/lib/kubelet/pods/447ed8b7-5c7b-4331-b5e9-3af6ed931be4/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount","volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2"}
I0528 06:26:37.757339 32873 mount_linux.go:238] Unmounting /var/lib/kubelet/pods/447ed8b7-5c7b-4331-b5e9-3af6ed931be4/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount
I0528 06:26:37.803202 32873 nodeserver.go:520] ID: 72 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: successfully unbound volume 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 from /var/lib/kubelet/pods/447ed8b7-5c7b-4331-b5e9-3af6ed931be4/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount
I0528 06:26:37.803266 32873 utils.go:165] ID: 72 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC response: {}
I0528 06:26:37.881706 32873 utils.go:159] ID: 73 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0528 06:26:37.881755 32873 utils.go:160] ID: 73 GRPC request: {}
I0528 06:26:37.882818 32873 utils.go:165] ID: 73 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0528 06:26:37.888850 32873 utils.go:159] ID: 74 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC call: /csi.v1.Node/NodeUnstageVolume
I0528 06:26:37.888888 32873 utils.go:160] ID: 74 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC request: {"staging_target_path":"/var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount","volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2"}
I0528 06:26:37.889960 32873 mount_linux.go:238] Unmounting /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2
I0528 06:26:39.104722 32873 nodeserver.go:609] ID: 74 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 successfully unmounted volume (0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2) from staging path (/var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2)
I0528 06:26:39.105072 32873 utils.go:165] ID: 74 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC response: {}
I0528 06:26:39.175564 32873 utils.go:159] ID: 75 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0528 06:26:39.175594 32873 utils.go:160] ID: 75 GRPC request: {}
I0528 06:26:39.176737 32873 utils.go:165] ID: 75 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0528 06:26:39.194572 32873 utils.go:159] ID: 76 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC call: /csi.v1.Node/NodeStageVolume
I0528 06:26:39.194621 32873 utils.go:160] ID: 76 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC request: {"secrets":"***stripped***","staging_target_path":"/var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount","volume_capability":{"AccessType":{"Mount":{"fs_type":"ext4","mount_flags":["discard"]}},"access_mode":{"mode":1}},"volume_context":{"clusterID":"16042995-ff80-4561-8d47-70d5a80ff4ea","imageFeatures":"layering","pool":"rbd","storage.kubernetes.io/csiProvisionerIdentity":"1585758279174-8081-rbd.csi.ceph.com"},"volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2"}
I0528 06:26:39.199564 32873 rbd_util.go:585] ID: 76 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 setting disableInUseChecks on rbd volume to: false
I0528 06:26:40.016566 32873 rbd_util.go:212] ID: 76 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: status csi-vol-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 using mon 192.168.0.42:6789,192.168.0.43:6789,192.168.0.44:6789, pool rbd
W0528 06:26:40.161445 32873 rbd_util.go:234] ID: 76 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: no watchers on csi-vol-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2
I0528 06:26:40.161517 32873 rbd_attach.go:208] ID: 76 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: map mon 192.168.0.42:6789,192.168.0.43:6789,192.168.0.44:6789
I0528 06:26:40.291251 32873 nodeserver.go:211] ID: 76 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd image: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2/rbd was successfully mapped at /dev/rbd3
I0528 06:26:40.291436 32873 mount_linux.go:405] Attempting to determine if disk "/dev/rbd3" is formatted using blkid with args: ([-p -s TYPE -s PTTYPE -o export /dev/rbd3])
I0528 06:26:40.336386 32873 mount_linux.go:408] Output: "DEVNAME=/dev/rbd3\nTYPE=ext4\n", err: <nil>
I0528 06:26:40.336435 32873 mount_linux.go:405] Attempting to determine if disk "/dev/rbd3" is formatted using blkid with args: ([-p -s TYPE -s PTTYPE -o export /dev/rbd3])
I0528 06:26:40.367471 32873 mount_linux.go:408] Output: "DEVNAME=/dev/rbd3\nTYPE=ext4\n", err: <nil>
I0528 06:26:40.367501 32873 mount_linux.go:298] Checking for issues with fsck on disk: /dev/rbd3
I0528 06:26:40.440787 32873 mount_linux.go:394] Attempting to mount disk /dev/rbd3 in ext4 format at /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2
I0528 06:26:40.440878 32873 mount_linux.go:146] Mounting cmd (mount) with arguments (-t ext4 -o _netdev,discard,defaults /dev/rbd3 /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2)
I0528 06:26:40.462742 32873 nodeserver.go:187] ID: 76 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 rbd: successfully mounted volume 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 to stagingTargetPath /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2
I0528 06:26:40.462800 32873 utils.go:165] ID: 76 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC response: {}
I0528 06:26:40.473125 32873 utils.go:159] ID: 77 GRPC call: /csi.v1.Node/NodeGetCapabilities
I0528 06:26:40.473171 32873 utils.go:160] ID: 77 GRPC request: {}
I0528 06:26:40.475895 32873 utils.go:165] ID: 77 GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":2}}},{"Type":{"Rpc":{"type":3}}}]}
I0528 06:26:40.487780 32873 utils.go:159] ID: 78 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC call: /csi.v1.Node/NodePublishVolume
I0528 06:26:40.487821 32873 utils.go:160] ID: 78 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 GRPC request: {"staging_target_path":"/var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount","target_path":"/var/lib/kubelet/pods/829afc31-b65f-4170-9638-3724c2050de1/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount","volume_capability":{"AccessType":{"Mount":{"fs_type":"ext4","mount_flags":["discard"]}},"access_mode":{"mode":1}},"volume_context":{"clusterID":"16042995-ff80-4561-8d47-70d5a80ff4ea","imageFeatures":"layering","pool":"rbd","storage.kubernetes.io/csiProvisionerIdentity":"1585758279174-8081-rbd.csi.ceph.com"},"volume_id":"0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2"}
I0528 06:26:40.496156 32873 nodeserver.go:437] ID: 78 Req-ID: 0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2 target /var/lib/kubelet/pods/829afc31-b65f-4170-9638-3724c2050de1/volumes/kubernetes.io~csi/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/mount
isBlock false
fstype ext4
stagingPath /var/lib/kubelet/plugins/kubernetes.io/csi/pv/pvc-b5b268b6-4a99-4cda-9fd2-bf6b6c186053/globalmount/0001-0024-16042995-ff80-4561-8d47-70d5a80ff4ea-0000000000000010-5ddd037e-74b1-11ea-9d12-f2eedd7cc4d2
readonly false
mountflags [bind _netdev discard]
---
>>> after 7 minutes come back
Events:
Type Reason Age From Message
---- ------ ---- ---- -------
Normal Scheduled <unknown> default-scheduler Successfully assigned git/gitlab-gitaly-0 to node00b.workspace.domain.com
Warning FailedMount 4m50s kubelet, node00b.workspace.domain.com Unable to attach or mount volumes: unmounted volumes=[repo-data], unattached volumes=[etc-ssl-certs custom-ca-certificates default-token-nsd89 gitaly-config init-gitaly-secrets gitaly-secrets repo-data]: timed out waiting for the condition
Warning FailedMount 2m32s kubelet, node00b.workspace.domain.com Unable to attach or mount volumes: unmounted volumes=[repo-data], unattached volumes=[init-gitaly-secrets gitaly-secrets repo-data etc-ssl-certs custom-ca-certificates default-token-nsd89 gitaly-config]: timed out waiting for the condition
Normal Pulled 19s kubelet, node00b.workspace.domain.com Container image "registry.gitlab.com/gitlab-org/build/cng/alpine-certificates:20171114-r3" already present on machine
Normal Created 19s kubelet, node00b.workspace.domain.com Created container certificates
Normal Started 19s kubelet, node00b.workspace.domain.com Started container certificates
Normal Pulling 18s kubelet, node00b.workspace.domain.com Pulling image "busybox:latest"
Normal Pulled 16s kubelet, node00b.workspace.domain.com Successfully pulled image "busybox:latest"
Normal Created 16s kubelet, node00b.workspace.domain.com Created container configure
Normal Started 15s kubelet, node00b.workspace.domain.com Started container configure
Normal Pulled 14s kubelet, node00b.workspace.domain.com Container image "registry.gitlab.com/gitlab-org/build/cng/gitaly:v12.9.3" already present on machine
Normal Created 14s kubelet, node00b.workspace.domain.com Created container gitaly
Normal Started 14s kubelet, node00b.workspace.domain.com Started container gitaly
NAME READY STATUS RESTARTS AGE
gitlab-gitaly-0 1/1 Running 0 7m4s
Additional context
n/a
Thanks in advance.
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 29 (12 by maintainers)
@i033653 Thanks for the confirmation closing this issue.