harbor: docker push fail
Hello.
I created robot account for jenkins CI. From that time, there’s no docker login fail (regarding #10832 ).
Now, there’s a problem when docker push. This occurs sometimes.
Here’s Jenkins log.
[2020-02-24T16:25:53.746Z] + docker push harbor.company.net/project/container
[2020-02-24T16:25:53.746Z] The push refers to repository [harbor.company.net/project/container]
[2020-02-24T16:25:53.746Z] 06c4036205f0: Preparing
[2020-02-24T16:25:53.746Z] 318c7183ebf4: Preparing
[2020-02-24T16:25:53.746Z] 89a4929bfb9c: Preparing
[2020-02-24T16:25:53.746Z] a210241fb7b0: Preparing
[2020-02-24T16:25:53.746Z] 307652cfaa21: Preparing
[2020-02-24T16:25:53.746Z] 4e6f1765500c: Preparing
[2020-02-24T16:25:53.746Z] 849a812dfffd: Preparing
[2020-02-24T16:25:53.746Z] 5559ed7737ee: Preparing
[2020-02-24T16:25:53.746Z] 550aefd43d00: Preparing
[2020-02-24T16:25:53.746Z] 27e45ca143e1: Preparing
[2020-02-24T16:25:53.746Z] dd7d5adb4579: Preparing
[2020-02-24T16:25:53.746Z] 4e6f1765500c: Waiting
[2020-02-24T16:25:53.746Z] 849a812dfffd: Waiting
[2020-02-24T16:25:53.746Z] 27e45ca143e1: Waiting
[2020-02-24T16:25:53.746Z] dd7d5adb4579: Waiting
[2020-02-24T16:25:53.746Z] 5559ed7737ee: Waiting
[2020-02-24T16:25:53.746Z] 550aefd43d00: Waiting
[2020-02-24T16:25:53.746Z] 307652cfaa21: Retrying in 5 seconds
[2020-02-24T16:25:53.746Z] 06c4036205f0: Retrying in 5 seconds
[2020-02-24T16:25:54.003Z] 318c7183ebf4: Retrying in 5 seconds
[2020-02-24T16:25:54.005Z] a210241fb7b0: Retrying in 5 seconds
[2020-02-24T16:25:54.005Z] 89a4929bfb9c: Retrying in 5 seconds
[2020-02-24T16:25:54.936Z] 307652cfaa21: Retrying in 4 seconds
[2020-02-24T16:25:54.936Z] 06c4036205f0: Retrying in 4 seconds
[2020-02-24T16:25:54.936Z] 318c7183ebf4: Retrying in 4 seconds
[2020-02-24T16:25:54.936Z] a210241fb7b0: Retrying in 4 seconds
[2020-02-24T16:25:54.936Z] 89a4929bfb9c: Retrying in 4 seconds
[2020-02-24T16:25:55.867Z] 307652cfaa21: Retrying in 3 seconds
[2020-02-24T16:25:55.868Z] 06c4036205f0: Retrying in 3 seconds
[2020-02-24T16:25:55.868Z] 318c7183ebf4: Retrying in 3 seconds
[2020-02-24T16:25:55.868Z] a210241fb7b0: Retrying in 3 seconds
[2020-02-24T16:25:55.868Z] 89a4929bfb9c: Retrying in 3 seconds
[2020-02-24T16:25:56.799Z] 307652cfaa21: Retrying in 2 seconds
[2020-02-24T16:25:56.800Z] 06c4036205f0: Retrying in 2 seconds
[2020-02-24T16:25:56.800Z] 318c7183ebf4: Retrying in 2 seconds
[2020-02-24T16:25:56.800Z] a210241fb7b0: Retrying in 2 seconds
[2020-02-24T16:25:56.800Z] 89a4929bfb9c: Retrying in 2 seconds
[2020-02-24T16:25:58.172Z] 06c4036205f0: Retrying in 1 second
[2020-02-24T16:25:58.173Z] 307652cfaa21: Retrying in 1 second
[2020-02-24T16:25:58.173Z] 318c7183ebf4: Retrying in 1 second
[2020-02-24T16:25:58.173Z] a210241fb7b0: Retrying in 1 second
[2020-02-24T16:25:58.173Z] 89a4929bfb9c: Retrying in 1 second
[2020-02-24T16:25:59.106Z] 06c4036205f0: Retrying in 10 seconds
[2020-02-24T16:25:59.106Z] 307652cfaa21: Retrying in 10 seconds
[2020-02-24T16:25:59.107Z] 318c7183ebf4: Retrying in 10 seconds
[2020-02-24T16:25:59.107Z] a210241fb7b0: Retrying in 10 seconds
[2020-02-24T16:25:59.107Z] 89a4929bfb9c: Retrying in 10 seconds
[2020-02-24T16:26:00.036Z] 06c4036205f0: Retrying in 9 seconds
[2020-02-24T16:26:00.037Z] 307652cfaa21: Retrying in 9 seconds
[2020-02-24T16:26:00.037Z] 318c7183ebf4: Retrying in 9 seconds
[2020-02-24T16:26:00.037Z] a210241fb7b0: Retrying in 9 seconds
[2020-02-24T16:26:00.037Z] 89a4929bfb9c: Retrying in 9 seconds
[2020-02-24T16:26:00.968Z] 06c4036205f0: Retrying in 8 seconds
[2020-02-24T16:26:00.968Z] 307652cfaa21: Retrying in 8 seconds
[2020-02-24T16:26:00.968Z] 318c7183ebf4: Retrying in 8 seconds
[2020-02-24T16:26:00.968Z] a210241fb7b0: Retrying in 8 seconds
[2020-02-24T16:26:00.968Z] 89a4929bfb9c: Retrying in 8 seconds
[2020-02-24T16:26:01.899Z] 06c4036205f0: Retrying in 7 seconds
[2020-02-24T16:26:01.900Z] 307652cfaa21: Retrying in 7 seconds
[2020-02-24T16:26:01.900Z] 318c7183ebf4: Retrying in 7 seconds
[2020-02-24T16:26:01.900Z] a210241fb7b0: Retrying in 7 seconds
[2020-02-24T16:26:01.900Z] 89a4929bfb9c: Retrying in 7 seconds
[2020-02-24T16:26:02.832Z] 06c4036205f0: Retrying in 6 seconds
[2020-02-24T16:26:02.832Z] 307652cfaa21: Retrying in 6 seconds
[2020-02-24T16:26:02.833Z] 318c7183ebf4: Retrying in 6 seconds
[2020-02-24T16:26:02.833Z] a210241fb7b0: Retrying in 6 seconds
[2020-02-24T16:26:02.833Z] 89a4929bfb9c: Retrying in 6 seconds
[2020-02-24T16:26:03.764Z] 06c4036205f0: Retrying in 5 seconds
[2020-02-24T16:26:03.764Z] 307652cfaa21: Retrying in 5 seconds
[2020-02-24T16:26:04.021Z] 318c7183ebf4: Retrying in 5 seconds
[2020-02-24T16:26:04.022Z] a210241fb7b0: Retrying in 5 seconds
[2020-02-24T16:26:04.022Z] 89a4929bfb9c: Retrying in 5 seconds
[2020-02-24T16:26:04.952Z] 06c4036205f0: Retrying in 4 seconds
[2020-02-24T16:26:04.952Z] 307652cfaa21: Retrying in 4 seconds
[2020-02-24T16:26:04.952Z] 318c7183ebf4: Retrying in 4 seconds
[2020-02-24T16:26:04.952Z] a210241fb7b0: Retrying in 4 seconds
[2020-02-24T16:26:04.952Z] 89a4929bfb9c: Retrying in 4 seconds
[2020-02-24T16:26:05.883Z] 06c4036205f0: Retrying in 3 seconds
[2020-02-24T16:26:05.884Z] 307652cfaa21: Retrying in 3 seconds
[2020-02-24T16:26:05.884Z] 318c7183ebf4: Retrying in 3 seconds
[2020-02-24T16:26:05.884Z] a210241fb7b0: Retrying in 3 seconds
[2020-02-24T16:26:05.884Z] 89a4929bfb9c: Retrying in 3 seconds
[2020-02-24T16:26:06.816Z] 06c4036205f0: Retrying in 2 seconds
[2020-02-24T16:26:06.817Z] 307652cfaa21: Retrying in 2 seconds
[2020-02-24T16:26:06.817Z] 318c7183ebf4: Retrying in 2 seconds
[2020-02-24T16:26:06.817Z] a210241fb7b0: Retrying in 2 seconds
[2020-02-24T16:26:06.817Z] 89a4929bfb9c: Retrying in 2 seconds
[2020-02-24T16:26:08.187Z] 06c4036205f0: Retrying in 1 second
[2020-02-24T16:26:08.188Z] 307652cfaa21: Retrying in 1 second
[2020-02-24T16:26:08.188Z] 318c7183ebf4: Retrying in 1 second
[2020-02-24T16:26:08.188Z] a210241fb7b0: Retrying in 1 second
[2020-02-24T16:26:08.188Z] 89a4929bfb9c: Retrying in 1 second
[2020-02-24T16:26:09.119Z] unauthorized: authentication required
I did docker login successfully before push.
Here’s harbor-clair log.
{"Event":"could not download layer","Level":"warning","Location":"driver.go:130","Time":"2020-02-24 16:25:51.633354","error":"Get http://harbor-prod-harbor-core/v2/project/container/blobs/sha256:c2d332d9c1e7675bcb9dfae323b7dab73a1cd195602b3c4425c31f113991e846: dial tcp 10.96.61.194:80: connect: connection refused"}
{"Event":"failed to extract data from path","Level":"error","Location":"worker.go:122","Time":"2020-02-24 16:25:51.633479","error":"could not find layer","layer":"02332fa37e93f287f81e7e574674e5fa0a989f7356f81394b60f74f4a371e6e4","path":"http://harbor-prod-harbor-core/v2/project/container/blobs/sha256:c2d332d9c1e7675bcb9dfae323b7dab73a1cd195602b3c4425c31f113991e846"}
Here’s harbor-core log. Please ignore timestamp. I paste some errors in here.
2020-02-25T06:55:49Z [ERROR] [/core/main.go:280]: Failed to parse SYNC_QUOTA: strconv.ParseBool: parsing "": invalid syntax
2020-02-25T06:55:50Z [ERROR] [/core/filter/security.go:244]: Failed to verify secret: failed to verify the secret: user does not exist, name: robotaccount
2020-02-25T06:56:27Z [ERROR] [/core/service/notifications/registry/handler.go:174]: registry notification: trigger scan when pushing automatically: error: 10409(conflict) : scan controller: scan : cause: error: 10409(conflict) : a previous scan process is Running
<snip>
2020/02/25 09:02:41 [D] [server.go:2774] | 127.0.0.1| 404 | 18.855152ms| match| HEAD /v2/project/container/blobs/sha256:b34f6cab83075934a1c8fa259d0543773a5f807e2822ef00c79ece4b3cfe8b8b r:/v2/*
2020-02-25T09:02:41Z [ERROR] [/core/filter/security.go:244]: Failed to verify secret: failed to verify the secret: user does not exist, name: robotaccount
<snip>
2020-02-25T09:03:17Z [ERROR] [/core/filter/security.go:244]: Failed to verify secret: failed to verify the secret: user does not exist, name: robotaccount
and I think harbor-core restarted at that time because I use clair scanning at push time option and at that time tens of images push to same project and clair’s load increases suddenly, it may lead core’s load increase and then core restarted or hang.
What is the problem?
Versions:
- harbor version: 1.10.0
Thanks,
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 20 (5 by maintainers)
We’re also facing this issue with Harbor version 1.10.0. We’ve had no issues earlier but suddenly we started seeing this issue on all our push requests. This happened 3 days ago when our containers restarted due to a server reboot.
Our logs in harbor says:
May 4 09:48:12 172.18.0.1 core[6055]: 2020-05-04T07:48:12Z [ERROR] [/core/filter/security.go:244]: Failed to verify secret: failed to verify the secret: user does not exist, name: robot$CICDBut that robot account definitely exists. This happens to accounts that were created before our problems started, and after. The problem also occurs when using our CLI Secrets. All containers appear to be up and running.
Notary logs the following:
May 4 10:13:47 172.18.0.1 notary-server[6055]: {"go.version":"go1.13.4","http.request.host":"FOOBAR:port","http.request.id":"FOOBAR","http.request.method":"GET","http.request.remoteaddr":"FOOBAR:PORT","http.request.uri":"FOOBAR.json","http.request.useragent":"Go-http-client/1.1","level":"info","msg":"metadata not found: You have requested metadata that does not exist.: No record found","time":"2020-05-04T08:13:47Z"}We’ve tried to restart all containers again, but it doesn’t solve the issue. We’ve tried to turn on and off the image scanning, it doesn’t solve it either.
Met same issue.
Unable to push image.
K8s installation.
Harbor version 1.10.2
2020-07-23T01:37:41Z [ERROR] [/core/filter/security.go:244]: Failed to verify secret: failed to verify the secret: user does not exist, name: robot$ci-cd