moby: Hash sum mismatch

Steps to reproduce the issue:

  1. Running apt-get update

Describe the results you received: Running apt-get update on Debian Stretch just now results in

Err:2 https://apt.dockerproject.org/repo debian-stretch/main amd64 Packages
  Hash Sum mismatch

as well as

E: Failed to fetch https://apt.dockerproject.org/repo/dists/debian-stretch/main/binary-amd64/Packages.bz2  Hash Sum mismatch

I have cleaned the apt caches and tried again with the same result. Also, I’m not using a proxy.

Describe the results you expected: No error.

About this issue

  • Original URL
  • State: closed
  • Created 8 years ago
  • Reactions: 121
  • Comments: 90 (17 by maintainers)

Commits related to this issue

Most upvoted comments

Hi everyone. I work at Docker.

First, my apologies for the outage. I consider our package infrastructure as critical infrastructure, both for the free and commercial versions of Docker. It’s true that we offer better support for the commercial version (it’s one if its features), but that should not apply to fundamental things like being able to download your packages.

The team is working on the issue and will continue to give updates here. We are taking this seriously.

Some of you pointed out that the response time and use of communication channels seem inadequate, for example the @dockerststus bot has not mentioned the issue when it was detected. I share the opinion but I don’t know the full story yet; the post-mortem will tell us for sure what went wrong. At the moment the team is focusing on fixing the issue and I don’t want to distract them from that.

Once the post-mortem identifies what went wrong, we will take appropriate corrective action. I suspect part of it will be better coordination between core engineers and infrastructure engineers (2 distinct groups within Docker).

Thanks and sorry again for the inconvenience.

for everybody raging over this downtime: here’s a cute deer picture to calm down and pass the time in the meanwhile:

To give an update; I raised this issue internally, but the people needed to fix this are in the San Francisco timezone, so they’re not present yet.

Does this mean that Docker – a major infrastructure company – does not have any on-call engineers available to fix this?

To all complainers and ragers:

There is a commercial, payed and well supported version of Docker.

FYI this is the community version, supported on best effort basis and NO MORE.

We’re working on it, folks.

can we agree upon the fact that the hashsums are incorrect/repo needs administrative action?

We localized the cause of the issue, and if should be resolved now, please try again.

It may be needed to clear the apt-cache;

apt-get clean && apt-get update

Accidents happen, its how we deal with them and the lessons we take forward that matter. Most of this thread seems to be rampant with speculation. Thanks in advance to all Docker team members working on fixing this problem.

Docker repo maintainers. You need:

  • Automatic testing on changes
  • Healthcheck of your repo
  • basically monitoring and alarms

I hope this never happens again. Docker was causing production test and deployment issues here (on TravisCI) with this although I’m not using a single Docker container in production. 😑

Here is a script for Ubuntu to get notified by a chime (plays audio file) when the repository checksums get updated, https://gist.github.com/simos/7ee8258ec17101e44bbfa93606694ede

I think there is not much to say other than get an official response from Docker on this.

@mlafeldt guess you didn’t pay for 24/7 support.

USER POLL

The best way to get notified of updates is to use the Subscribe button on this page.

Please don’t use “+1” or “I have this too” comments on issues. We automatically collect those comments to keep the thread short.

The people listed below have upvoted this issue by leaving a +1 comment:

@ViGo5190

@therealmarv This problem should not affect your production or any deployment pipelines anyway since no one should rely on an Internet connection or an external repository to build and deploy software.

Such a interesting single point of failure for docker ecosystem

@theluk the Experimental build is built from master currently

To give an update; I raised this issue internally, but the people needed to fix this are in the San Francisco timezone, so they’re not present yet.

As a temporary workaround, you can install docker 1.11.2-rc1 from the “test” repository; 1.11.2-rc1 is almost the same as the current release, apart from these three changes; https://github.com/docker/docker/pull/23164, https://github.com/docker/docker/pull/23169, and https://github.com/docker/docker/pull/23176

Those changes should not make a functional difference (and the last change only affects some corner-cases)

You can install the RC, either by changing the “main” to “test” repository for APT, or using the install script;

curl -fsSL https://test.docker.com | sh

Hoping to get this fixed ASAP

@thaJeztah so there is a different repo for commercial users thats not broken?

I never knew Docker was a 2 tier organisation where the user base is split between the haves and have nots. Surly installing docker is global concern for everyone using the software and therefore the support that “commercial” people get should also apply to the community. A paid tier to an organisation is a good way to make money but that should go beyond basics, like being able to install your software.

ETA?

I’ve tweeted @docker @dockerstatus (multiple times)… this is a major issue… surprised they’ve been so silent!

Same problem on Debian Trusty

W: Failed to fetch https://apt.dockerproject.org/repo/dists/ubuntu-trusty/main/binary-amd64/Packages  Hash Sum mismatch

E: Some index files failed to download. They have been ignored, or old ones used instead.

Thanks for working on fixing this. It would be great if you’d publish the results of the post-mortem once fixed.

Probably shouldn’t be surprised, but it is shocking how many people risk their infrastructure with hard dependencies on external repos. I don’t even do that with my home systems.

And then complain about Docker having a single point of failure?

I’m sad that my deer picture got less +1s than the official answer.

I worked around this by downloading the new package manually and installing using dpkg

curl -OL https://apt.dockerproject.org/repo/pool/main/d/docker-engine/docker-engine_1.11.2-0~trusty_amd64.deb
dpkg -i docker-engine*.deb

@vadviktor Does not work at Docker nor maintain the project.

@vadviktor Best effort does not mean bring down everyone. It means that small bugs and defects are looked at eventually. You still need to keep everything running under best case scenarios.

For people using Travis, I could fix it doing the following:

before_install:
- sudo apt-get install libsystemd-journal0
- pushd /tmp
- curl -OL https://apt.dockerproject.org/repo/pool/main/d/docker-engine/docker-engine_1.10.2-0~trusty_amd64.deb
- sudo dpkg --force-all -i docker-engine*.deb
- docker -v
- popd

@krak3n yes, there are separate releases for the commercially supported version.

@jalawrence Docker is the tip of the iceberg… Did you hear about the recent problems with node.js and one dev pulling out one single package? I am pretty sure that most php developers using Composer - the defacto package manager for that platform - also do not store complete copies of all their site’s dependencies, and the fact that there has been no mishaps so far is more luck than anything. The problem is that everybody and their dog now depends on $world, and caching all the dependencies locally is a sisyphean task. Shall I cache all of debian, all of packagist, all of cpan, all of rubygems, all of npm within a reverse proxy at my own expenses? And then: if github, bitbucket or travis are down, what will my developers be able to do anyway? Do I want back to the day when I had to host all of that?

Thank you all for the reports: we’re very sorry for this. We’re looking into the details and the timeline of events that lead to this, and we’ll make sure it doesn’t happen again.

I’m closing the issue, but of course feel free to let me know if you see any remaining quirks.

@vadviktor Is that the official position of Docker, because I’d like to quote that?

@xuedong09 instead of “test” use “testing”

yes we need an ETA too, it’s pretty urgent - our complete travis build chain is dead now -.-

It appears to be working for Ubuntu Xenial now.

Here are the relevant files for xenial, https://apt.dockerproject.org/repo/dists/ubuntu-xenial/main/binary-amd64/

InRelease        02-Jun-2016 11:06  2.6K
Packages          02-Jun-2016  2:38  4.8K
Packages.bz2  02-Jun-2016  2:38  1.7K
Packages.gz    02-Jun-2016  2:38  1.4K
Release            02-Jun-2016  3:43  1.7K
Release.gpg    02-Jun-2016  3:43  801

We can see that these files have been regenerated earlier today. The checksums (hashes) for these files should match what is inside the signed InRelease file of checksums.

In the InRelease file (https://apt.dockerproject.org/repo/dists/ubuntu-xenial/main/binary-amd64/InRelease), it says that this file was generated on Date: Thu, 02 Jun 2016 03:43:32 UTC . However, the timestamp as shown by the Web server is 02-Jun-2016 11:06.

Among the several causes for Hash Sum Mismatch, this one is about some weird update of InRelease with wrong checksums. In addition, InRelease lists the Release as being at 0 bytes long.

Same on Debian Jessie: W: Failed to fetch https://apt.dockerproject.org/repo/dists/debian-jessie/main/binary-amd64/Packages Hash Sum mismatch

Up until now there have been workarounds (either grab the .deb and install with dpkg, temporarily switch to the testing repository, etc). These are not permanent solutions.

A fix means that the source of this problem is resolved and we can mark this issue as Solved.

As posted earlier, you can use a script to get an audio notification as soon as the main docker repositories are fixed, https://gist.github.com/simos/7ee8258ec17101e44bbfa93606694ede Other than that, there is not much to do.

Sadly with this on the top of hacker news there’s going to be billions of comments. Big thanks for the quick fix @thaJeztah.

I wonder if we should lock this thread before they show up.

👍

Trusty appears to be back up

For ubuntu trusty (14.04), switching from the “main” to “testing” APT repository worked great for me.

@xuedong09 Just keep in mind that’s where we publish pre-release packages.

Thanks @crunis - that Travis fix works a treat.

I’m gobsmacked that this process isn’t automated with the checksums calculated independently by separate Docker containers and in the event of a disputed calculation between them the upload is held until a human can intervene.

@mlafeldt commercial support does; open source is separate infrastructure

Same probleme with Ubuntu Trusty on Travis CI:

W: Failed to fetch https://apt.dockerproject.org/repo/dists/ubuntu-trusty/main/binary-amd64/Packages  Hash Sum mismatch

Running apt-get -o Debug::pkgAcquire::Auth=true update on Ubuntu 14.04 yields

[Waiting for headers]201 URI Done: bzip2:/var/lib/apt/lists/partial/apt.dockerproject.org_repo_dists_ubuntu-trusty_main_binary-amd64_Packages
RecivedHash: SHA512:d6ca1f74e876031161d1abd6cf9ad0b45f60b19876468cfcf9cacd4956dfd13be43147227a8daa5536f1455bb75b353b178942bc1843d11f0188d00117483912
ExpectedHash: SHA512:d07a3f2c42a9b213e3f03f2f11c08154512baa9fbbaed19f3601865634b82cfdde0e65151a24e523017f29ecfd08a1dfc0af2c2117b025c46d683160892b0de6


https://apt.dockerproject.org/repo/dists/ubuntu-trusty/main/binary-amd64/Packages: 
Computed Hash: SHA512:d6ca1f74e876031161d1abd6cf9ad0b45f60b19876468cfcf9cacd4956dfd13be43147227a8daa5536f1455bb75b353b178942bc1843d11f0188d00117483912  
Expected Hash: SHA512:d07a3f2c42a9b213e3f03f2f11c08154512baa9fbbaed19f3601865634b82cfdde0e65151a24e523017f29ecfd08a1dfc0af2c2117b025c46d683160892b0de6

Same here!

Failed to fetch https://apt.dockerproject.org/repo/dists/ubuntu-trusty/main/binary-amd64/Packages  Hash Sum mismatch

heh - got the catalog but the package is missing - guess I’ll have another cofee 😃 @shykes Thanks for the update - lousy way to start your morning … Hope the day gets better from here

Thanks guys!