distributions: [ERROR] Node.js 20 on debian Bullseye Installation fails due to checksum error
When installing Node into docker container an error is received to say that there is a hash failure
#0 23.91 E: Failed to fetch https://deb.nodesource.com/node_20.x/dists/bullseye/main/binary-amd64/Packages.gz File has unexpected size (777 != 776). Mirror sync in progress? [IP: 23.217.103.89 443]
#0 23.91 Hashes of expected file:
#0 23.91 - Filesize:776 [weak]
#0 23.91 - SHA256:b8ca63ac4fbe9dad6950b850a9258db453c68c7b1c60f25457b9709684154e47
#0 23.91 - SHA1:35c0928a4089a3064e90b1c3cd0eb3f90d96983f [weak]
#0 23.91 - MD5Sum:333f082eb4c6371f533f47acb4774153 [weak]
#0 23.91 Release file created at: Tue, 16 May 2023 15:52:17 +0000
Distribution Information:
- OS: debian Bullsey
- DOCKER php:8.1-cli-bullseye
Node Version:
- Node: 20
To Reproduce Steps to reproduce the behavior: Use installation process
RUN curl -fsSL https://deb.nodesource.com/setup_20.x | bash - \
&& apt-get install -y nodejs \
&& npm install --global yarn
Expected behavior Node npm and yarn to install into image
About this issue
- Original URL
- State: closed
- Created a year ago
- Reactions: 27
- Comments: 82 (8 by maintainers)
Links to this issue
Commits related to this issue
- Update Dockerfile see this https://github.com/nodesource/distributions/issues/1576 — committed to Nearsure/github_actions by eacarras 10 months ago
- chore(docker): Try fixing node installation on the legacy image https://github.com/nodesource/distributions/issues/1576#issuecomment-1698012034 Signed-off-by: Frank Viernau <frank_viernau@epam.com> — committed to oss-review-toolkit/ort by fviernau 8 months ago
- chore(docker): Fixing install Node.js in the legacy image Building the legacy docker on CI does not work anymore because the way how Node.js has to be installed has changed, see [1]. So, migrate the ... — committed to oss-review-toolkit/ort by fviernau 8 months ago
- chore(docker): Fix installing Node.js in the legacy image Building the legacy docker on CI does not work anymore because the way how Node.js has to be installed has changed, see [1]. So, migrate the ... — committed to oss-review-toolkit/ort by fviernau 8 months ago
- chore(docker): Fix installing Node.js in the legacy image Building the legacy docker on CI does not work anymore because the way how Node.js has to be installed has changed, see [1]. So, migrate the ... — committed to oss-review-toolkit/ort by fviernau 8 months ago
- chore(docker): Fix installing Node.js in the legacy image Building the legacy docker on CI does not work anymore because the way how Node.js has to be installed has changed, see [1]. So, migrate the ... — committed to oss-review-toolkit/ort by fviernau 8 months ago
FYI, I’m indirectly affected by this through Playwright which uses nodesource in its Dockerfile: https://github.com/microsoft/playwright/blob/main/utils/docker/Dockerfile.focal#L12 (Edit: permalink)
If you’re using the install script, something like this works:
curl -sL https://deb.nodesource.com/setup_16.x | sed 's/https:\/\/deb.nodesource.com/http:\/\/deb.nodesource.com/g' | bash -
Not very comfortable with such a hacky method but better than complete breakage?I did several cache purges yesterday. I will contact our CDN provider.
edge servers are mirrors of the origin server, the edge server is typically chosen to be somehow the nearest to your client it seems, that some of these edge servers are out of sync with the origin, leading to above reported issue while others are working properly
different edge servers have different IPs, so, if you have some edge server which is working (you can try e.g. on some remote host or use some other provider), then you use this edge server in the region where you see the error
simplest way was, to just put the IP and the domain name into your /etc/hosts file
e.g. this was working for me
23.216.155.10 deb.nodesource.com
I added it to end of my /etc/hosts
@kay-ramme
And how would I go about doing that? I don’t know what that means. 😃
Same issue on Debian bullseye, and node 16 LTS:
Hi guys we’ve received some reports about this, I’m checking into this.
HTTPS is working again for Node 20.x, bookworm, arm64.
Is it possible that the origin servers are briefly receiving outdated artifacts as part of the build process, resulting in a race condition with the edge servers? If a request is made to an affected file on an edge server during this period, it would presumably cache on outdated artifact.
That would also explain why HTTPS is more often affected than HTTP: since the script defaults to HTTPS, more systems are making requests to the HTTPS URL, so the race condition is more likely to arise.
I really dislike switching to http… any workarounds?
Have the same issue with Laravel Sail.
Still affected:
^ Same issue, was working a few hours ago.
I guys, sorry about this issue again, I’ve forced again the cache purge. I’ll rise a ticket with our CDN vendor.
Can confirm that it’s working now. Thanks!
ok, that was useful. would you mind to try it again in some mins over HTTPS? thank you
With this Dockerfile :
I’ve got the same error when I try to build this image on gitlab CI (without any specific configuration)
That’s not actually going to fix it. It might work around it once or twice, but it’ll break again. There are instructions above, but you probably shouldn’t use them.
I’m seeing 775 and up-to-date timestamps in all of the following:
I’m still betting on this being a race condition with caching. If that’s the case, clearing the CDN’s cache is always just going to be a temporary fix.
Does the origin have multiple servers that auto-scale or similar? If so, do servers ever come online prior to ensuring that they’re up-to-date? That would cause this issue, and the CDN wouldn’t be to blame.
it’s working fine now, no change from my side, no workaround! Maybe it was a caching issue somewhere, I doubt it was my server since I tried clearing the cache
It is solved by editing /etc/apt/sources.list.d/nodesource.list and changing https to http in the URLs, thus:
deb [signed-by=/usr/share/keyrings/nodesource.gpg] http://deb.nodesource.com/node_16.x focal main deb-src [signed-by=/usr/share/keyrings/nodesource.gpg] http://deb.nodesource.com/node_16.x focal main
And then typing:
apt update
apt upgrade
Fallo al obtener https://deb.nodesource.com/node_16.x/dists/focal/main/binary-amd64/Packages.gz El archivo tiene un tamaño inesperado (776 != 775). ¿La sincronización de la réplica está en progreso?
[IP: 130.206.192.15 443] Hashes of expected file: - Filesize:775 [weak] - SHA256:0c6d8382f60bbb4bcc24ce922521673cd749c7530b2128b8dffb3e11297d1a15 - SHA1:1e1b1951e616fb84373ee32a26944b356a83149d [weak] - MD5Sum:fdee69ee72123acebf6c975c753f6182 [weak] Release file created at: Wed, 21 Jun 2023 21:26:29 +0000 E: No se han podido descargar algunos archivos de índice, se han omitido, o se han utilizado unos antiguos en su lugar.
I am trying to download node - 16. in ubuntu. getting same error.
Looks like it’s working everywhere I need (arm64 bullseye, amd64 bullseye, amd64 bookworm, armhf bullseye) 😄
Node 14.x reached EOL in April: https://nodejs.dev/en/about/releases/ I would assume it’s not supported. Although you shouldn’t, I’m sure you can get it working if you resolve the dependency problems, but that’s a separate issue–it’s not related to the HTTP/HTTPS staleness issue.
Much appreciated! Let me know if there’s any additional debugging info I can provide.
In the meantime, anyone experiencing this issue can probably work around it by using HTTP instead of HTTPS. This isn’t ideal, but the packages are still signed, so it should at least ensure package integrity.
I think that this is probably a configuration thing. I’d say the system that does not fail is not using the weak file size test, however I haven’t been able to discover how to disable the filesize test or even where its implemented 😦
The hash tests are more robust.
My guess is the file was edited after hashes were generated.
Interestingly, this doesn’t always happen. On one device (host with ubuntu 20.04) the following fails with the mentioned error message, where on other device (host with ubuntu 22.04) things work fine.
docker run --rm -it openjdk:11-jre-slim-buster bash