backstage: 🐛 Bug Report: zlib: unexpected end of file
📜 Description
We’ve started to see “zlib: unexpected end of file” errors in our logs. At first it was just in a single place but now we are seeing it in another. These both come places where we are using the UrlReader one with Bitbucket and the other with GitHub. We can’t reproduce it locally but it happens in all of our deployed environments.
The error stack:
zlib: unexpected end of file, ZlibError: zlib: unexpected end of file
at Unzip.write (/app/node_modules/minizlib/index.js:154:22)
at Unzip.flush (/app/node_modules/minizlib/index.js:105:10)
at Unzip.end (/app/node_modules/minizlib/index.js:111:10)
at Unpack.end (/app/node_modules/tar/lib/parse.js:502:21)
at endFn (node:internal/streams/pipeline:425:11)
at process.processTicksAndRejections (node:internal/process/task_queues:77:11)
Others have reported this on Discord as well: https://discord.com/channels/687207715902193673/1108389526440771624
👍 Expected behavior
There should be no errors and the UrlReader should work as expected
👎 Actual Behavior with Screenshots
We are getting this error:
zlib: unexpected end of file, ZlibError: zlib: unexpected end of file
at Unzip.write (/app/node_modules/minizlib/index.js:154:22)
at Unzip.flush (/app/node_modules/minizlib/index.js:105:10)
at Unzip.end (/app/node_modules/minizlib/index.js:111:10)
at Unpack.end (/app/node_modules/tar/lib/parse.js:502:21)
at endFn (node:internal/streams/pipeline:425:11)
at process.processTicksAndRejections (node:internal/process/task_queues:77:11)
👟 Reproduction steps
Honestly this isn’t one that we are able to reproduce. We’ve just seen it in our logs and working back from there they all are from using the UrlReader.
📃 Provide the context for the Bug.
This error is causing issues with a few internal plugins that required pulling in files from various SCM using the UrlReader which was working fine before.
🖥️ Your Environment
I can’t get all the details but this started for use somewhere between 1.14.0-next.2 and 1.14.0
👀 Have you spent some time to check if this bug has been raised before?
- I checked and didn’t find similar issue
🏢 Have you read the Code of Conduct?
- I have read the Code of Conduct
Are you willing to submit PR?
No, but I’m happy to collaborate on a PR with someone else
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 37 (11 by maintainers)
we updated backstage to latest release version (as of this) and the error is gone. didn’t pursue it further.
our instance was quite old (last update was April).
I just saw this issue occur in a new place - during a techdocs docs build within the app.
OK so I just did some digging around and it looks like that this issue has been brought to the node team, but it’s actually zlib that’s causing this issue as the stream is malformed in the first place, and it’s a little more strict now about how it’s handled.
https://github.com/nodejs/node/issues/46359 https://github.com/madler/zlib/issues/773
I wonder if this is something that we need to bring up with github instead as it looks like their zlib compression is invalid?
it is actually quite logical that it is related, since techdocs would also try to fetch whole repo as archive and get required part from it (as far as my experiments go)
“me too”. The funny thing is that locally (in dev mode) it works fine (from time to time same template fails with rate of maybe 5%), in the container - fine, deploying container into the K8 and the failure rate is 100% (I am still trying to figure out what could be the reason for this difference, but looking to the initial issue message “We can’t reproduce it locally but it happens in all of our deployed environments.” it seems to fit)