parcel: Segmentation fault in alpine Docker on M1 Mac
🐛 bug report
Running parcel in an alpine Docker container on an M1 Mac results in a Segmentation Fault.
🎛 Configuration (.babelrc, package.json, cli command)
FROM node:alpine
RUN apk add --no-cache build-base python3 && \
apk add --no-cache gcc g++
RUN npm i -g parcel
CMD parcel
🤔 Expected Behavior
Parcel should run without crashing.
😯 Current Behavior
Parcel segfaults.
Running gdb:
#0 0x0000000000005ed0 in ?? ()
#1 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#2 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#3 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#4 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#5 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#6 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#7 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#8 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#9 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#10 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#11 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#12 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#13 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#14 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#15 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#16 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#17 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#18 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#19 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#20 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
#21 0x0000ffff85c613f0 in mdb_env_open () from /usr/local/lib/node_modules/parcel/node_modules/lmdb-store/prebuilds/linux-arm64/node.abi102.node
💁 Possible Solution
Possibly the prebuilt binary isn’t M1 compatible.
🔦 Context
💻 Code Sample
🌍 Your Environment
| Software | Version(s) |
|---|---|
| Parcel | 2.0.1 |
| Node | v17.2.0 |
| npm/Yarn | 8.1.4 |
| Operating System | Alpine |
About this issue
- Original URL
- State: open
- Created 3 years ago
- Reactions: 1
- Comments: 24 (3 by maintainers)
FYI, I’m on parcel 2.7.0 (building from
node:16-alpine) and still getting segfaults/node_modules/@parcel/optimizer-image/parcel-image.linux-arm64-musl.node: __crc32w: symbol not found
any update on this ?
@lorenzogrv Clone the repo, run
yarnin the root, go into this directory https://github.com/parcel-bundler/parcel/tree/v2/packages/optimizers/image, and runyarn buildoryarn build-release. That will create the fileparcel-image.darwin-x64.node(or whatever target you’re on).The
napiCLI will internally callcargo build(or some version of that), so you’ll need to figure out how to set these ARM flags for Rust. Maybe something like https://rust-lang.github.io/packed_simd/perf-guide/target-feature/rustflags.html If you have figured out what has to be set, we can see how to set that flag conditionally only on ARM.FROM node:18.12.1-alpine3.16 … Parcel 2.8 …
#87 35.81 /app/node_modules/@parcel/optimizer-image/parcel-image.linux-arm64-musl.node: #87 35.81 __crc32w: symbol not found
I’m commenting here because #8790 is closed by @mischnic, instructing to follow here.
After resolving a failing build on ARM64 Graviton2 caused by huge memory usage (see comment at #5072), we had “the symbol not found” error as stated before.
Disabling image optimization works-around the issue, but of course is not desired as permanent. It smells like something is wrong with the parcel arm64-musl build. After some research, I’ve found an interesting QA on SO:
For me, it seems some ARM flag is need at compile time. I would like to dive into, but would help some guidance about parcel’s internals so I can compile optimizer-image by my own means. Are there any docs where I can research further?
Still happening with:
Host: Apple M1 Mac Monterey Parcel: 2.7.0 Container from: node:18-alpine3.16 Docker version 20.10.17, build 100c701
I’m going to use node:18-slim, can’t find any solution to the issue 😦
I was able to workaround the latest iteration of this issue by disabling the image optimizer, since we aren’t using it, but for those who need it the crash is still occurring in 2.3.2 with or without
PARCEL_WORKER_BACKEND=processFYI, we’re seeing the same issue as @sethsamuel. We were initially seeing segfaults when trying to run on M1 macs, and now encountering the same error.
Any updates or known workarounds here?