parcel: 🐛 `parcel watch` in 1.5.0 stuck on "Building someFile.js..."
I tried updating my project from parcel-bundler 1.4.1 to 1.5.0, but parcel watch
doesn’t finish. I removed the cache folder manually and started the watch command (see exact CLI command below). It gets stuck at some random file, displaying ∞ Building someFile.js...
forever.
The parcel build
command works btw.
🎛 Configuration (.babelrc, package.json, cli command)
.babelrc:
{
"presets": ["env", "es2015", "react"],
"plugins": [
"transform-class-properties",
"transform-object-rest-spread",
"transform-decorators-legacy",
"transform-es3-member-expression-literals",
"transform-es3-property-literals"
],
"env": {
"production": {
"plugins": ["transform-remove-debugger", "transform-remove-console"]
}
}
}
package.json (without name etc. but all deps & scripts):
{
"dependencies": {
"animate.css": "^3.5.2",
"babel-polyfill": "^6.26.0",
"flag-icon-css": "^2.8.0",
"immer": "^0.3.0",
"lodash": "^4.17.4",
"material-ui": "^1.0.0-beta.12",
"material-ui-icons": "^1.0.0-beta.10",
"object.values": "^1.0.4",
"pubsub-js": "^1.5.7",
"rc-collapse": "^1.7.6",
"react": "16.2.0",
"react-dom": "16.2.0",
"react-json-view": "^1.13.0",
"react-onclickoutside": "^6.4.0",
"react-redux": "5.0.6",
"react-treebeard": "^2.0.3",
"redux": "3.7.2",
"redux-saga": "^0.15.6",
"redux-thunk": "^2.2.0",
"reselect": "^3.0.1",
"workerize": "^0.1.2"
},
"devDependencies": {
"babel-eslint": "7.2.3",
"babel-jest": "^22.0.3",
"babel-plugin-transform-class-properties": "^6.24.1",
"babel-plugin-transform-decorators-legacy": "^1.3.4",
"babel-plugin-transform-es3-member-expression-literals": "^6.22.0",
"babel-plugin-transform-es3-property-literals": "^6.22.0",
"babel-plugin-transform-object-rest-spread": "^6.26.0",
"babel-plugin-transform-remove-console": "^6.8.5",
"babel-plugin-transform-remove-debugger": "^6.8.5",
"babel-preset-env": "^1.6.1",
"babel-preset-es2015": "^6.24.1",
"babel-preset-react": "^6.24.1",
"cross-env": "^5.0.2",
"enzyme": "^2.8.2",
"eslint": "4.1.1",
"eslint-config-react-app": "^2.0.1",
"eslint-plugin-flowtype": "2.34.1",
"eslint-plugin-import": "2.6.0",
"eslint-plugin-jsx-a11y": "5.1.1",
"eslint-plugin-react": "^7.5.1",
"husky": "^0.14.3",
"identity-obj-proxy": "^3.0.0",
"jest": "22.1.2",
"jest-css-modules": "^1.1.0",
"lint-staged": "^6.0.0",
"node-sass": "^4.5.3",
"parcel-bundler": "1.5.0",
"prettier": "1.10.2",
"react-test-renderer": "^16.2.0",
"redux-mock-store": "^1.2.3",
"sinon": "^4.1.6"
},
"scripts": {
"watch": "cross-env REACT_APP_VERSION=$npm_package_version parcel watch src/index.js --out-dir ./public/static --public-url ./static/",
"test": "jest",
"lint": "eslint ./src",
"pretty": "prettier --write \"src/**/*.{js,css,md,scss}\"",
"precommit": "lint-staged",
"prepush": "npm run test --coverage"
},
"lint-staged": {
"src/**/*.{js,css,md,scss}": [
"prettier --write",
"git add"
],
"src/**/*.js": "eslint --max-warnings=0"
},
"jest": {
"rootDir": "src",
"moduleNameMapper": {
"\\.(css|less|scss)$": "identity-obj-proxy"
},
"transform": {
"^.+\\.js$": "babel-jest",
"\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga)$": "<rootDir>/fileTransformer.js"
},
"setupFiles": [
"<rootDir>/setupTests.js"
]
}
}
Used cli command: yarn watch
(see script in package.json above)
🤔 Expected Behavior
I expect the parcel watch command to finish building.
😯 Current Behavior
The command is getting stuck while building.
💁 Possible Solution
No idea, sorry.
🔦 Context
I will have to stay at version 1.4.1 until the watch command works again in my project, as I heavily depend on the watch command during development.
🌍 Your Environment
Software | Version(s) |
---|---|
Parcel | 1.5.0 |
Node | 8.9.3 |
npm | 5.6.0 |
yarn | 1.3.2 |
Operating System | Microsoft Windows 10 Enterprise [Version 10.0.15063] |
About this issue
- Original URL
- State: closed
- Created 6 years ago
- Reactions: 1
- Comments: 18 (7 by maintainers)
Commits related to this issue
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the amount of data written exceeds the size ... — committed to libuv/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to libuv/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to libuv/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to libuv/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to libuv/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to piscisaureus/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to piscisaureus/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to libuv/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to libuv/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to libuv/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to libuv/libuv by piscisaureus 6 years ago
- win,pipe: fix IPC pipe deadlock This fixes a bug where IPC pipe communication would deadlock when both ends of the pipe are written to simultaneously, and the kernel pipe buffer has already been fill... — committed to libuv/libuv by piscisaureus 6 years ago
Actually fixed in #901. #900 is the issue. Already fixed a while ago, but not yet merged. Probably just upvote the PR then it should get into the next patch I guess 😃
@amykapernick Yes, it’s always random, but it’s not call time related but call count.
I tried your repository and found out, that if we change the workerfarmOptions to this, it works:
this let’s a call to the remote worker time out after 1second, and retries Infinite times after a timeout and apparently it doesn’t throw a workerTimeout error or any other error, so something inside the code or worker-farm code is freezing the individual workers.
After a bit more research it’s always stuck on the same assets (mainly polyfills, so this bug has nothing to do with overworked workers or anything similar, it’s an issue within some asset processing or it’s just taking extremely long)
I’m having the same issue with niicojs on Windows. his solution with update parcel helps me sometimes…
PS. 1.4.1 build project fast and without issues 1.5.0+ all the time stuck and seems nothing helps already. PSS. 1.6.1 works stable, no stucks yet.
Still in 1.7.0… It’s frustrating
Still in 1.7.0.
Alas, I have to do this all every startup of Parcel or it fails to finish building about 75% of the time.
I’m having the same issue at each update.
I usually get into this to solve it:
and it ends up working… I never looked closer as I’m doing other stuff when it fails 😃
I sadly can’t share the project I’m experiencing the issue in, but I stripped a copy down to re-create the build config and structure and the issue is reproducible in that stripped down project.
@DeMoorJasper You can find the repo here: https://github.com/mBeierl/parcel-watch-issue