workers-sdk: 🐛 BUG: Unable to run wrangler dev since version 3.19.0
Which Cloudflare product(s) does this pertain to?
Wrangler core
What version(s) of the tool(s) are you using?
3.26.0
What version of Node are you using?
20.11.0
What operating system and version are you using?
macOS Sonoma 14.3
Describe the Bug
Observed behavior
Since version 3.19.0 of wrangler was released we’re no longer able to run wrangler dev --remote. It runs, but as soon as a page is viewed the following error is thrown:
✘ [ERROR] Error in ProxyController: Error inside ProxyWorker
{
name: 'Error',
message: 'internal error',
stack: 'Error: internal error'
}
Expected behavior
Work as it did in version 3.18.0 and lower.
Steps to reproduce
Due to the code being internal, I cannot provide an exact example. Happy to provide any debug logs that might be useful. The error happens 100% of the time by just running wrangler dev --remote and then opening http://localhost:8787 in a browser (or pressing b).
In case it’s applicable, we have these compatibility settings in wrangler.toml. In saying that though I can replicate by removing both of these.
compatibility_date = "2022-09-01"
compatibility_flags = ["url_standard"]
Potentially worth noting is that I’ve added a console.log in the exported fetch function to see if our worker code ever runs but it doesn’t appear to (or at least the log message is never printed).
Please provide a link to a minimal reproduction
No response
Please provide any relevant error logs
I’ve redacted some parts that I didn’t want public and replaced it with placeholder strings (e.g. <ACCOUNT>).
--- 2024-02-05T02:02:55.251Z debug
🪵 Writing logs to "/Users/lnewson/Library/Preferences/.wrangler/logs/wrangler-2024-02-05_02-02-55_173.log"
---
--- 2024-02-05T02:02:55.251Z debug
Failed to load .env file ".env": Error: ENOENT: no such file or directory, open '.env'
at Object.openSync (node:fs:581:18)
at Object.readFileSync (node:fs:457:35)
at tryLoadDotEnv (/Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:124257:72)
at loadDotEnv (/Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:124266:12)
at /Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:162342:20
at /Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:131131:16
at maybeAsyncResult (/Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:129352:44)
at /Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:131130:14
at /Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:129339:22
at Array.reduce (<anonymous>) {
errno: -2,
code: 'ENOENT',
syscall: 'open',
path: '.env'
}
---
--- 2024-02-05T02:02:55.257Z log
⛅️ wrangler 3.26.0
[38;5;214m-------------------[39m
---
--- 2024-02-05T02:02:55.260Z log
Running custom build: yarn test && yarn build
---
--- 2024-02-05T02:02:57.657Z debug
Retrieving cached values for userId from node_modules/.cache/wrangler
---
--- 2024-02-05T02:02:57.657Z debug
Metrics dispatcher: Dispatching disabled - would have sent {"type":"event","name":"run dev","properties":{"local":false,"usesTypeScript":false}}.
---
--- 2024-02-05T02:02:57.657Z debug
Failed to load .env file "/Users/lnewson/site/cloudflare/worker/.dev.vars": Error: ENOENT: no such file or directory, open '/Users/lnewson/site/cloudflare/worker/.dev.vars'
at Object.openSync (node:fs:581:18)
at Object.readFileSync (node:fs:457:35)
at tryLoadDotEnv (/Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:124257:72)
at loadDotEnv (/Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:124266:12)
at getVarsForDev (/Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:166550:18)
at getBindings (/Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:167559:10)
at getBindingsAndAssetPaths (/Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:167515:20)
at getDevReactElement (/Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:167181:40)
at startDev (/Users/lnewson/site/cloudflare/worker/node_modules/wrangler/wrangler-dist/cli.js:167244:60)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
errno: -2,
code: 'ENOENT',
syscall: 'open',
path: '/Users/lnewson/site/cloudflare/worker/.dev.vars'
}
---
--- 2024-02-05T02:02:57.686Z debug
-- START CF API REQUEST: GET https://api.cloudflare.com/client/v4/accounts/<ACCOUNT>/workers/subdomain/edge-preview
---
--- 2024-02-05T02:02:57.686Z debug
HEADERS: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-02-05T02:02:57.686Z debug
INIT: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-02-05T02:02:57.686Z debug
-- END CF API REQUEST
---
--- 2024-02-05T02:02:57.824Z debug
[InspectorProxyWorker] handleProxyControllerIncomingMessage {"type":"reloadStart"}
---
--- 2024-02-05T02:02:58.264Z debug
-- START CF API RESPONSE: OK 200
---
--- 2024-02-05T02:02:58.264Z debug
HEADERS: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-02-05T02:02:58.264Z debug
RESPONSE: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-02-05T02:02:58.264Z debug
-- END CF API RESPONSE
---
--- 2024-02-05T02:02:58.265Z debug
-- START EXCHANGE API REQUEST: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-02-05T02:02:58.265Z debug
-- END EXCHANGE API REQUEST
---
--- 2024-02-05T02:02:58.396Z debug
-- START EXCHANGE API RESPONSE: OK 200
---
--- 2024-02-05T02:02:58.396Z debug
HEADERS: {}
---
--- 2024-02-05T02:02:58.396Z debug
RESPONSE: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-02-05T02:02:58.396Z debug
-- END EXCHANGE API RESPONSE
---
--- 2024-02-05T02:02:58.422Z log
Total Upload: [32m89.31 KiB / gzip: 18.90 KiB[39m
---
--- 2024-02-05T02:02:58.423Z debug
-- START CF API REQUEST: POST https://api.cloudflare.com/client/v4/accounts/<ACCOUNT>/workers/scripts/dev_docs-prod/edge-preview
---
--- 2024-02-05T02:02:58.423Z debug
HEADERS: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-02-05T02:02:58.423Z debug
INIT: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-02-05T02:02:58.423Z debug
-- END CF API REQUEST
---
--- 2024-02-05T02:02:59.729Z debug
-- START CF API RESPONSE: OK 200
---
--- 2024-02-05T02:02:59.730Z debug
HEADERS: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-02-05T02:02:59.730Z debug
RESPONSE: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-02-05T02:02:59.730Z debug
-- END CF API RESPONSE
---
--- 2024-02-05T02:02:59.730Z debug
Checking if domain has Access enabled: <DOMAIN>
---
--- 2024-02-05T02:02:59.730Z debug
Access switch not cached for: <DOMAIN>
---
--- 2024-02-05T02:02:59.760Z debug
Caching access switch for: <DOMAIN>
---
--- 2024-02-05T02:02:59.766Z debug
Checking if domain has Access enabled: <DOMAIN>
---
--- 2024-02-05T02:02:59.766Z debug
Access switch not cached for: <DOMAIN>
---
--- 2024-02-05T02:02:59.905Z debug
Caching access switch for: <DOMAIN>
---
--- 2024-02-05T02:02:59.915Z debug
[InspectorProxyWorker] handleProxyControllerIncomingMessage {"type":"reloadComplete","proxyData":{"userWorkerUrl":{"protocol":"https:","hostname":"<DOMAIN>","port":"443"},"userWorkerInspectorUrl":{"protocol":"wss:","hostname":"<DOMAIN>","port":"","pathname":"/cdn-cgi/workers/preview/inspector"},"userWorkerInnerUrlOverrides":{},"headers":{"cf-workers-preview-token":"AU9fF1RhfYQtv5yO0s8Ywz5HjouYhxd7PuhyNQwgEukUkOchT9TiRIL5LyFnn2UkdcwBp9hQ8adrZ9tVFFOqFQk6UhaO8xU25YiYbse7yxoizSystaE8emx1pNajdTCfQ7NCzu3cl8GGofkr2eQ1G9H89vSNfX7tNBg_iJiBa0qjLqH0JZCV8rooU1K4cjso3mLyQ1SwxK5UDbQvhYpnteM-KuXCdqSR6NlwCk8KR19v9xf0Vfcg1_Js_Xa0x92-ZdVrzmn1TjcVhZjw5HX6d9C3sOpoA2QGnDbkCYIF1OnEsamnKwpUSD4QcjQSj2dj3QlAx1n_qhUpAkT2DS3pMHYceXvAzsNJZiQ7at8w1d-MSK8sQjerNeNoMSbDJ2jt7LE_mO9RAa9gO9X8D6oN96_01yqFb7W-6354eH7DnO-1aDBRL-ITgGOh697OK15GZeXytqZmHqzi01mf9fsGtFCV7ujE1yuUlUt_nXihCHK8aJ2_3t5cohAzazc_MC7eXlQONxbZyNZo2f0qz17-RfMYSN3m2pjr4Vwfhjv1iJt6nDyKEJ_JIw1rp2zizc6-LH-HamKeaRx7Pk8RvYe-r5Di1d62X-qDSxSTYJsoeN6mvCW7tBasqyMxklgzifvgouyzg2dt0LXfarVADFktCZ1cbaQv5Dhudwa0J1RVU7WZA8gEI8HNUkTr8tD37TEFkJGulRahM8Y4qMj0epsryPo43nW-6VmroawjMJCr7m6kU85vlsN4RSfed4RkyjM_Ki32SxLFs-acVa5WDPJvTE2zcCcdC9v0rZilm5srQNSQjR7WPjUXzVTr0qXSSZeuCRD5M5Uq_SHUSPidcu3YrtYUnagKg9KjC2IxkLqFAE25rZ0J1sASTTi3idmYqqryqHLwJckQgoQP3Mr2d5WgBQPEQ4jYj3uLrw-FITJDuYFQvah0wlDeXHVv2NPYz8PPisyVWxFHZx34HIyqxx-FkBW5MWLD2_oNQbLB01q-pxxaf36zx179vTquoNKI0LIEbsErfbi5K0L6DGukAJuqazg"},"liveReload":false,"proxyLogsToController":true}}
---
--- 2024-02-05T02:02:59.916Z debug
[InspectorProxyWorker] reconnectRuntimeWebSocket
---
--- 2024-02-05T02:02:59.917Z debug
[InspectorProxyWorker] NEW RUNTIME WEBSOCKET https://<DOMAIN>/cdn-cgi/workers/preview/inspector
---
--- 2024-02-05T02:02:59.918Z debug
[InspectorProxyWorker] SEND TO DEVTOOLS {"method":"Runtime.executionContextsCleared"}
---
--- 2024-02-05T02:03:00.523Z debug
workerd/util/symbolizer.c++:96: warning: Not symbolizing stack traces because $LLVM_SYMBOLIZER is not set. To symbolize stack traces, set $LLVM_SYMBOLIZER to the location of the llvm-symbolizer binary. When running tests under bazel, use `--test_env=LLVM_SYMBOLIZER=<path>`.
workerd/jsg/util.c++:281: error: e = kj/compat/tls.c++:221: failed: TLS peer's certificate is not trusted; reason = Hostname mismatch
stack: 1038a2c83 1038a32b7 104bd033f 1038aabe8 104bfabb8 104bfb000 1039c3a9b 1039c438b 1039c5a87 1039a319c 1039a9d44 1039bff20 102b978cc 103148bdc; sentryErrorContext = jsgInternalError
---
--- 2024-02-05T02:03:00.562Z error
[31m✘ [41;31m[[41;97mERROR[41;31m][0m [1mError in ProxyController: Error inside ProxyWorker[0m
{
name: 'Error',
message: 'internal error',
stack: 'Error: internal error'
}
---
--- 2024-02-05T02:03:00.563Z debug
=> Error contextual data: {
config: {
name: 'dev_docs-prod',
script: { contents: '' },
dev: {
server: [Object],
inspector: [Object],
urlOverrides: [Object],
liveReload: false
}
},
bundle: {
id: 0,
entry: {
file: '/Users/lnewson/site/cloudflare/worker/build/index.js',
directory: '/Users/lnewson/site/cloudflare/worker',
format: 'modules',
moduleRoot: '/Users/lnewson/site/cloudflare/worker/build',
name: 'dev_docs-prod'
},
path: '/Users/lnewson/site/cloudflare/worker/.wrangler/tmp/dev-dJD68W/index.js',
type: 'esm',
modules: [],
dependencies: {
'.wrangler/tmp/bundle-Y06h6i/checked-fetch.js': [Object],
'build/index.js': [Object]
},
sourceMapPath: '.wrangler/tmp/dev-dJD68W/index.js.map',
sourceMapMetadata: {
tmpDir: '/Users/lnewson/site/cloudflare/worker/.wrangler/tmp/bundle-Y06h6i',
entryDirectory: '/Users/lnewson/site/cloudflare/worker'
}
}
}
---
--- 2024-02-05T02:03:00.673Z debug
[InspectorProxyWorker] RUNTIME WEBSOCKET OPENED
---
--- 2024-02-05T02:03:00.674Z debug
[InspectorProxyWorker] SEND TO RUNTIME {"method":"Runtime.enable","id":100000001}
---
--- 2024-02-05T02:03:00.674Z debug
[InspectorProxyWorker] SEND TO RUNTIME {"method":"Debugger.enable","id":100000002}
---
--- 2024-02-05T02:03:00.674Z debug
[InspectorProxyWorker] SEND TO RUNTIME {"method":"Network.enable","id":100000003}
---
--- 2024-02-05T02:03:00.704Z debug
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE {
method: 'Runtime.executionContextCreated',
params: {
context: {
id: 1,
origin: '',
name: 'Worker',
uniqueId: '-6374973205721458888.-7824425677934341909'
}
}
}
---
--- 2024-02-05T02:03:00.704Z debug
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE { id: 100000001, result: {} }
---
--- 2024-02-05T02:03:00.705Z debug
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE {
method: 'Debugger.scriptParsed',
params: {
scriptId: '3',
url: 'file:///Users/lnewson/site/cloudflare/worker/.wrangler/tmp/dev-dJD68W/index.js',
startLine: 0,
startColumn: 0,
endLine: 2667,
endColumn: 101,
executionContextId: 1,
hash: '8e99dbf3c9140b0e379af69d701de87f7741316c74b40a77c63d7831277d7288',
isLiveEdit: false,
sourceMapURL: 'index.js.map',
hasSourceURL: true,
isModule: true,
length: 91451,
scriptLanguage: 'JavaScript',
embedderName: 'index.js'
}
}
---
--- 2024-02-05T02:03:00.705Z debug
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE {
id: 100000002,
result: { debuggerId: '-6861462484141182344.8585576275922833579' }
}
---
--- 2024-02-05T02:03:00.706Z debug
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE { id: 100000003, method: 'Network.enable', result: {} }
---
--- 2024-02-05T02:03:03.066Z debug
Not Implemented Error: ConfigController#teardown
---
--- 2024-02-05T02:03:03.066Z debug
Not Implemented Error: BundlerController#teardown
---
--- 2024-02-05T02:03:03.066Z debug
Not Implemented Error: LocalRuntimeController#teardown
---
--- 2024-02-05T02:03:03.066Z debug
Not Implemented Error: RemoteRuntimeController#teardown
---
--- 2024-02-05T02:03:03.066Z debug
ProxyController teardown
---
Here’s an extract of some extra logs by adding --log-level debug of when the error occurs:
[wrangler-ProxyWorker:inf] GET /cdn-cgi/ProxyWorker/pause 204 No Content (0ms)
-- START CF API RESPONSE: OK 200
HEADERS: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
RESPONSE: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
-- END CF API RESPONSE
-- START EXCHANGE API REQUEST: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
-- END EXCHANGE API REQUEST
-- START EXCHANGE API RESPONSE: OK 200
HEADERS: {}
RESPONSE: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
-- END EXCHANGE API RESPONSE
Total Upload: 89.31 KiB / gzip: 18.90 KiB
-- START CF API REQUEST: POST https://api.cloudflare.com/client/v4/accounts/<ACCOUNT>/workers/scripts/dev_docs-prod/edge-preview
HEADERS: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
INIT: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
-- END CF API REQUEST
-- START CF API RESPONSE: OK 200
HEADERS: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
RESPONSE: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
-- END CF API RESPONSE
Checking if domain has Access enabled: <DOMAIN>
Access switch not cached for: <DOMAIN>
Caching access switch for: <DOMAIN>
Checking if domain has Access enabled: <DOMAIN>
Access switch not cached for: <DOMAIN>
Caching access switch for: <DOMAIN>
[InspectorProxyWorker] handleProxyControllerIncomingMessage {"type":"reloadComplete","proxyData":{"userWorkerUrl":{"protocol":"https:","hostname":"<DOMAIN>","port":"443"},"userWorkerInspectorUrl":{"protocol":"wss:","hostname":"<DOMAIN>","port":"","pathname":"/cdn-cgi/workers/preview/inspector"},"userWorkerInnerUrlOverrides":{},"headers":{"cf-workers-preview-token":"<TOKEN>"},"liveReload":false,"proxyLogsToController":true}}
[InspectorProxyWorker] reconnectRuntimeWebSocket
[InspectorProxyWorker] NEW RUNTIME WEBSOCKET https://<DOMAIN>/cdn-cgi/workers/preview/inspector
[InspectorProxyWorker] SEND TO DEVTOOLS {"method":"Runtime.executionContextsCleared"}
[wrangler-ProxyWorker:inf] GET /cdn-cgi/ProxyWorker/play 204 No Content (1ms)
[InspectorProxyWorker] SEND TO RUNTIME {"method":"Runtime.enable","id":100000001}
[InspectorProxyWorker] RUNTIME WEBSOCKET OPENED
[InspectorProxyWorker] SEND TO RUNTIME {"method":"Debugger.enable","id":100000002}
[InspectorProxyWorker] SEND TO RUNTIME {"method":"Network.enable","id":100000003}
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE {
method: 'Runtime.executionContextCreated',
params: {
context: {
id: 1,
origin: '',
name: 'Worker',
uniqueId: '<UNIQUE_ID>'
}
}
}
[InspectorProxyWorker] reconnectRuntimeWebSocket
[InspectorProxyWorker] NEW RUNTIME WEBSOCKET https://<DOMAIN>/cdn-cgi/workers/preview/inspector
[InspectorProxyWorker] SEND TO DEVTOOLS {"method":"Runtime.executionContextsCleared"}
[wrangler-ProxyWorker:inf] GET /cdn-cgi/ProxyWorker/play 204 No Content (1ms)
[InspectorProxyWorker] SEND TO RUNTIME {"method":"Runtime.enable","id":100000001}
[InspectorProxyWorker] RUNTIME WEBSOCKET OPENED
[InspectorProxyWorker] SEND TO RUNTIME {"method":"Debugger.enable","id":100000002}
[InspectorProxyWorker] SEND TO RUNTIME {"method":"Network.enable","id":100000003}
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE {
method: 'Runtime.executionContextCreated',
params: {
context: {
id: 1,
origin: '',
name: 'Worker',
uniqueId: '-9162096442528856549.1803397986076218078'
}
}
}
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE { id: 100000001, result: {} }
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE {
method: 'Debugger.scriptParsed',
params: {
scriptId: '3',
url: 'file:///Users/lnewson/site/cloudflare/worker/.wrangler/tmp/dev-WWcJsf/index.js',
startLine: 0,
startColumn: 0,
endLine: 2667,
endColumn: 101,
executionContextId: 1,
hash: '17ff3c2389e969cf0a7aa2fb10ad5bab2e5f24f34f6a8b1ea17be9ecaf52b816',
isLiveEdit: false,
sourceMapURL: 'index.js.map',
hasSourceURL: true,
isModule: true,
length: 91451,
scriptLanguage: 'JavaScript',
embedderName: 'index.js'
}
}
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE {
id: 100000002,
result: { debuggerId: '7465433172647803524.3059957525368977207' }
}
[InspectorProxyWorker] RUNTIME INCOMING MESSAGE { id: 100000003, method: 'Network.enable', result: {} }
[InspectorProxyWorker] SEND TO RUNTIME {"method":"Runtime.getIsolateId","id":100000004}
workerd/util/symbolizer.c++:96: warning: Not symbolizing stack traces because $LLVM_SYMBOLIZER is not set. To symbolize stack traces, set $LLVM_SYMBOLIZER to the location of the llvm-symbolizer binary. When running tests under bazel, use `--test_env=LLVM_SYMBOLIZER=<path>`.
workerd/jsg/util.c++:281: error: e = kj/compat/tls.c++:221: failed: TLS peer's certificate is not trusted; reason = Hostname mismatch
stack: 100f96c83 100f972b7 1022c433f 100f9ebe8 1022eebb8 1022ef000 1010b7a9b 1010b838b 1010b9a87 10109719c 10109dd44 1010b3f20 10028b8cc 10083cbdc; sentryErrorContext = jsgInternalError
workerd/io/worker.c++:1764: info: uncaught exception; source = Uncaught (in promise); exception = Error: internal error
workerd/io/io-context.c++:359: info: uncaught exception; exception = workerd/jsg/_virtual_includes/jsg/workerd/jsg/value.h:1334: failed: jsg.Error: internal error
stack: 1008bd45c 1008be23b 1022c433f 10078df1c 10078e4dc
workerd/io/worker.c++:1764: info: uncaught exception; source = Uncaught (in promise); exception = Error: internal error
workerd/io/io-context.c++:359: info: uncaught exception; exception = workerd/jsg/_virtual_includes/jsg/workerd/jsg/value.h:1334: failed: jsg.Error: internal error
stack: 1008bd45c 1008be23b 1022c433f 10078df1c 10078e4dc
✘ [ERROR] Error in ProxyController: Error inside ProxyWorker
{
name: 'Error',
message: 'internal error',
stack: 'Error: internal error'
}
About this issue
- Original URL
- State: closed
- Created 5 months ago
- Comments: 34 (22 by maintainers)
Thank you for the detailed reply.
For future reference, there is no data loss in this method right? Since the bindings remain the same. Its just that the new worker is on a new route - ALL else remains the same. I just have to switch my custom route to the new worker, and I’m done.
If you rename the worker by simply renaming it in
wrangler.tomlit will create a new Worker, leaving the old one behind. Instead you can rename the Worker in the CF dashboard.I had a worker called
test_worker, which I managed to create using Wrangler (the CF dashboard does not allow this). I logged in the dashboard, clicked on the Worker -> Manage -> Rename Worker. This allowed me to rename the Worker (the CF dashboard would not allow me to put an underscore in the name).Note that the
workers.devsubdomain will change accordingly, in case you rely upon this.Note the warning that is given:
Now I would need to go and update the wrangler.toml with the new name so that next time I deploy from Wrangler it does not recreate the old Worker with the underscore in its name.