anthropic-sdk-typescript: "Unexpected end of JSON input" when streaming on edge environments (Vercel Edge, Cloudflare Workers)
The SDK seems to operate fine when running in a node.js environment, but when running in an Edge runtime (browser env), such as Vercel Edge or Cloudflare Workers, streaming becomes cut off with the following exception:
Could not parse message into JSON:
From chunk: [ 'event: content_block_delta' ]
SyntaxError: Unexpected end of JSON input
at (node_modules/@anthropic-ai/sdk/streaming.mjs:58:39)
at (app/api/test/route.js:15:19)
at (node_modules/next/dist/esm/server/future/route-modules/app-route/module.js:189:36)
at (node_modules/next/dist/esm/server/future/route-modules/app-route/module.js:128:25)
at (node_modules/next/dist/esm/server/future/route-modules/app-route/module.js:251:29)
at (node_modules/next/dist/esm/server/web/edge-route-module-wrapper.js:81:20)
at (node_modules/next/dist/esm/server/web/adapter.js:157:15)
The error is coming from this block: https://github.com/anthropics/anthropic-sdk-typescript/blob/main/src/streaming.ts#L69-L84
The line content is:
{
event: 'content_block_delta',
data: '',
raw: [ 'event: content_block_delta' ]
}
Since the data is an empty string, the JSON parsing blows up. I can bypass this error if I modify the code to ignore empty strings, but that does not seem ideal.
Reproduction repos:
I put the Streaming example from the Anthropic SDK README into a Vercel Edge function and a Cloudflare Workers function with the same failing result.
Note, the error occurs whether we use import "@anthropic-ai/sdk/shims/web";
or not.
Vercel Edge:
I’ve put together a sample repo, using create-next-app and using the example from your README: https://github.com/venables/anthropic-edge-stream-error
The file in question would be app/api/test/route.ts
. If you remove export const runtime = "edge"
, it works as expected.
This error will not occur locally since locally the environment is a node.js environment, but when you deploy to Vercel (with runtime = "edge"
still in the code), you will consistently get the error.
Cloudflare Workers
If you want to reproduce this locally, you can do so using Wrangler and Cloudflare Workers, which spins up a real edge-like environment locally when you run it.
I created a sample repository here, using Hono as the router: https://github.com/venables/anthropic-stream-error-cf
The file in question here is src/index.ts
Running that locally and hitting the endpoint will fail.
About this issue
- Original URL
- State: closed
- Created 5 months ago
- Reactions: 1
- Comments: 16
This fix was released in
v0.17.0
!OK I fixed it, @nyacg was on the right track, the issue is in the
LineDecoder
class instreaming.ts
.Specifically, it’s because the
decode()
method is directly ported from this python implementation, but missed an important behavior difference in js’ssplit()
method vs python’ssplitlines()
method.The error happens when
decode()
receives the any inputs that ends in a new line, e.g.:event: content_block_delta\r\n
OR\r\n
In both of these cases, js’s
split()
method will add an extra empty string to the end of the lines array, whereas python’ssplitlines()
method will not. This is causing empty lines to be passed through to the next layer of sse decoders, causing this issue.This is also why the previous patch doesn’t work, it dropped tokens because the extra empty line caused whole data packets to be ignored.
It’s got nothing to do with edge env, I suspect some network config in edge causes SSE packets to be smaller making this issue more noticeable.
For now, you can patch the package really easily:
This will account for the different
split()
behavior and align it with python’ssplitlines()
behavior.I already created a patch for my llm-api lib if anyone want the patch files. Commit for patch
Fixed in https://github.com/anthropics/anthropic-sdk-typescript/pull/312 which should be released shortly.
Ahh thank you so much for the detailed investigation and proposed patch @dzhng! We’ll test and port this over to our side ASAP.
Amazing turnaround time on this, thank you both @rattrayalex & @RobertCraigie
Doing some debugging myself it looks like a potential bug in the
LineDecoder
Here’s an extract of the logs where an error occurs. I’m printing the output of
this.decodeText(chunk);
in theLineDecoder
.Note: logs go from bottom to top
We get the output
Gouda, its smoky
i.e. the " with" delta is droppedGenerally the chunks that the
LineDecoder.decode
gets fed are eitherevent: content_block_delta d
thenata: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":"<the delta>"}}
However an error occurs when the first chunk is just
event: content_block_delta
(possibly with a trailing space). This leads to an SSE with asse.data
of an empty string which would usually throw an error. If we just continue instead of throwing an error we then get the next SSE with the delta after this one