langchainjs: Stream ConversationalRetrievalQAChain is not working
I’m using a pinecone vector with OpenAIChat model in a ConversationalRetrievalQAChain, it works as intended when im not streaming the output however if I change the OpenAIChat to streaming true as per the below code snippet it fails with the following error “TypeError: stream.getReader is not a function”
const model = new OpenAIChat({
streaming: true,
callbackManager: CallbackManager.fromHandlers({
async handleLLMNewToken(token: string) {
console.log(token);
},
}),
});
This is in a nextjs API route
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 15 (1 by maintainers)
I think i find why it didnt work, it should be the misuse the origin web env “fetch” and “node-fetch”. Check the detail here https://stackoverflow.com/questions/57664058/response-body-getreader-is-not-a-function.
Then I change the dep file
event-source-parsegetBytes function like this , problem solved.@dqbd do you want to have a look at this one? Would be pretty strange if nextjs/vercel didn’t support the ReadableStream web standard? Or is that people are running this in Node 16 without realising and Node 16 doesn’t support ReadableStream
This solution is fine! Can next version adapter browser fetch (node v18+) and node-fetch (node v16) ?
This solves the issue. I had the same problem while trying to stream the response from the /experimental/autogpt function with node v17.9.1 in my local environment. getReader() function is not supported by Node.js node-fetch package.