langchainjs: Illegal invocation with `langchain/llms/bedrock/web` on Cloudflare Worker

declare type Env = {
  ENVIRONMENT: "dev" | "production";
  BEDROCK_AWS_ACCESS_KEY_ID: string;
  BEDROCK_AWS_SECRET_ACCESS_KEY: string;
};

import { Bedrock } from "langchain/llms/bedrock/web";

export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
    const model = new Bedrock({
      model: "anthropic.claude-v2",
      region: "us-east-1",
      credentials: {
        accessKeyId: env.BEDROCK_AWS_ACCESS_KEY_ID,
        secretAccessKey: env.BEDROCK_AWS_SECRET_ACCESS_KEY
      },
      streaming: false,
      verbose: true
    });
    console.log(model);
    const text = "Human: What would be a good company name for a company that makes colorful socks?";
  
    console.log(text);
    const llmResult = await model.invoke(text);
  
    console.log(llmResult);
    return new Response(llmResult);
  },
};
Bedrock {
  lc_serializable: true,
  lc_kwargs: Object,
  lc_runnable: true,
  verbose: true,
  callbacks: undefined
  ...
}
Human: What would be a good company name for a company that makes colorful socks?
[llm/start] [1:llm:Bedrock] Entering LLM run with input: {
  "prompts": [
    "Human: What would be a good company name for a company that makes colorful socks?"
  ]
}
[llm/error] [1:llm:Bedrock] [18ms] LLM run errored with error: "Illegal invocation"
[mf:err] TypeError: Illegal invocation
    at Object.apply (/private/var/folders/wd/0gfsfmwx4b91h52nfbbfrb5m0000gn/T/tmp-831-V0NWW140FaH9/checked-fetch.js:28:18)
    at null.<anonymous> (/Users/dave/Work/langchain-workers/langchain-worker/node_modules/langchain/dist/llms/bedrock/web.js:204:105)
    at onFailedAttempt (/Users/dave/Work/langchain-workers/langchain-worker/node_modules/langchain/dist/util/async_caller.js:89:50)
    at RetryOperation._fn (/Users/dave/Work/langchain-workers/langchain-worker/node_modules/p-retry/index.js:50:18)
    at RetryOperation.attempt (/Users/dave/Work/langchain-workers/langchain-worker/node_modules/retry/lib/retry_operation.js:116:8)
    at null.<anonymous> (/Users/dave/Work/langchain-workers/langchain-worker/node_modules/p-retry/index.js:48:12)
    at [object Object]
    at pRetry (/Users/dave/Work/langchain-workers/langchain-worker/node_modules/p-retry/index.js:39:36)
    at queue.add.throwOnTimeout (/Users/dave/Work/langchain-workers/langchain-worker/node_modules/langchain/dist/util/async_caller.js:89:37)
    at run (/Users/dave/Work/langchain-workers/langchain-worker/node_modules/p-queue/dist/index.js:157:104)
image

About this issue

  • Original URL
  • State: closed
  • Created 8 months ago
  • Reactions: 1
  • Comments: 15 (8 by maintainers)

Commits related to this issue

Most upvoted comments

🤖

Based on the error message you’re receiving, it seems like the issue might be related to the fetch function in the _request method. The fetch function in Cloudflare Workers doesn’t support the signal option for aborting requests, which is a feature available in the browser environment but not in the Cloudflare Workers environment.

Here’s the problematic code:

const response = await fetch(url, {
  method: "POST",
  headers,
  body: JSON.stringify(data),
  signal: options.signal, // This line might be causing the issue
});

To fix this issue, you can simply remove the signal option from the fetch function:

const response = await fetch(url, {
  method: "POST",
  headers,
  body: JSON.stringify(data),
  // signal: options.signal, // Remove this line
});

Please note that this will disable the ability to abort the request. If you need to support request cancellation, you’ll need to implement it in a different way that’s compatible with Cloudflare Workers.

This issue was also discussed in a similar context in the langchainjs repository. You can refer to the issue Add support for Cloudflare Workers / Vercel Edge Functions and the pull request Enable support for Browsers, Cloudflare Workers, Next.js Browser/Serverless/Edge for more details.

The fetch function in the _request method within the langchain/llms/bedrock/web module is used to send HTTP requests to the Bedrock service. It is used in the _signedFetch method to send a signed HTTP request to the Bedrock service. The fetch function is used with the URL of the Bedrock service and an options object that includes the headers, body, and method of the HTTP request. The headers and body are obtained from the signed request, and the method is also obtained from the signed request.

I hope this helps! If you have any other questions or need further clarification, feel free to ask.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

Thanks, I can confirm #3302 fixed it!

So this makes zero sense to me… but defining my own fetch function removes the error. (It’s super ugly though.)

declare type Env = {
  ENVIRONMENT: "dev" | "production";
  BEDROCK_AWS_ACCESS_KEY_ID: string;
  BEDROCK_AWS_SECRET_ACCESS_KEY: string;
};

import { Bedrock } from "langchain/llms/bedrock/web";

function myFetch(input: RequestInfo, init?: RequestInit<CfProperties<unknown>> | undefined): Promise<Response> {
  console.log("myFetch", input, init);
  return fetch(input, init);
}

export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
    const model = new Bedrock({
      model: "anthropic.claude-v2",
      region: "us-east-1",
      credentials: {
        accessKeyId: env.BEDROCK_AWS_ACCESS_KEY_ID,
        secretAccessKey: env.BEDROCK_AWS_SECRET_ACCESS_KEY
      },
      streaming: false,
      verbose: true,
      fetchFn: myFetch
    });
    console.log(model);
    const text = "Human: What would be a good company name for a company that makes colorful socks?";
  
    console.log(text);
    const llmResult = await model.invoke(text);
  
    console.log(llmResult);
    return new Response(llmResult);
  },
};