langchainjs: All requests through langchain to OpenAI timeout

Trying to add langchainjs to my Node.js project but whenever calling any function related to OpenAI the requests timeout. I also downloaded the repo, configured it and ran the examples but the results were exactly the same, it just times out.

For example:

const model = new OpenAI({ temperature: 0.9, modelName: "gpt-3.5-turbo" , timeout: 20})
const response = await model.generate(["Tell me a joke."]);
console.log(response)

When directly using the OpenAI node package then it works perfectly

const configuration = new Configuration({
organization: orgId,
apiKey: process.env.OPENAI_API_KEY,
});
const openai = new OpenAIApi(configuration);
const response = await openai.createChatCompletion({model: 'gpt-3.5-turbo', temperature: 0, messages: [{role: 'user', content: 'hi there'}]})
console.log(response)

Hoping for some feedback. Thanks!

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Reactions: 3
  • Comments: 18

Most upvoted comments

Haha no worries

@nfcampos Get the f**k out of here 😡 😂 😂 . Obviously I assumed it to be seconds and for some reason I initially added the timeout there because it seemed like it was not working so in my mind I increased the timeout instead of setting it to 20ms.

Managed to dwelve into the deepest depths of langchain for 1+ day just to find out my fatal mistake was the dumbest one …

I am observing the same, even when running the openai example in the examples directory, I am getting a 404 response:

yarn run start src/llms/openai.ts

Request data: '{"model":"gpt-4","temperature":0.7,"top_p":1,"frequency_penalty":0,"presence_penalty":0,"n":1,"max_tokens":1000,"stream":false,"messages":[{"role":"user","content":"Question: What would be a good company name a company that makes colorful socks?\\nAnswer:"}]}', url: 'https://api.openai.com/v1/chat/completions' },

Response

response: { ok: false, status: 404, statusText: 'Not Found', headers: { 'alt-svc': 'h3=":443"; ma=86400, h3-29=":443"; ma=86400', 'cf-cache-status': 'DYNAMIC', 'cf-ray': '~removed~', connection: 'keep-alive', 'content-encoding': 'gzip', 'content-type': 'application/json; charset=utf-8', date: 'Tue, 18 Apr 2023 removed', server: 'cloudflare', 'set-cookie': '00000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None', 'strict-transport-security': 'max-age=15724800; includeSubDomains', 'transfer-encoding': 'chunked', vary: 'Origin', 'x-request-id': '~removed~' },

In your case perhaps the issue might be that you should change examples to use modelName: "gpt-3.5-turbo" since currently I see that by default it uses gpt-4 for which you might not have access. For me though it is not the issue.