llm-workflow-engine: New chat title generation POST request hanging
I encountered a problem when calling the chatgpt.ask() function. My question is successfully passed to the OpenAI website, and the dialogue content (i.e., the response from chatgpt to my question) is also generated correctly on the OpenAI platform. However, my local program is unable to retrieve the answer. During the ‘await response’ process, the following error occurs: playwright._impl._api_types.Error: Request context disposed. Can you help me understand and resolve this issue?
My test code:
from chatgpt_wrapper import ChatGPT
from chatgpt_wrapper.config import Config
config = Config()
config.set('browser.debug', True)
config.set('chat.model', 'legacy-paid')
bot = ChatGPT(config)
success, response, message = bot.ask("Can you tell me a joke?")
if success:
print(response)
else:
raise RuntimeError(message)
Traceback (most recent call last):
File "c:\Users\Administrator\Desktop\chatgpt-test\test.py", line 9, in <module>
success, response, message = bot.ask("Can you tell me a joke?")
File "C:\ProgramData\Anaconda3\lib\site-packages\chatgpt_wrapper\chatgpt.py", line 518, in ask
return self.async_run(self.agpt.ask(message, title=title))
File "C:\ProgramData\Anaconda3\lib\site-packages\chatgpt_wrapper\chatgpt.py", line 494, in async_run
return asyncio.get_event_loop().run_until_complete(awaitable)
File "C:\ProgramData\Anaconda3\lib\asyncio\base_events.py", line 647, in run_until_complete
return future.result()
File "C:\ProgramData\Anaconda3\lib\site-packages\chatgpt_wrapper\chatgpt.py", line 468, in ask
response = list([i async for i in self.ask_stream(message, title=title)])
File "C:\ProgramData\Anaconda3\lib\site-packages\chatgpt_wrapper\chatgpt.py", line 468, in <listcomp>
response = list([i async for i in self.ask_stream(message, title=title)])
File "C:\ProgramData\Anaconda3\lib\site-packages\chatgpt_wrapper\chatgpt.py", line 444, in ask_stream
await self._gen_title()
File "C:\ProgramData\Anaconda3\lib\site-packages\chatgpt_wrapper\chatgpt.py", line 204, in _gen_title
ok, json, response = await self._api_post_request(url, data)
File "C:\ProgramData\Anaconda3\lib\site-packages\chatgpt_wrapper\chatgpt.py", line 188, in _api_post_request
response = await self.page.request.post(url, headers=headers, data=data)
File "C:\ProgramData\Anaconda3\lib\site-packages\playwright\async_api\_generated.py", line 18313, in post
await self._impl_obj.post(
File "C:\ProgramData\Anaconda3\lib\site-packages\playwright\_impl\_fetch.py", line 248, in post
return await self.fetch(
File "C:\ProgramData\Anaconda3\lib\site-packages\playwright\_impl\_fetch.py", line 285, in fetch
return await self._inner_fetch(
File "C:\ProgramData\Anaconda3\lib\site-packages\playwright\_impl\_fetch.py", line 372, in _inner_fetch
response = await self._channel.send(
File "C:\ProgramData\Anaconda3\lib\site-packages\playwright\_impl\_connection.py", line 44, in send
return await self._connection.wrap_api_call(
File "C:\ProgramData\Anaconda3\lib\site-packages\playwright\_impl\_connection.py", line 419, in wrap_api_call
return await cb()
File "C:\ProgramData\Anaconda3\lib\site-packages\playwright\_impl\_connection.py", line 79, in inner_send
result = next(iter(done)).result()
playwright._impl._api_types.Error: Request context disposed.
The dialogue content (i.e., the response from chatgpt to my question) is also generated correctly on the OpenAI platform:
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 18
@He2hiwei @cloudyskyy can you please try the latest release and see if that fixes the issue for you?
I’ve added a sensible timeout for the auto title generation, and put in better error handling – I think it will work fine in your cases.
Yes, kind of ‘solved’ … I found a strange phenomenon: When I rolled back the version to chatgpt 0.3.16, it suddenly stopped reporting this error, and the python API could return normal response of chatgpt. Once upgraded to chatgpt 0.3.17 or higher, the problem reappeared.