Codex-CLI: Error "Codex CLI error: Invalid request - The model: `code-davinci-002` does not exist"

Edit 3: looks like OpenAI shut down their Codex API (https://news.ycombinator.com/item?id=35242069). Apparently there was an e-mail but I never received one.

Also apparently the API is available through Azure (https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/work-with-code). Perhaps an alternative would be to use Azure?

Original:


I’m getting this error on each attempt to use the codex:

Codex CLI error: Invalid request - The model: `code-davinci-002` does not exist

I haven’t dug into the code at all but I noticed all of the OpenAI beta URLs have been removed. In the install instructions most redirect but the engines listing URL returns 404 (https://beta.openai.com/docs/engines/codex-series-private-beta). The new URL is https://platform.openai.com/docs/models/codex. It still shows it’s in private beta on the new URL.

Edit:

When I query a list of OpenAI engines available to my account, code-davinci-002 is not listed. So I reran the install script and select a different engine available to me (gpt-3.5-turbo). After restarting PowerShell I’m still seeing the same error. I confirmed that the openaiapirc file was correctly updated with the new engine and verified that all instances of PowerShell had been shut down but am still seeing the same error.

Edit 2: I also updated the current_context.config file with the updated engine. Now the error I receive is

Codex CLI error: Invalid request - This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?

About this issue

  • Original URL
  • State: open
  • Created a year ago
  • Reactions: 2
  • Comments: 18

Most upvoted comments

Support for Chat models (GPT 3.5/4) now works on my fork!

Feel free to use it here:

https://github.com/AntonOsika/CLI-Co-Pilot

The required changes in the code were small but non-obvious.

It seems code-davinci-002 and code-cushman-001 have been removed, but yes, the Codex CLI does not seem to support gpt-3.5-turbo at the same time.

I’m still getting “cannot find openAI model” errors, even with gpt-3.5-turbo as my model. EDIT: That was with using the fork mentioned above: https://github.com/Lukas-LLS/Codex-CLI

Using the main branch, I get Codex CLI error: Invalid request - This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions? when I use gpt-3.5-turbo as my model, and gpt-4 straight up doesn’t work.

I use this tool every day, so a bugfix would be great!

do you know how to fix it?

Try updating the openai package with pip.

I already have submitted a PR #131

@Lukas-LLS still having problems. Both gpt-4-32k and gpt-3.5-turbo return Cannot find OpenAI model errors. I have Chat GPT Plus.

Screenshot 2023-03-31 at 6 54 09 AM

The only difference I can think of between what I did and the installation instructions is that I copied my openAI secret key from where I originally stored it - since I think the picture where you can copy it directly from the OpenAI website is outdated (I cannot do that)

I can think of two possible causes for your problem:

  1. You might have an old version of the fork, because at the time my fork was posted into this issue I was still working on it and at that point it was not operational (If you update to a newer version make sure to use a cleanup script and setup thereafter, because there a some changes that will break an older setup)
  2. It could also be possible that you still have the setup from https://github.com/microsoft/Codex-CLI. If that is the case, you should run the cleanup script from the original project before running the setup from the fork. (I don’t know how you migrated to the fork, so this is just a possibility)

And for gpt-4 make sure you have the model available to you, for that to be the case you must have either signed up on the waitlist: https://openai.com/waitlist/gpt-4-api (from the waitlist you gain access to the gpt-4 model) or you must have the ChatGPT Plus subscription: https://chat.openai.com/chat in the lower left corner the button Upgrade to Plus (from ChatGPT Plus you gain access to the gpt-4-32k model)

If your issue still persists after these steps, let me know and I will look further into it