openai-python: The official example for Function Calling doesn't work with SDK version 1.1.1

Expected behavior

The “Example with one function called in parallel” code from the documentation should correctly show how to use the function call feature.

Actual behavior

I get BadRequestError: Error code: 400 - {'error': {'message': "'content' is a required property - 'messages.1'", 'type': 'invalid_request_error', 'param': None, 'code': None}} while running the code.

Stack trace:

BadRequestError                           Traceback (most recent call last)
[<ipython-input-73-3a60881757d5>](https://localhost:8080/#) in <cell line: 77>()
     75         )  # get a new response from the model where it can see the function response
     76         return second_response
---> 77 print(run_conversation())

5 frames
[<ipython-input-73-3a60881757d5>](https://localhost:8080/#) in run_conversation()
     70                 }
     71             )  # extend conversation with function response
---> 72         second_response = openai.chat.completions.create(
     73             model="gpt-3.5-turbo-1106",
     74             messages=messages,

[/usr/local/lib/python3.10/dist-packages/openai/_utils/_utils.py](https://localhost:8080/#) in wrapper(*args, **kwargs)
    297                         msg = f"Missing required argument: {quote(missing[0])}"
    298                 raise TypeError(msg)
--> 299             return func(*args, **kwargs)
    300 
    301         return wrapper  # type: ignore

[/usr/local/lib/python3.10/dist-packages/openai/resources/chat/completions.py](https://localhost:8080/#) in create(self, messages, model, frequency_penalty, function_call, functions, logit_bias, max_tokens, n, presence_penalty, response_format, seed, stop, stream, temperature, tool_choice, tools, top_p, user, extra_headers, extra_query, extra_body, timeout)
    554         timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
    555     ) -> ChatCompletion | Stream[ChatCompletionChunk]:
--> 556         return self._post(
    557             "/chat/completions",
    558             body=maybe_transform(

[/usr/local/lib/python3.10/dist-packages/openai/_base_client.py](https://localhost:8080/#) in post(self, path, cast_to, body, options, files, stream, stream_cls)
   1053             method="post", url=path, json_data=body, files=to_httpx_files(files), **options
   1054         )
-> 1055         return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
   1056 
   1057     def patch(

[/usr/local/lib/python3.10/dist-packages/openai/_base_client.py](https://localhost:8080/#) in request(self, cast_to, options, remaining_retries, stream, stream_cls)
    832         stream_cls: type[_StreamT] | None = None,
    833     ) -> ResponseT | _StreamT:
--> 834         return self._request(
    835             cast_to=cast_to,
    836             options=options,

[/usr/local/lib/python3.10/dist-packages/openai/_base_client.py](https://localhost:8080/#) in _request(self, cast_to, options, remaining_retries, stream, stream_cls)
    875             # to completion before attempting to access the response text.
    876             err.response.read()
--> 877             raise self._make_status_error_from_response(err.response) from None
    878         except httpx.TimeoutException as err:
    879             if retries > 0:

Versions

OpenAI SDK: 1.1.1 Python: 3.10.12

About this issue

  • Original URL
  • State: closed
  • Created 8 months ago
  • Comments: 19

Most upvoted comments

I’m experiencing the same issue with OpenAI 1.1.1 and Python 3.10.12.

Workaround is to change the None values in the response_message to acceptable ones:

if response_message.content is None:
    response_message.content = ""
if response_message.function_call is None:
    del response_message.function_call

Add this after the line response_message = response.choices[0].message (line 44)

EDIT: The proper fix is to upgrade pydantic with pip install --upgrade pydantic

@enochcheung @mikulskibartosz @cjpark-data @miwiley

I wasn’t able to reproduce the problem when running in a virtual environment, so I checked the difference between pip freeze in the in the virtual environment and without it.

Turns out pydantic==1.10.12 was the issue.

Running pip install --upgrade pydantic solves the problem.

@mikulskibartosz It won’t affect the results since it is deleted only if it is None (see the if statement it’s wrapped in). The problem is the library is sending content and function_call as None but the API doesn’t allow that. Setting content to an empty string in case it is None and removing function_call altogether when it is None solves the issue.

EDIT: No need to remove function_call - I only needed it at first because I used the following:

response_message = dict(response.choices[0].message)
if response_message["content"] is None:
    response_message["content"] = ""

Without casting to a dict it is sufficient to set the content to an empty string if it is None

Fixed in https://github.com/openai/openai-python/releases/tag/v1.2.3

(upgrading pydantic also works)

Outstanding! Cheers.

@unconv patch is working, although I do needed to remove the function call as it was None as well, tried replacing it what “” but did not work, in order to make the example work:

response_message = response.choices[0].message
tool_calls = response_message.tool_calls
response_message = dict(response.choices[0].message)
if response_message["content"] is None:
    response_message["content"] = ""
if response_message["function_call"] is None:
    del response_message["function_call"]