llama_index: [Bug]: index.as_chat_engine is throwing error openai.NotFoundError: Error code: 404
Bug Description
I am using version 0.10.6, using Settings as per documentation instead of ServiceContext with Azure OpenAI. it is working well for index.as_query_engine.
However, when I changed to chate engine, as below
chat_engine = index.as_chat_engine(chat_mode='best', system_prompt=system_prompt, verbose=True)
response = chat_engine.chat(prompt)
Version
0.10.6
Steps to Reproduce
follows the steps of this documentation page
chat_engine = index.as_chat_engine(chat_mode='best', system_prompt=system_prompt, verbose=True)
prompt = 'What is most interesting about this essay'
response = chat_engine.chat(prompt)
Relevant Logs/Tracbacks
2024-02-24 20:19:51.761 Uncaught app exception
Traceback (most recent call last):
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 535, in _run_script
exec(code, module.__dict__)
File "/Users/localuser/Documents/AnacondaProjects/genAI2/azure_llm/maintenance_chat_assistance_V2.py", line 282, in <module>
main(config)
File "/Users/localuser/Documents/AnacondaProjects/genAI2/azure_llm/maintenance_chat_assistance_V2.py", line 252, in main
response = chat_engine.chat(prompt)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/core/callbacks/utils.py", line 41, in wrapper
return func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/core/agent/runner/base.py", line 575, in chat
chat_response = self._chat(
^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/core/agent/runner/base.py", line 520, in _chat
cur_step_output = self._run_step(
^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/core/agent/runner/base.py", line 372, in _run_step
cur_step_output = self.agent_worker.run_step(step, task, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/core/callbacks/utils.py", line 41, in wrapper
return func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/agent/openai/step.py", line 572, in run_step
return self._run_step(
^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/agent/openai/step.py", line 447, in _run_step
agent_chat_response = self._get_agent_response(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/agent/openai/step.py", line 321, in _get_agent_response
chat_response: ChatResponse = self._llm.chat(**llm_chat_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/core/llms/callbacks.py", line 93, in wrapped_llm_chat
f_return_val = f(_self, messages, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/llms/openai/base.py", line 237, in chat
return chat_fn(messages, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/llama_index/llms/openai/base.py", line 296, in _chat
response = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 663, in create
return self._post(
^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/openai/_base_client.py", line 1200, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/openai/_base_client.py", line 889, in request
return self._request(
^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/genAI2/lib/python3.11/site-packages/openai/_base_client.py", line 980, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'Unrecognized request argument supplied: tools', 'type': 'invalid_request_error', 'param': None, 'code': None}}
About this issue
- Original URL
- State: open
- Created 4 months ago
- Comments: 20
đ¤
Based on the context provided, it seems like youâre encountering an issue with the high-level API
index.as_chat_engine(). As a workaround, you can use the low-level APIs to set up the chat engine for the Azure OpenAI model in LlamaIndex version 0.10.6. Hereâs how you can do it:AzureOpenAIMultiModalclass by providing the necessary parameters, including theengineparameter which corresponds to the custom name you chose for your deployment when you deployed a model on Azure OpenAI. Ensure you have set the required environment variables (OPENAI_API_VERSION,AZURE_OPENAI_ENDPOINT, andAZURE_OPENAI_API_KEY).Authentication: The class handles authentication automatically using the provided
use_azure_adflag. Ifuse_azure_adis set toTrue, it will use Microsoft Entra ID (formerly Azure AD) for token authentication. The_get_credential_kwargsmethod prepares the necessary keyword arguments for authentication, including refreshing the Azure AD token if needed.Making Requests: To make requests to the Azure OpenAI API, you can use the low-level
httpx.Clientdirectly with the credentials and endpoints configured in theAzureOpenAIMultiModalinstance. You will need to manually construct the request payload according to the Azure OpenAI API documentation and use thehttp_clientattribute of yourAzureOpenAIMultiModalinstance to send requests.This approach allows you to manually handle chat requests and responses using the Azure OpenAI API, bypassing the high-level API issues you encountered. Remember to replace placeholders like
"your_engine_name","https://YOUR_RESOURCE_NAME.openai.azure.com/", and"your_model_name"with your actual deployment details.As for your question about the role of the
index.as_chat_engine()high-level API in the LlamaIndex codebase, itâs not explicitly mentioned within the provided context. However, it can be inferred that the role of a high-level API resemblingindex.as_chat_engine()would be to facilitate the creation or interaction with a chat engine instance, specifically theContextChatEngineclass in this case. TheContextChatEngineclass is designed to use a retriever to fetch context, set this context in the system prompt, and then utilize a Large Language Model (LLM) to generate a response, aiming for a fluid chat experience. This involves retrieving relevant information based on the userâs message, processing this information, and then generating a response that is contextually aware, leveraging the capabilities of the underlying LLM and the retrieved context to provide meaningful interactions.You can find more details in the llama-index-core/llama_index/core/chat_engine/context.py file.
I hope this helps! If you have any more questions or need further clarification, feel free to ask.
Sources
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a đ if this is helpful and đ if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
đ¤
Based on the issues youâve encountered and the context provided, it seems like the error youâre experiencing is due to the tool named âfunctionsâ not being recognized. This could be due to a misconfiguration or a typo in the toolâs name.
There are two similar solved issues in the LlamaIndex repository that might be helpful:
[Bug]: Agent sometimes tries to use tools that do not exist and then crashes suggests adding a check to ensure that the tool exists before trying to call it. This involves catching the
ValueErrorand skipping the tool call if the tool does not exist, to prevent a crash. Another user suggested telling the OpenAI agent to stop prepending âfunctions.â to the actual tool name in the system prompt, which resolved a similar issue for them.[Bug]: #2129 was closed as resolved but I am still having the same error even with a completely new install and container environment, and others are reporting the same error on discord discusses a similar error and might provide some additional context or solutions.
As for setting up the chat engine using the low-level APIs, you can follow the pattern demonstrated in the
test_from_openai_messages_function_calling_azurefunction. This function shows how to manually convert messages from the Azure OpenAI format to the LlamaIndex format using fixtures that simulate the Azure OpenAI message structure and the corresponding LlamaIndex chat messages.Hereâs a simplified example:
Unfortunately, I couldnât find any specific changes between LlamaIndex version 0.10.6 and the latest version v0.10.13.post1 that could potentially resolve the
ValueError: Tool with name functions not founderror. Itâs possible that the answer may be available elsewhere or I could have missed it.I hope this helps! If you have any more questions or need further clarification, feel free to ask.
Sources
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a đ if this is helpful and đ if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
đ¤
Based on the error traceback you provided, it seems like the
get_function_by_namefunction is trying to find a tool named âfunctionsâ but itâs not finding any, hence theValueError: Tool with name functions not found.This could be due to a few reasons:
toolslist.To resolve this issue, you need to ensure that the tool with the name âfunctionsâ exists in the
toolslist. Hereâs how you can do it:Check the
toolslist to see if the tool with the name âfunctionsâ exists. You can print out thetoolslist to see all the available tools and their names.If the tool exists, make sure that the name youâre providing matches exactly with the name of the tool in the
toolslist. Remember that Python is case-sensitive.Hereâs a sample code snippet to print out the
toolslist:This will print out the names of all the tools in the
toolslist. You can then check if the tool youâre trying to access exists and if the name youâre providing matches exactly with the name of the tool.In addition, I found two similar issues in the LlamaIndex repository that might be helpful:
The first issue suggests modifying the code to include a check that ensures the tool exists before trying to call it. The second issue doesnât provide a specific solution for the error, but it might be worth checking out the discussion for any insights.
I hope this helps! If you have any other questions or need further clarification, feel free to ask.
Sources
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a đ if this is helpful and đ if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
đ¤
Based on the context provided, it seems like youâre encountering an
openai.NotFoundError: Error code: 404error due to an unrecognized request argument âtoolsâ. This error typically occurs when the requested resource is not found. In this case, it seems like the OpenAI model âgpt-4â youâre trying to use is not available or not found.As of the current OpenAI API version
2023-12-01-preview, there is no âgpt-4â model available. The available models are âgpt-3â, âdavinciâ, âcurieâ, âbabbageâ, and âadaâ.To fix this issue, you should use an available model. Hereâs how you can modify your code:
Please replace âgpt-3â with the model you have access to and try again. You can refer to the chat_engine_best.ipynb example for more details.
Regarding the changes between LlamaIndex library version 0.10.6 and the latest version v0.10.12, I wasnât able to find specific changes that could potentially resolve the
openai.NotFoundError: Error code: 404error. However, itâs generally recommended to use the latest version of the library to benefit from the latest features and bug fixes.If you continue to experience issues, please provide more details about your setup and the exact error message youâre seeing. This will help in diagnosing the problem more accurately.
Sources
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a đ if this is helpful and đ if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
đ¤
Hey @ramakrse! Great to see you diving into another challenge with LlamaIndex. Whatâs cooking this time?
Based on the error logs youâve provided, it seems like the
toolsargument in thechatmethod of thellama_index/agent/openai/step.pyfile is causing the issue. Thetoolsargument is a list ofBaseToolinstances used to perform various operations within the LlamaIndex framework. If the function name provided does not exist in thetoolslist, aValueErroris raised, which could be interpreted as a 404 error in a web context.To fix the error, ensure that the tool name provided exists in the
toolslist. Here is the relevant code:You can find this in the llama-index-legacy/llama_index/legacy/agent/openai/step.py file.
Additionally, there are a few similar issues in the LlamaIndex repository that have been resolved. You might find the discussions and solutions helpful:
These issues suggest using the correct Azure API version, the latest version of llama-index, and ensuring that your AzureOpenAI endpoint is set up correctly.
Unfortunately, I couldnât find any changes made in the LlamaIndex codebase between version 0.10.6 and the latest version 0.10.12 that could potentially resolve the
openai.NotFoundError: Error code: 404error. It might be worth trying to update to the latest version to see if the issue persists.If you continue to experience this issue, could you provide more information about your setup? Specifically, the tool names youâre using and how youâre setting up your
toolslist could be helpful for further troubleshooting.Sources
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a đ if this is helpful and đ if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
đ¤
Based on the context provided, it seems like youâre encountering a
ValueError: Tool with name functions not founderror when trying to set up a chat engine using theindex.as_chat_engine()method in LlamaIndex version 0.10.6. As a workaround, you can use the low-level APIs to directly instantiate one of the chat engine classes provided in thellama_index.core.chat_enginemodule.Hereâs an example of how you can set up the
SimpleChatEngine:You can replace
SimpleChatEnginewith any of the other available chat engines (CondenseQuestionChatEngine,ContextChatEngine,CondensePlusContextChatEngine) depending on your specific requirements.This approach bypasses the need to use a higher-level API that might be causing the
ValueError: Tool with name functions not founderror, by directly utilizing the classes defined for chat engine functionality.Unfortunately, I couldnât find specific documentation detailing the differences between the four chat engine classes (
SimpleChatEngine,CondenseQuestionChatEngine,ContextChatEngine,CondensePlusContextChatEngine) in thellama_index.core.chat_enginemodule. Itâs possible that the differences lie in how each class processes and responds to messages, but without more information, itâs difficult to say for certain.If you encounter any issues or need further clarification, please provide more details so I can assist you better.
Sources
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a đ if this is helpful and đ if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
đ¤
I apologize for the confusion. It seems there might have been a misunderstanding. As the maintainer of LlamaIndex, I strive to provide accurate and relevant recommendations. Could you please provide more details about your specific requirements and the context in which AzureOpenAIMultiModal was recommended? This will help me better understand your needs and provide you with the most appropriate guidance.
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a đ if this is helpful and đ if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
đ¤
Based on the error traceback you provided, it seems like the
OpenAIAgentWorkeris trying to call a tool named âfunctionsâ that it canât find. This is likely causing theValueError: Tool with name functions not foundyouâre encountering.The
get_function_by_namefunction, which is raising this error, is designed to retrieve a specific tool from a list of tools by its name. If the tool is not found in the list, it raises aValueError. This suggests that the tool named âfunctionsâ is not included in the list of tools youâre providing to theOpenAIAgentWorkerinstance, or thereâs a typo or case-sensitivity issue with the tool name.There are a few potential solutions to this issue based on similar solved issues in the LlamaIndex repository:
Modify the code to include a check that ensures the tool exists before trying to call it. This will prevent the agent from crashing when it tries to call a non-existent tool. You can refer to the solution provided in the issue #7170 for more details.
Ensure youâre using the correct Azure API version. Update the
api_versionto â2023-07-01-previewâ or the latest version supported by your Azure deployment. Verify that your AzureOpenAI endpoint is set up correctly and that it accepts thetoolsandtool_choicearguments. You can refer to the solution provided in the issue #9195 for more details.Openai updated their API. The correct kwarg is
tool_choice="query_engine_tool". You can refer to the solution provided in the issue #9315 for more details.Update LlamaIndex to the latest version if you havenât already. The latest version of LlamaIndex is v0.10.12.
If none of these solutions work, please provide more specific information about how youâre initializing and using the
OpenAIAgentWorkerand the tools youâre providing. This will help us better understand the issue and provide a more precise solution.Sources
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a đ if this is helpful and đ if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.