langchain: SQLDatabaseChain. It's "azure-openai-chat LLM cannot be loaded
System Info
Hi @harrisoUnable to load the model logged using mlflow
import mlflow
logged_model = 'runs:/runid/sql_database_chain'
# Load model as a PyFuncModel.
loaded_model = mlflow.pyfunc.load_model(logged_model)
mlflow -> 2.7.2.dev0 langchain ->0.0.297 langchain-experimental -> 0.0.20
Exception:
`ValueError Traceback (most recent call last) File <command-594169214443775>:5 2 logged_model = ‘runs:/8998b4fd57c743fe8e0dae9a19ca5155/sql_database_chain’ 4 # Load model as a PyFuncModel. ----> 5 loaded_model = mlflow.pyfunc.load_model(logged_model)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/mlflow/pyfunc/init.py:637, in load_model(model_uri, suppress_warnings, dst_path) 635 data_path = os.path.join(local_path, conf[DATA]) if (DATA in conf) else local_path 636 try: –> 637 model_impl = importlib.import_module(conf[MAIN])._load_pyfunc(data_path) 638 except ModuleNotFoundError as e: 639 if conf[MAIN] == _DATABRICKS_FS_LOADER_MODULE:
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/mlflow/langchain/init.py:778, in _load_pyfunc(path)
773 “”"
774 Load PyFunc implementation for LangChain. Called by pyfunc.load_model
.
775 :param path: Local filesystem path to the MLflow Model with the langchain
flavor.
776 “”"
777 wrapper_cls = _TestLangChainWrapper if _MLFLOW_TESTING.get() else _LangChainModelWrapper
–> 778 return wrapper_cls(_load_model_from_local_fs(path))
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/mlflow/langchain/init.py:807, in _load_model_from_local_fs(local_model_path) 804 model_type = flavor_conf.get(_MODEL_TYPE_KEY) 805 loader_arg = flavor_conf.get(_LOADER_ARG_KEY) –> 807 return _load_model( 808 lc_model_path, 809 model_type, 810 loader_arg, 811 agent_model_path, 812 tools_model_path, 813 agent_primitive_path, 814 loader_fn_path, 815 persist_dir, 816 )
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/mlflow/langchain/init.py:660, in _load_model(path, model_type, loader_arg, agent_path, tools_path, agent_primitive_path, loader_fn_path, persist_dir) 658 model = _RetrieverChain.load(path, **kwargs).retriever 659 else: –> 660 model = load_chain(path, **kwargs) 661 elif agent_path is None and tools_path is None: 662 model = load_chain(path)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:595, in load_chain(path, **kwargs) 593 return hub_result 594 else: –> 595 return _load_chain_from_file(path, **kwargs)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:622, in _load_chain_from_file(file, **kwargs) 619 config[“memory”] = kwargs.pop(“memory”) 621 # Load the chain from the config now. –> 622 return load_chain_from_config(config, **kwargs)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:585, in load_chain_from_config(config, **kwargs) 582 raise ValueError(f"Loading {config_type} chain not supported") 584 chain_loader = type_to_loader_dict[config_type] –> 585 return chain_loader(config, **kwargs)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:369, in _load_sql_database_chain(config, **kwargs) 367 if “llm_chain” in config: 368 llm_chain_config = config.pop(“llm_chain”) –> 369 chain = load_chain_from_config(llm_chain_config) 370 return SQLDatabaseChain(llm_chain=chain, database=database, **config) 371 if “llm” in config:
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:585, in load_chain_from_config(config, **kwargs) 582 raise ValueError(f"Loading {config_type} chain not supported") 584 chain_loader = type_to_loader_dict[config_type] –> 585 return chain_loader(config, **kwargs)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:41, in _load_llm_chain(config, **kwargs) 39 if “llm” in config: 40 llm_config = config.pop(“llm”) —> 41 llm = load_llm_from_config(llm_config) 42 elif “llm_path” in config: 43 llm = load_llm(config.pop(“llm_path”))
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/llms/loading.py:19, in load_llm_from_config(config) 16 config_type = config.pop(“_type”) 18 if config_type not in type_to_cls_dict: —> 19 raise ValueError(f"Loading {config_type} LLM not supported") 21 llm_cls = type_to_cls_dict[config_type] 22 return llm_cls(**config)
ValueError: Loading azure-openai-chat LLM not supported `
Who can help?
na
Information
- The official example notebooks/scripts
- My own modified scripts
Related Components
- LLMs/Chat Models
- Embedding Models
- Prompts / Prompt Templates / Prompt Selectors
- Output Parsers
- Document Loaders
- Vector Stores / Retrievers
- Memory
- Agents / Agent Executors
- Tools / Toolkits
- Chains
- Callbacks/Tracing
- Async
Reproduction
na
Expected behavior
it should able to load the model
About this issue
- Original URL
- State: open
- Created 9 months ago
- Comments: 15
🤖
Hello,
The error message you’re encountering suggests that the LangChain version you’re using (0.0.297) does not support the azure-openai-chat language model.
To resolve this issue, you can use the AzureOpenAI class provided in the LangChain framework. This class is a wrapper for the Azure OpenAI chat completion API and can be used to interact with the azure-openai-chat model. Here is an example of how you can use the AzureOpenAI class:
Please replace
"35-turbo-dev"
with your actual deployment name and"2023-05-15"
with your actual API version.You need to have the
openai
Python package installed and the following environment variables set or passed in the constructor in lower case:OPENAI_API_TYPE
(default:azure
)OPENAI_API_KEY
OPENAI_API_BASE
OPENAI_API_VERSION
OPENAI_PROXY
The above code is just an example and the actual implementation may vary depending on the specific requirements of the azure-openai-chat model.
In your context, it seems like you are trying to load the model using the MlflowLogger class in LangChain. If you are trying to load the model logged using mlflow, you might be facing the issue because the MlflowLogger class in LangChain uses the
mlflow.langchain.log_model(chain, "langchain-model")
method to log the model. This method might not be compatible with the azure-openai-chat model. You might need to modify the MlflowLogger class to make it compatible with the azure-openai-chat model.Additionally, the LangChain framework uses the
openai
Python package for interacting with the OpenAI API. If you are trying to load the azure-openai-chat model, you might need to ensure that theopenai
Python package is installed and the necessary environment variables are set. You might also need to ensure that theopenai
Python package is compatible with the azure-openai-chat model.As for the changes in LangChain version 0.0.300, I couldn’t find specific information about changes that could potentially resolve the “Loading azure-openai-chat LLM not supported” error. It would be helpful if you could provide more information about the changes you made or the specific features you used in version 0.0.300.
I hope this helps! If you have any other questions or need further clarification, feel free to ask.
Best, Dosu
Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.