langchain: create_extraction_chain does not work with other LLMs? Replicate models fails to load
System Info
LangChain 0.0.281
Who can help?
No response
Information
- The official example notebooks/scripts
- My own modified scripts
Related Components
- LLMs/Chat Models
- Embedding Models
- Prompts / Prompt Templates / Prompt Selectors
- Output Parsers
- Document Loaders
- Vector Stores / Retrievers
- Memory
- Agents / Agent Executors
- Tools / Toolkits
- Chains
- Callbacks/Tracing
- Async
Reproduction
Use this lines of code
schema = {
"properties": {
"visit": {"type": "string"},
"date": {"type": "string"},
"gender": {"type": "string"},
"age": {"type": "integer"},
}
}
inp = """This 23-year-old white female presents with complaint of allergies.
She used to have allergies when she lived in Seattle but she thinks they are worse here.
In the past, she has tried Claritin, and Zyrtec. Both worked for short time but then seemed to lose effectiveness. """
llm = Replicate(
model="a16z-infra/llama13b-v2-chat:df7690f1994d94e96ad9d568eac121aecf50684a0b0963b25a41cc40061269e5",
input={"temperature": 0.75, "max_length": 500, "top_p": 1},
)
chain = create_extraction_chain(schema, llm)
chain.run(inp)
OutputParserException Traceback (most recent call last)
[<ipython-input-9-5e77f11609b2>](https://localhost:8080/#) in <cell line: 72>()
70 )
71 chain = create_extraction_chain(schema, llm)
---> 72 chain.run(inp)["data"]
8 frames
[/usr/local/lib/python3.10/dist-packages/langchain/output_parsers/openai_functions.py](https://localhost:8080/#) in parse_result(self, result)
21 generation = result[0]
22 if not isinstance(generation, ChatGeneration):
---> 23 raise OutputParserException(
24 "This output parser can only be used with a chat generation."
25 )
OutputParserException: This output parser can only be used with a chat generation.
Expected behavior
Structrued JSON based on schema
About this issue
- Original URL
- State: closed
- Created 10 months ago
- Reactions: 3
- Comments: 16 (2 by maintainers)
🤖
Hello,
Thank you for reaching out and providing detailed information about the issue you’re facing.
The
create_extraction_chain
function is designed to work with specific language learning models (LLMs) and it seems like theReplicate
model you’re trying to use might not be fully compatible with it. The function might be using specific methods or properties that are only available in certain LLMs, and not in theReplicate
model.As for the
Replicate
model failing to load, it could be due to a missing or incorrectREPLICATE_API_TOKEN
. TheReplicate
model requires theREPLICATE_API_TOKEN
environment variable to be set with your API token. If the token is missing or incorrect, the model will fail to load and result in anOutputParserException
.Another possible reason for the
Replicate
model failing to load could be the absence of thereplicate
Python package. TheReplicate
model requires thereplicate
Python package to be installed. If it’s not installed, anImportError
will be raised.Please ensure that the
REPLICATE_API_TOKEN
is correctly set and thereplicate
Python package is installed. You can install thereplicate
package using pip:If you’ve already done these steps and are still facing the issue, it would be helpful if you could provide more information about your environment, such as the Python version you’re using and the list of installed packages (
pip freeze
).I hope this helps! Let me know if you have any other questions or if there’s anything else I can do to assist you.
Best, Dosu
Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
Has anyone solved this issue?
Year 2023 Same Question.
Code
ISSUE
raise OutputParserException( langchain_core.exceptions.OutputParserException: This output parser can only be used with a chat generation.
Extraction use case docs have been revamped: https://python.langchain.com/docs/use_cases/extraction
Please see documentation there with recommendations on how to extraction with LangChain.
It was indeed the problem. I managed to make it work wit the Kor library, which is just another way to create output parsors.
Thanks a lot for your response & help.
Really appreciate it !