chainlit: Llama Index Integration Broken?
The llama-index integration example from Chainlit doc (https://docs.chainlit.io/integrations/llama-index) throws LookupError: <ContextVar name='emitter' at 0x7f180cba02c0> - pretty sure it was working a few weeks ago.
The source code is exactly the same as the example. Full error message:
Exception in thread Thread-2:
Traceback (most recent call last):
File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/chainlit/context.py", line 20, in get_emitter
return emitter_var.get()
LookupError: <ContextVar name='emitter' at 0x7f180cba02c0>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ibicdev/anaconda3/lib/python3.9/threading.py", line 973, in _bootstrap_inner
self.run()
File "/home/ibicdev/anaconda3/lib/python3.9/threading.py", line 910, in run
self._target(*self._args, **self._kwargs)
File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/llama_index/llms/base.py", line 133, in wrapped_llm_chat
event_id = callback_manager.on_event_start(
File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/llama_index/callbacks/base.py", line 80, in on_event_start
handler.on_event_start(event_type, payload, event_id=event_id, **kwargs)
File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/chainlit/llama_index/callbacks.py", line 44, in on_event_start
Message(
File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/chainlit/message.py", line 176, in __init__
super().__post_init__()
File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/chainlit/message.py", line 33, in __post_init__
self.emitter = get_emitter()
File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/chainlit/context.py", line 22, in get_emitter
raise ChainlitContextException()
chainlit.context.ChainlitContextException: Chainlit context not found
About this issue
- Original URL
- State: closed
- Created a year ago
- Reactions: 2
- Comments: 15 (5 by maintainers)
The issue is with the latest llama-index version. If you try with
0.7.17it should be working. Will investigate what is the breaking change.Works with
llama-index==0.7.17. Thanks!@willydouhard It’s not working for me as well
UI is stuck at the LLM step:
I am using
chainlit==0.6.1Just did a bisect of llama_index and the faulty PR is https://github.com/jerryjliu/llama_index/pull/7112
@Skisquaw can you please open another issue for your problem?
I’m getting a different issue. I use the above in a clean python 3.9 conda env with /storage deleted and the index is not working properly. Try this:
User 03:50:10 PM who is Pat Gelsinger? Chatbot 03:50:10 PM Based on the given context information, there is no mention of Pat Gelsinger. Therefore, it is not possible to determine who Pat Gelsinger is based on this information alone.