chainlit: Llama Index Integration Broken?

The llama-index integration example from Chainlit doc (https://docs.chainlit.io/integrations/llama-index) throws LookupError: <ContextVar name='emitter' at 0x7f180cba02c0> - pretty sure it was working a few weeks ago.

The source code is exactly the same as the example. Full error message:

Exception in thread Thread-2:
Traceback (most recent call last):
  File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/chainlit/context.py", line 20, in get_emitter
    return emitter_var.get()
LookupError: <ContextVar name='emitter' at 0x7f180cba02c0>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ibicdev/anaconda3/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    self.run()
  File "/home/ibicdev/anaconda3/lib/python3.9/threading.py", line 910, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/llama_index/llms/base.py", line 133, in wrapped_llm_chat
    event_id = callback_manager.on_event_start(
  File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/llama_index/callbacks/base.py", line 80, in on_event_start
    handler.on_event_start(event_type, payload, event_id=event_id, **kwargs)
  File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/chainlit/llama_index/callbacks.py", line 44, in on_event_start
    Message(
  File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/chainlit/message.py", line 176, in __init__
    super().__post_init__()
  File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/chainlit/message.py", line 33, in __post_init__
    self.emitter = get_emitter()
  File "/home/ibicdev/anaconda3/lib/python3.9/site-packages/chainlit/context.py", line 22, in get_emitter
    raise ChainlitContextException()
chainlit.context.ChainlitContextException: Chainlit context not found

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Reactions: 2
  • Comments: 15 (5 by maintainers)

Most upvoted comments

The issue is with the latest llama-index version. If you try with 0.7.17 it should be working. Will investigate what is the breaking change.

Works with llama-index==0.7.17. Thanks!

@willydouhard It’s not working for me as well

❯ chainlit run .\llama_index.py -w
2023-08-06 16:10:11 - Created default config file at C:\Users\siddh\OneDrive\Desktop\Code\chainlit\apps\llama_index\.chainlit\config.toml
2023-08-06 16:10:15 - Loading all indices.
2023-08-06 16:10:15 - Your app is available at http://localhost:8000
Exception in thread Thread-1:
Traceback (most recent call last):
  File "C:\Users\siddh\miniconda3\envs\py38\lib\site-packages\chainlit\context.py", line 20, in get_emitter
    return emitter_var.get()
LookupError: <ContextVar name='emitter' at 0x0000016F83D4F4A0>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\siddh\miniconda3\envs\py38\lib\threading.py", line 932, in _bootstrap_inner
    self.run()
  File "C:\Users\siddh\miniconda3\envs\py38\lib\threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Users\siddh\miniconda3\envs\py38\lib\site-packages\llama_index\llms\base.py", line 133, in wrapped_llm_chat
    event_id = callback_manager.on_event_start(
  File "C:\Users\siddh\miniconda3\envs\py38\lib\site-packages\llama_index\callbacks\base.py", line 80, in on_event_start
    handler.on_event_start(event_type, payload, event_id=event_id, **kwargs)
  File "C:\Users\siddh\miniconda3\envs\py38\lib\site-packages\chainlit\llama_index\callbacks.py", line 45, in on_event_start
    Message(
  File "C:\Users\siddh\miniconda3\envs\py38\lib\site-packages\chainlit\message.py", line 168, in __init__
    super().__post_init__()
  File "C:\Users\siddh\miniconda3\envs\py38\lib\site-packages\chainlit\message.py", line 35, in __post_init__
    self.emitter = get_emitter()
  File "C:\Users\siddh\miniconda3\envs\py38\lib\site-packages\chainlit\context.py", line 22, in get_emitter
    raise ChainlitContextException()
chainlit.context.ChainlitContextException: Chainlit context not found

UI is stuck at the LLM step: image

I am using chainlit==0.6.1

Just did a bisect of llama_index and the faulty PR is https://github.com/jerryjliu/llama_index/pull/7112

@Skisquaw can you please open another issue for your problem?

I’m getting a different issue. I use the above in a clean python 3.9 conda env with /storage deleted and the index is not working properly. Try this:

User 03:50:10 PM who is Pat Gelsinger? Chatbot 03:50:10 PM Based on the given context information, there is no mention of Pat Gelsinger. Therefore, it is not possible to determine who Pat Gelsinger is based on this information alone.