ComfyUI-N-Nodes: AttributeError: 'Logger' object has no attribute 'fileno'

Got this error when trying to inference.

Traceback (most recent call last):
  File "C:\sdComfyUI\ComfyUI_windows_portable_nvidia_cu118_or_cpu\ComfyUI_windows_portable\ComfyUI\execution.py", line 152, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
  File "C:\sdComfyUI\ComfyUI_windows_portable_nvidia_cu118_or_cpu\ComfyUI_windows_portable\ComfyUI\execution.py", line 82, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
  File "C:\sdComfyUI\ComfyUI_windows_portable_nvidia_cu118_or_cpu\ComfyUI_windows_portable\ComfyUI\execution.py", line 75, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
  File "C:\sdComfyUI\ComfyUI_windows_portable_nvidia_cu118_or_cpu\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-N-Nodes\py\gptcpp_node.py", line 50, in load_gpt_checkpoint
    llm = Llama(model_path=ckpt_path,n_gpu_layers=gpu_layers,verbose=False,n_threads=n_threads, n_ctx=4000, )
  File "C:\sdComfyUI\ComfyUI_windows_portable_nvidia_cu118_or_cpu\ComfyUI_windows_portable\python_embeded\lib\site-packages\llama_cpp\llama.py", line 319, in __init__
    with suppress_stdout_stderr():
  File "C:\sdComfyUI\ComfyUI_windows_portable_nvidia_cu118_or_cpu\ComfyUI_windows_portable\python_embeded\lib\site-packages\llama_cpp\utils.py", line 11, in __enter__
    self.old_stdout_fileno_undup = sys.stdout.fileno()
AttributeError: 'Logger' object has no attribute 'fileno'

I was able to build llama cpp from source using cuda toolkit 12.1

About this issue

  • Original URL
  • State: closed
  • Created 10 months ago
  • Comments: 15 (4 by maintainers)

Most upvoted comments

Regarding the error: “AttributeError: ‘Logger’ object has no attribute ‘fileno’”, it occurred in my case due to the “ComfyUI-Manager” custom node. When I commented out line 41 in the “prestartup_script.py” file within the “ComfyUI-Manager”, the error disappeared.

What is it on line 41? My prestartup_script.py looks like: 41 else: 42 self.sync_write(message)

The prestartup_script.py file has changed in recent updates of ComfyUI-Manager. To make the nodes work properly, you now need to comment out lines 75 and 76:

    #sys.stdout = Logger(True)
    #sys.stderr = Logger(False)

Essentially, what’s happening is that ComfyUI-Manager is ‘hijacking’ the sys.stdout object and replacing it with its own Logger class, which lacks the fileno() method. As far as I understand, there’s little that Nuked can do to mitigate this issue, since it’s LLama.cpp that requires the fileno() method.

Regarding the error: “AttributeError: ‘Logger’ object has no attribute ‘fileno’”, it occurred in my case due to the “ComfyUI-Manager” custom node. When I commented out line 41 in the “prestartup_script.py” file within the “ComfyUI-Manager”, the error disappeared.

Regarding the error: “AttributeError: ‘Logger’ object has no attribute ‘fileno’”, it occurred in my case due to the “ComfyUI-Manager” custom node. When I commented out line 41 in the “prestartup_script.py” file within the “ComfyUI-Manager”, the error disappeared.

What is it on line 41? My prestartup_script.py looks like: 41 else: 42 self.sync_write(message)

The prestartup_script.py file has changed in recent updates of ComfyUI-Manager. To make the nodes work properly, you now need to comment out lines 75 and 76:

    #sys.stdout = Logger(True)
    #sys.stderr = Logger(False)

Essentially, what’s happening is that ComfyUI-Manager is ‘hijacking’ the sys.stdout object and replacing it with its own Logger class, which lacks the fileno() method. As far as I understand, there’s little that Nuked can do to mitigate this issue, since it’s LLama.cpp that requires the fileno() method.

It works now!!. Thank you to everyone who creates and makes it possible for us to use these tools.

Regarding the error: “AttributeError: ‘Logger’ object has no attribute ‘fileno’”, it occurred in my case due to the “ComfyUI-Manager” custom node. When I commented out line 41 in the “prestartup_script.py” file within the “ComfyUI-Manager”, the error disappeared.

Thanks for the tip, works now, I can confirm that now I’m able to use ggml v3 and gguf models

ggufWorks2

Works on a fresh install of comfy with only installing these 2 custom nodes, and works with ggml v3 llms

extra2nodes

ggmlWorks

I’ll investigate what’s causing it to not work when all these other custom nodes are installed

otherCustomNodes

Thanks for your work though, I was looking to be able to use local llms along with comfy for so long and you’ve delivered.