proxy.py: [Core] Default send buffer size must be configurable

I notice that queueing a huge response in a plugins "handle_client_request()" method is very slow.

def handle_client_request(self, request: HttpParser) -> Optional[HttpParser]:
    if some condition:
        return handle_request_locally()
    # else access remote resource
    return request

def handle_request_locally(self) -> None:
    f = open('my_file', 'rb')
    file_data = f.read()
    f.close()
    self.client.queue(
        okResponse(
            file_data,
            {b'Content-Type': b'application/octet-stream'},
            conn_close=True,
    ))

Connecting to the proxy via a browser and thereby accessing a huge file results in very slow handling of the data. My question: is there a bottleneck somewhere in proxy.py or is it simply the wrong approach? Even when running proxy.py on localhost and no network transfer is involved the browser indicates that it will take hours or days to transfer the data.

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Comments: 16 (12 by maintainers)

Most upvoted comments

@achen353 Kindly check finding by @softhub-software-development , may be it can help you with #1044 too, can you try and let us know. Thank you!!!

@abhinavsingh Thank you for the advice. I’ve been attempting to get access to our team’s public server for testing. I’ll definitely try this out and keep you posted.

Setting the DEFAULT_MAX_SEND_SIZE to 128k results in an error message “OSError when flushing buffer to client”. Increasing it from 16k to 64k seems ok.

I noticed that increasing the DEFAULT_MAX_SEND_SIZE in proxy.common.constants.py from 16k to 128k improves performance significantly.