fastapi: A running background task will block new requests from the OpenAPI view when using a metrics exporter middleware

Describe the bug

To Reproduce

Steps to reproduce the behavior with a minimum self-contained file.

Replace each part with your own scenario:

  1. Create a file with:
from time import sleep

from fastapi import FastAPI, BackgroundTasks
from starlette_exporter import PrometheusMiddleware, handle_metrics


app = FastAPI()
app.add_middleware(PrometheusMiddleware)
app.add_route("/metrics", handle_metrics)


def task(seconds):
    print("Starting task, sleep", seconds)
    sleep(seconds)
    print("Done task, sleep", seconds)


@app.get("/{seconds}")
async def test(seconds: int, background_tasks: BackgroundTasks):
    print(f"In route with seconds={seconds}")
    background_tasks.add_task(task, seconds=seconds)
    return f"{seconds} hello world!"
  1. Open the OpenAPI docs view and click “Try It Now”.
  2. Send a request with 10 seconds (or something else but make it long enough).
  3. The response will return immediately.
  4. Immediately send another request and the request will block until the backgroud task will finish.

Expected behavior

Requests shouldn’t be blocked by background tasks

Screenshots

If applicable, add screenshots to help explain your problem.

Environment

  • OS: Linux
  • FastAPI Version 0.52.0
  • Python version, 3.7.7

Additional context

At first, I thought about opening this issue in starlette-prometheus package but then I tried it with starlette-exporter as you can see in the example and it still happens. I don’t know why is it reproducing with the OpenAPI view but not via curl and I don’t have the knowledge to investigate this issue.

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Reactions: 15
  • Comments: 15 (2 by maintainers)

Most upvoted comments

I have hit a similar issue, but I think this bug is actually a specific case of https://github.com/encode/starlette/issues/919. So a Starlette bug, not a FastAPI bug.

Same problem here, with the GZipMiddleware enabled the request is blocked until the background task finishes. If I disable it it works as expected.

The curious thing is that the log shows that the request was correctly processed, so I initially assumed it was a reverse proxy problem.

I don’t use any middleware but it happens on my app too. Why does it happen?

@ycd that seems to relate to apps that use middleware extending BaseHTTPMiddleware and to my knowledge my app isn’t using that:

(Pdb) app.middleware_stack
<starlette.middleware.errors.ServerErrorMiddleware object at 0x1095e3970> # ServerErrorMiddleware doesn't extend BaseHTTPMiddleware
(Pdb) app.user_middleware
[]

seems like my issue was actually due to my database. tried using postgres instead of sqlite and it worked fine

I’m experiencing a similar issue with the elastic apm-agent-python - running a background task blocks the request that initiated it from returning. When not adding the apm middleware, the response returns immediately as expected, and the background task runs separately until it finished.

For the time being I’ve resorted to:


from asyncio import run
from threading import Thread

def _async_run(*args, **kwargs):
    target = kwargs.pop("__target")
    run(target(*args, **kwargs))


def run_in_thread(target, *args, **kwargs):
    kwargs["__target"] = target
    thread = Thread(target=_async_run, args=args, kwargs=kwargs, daemon=True)
    thread.start()


@router.get("/")
def my_route():
    run_in_thread(my_long_running_coroutine, arg1, arg2, kwarg1="ok")

I do imagine this has an impact on APM/metrics, as they are likely not being traced. Good chance they’re instrumentable, if that’s necessary.

Same here, could verify the issue with this code:

from typing import Optional
import time
from fastapi import FastAPI, BackgroundTasks, Request
from fastapi.middleware.gzip import GZipMiddleware
app = FastAPI()

def lol():
    import time
    time.sleep(15)

app.add_middleware(GZipMiddleware, minimum_size=1000)

@app.middleware("http")
async def add_process_time_header(request: Request, call_next):
    start_time = time.time()
    response = await call_next(request)
    process_time = time.time() - start_time
    response.headers["X-Process-Time"] = str(process_time)
    return response

@app.get("/")
def read_root(background_tasks: BackgroundTasks):
    background_tasks.add_task(lol)
    return {"Hello": "World"}


@app.get("/items/{item_id}")
def read_item(item_id: int, q: Optional[str] = None):
    return {"item_id": item_id, "q": q}

No, the first request is handled correctly and fast. The second one is stuck in the OpenAPI interface and it doesnt even get to the route function. I tried both sync and async tasks.