ClickHouse: Memory limit (total) exceeded issues with wait_end_of_query=1
Using the version a2ed8267add1cfc28fa2ca17f7227a816921e7d7 in case I run a QUERY like “SELECT 1” in loop using wait_end_of_query=1
after 7000 iterations all queries will fail with:
DynamicQueryHandler: Code: 241, e.displayText() = DB::Exception: Memory limit (total) exceeded: would use 14.40 GiB (attempt to allocate chunk of 4198133 bytes), maximum: 14.40 GiB, Stack trace (when copying this message, always include the lines below):
If wait_end_of_query
is not specified the problem doesn’t occur.
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 23 (5 by maintainers)
Commits related to this issue
- Fix memory accounting via HTTP interface function perf_test() { time yes '127.1:8123/?wait_end_of_query=1' | head -n10000 | xargs -P10000 curl -s -d 'select 1' | grep -x -c 1 } function server() ... — committed to azat/ClickHouse by azat 4 years ago
- Backport #11920 to 20.5: Fix using current database while checking access rights. (#11963) * Simple github hook * Add concurrent benchmark to performance test After the main test, run queries f... — committed to ClickHouse/ClickHouse by abyss7 4 years ago
Yes, this is kind of expecting behaviour
You can add the following into
config.xml
:To verify take a look into the logs:
And yes it pops up only now, since
max_server_memory_usage
has been added only in 20.4 (#10421, #10362)P.S. this memory tracking accounting will be reset each 90 seconds (AFAIR), but due to this queries are too fast (since they are no-op) this does not work