infi.clickhouse_orm: Very long inserts and MemoryError
Bulk inserts work very slow via ORM. I insert 1k of rows into table. Here is the time measure of code:
t2 = time.time()
statsd.timing(statsd_key_prefix + '.import_clickhouse_model_objects.convert', t2 - t1)
for model_group in convert_items.values():
settings.CLICKHOUSE_DB.insert(model_group, batch_size=batch_size)
t3 = time.time()
I know, that convert_items dict contains only 1 item. So, in fact, only insert is measured.
batch_size is 1000 here. There are 1000 elements in the list.
I’ve also tried making batch_size=10000, but the query failed with MemoryError on backend size…
ClickHouse server is not loaded at all (8 cores, 16gb memory). htop doesn’t show any activity practically.
Here is the graph, where you can see inserts are executed 60-100 seconds… Very very slow.

Another query in my code makes insert from select via settings.CLICKHOUSE_DB.raw(query). It inserts 200 rows and executes for 1-2 seconds only
About this issue
- Original URL
- State: closed
- Created 7 years ago
- Comments: 17 (17 by maintainers)
Most upvoted comments
Awesome
+1
M1ha-Shvn on Apr 6, 2017