requests: iter_content slow with large chunk size on HTTPS connection

https = requests.get(url, stream=True)
for content in https.iter_content(100 * 2 ** 20): # 100MB
    pass

This causes 100% CPU and slows down throughput to less than 1MB/s.

About this issue

  • Original URL
  • State: closed
  • Created 8 years ago
  • Comments: 22 (11 by maintainers)

Most upvoted comments

So @pdknsk, after further investigation prodded by @njsmith, you’ve actually stumbled onto two bugs that are wasting a lot of CPU and electricity.

The first is that CFFI should be using more efficient allocation. The second, and much more important, is that there is a region of allocations between about 128kB and 125MB where macOS (and, presumably, iOS/watchOS/tvOS as well) forcibly page-in and zero memory when they don’t need to, wasting a ton of CPU to do it.

Nicely done!