requests: Possible Memory Leak
I’m crawling a lot of different URLs using the requests library and I encountered that the process takes more and more RAM over time. Basically all I do is calling this iteratively from multiple threads:
r = requests.get(url=url, timeout=timeout)
content = r.text
when I comment out the second line this issue does not occur … am I using the library fundamentally wrong or could this actually be a bug?
About this issue
- Original URL
- State: closed
- Created 7 years ago
- Comments: 23 (9 by maintainers)
Using python 2.7.15 and requests 2.21.0 And I encountering the exact same memory leak.