subfinder: Could not run enumeration (too many open files error)
What’s the problem (or question)?
While performing an automatic scan, the following problem occurred:
[FTL] Could not run enumeration: open /11/miningpools.cloud.txt: too many open files
After this message Subfinder stopped working.
How can we reproduce the issue?
- Start Subfinder with the following command,
cat '/URLsToScan.txt' | 'go/bin/subfinder' -config '/Config/subfinder/config.yaml' -t 100 -timeout 1 -silent -oD /11/ | tee /URLsToScanDONEsubfinder.txt
After 1034 URLS the error occurred. I don’t know if it is a problem in Subfinder or again a problem with my internet connection.
What are the running context details?
- Installation method (e.g.
pip
,apt-get
,git clone
orzip
/tar.gz
):GO111MODULE=on go get -v github.com/projectdiscovery/subfinder/cmd/subfinder
- Client OS : Linux
- Program version (see banner): 2.3.4
- Exception traceback (if any):
[FTL] Could not run enumeration: open /11/miningpools.cloud.txt: too many open files
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Reactions: 1
- Comments: 19 (9 by maintainers)
@bauthard - It seems like we need fdmax ?
Looks like a different error, it is not related to file descriptors limit.
Could you please paste the complete command?
It’s time to use fdmax, but IMO it should be used as an option. Thoughts?
Oh, I see, sorry.
Hi @vzamanillo @bauthard,
Thanks for the update, unfortunately I can’t test it at the moment because I have problems with the internet see the following tweet: https://twitter.com/zero_dot1/status/1306782690696466435 If my internet works properly again I will test it immediately. Thanks a lot for your work, I appreciate it very much.
@vzamanillo Yes, here is a list of 1261 mixed URLs. File: testurls.txt
@ZeroDot1 Could you please give me an URL test file?
A quick solution would be to split the lists with the URLs into different files. However, this means a significant amount of additional work when maintaining the lists. With other tools, e.g. Findomain, I don’t have this problem because I successfully tested a URL list with 50.000 URLs without any interruption.
While researching for the CoinBlockerLists I tried a lot of tools and solutions. Subfinder is the only solution that can deliver the most subdomains, that’s why I use Subfinder a lot. Subfinder is definitely very useful for my work on the CoinBlockerLists, and I really appreciate the work you guys do. Thank you very much.
Thnka you very much, @ZeroDot1
I don’t know if the right thing is to do something about at development level because this is an OS ulimit issue. I think the use of some technice to bypass the ulimit is not correct because is not legitimate it is up the the user the choice of increment this or not.
@ZeroDot1 Can you try to reproduce this with the latest release?
Can we test this against the new 2.4 release?
We have to check that all the resources used in each source are released correctly before using
fdmax
, it may not all be released correctly, I think it is a good exercise, there are somegolang
gotchas fordefer
, ej. it does not work as expected in iteration blocks.