euscan: robots.txt, timeout, user-agent, ...
- Add a blacklist for robots.txt, we *want* to scan sourceforge - Set a user-agent that doesn't looks like a browser - Handle timeouts more carefully - If brute force detect too much versions, avoid infinite loops - Handle redirections more carefully Signed-off-by: Corentin Chary <corentincj@iksaif.net>
This commit is contained in:
@ -172,6 +172,10 @@ def brute_force(cpv, url):
|
||||
|
||||
result.append([url, version])
|
||||
|
||||
if len(result) > CONFIG['brute-force-false-watermark']:
|
||||
output.einfo("Broken server detected ! Skipping brute force.")
|
||||
return []
|
||||
|
||||
if CONFIG["brute-force-recursive"]:
|
||||
for v in helpers.gen_versions(list(components), CONFIG["brute-force"]):
|
||||
if v not in versions and tuple(v) not in done:
|
||||
|
Reference in New Issue
Block a user