Every site in the world!

03/29/2006 07:02 am
I was downloading a site that kept lots of files on different servers. So I set it to basically spider without restrictions, figuring I`d catch the external links. I also added "link" and "links" to the html filtering page.

But I wound up blocking site after site after site. I must have added a thousand url`s to "Skip the following URL`s while downloading" and there`s no end in site. What should I do?
Oleg Chernavin
03/29/2006 07:02 am
Maybe do the following - restrict downloading to the starting server and then allow up to X levels on other servers?