Some websites randomly downloading wikipedia
|Jerry||06/02/2009 04:52 am|
|So here are 2 websites I try to download
Towards the end of downloading either of them, I will then notice that OE starts to download random pages and media from wikipedia and wikimedia. It just keeps going and going and you eventually notice it try''s to download every single wiki-related main site, even non-english sites. And as far as I know there are no links of any kind on either of these sites to any wiki site. I tried to put in the URL filter setting "http://*wiki*" and hoped that would help with stopping the download but it didn''t. What can I do to stop downloading from the wiki sites?
|Oleg Chernavin||06/02/2009 05:46 am|
|The easiest way is to allow downloading only from the starting server in URL Filters - Server section. Also, please see if File Filters categories use "Load using URL Filters settings" in the Location box.
|Jerry||06/02/2009 11:33 pm|
|> The easiest way is to allow downloading only from the starting server in URL Filters - Server section. Also, please see if File Filters categories use "Load using URL Filters settings" in the Location box.
> Best regards,
> Oleg Chernavin
> MP Staff
Now I am having a slightly different problem. I want to download http://www.infiltration.org/ and when I do it it also downloads http://infiltration.org/ which I do not need since that has the same data, so I then tell it to just download the server http://www.infiltration.org/ yet it still downloads http://infiltration.org/
|Oleg Chernavin||06/03/2009 07:17 am|
|Please open the Properties dialog - Parsing section, click URL Substitutes and add the rule:
This will make the download correct with no duplicates.