How do I get rid of duplicate downloaded files?

James_K
07/10/2009 03:36 pm
As I am downloading or after I have downloaded a website, is there a way Offline Explorer Pro can search for duplicate files, and just keep one copy and change the links to just that file?
Oleg Chernavin
07/10/2009 07:39 pm
So far it is not implemented. Can you give me several samples of duplicate files with different URLs? I will see if there could be some workaround.

Thank you!

Best regards,
Oleg Chernavin
MP Staff
James_K
07/10/2009 07:57 pm
> So far it is not implemented. Can you give me several samples of duplicate files with different URLs? I will see if there could be some workaround.
>
> Thank you!
>
> Best regards,
> Oleg Chernavin
> MP Staff

Sometimes I will download http://www.website.com/ and it will also download http://website.com/ (which is a duplicate site) I know there is a URL filter for that kind of situation, but I would not know beforehand whether a site will do that or not, and I would have to literally go through every directory for every site I download to double check, then re-download if necessary
Oleg Chernavin
07/11/2009 05:38 am
Please use Project Properties - Parsing - URL Substitutes and add the rule:

URL:
http://website.com/
Replace:
http://website.com/*
With:
http://www.website.com/*

This will get rid of all duplicates when you download your site.

Oleg.