Suggestion: Flatten Structures

Author Message
Thomas Kunka 03/29/2006 07:01 am
It would be nice to have an option to ignore the directory structure of the website and dump everything into on project directory. Some sites have `changing` paths for security so as the program crawls through the directorys, it ends up downloading a duplicate set of files in another directory and the OffLine Explorer Pro can`t see that the files are the same even with the options selected not to download duplicate files.
Oleg Chernavin 03/29/2006 07:01 am
Thank you for the suggestions. I see what you mean, but the only way to make a flat directory is via the File | Export feature only. Keeping original directories is important for Offline Explorer, because it can restore original URLs for Project updates from filenames.

Best regards,
Oleg Chernavin
MetaProducts corp.
Thomas Kunka 03/29/2006 07:01 am
Yea, most of the time you would do it like you suggest but there are cases where the `dynamicly changing directories` make your entire download a moot point and it will repeatedly run thru the entire thing and never even end. You can keep the paths the same in the list to be downloaded but dump all the files into one directory when downloaded and maybe put in a message bar indicating that it appears that the same files are being downloaded because the files already exist to give the user an idea if they are in some sort of loop. Maybe have an alternative saved name schema set if the size or other descriptors are the same so if there are different files with the same name in different paths, you would get them all and the system would not think that are a duplicate.

tk


> Thank you for the suggestions. I see what you mean, but the only way to make a flat directory is via the File | Export feature only. Keeping original directories is important for Offline Explorer, because it can restore original URLs for Project updates from filenames.
>
> Best regards,
> Oleg Chernavin
> MetaProducts corp.
Oleg Chernavin 03/29/2006 07:01 am
The biggest problem is that many sites have simular filenames for different files, so it will be really hard to understand if files really differ or not. Comparing files contents with multiple others would greatly slow down the download.

Maybe you would let me know an example of a site that changes directories all the time and maybe I will be able to come with some solution?

Oleg.