Website download running very long and very large
|Kevin||12/16/2015 10:50 am|
|I am archiving a website for a client using Offline Explorer Pro.
The download has been running for days and queued files are in the hundreds of thousands.
I have the URL filters>Server set to "Load files only within the starting server". URL filters> Directories is set to "Load only files from the starting directory and below".
Is there anyway to perform a "list only" test to see how large a project is without performing the actual downloads?
Is there any guide to streamlining a download or best practices for Offline Explorer Pro projects? The docs are little lacking.
I mostly want to ensure that I am not causing myself heartburn on this download.
|Oleg Chernavin||12/16/2015 08:37 pm|
|Yes, it is possible to "list" a website - there is a "Build Site Map" mode of the Download button (drop-down arrow below it). However it is not that useful, because it just downloads web site without images and video files. But the downloaded files are not saved and just get displayed in the Project Map tab.
Later you would have to download everything again.
I would suggest you to pause the download and use the Queue tab to examine what kind of URLs are listed there. Maybe it started to download some HTML Forms and there are hundreds of possible control combinations.
Or the download could go to outer websites and start downloading much more than you wanted.
All sites are different and it is hard to make universal recommendations. We will have more predefined Projects or tasks in 7.0 version of Offline Explorer. The help file is completely rewritten there.
You may supply me with your Project settings. Perhaps, I will have thoughts on what to improve there. Select it, press Ctrl+C on keyboard and paste to the E-mail message.