I am downloading a very large site that has many external links (don''t want anything external) so I set URL Ommissions to the links I could find (and tested them, that is good), I checked ONLY images and then only jpg and jpeg (that is good)
Checked Load only files in the starting server (seems OK)
Set level limit to 10 (I think you can get anywhere on the in less then 10 clicks)
so it ultimately ques up over 2,100,000 files, I believe it is saying that is the number of html files it will have to evaluate as when it got to the 400,000 mark, it had actually saved 50,603 pictures. The problem is, this is such a long process, can I have save points in case the PC bombs? For example, last night I think I got logged out (they might have a time out how long you can stay on) and the program shut down (no error message, nothing, so when I want back in thinking I could resume, I basically had to start over so again it is running like crazy... is there anyway to suspend and then say, "save everything here so if the PC bombs out, I can restart from that point?
Regarding the download - in most cases it is better to use URL Filters - Server and Directory sections to allow "Load only from the starting...". Often it is good to use keywords filtering to setup exactly the links you want to get.
If you will have to repeat this download in future, you may give me more details about the site and what you load. I will advise you on how to make the Project settings.