downloading a huge web site with recursive links

Author Message
Charles Whittle 09/14/2006 03:27 pm
Greetings!

I am thinking about buying the Ofllline Explorer and would like to know if it can handle
a really huge web site with many recursive links. I collect photos for my screen saver and
have found several sites that have some great shots for free:

http://community.webshots.com
http://flickr.com
http://photobucket.com

to name just a few. I have been using Aaron's WebVacuum, but it gets bogged down after
going into one of these sites for 4 or 5 levels from the starting URL. Nearly every page has
links that go back to the home page as well as onward to more pages with more pictures.
As you might imagine the number of links in the queue grows rapidly; the WebShots site
has more than 400 million photos! My computer has a 3.3 GHz processor with 2.5 gigabytes
of RAM along with about 2 terabytes of online storage. How should the Offline Explorer be
configured? Would another one of your products be better suited for this task? Thanks!

Aloha,
Charles Whittle
Oleg Chernavin 09/15/2006 04:14 am
Downloading such huge sites is not an easy task. I would suggest you Offline Explorer Enterprise for that, because it is optimized to use less memory. Also, allow directory overload protection in the Options dialog - File Locations section.

Best regards,
Oleg Chernavin
MP Staff