Your product and the download mystery

Author Message
Greg 03/29/2006 07:01 am
Hi

I`ve tried your product before with limited success, either because the older versions were less featured or because I didn`t know how to use it correctly. I`m beginning to suspect the latter.

I initially tried to download a site and only got a handful of images from the 1st page. My thinking was that if I wanted to download only the images of a site, not the html thumbnails, I would set up the appropriate settings to not download them, like unchecking htm and html from the text list. But then I read the forums and got the impression that to be sure you get all the files you want in a site, the safest thing is to have no filters is that correct? Because when I did that, I got the site to download. It`s a site that uses image maps for links and that stumps most of your competitors, but OE figured it out.

And your program will not simply dump all files into one folder without subdirectories is that also correct? After using your program to download a site, a user should go through and delete the files they don`t want, and copy and paste the files into a folder if they just want to easily view images? And if so, does that cause the entire site to be downloaded again next time one uses OE on the same site? I hope I`m making sense. Please correct me if I`m wrong.

I`m almost sold on the product. Next I`m going to be trying a site that stumps every spider/downloader I`ve tried. It is an adult site, most of the sites I download are bondage/fetish-oriented. Is it okay to ask for help with it, or would you prefer I didn`t? If I can see it will download even this site, I`ll buy it.
Oleg Chernavin 03/29/2006 07:01 am
Greg,

Thank you for asking.

1. Yes, if you have no Filters, then Offline Explorer will get any link it sees. This is safe, but you are risking to get the whole Internet contents on your computer. Anyway, what I myself do quite often is I start downloading with no filters and quite high Level. But when the first URL starts loading I hit F9 key immediately. Once the first URL is loaded, I go to the Queue tab to see what URLs Offline Explorer is going to download. I simply abort the URLs I don`t want and let the download continue (F9 again). I can also hit F9 to wait for other 10 URLs t load and see what other addresses are now in the queue.

From what Offline Explorer finds, it is easy to setup Filters to exclude unwanted URLs.

2. Regarding saving all files in one folder. Yes, Offline Explorer creates subdirectories like on the original site. But you can do the File | Export and place them all in a flat directory. All links in Web pages will be corrected, so the site will be still browseable offline.

3. You can repeat downloads while keeping downloaded files and using Export to copy them to another folder. This is a kind of waste of some disk space. Anyway, you can use URL Filters to exclude certain files you don`t want to download anymore.

4. Please feel free to ask questions. I am glad to help!

Best regards,
Oleg Chernavin
MetaProducts corp.
Greg 03/29/2006 07:01 am
> Greg,
>
> Thank you for asking.
>
> 1. Yes, if you have no Filters, then Offline Explorer will get any link it sees. This is safe, but you are risking to get the whole Internet contents on your computer. Anyway, what I myself do quite often is I start downloading with no filters and quite high Level. But when the first URL starts loading I hit F9 key immediately. Once the first URL is loaded, I go to the Queue tab to see what URLs Offline Explorer is going to download. I simply abort the URLs I don`t want and let the download continue (F9 again). I can also hit F9 to wait for other 10 URLs t load and see what other addresses are now in the queue.
>
> From what Offline Explorer finds, it is easy to setup Filters to exclude unwanted URLs.
>
> 2. Regarding saving all files in one folder. Yes, Offline Explorer creates subdirectories like on the original site. But you can do the File | Export and place them all in a flat directory. All links in Web pages will be corrected, so the site will be still browseable offline.
>
> 3. You can repeat downloads while keeping downloaded files and using Export to copy them to another folder. This is a kind of waste of some disk space. Anyway, you can use URL Filters to exclude certain files you don`t want to download anymore.
>
> 4. Please feel free to ask questions. I am glad to help!
>
> Best regards,
> Oleg Chernavin
> MetaProducts corp.

So let me ask you this. F9, then abort url`s from the queue-- does that not do anything for future use? In other words, those same url`s would load next time right? Do you mean "disable from loading"? And is it better to have url`s in "disable from loading" or in server filters exclude?

And what if I just want to make sure some sites never load, like the any url with "yahoo" in it. yahoo has many domains so i just want to block yahoo dot anything. and what about real.com?

And let me make sure I get this part right. If I uncheck htm and html files from the filter, that would not work well because the spider wouldn`t be going through webpages to find images and other content, right?

And finally I`d like to give you a site, plus my username/pass. this site seems to outwit every spider but maybe you can impress me with your skills if you would be so kind. the site uses java buttons, some sort of method of multiple IP addresses between webpage and image, it uses cfm files instead of html files, and it has both a popup and an html login, very good security.
Greg 03/29/2006 07:01 am
and what types of filters would prevent links pages from being accessed? something like exclude the following? /links/, link*.htm*? (would the asterisk make sure to catch link.htm and links.htm, and both words with an html extension?

Also it seem my disabling and exluding wound up downloading each linked website as a folder with one file in it, why was that?
Oleg Chernavin 03/29/2006 07:01 am
OK. I got your E-mail and already answered it.

Oleg.