However; I got many more files than I actually wanted. (about 2 or 3 thousand too many)
Is there a way for me to just enter multiple urls at one time (27 thousand) and have the program simply download the specific pages only and nothing else? I got all the url's in a text document or listed as an .html page.
Calm
{:file=c:\path\file.txt}
Specify the correct path and filename of the text file with the links. Please have 1 URL per line in the text file.
Set the Level setting to 0 to download only the page and not follow its links.
Best regards,
Oleg Chernavin
MP Staff
I intend to begin the download this coming weekend. I own the website. It has a V-Bulletin Forum which I personally use as a "Searchable" news aggregator database. I intend to download each thread as "Text" only .html.
If problems arise at time of download, I will return here and explain.
Merci Buckets!
Don't let your mouse byte yuh!
Oleg.