Getting complete list of URLs

Scratch
03/04/2006 09:44 pm
Hi, I`m wondering if there`s a way to get a complete list of URLs from a site WITHOUT actually downloading any of it? In other words, maybe queue it completely before downloading? Is this possible?

Any suggestions would be appreciated!

Scratch
Scratch
03/04/2006 10:09 pm
Duh. Never mind!
Scratch
03/05/2006 02:08 am
Actually, I retract the duh. I`d like some help after all. ;-)
Oleg Chernavin
03/06/2006 04:20 am
Well, this is a problem, because to get all links from a site, you have to load all Web pages and many other files (scripts, Flash, styles, java classes, etc.) first. I would suggest you to load the site without images. Then enable images and use Ctrl+F5 to get missing files. Then press F9 to pause, wait for the parsing process to complete and then you can copy the links that still have to be loaded from the Queue tab.

Best regards,
Oleg Chernavin
MP Staff
Scratch
03/06/2006 09:57 am
Okay, Oleg. Thank you for responding. :-)

scratch


> Well, this is a problem, because to get all links from a site, you have to load all Web pages and many other files (scripts, Flash, styles, java classes, etc.) first. I would suggest you to load the site without images. Then enable images and use Ctrl+F5 to get missing files. Then press F9 to pause, wait for the parsing process to complete and then you can copy the links that still have to be loaded from the Queue tab.
>
> Best regards,
> Oleg Chernavin
> MP Staff
Oleg Chernavin
03/07/2006 05:37 am
You are welcome!

Oleg.
Fred
03/08/2006 08:18 pm
I just used the "New Project Wizzard" and, on the last page, chose the "Generate site map" radio button. Only first level pages were downloaded and a map was provided for the next level(s).

I have let to try the advanced settings.

I hope this helps.

-- Fred