Hi, I`m wondering if there`s a way to get a complete list of URLs from a site WITHOUT actually downloading any of it? In other words, maybe queue it completely before downloading? Is this possible?
Any suggestions would be appreciated!
Scratch
Actually, I retract the duh. I`d like some help after all. ;-)
Well, this is a problem, because to get all links from a site, you have to load all Web pages and many other files (scripts, Flash, styles, java classes, etc.) first. I would suggest you to load the site without images. Then enable images and use Ctrl+F5 to get missing files. Then press F9 to pause, wait for the parsing process to complete and then you can copy the links that still have to be loaded from the Queue tab.
Best regards,
Oleg Chernavin
MP Staff
Okay, Oleg. Thank you for responding. :-)
scratch
> Well, this is a problem, because to get all links from a site, you have to load all Web pages and many other files (scripts, Flash, styles, java classes, etc.) first. I would suggest you to load the site without images. Then enable images and use Ctrl+F5 to get missing files. Then press F9 to pause, wait for the parsing process to complete and then you can copy the links that still have to be loaded from the Queue tab.
>
> Best regards,
> Oleg Chernavin
> MP Staff
I just used the "New Project Wizzard" and, on the last page, chose the "Generate site map" radio button. Only first level pages were downloaded and a map was provided for the next level(s).
I have let to try the advanced settings.
I hope this helps.
-- Fred