Only save all the queue url

Author Message
userChris 10/06/2010 10:51 pm
Using log I can save in txt all the url of the downloaded files, but are not saved the url that has been reviewed and are on the queue.

Is to save the name of the url in queue ?
Oleg Chernavin 10/07/2010 07:47 am
This is possible in Offline Explorer Enterprise edition - download the site and use Tools - Make Google SiteMap button.

Best regards,
Oleg Chernavin
MP Staff
userChris 10/07/2010 01:14 pm
Understand, appreciate the help but I want to map the url txt in the site without having to download the files by downloading and saving only the url of the site.

I would save the url address shown in the queue. With log I can save in txt downloaded at the url, but I would do that also with the url that are in queue and without downloading any pages beyond the url. The site you plan to download is very large, so I'm asking this.

Thank you for your attention.
Oleg Chernavin 10/07/2010 02:16 pm
You can use Download - Site Map first - to get only HTML files and links to images and other files that contain no links. This way, the download will be minimal.

If you need the list of links only from 1 page, not the whole site, then create a Project with that page URL, start its download and press F9 immediately. Then use the Queue tab to save the list of links to a text file.

userChris 10/08/2010 11:29 am
I wish I could save only the queue for the txt, without downloading any pages, just analyze and save the url from the queue.

If you have a way to download files even smaller than the html site, or just pause or delay the download of files, and continue to review the files of the queue, please guide me.

It is a suggestion for the next version there is the option to map the site only with the url of the site.

Oleg Chernavin 10/08/2010 11:31 am
This feature is there already - the Download - Site Map.

It is impossible to make site map without downloading the site. At least, all its pages. Because the pages contain links to all site images, files and pages. And no program can guess what links will be without downloading a page first.

userChris 10/08/2010 04:49 pm
Understand, appreciate the help
Oleg Chernavin 10/09/2010 04:04 pm
You are welcome!