Is to save the name of the url in queue ?
I would save the url address shown in the queue. With log I can save in txt downloaded at the url, but I would do that also with the url that are in queue and without downloading any pages beyond the url. The site you plan to download is very large, so I'm asking this.
Thank you for your attention.
If you need the list of links only from 1 page, not the whole site, then create a Project with that page URL, start its download and press F9 immediately. Then use the Queue tab to save the list of links to a text file.
If you have a way to download files even smaller than the html site, or just pause or delay the download of files, and continue to review the files of the queue, please guide me.
It is a suggestion for the next version there is the option to map the site only with the url of the site.
It is impossible to make site map without downloading the site. At least, all its pages. Because the pages contain links to all site images, files and pages. And no program can guess what links will be without downloading a page first.