|Laura||05/20/2009 06:19 pm|
|Hi, I''m evaluating the Offline Explorer Pro tool. Can I use this software to generate a list of external links on a website?
What I''d like to do, for example, is to download any website (like a blog, for example), and to be able to obtain a list of all the outbound links on that site (to other blogs, news articles, etc). It looks like I can download and browse the outbound links using this software.
What I''d also like to be able to do, is I''d like to be able to copy and paste a list of all those external links into a text file or spreadsheet. Is it possible to do this using this software, and if so, how would I do that?
(I looked into TextPipe Pro for "data mining", but the price tag is very high for what seems to me to be straightforward task.)
Thanks for your help!
|Oleg Chernavin||05/21/2009 06:15 am|
|I think, the following hint would work. Download the Project with the setting "Load from the starting server only" first. When the download is complete, change its Properties - URL Filters - Server section. Check the "Load up to 1 links on other servers" setting. Click the OK button, press Ctrl+F5 to get missing files and press F9 key to pause.
Please wait for the parsing process (in the Status bar) to complete - Offline Explorer will go through all loaded files, get the external links to be loaded. Then switch to the Queue tab. You will see all the collected links. They will be not loaded because of the pause mode (F9).
You may use Ctrl+A to select all links, right-click on them and choose to Copy URL. Then paste to Notepad or another application. Or you can skip all links selection and simply use "Save to text file" in the context menu of the queue.
I hope this will work for you.