Downloading a Wiki Page?

Neila
10/11/2006 02:08 am
I'm trying to download an entire wiki like Wikipedia. In order to access the content, you need to enter login/pass to see the contents of the wiki. So under advanced, I've tried login/pass for HTML forms and secure Web and FTP files. Regardless, I can't get the content pages regardless of what I do (i.e. changing levels to 3 or 4). Instead I'm getting 200 or 300 arbitrary/random files that I don't need at all.. just seems like it's the skeleton of the website.

In a nuthshell, how do I copy this wiki?
Oleg Chernavin
10/14/2006 09:46 am
It should be easy to download this type of site.

You need to browse to the logon page of the site using the internal Browser of Offline Explorer Pro .

If you need to download the site immediately and only once, you can proceed with the logon and begin downloading the desired pages using Offline Explorer Pro . The program will use the session cookies of the logged on site from the internal browser.

You can also record the logon form contents in a Project, so that Offline Explorer Pro will know how to log itself on whenever you wish to download the site. This is useful when you want to schedule the site download or perform it later, or if you want to update the downloaded site in the future.

Once you have entered your username and password on the logon page in the internal browser, press and hold the Alt + Ctrl keys on your keyboard, click the Logon (or Submit) button in the Web form and release the keyboard buttons. You should get a new Project that contains the Web form information recorded in the URL field.

Adjust the Project settings as you wish (set the Level and other parameters) and click the OK button to save the Project. You may begin downloading at any time.

Note: The form recording method is supported only in the Pro and Enterprise editions of Offline Explorer.

Best regards,
Oleg Chernavin
MP Staff
MCHAL
11/19/2006 03:26 pm
> It should be easy to download this type of site.
>
> You need to browse to the logon page of the site using the internal Browser of Offline Explorer Pro .
>
> If you need to download the site immediately and only once, you can proceed with the logon and begin downloading the desired pages using Offline Explorer Pro . The program will use the session cookies of the logged on site from the internal browser.
>
> You can also record the logon form contents in a Project, so that Offline Explorer Pro will know how to log itself on whenever you wish to download the site. This is useful when you want to schedule the site download or perform it later, or if you want to update the downloaded site in the future.
>
> Once you have entered your username and password on the logon page in the internal browser, press and hold the Alt + Ctrl keys on your keyboard, click the Logon (or Submit) button in the Web form and release the keyboard buttons. You should get a new Project that contains the Web form information recorded in the URL field.
>
> Adjust the Project settings as you wish (set the Level and other parameters) and click the OK button to save the Project. You may begin downloading at any time.
>
> Note: The form recording method is supported only in the Pro and Enterprise editions of Offline Explorer.
>
> Best regards,
> Oleg Chernavin
> MP Staff


Okay, the download is feasible, but Wikipedia search engine does not work offline. Why is it?
Oleg Chernavin
11/20/2006 03:35 am
This is because the Wiki search engine uses an internal database on the server, which is not allowed to be downloaded. You can use the Search Contents feature in the Edit menu.

Oleg.