Parsing, Downloading, and Filters

John Gowing
07/25/2006 05:29 am
While we are on the subject of URL Filtering, can you clarify if my assumptions of how OE works are correct.

1. Site is scanned for web content pages HTML, JS, CGI, etc etc.
2. ALL these file are loaded (WITHOUT application of filters)
3. THese files are then parsed to extract links to more web content or media/files
4. ALL web content continues to be loaded and parsed up to the Level limit (WITHOUT Filtering)
5. Links to Content extracted from web content are evaluated, Filters APPLIED, and the content loaded according the options set in the Project.

Are there any options other than Level which control what urls are loaded for parsing?

Oleg Chernavin
07/25/2006 07:58 am
Well, it works in another way - the first URL in the URLs field of the Project will be loaded in any way. Offline Explorer extracts all links and changes them for offline browsing. Then all extracted links are tested against level, filters and other project settings. The links that are allowed for the download are added to the download queue.

Best regards,
Oleg Chernavin
MP Staff