In this case, they are news sites, where every news has an Id number. I want to store only the news files, but not all the others.
As I understand, if I set filters to store only "news.php" filenames, the site will not be correctly spidered, because files like index.php and other similars will not be downloaded.
How can I do this?
Don't you think it would be interesting to have a StoreOnlyFilenames=file command?
Another nice solution would be to have a "StoreOnlyIncludedFiles" comand. In this case, you can specify in the "included files" section the list of files to store. But ALL would be parsed.
I think my need is not very weird... in all the sites I download, I see that there are many "glue" files with no interesting information in them. The interesting information is only on certain files like "filename?ID=xxxxx", where xxxxx is the news number...
Thank you very much,
> Well, the URLs field of the Project supports DeleteAfterParsing= command, but you will have to tell it to remove the files of almost all combinations.
> Best regards,
> Oleg Chernavin
> MP Staff
I hope in the near future...
> I understand, but we didn't plan this yet.