I just want to know how to filter this kind of url 'http://www.site.com/area/city-2713//area/city-2713/'
In url there are repeatable words '/area/city-2713' with '//area/city-2713/', the different is in last url it uses // instead of /
My question is how to filter so OE only capture http://www.site.com/area/city-2713 not http://www.site.com/area/city-2713//area/city-2713/ because they are leads to the same page,
,also OE capture every url like this 'http://www.site.com/area/city-2713//area/city-2713/' three times, and seems don't detect it as duplicate
btw,I also want to say thank you for previous answer and I must say this company support is legendary
I think, it should be enough.
Another one, how to filter this url http://www.site.com/area/home-city-new-755/home-city-new-755/
I just want to save http://www.site.com/area/home-city-new-755 not http://www.site.com/area/home-city-new-755/home-city-new-755/
Also, please try to uncheck the "Suppress server errors" box in the Properties dialog - Parsing section. Maybe this will help to get rid of such weird URLs at all.
I already suspend to file and it generate 600,000+ queue urls with many strange urls inside
Do you think it's better for me to start from beginning with "Suppress server errors" uncheck, or I can resume from file with "Suppress server errors" uncheck and maybe OE can automatically filter all strange urls inside .wdqh file or any other way?
what I mean duplicate
this 2 urls leads to the same page, but when I open http://www.site.com/area/home-city-new-755/home-city-new-755/ it page don't have css/template, just crumble of text without border, graphic, etc
Another question, last time when you gave me exe fix for restore function it works faster in 300,000+ files, but when I tried in boe file with 600,000+ files it speed reduce by half
So, I just want to know, is speed of restore really depends on how many files inside and maybe other factors or can you make fix that restore speed always remain same no matter how much files inside?