I have noticed in recent versions that, whenever I have to use the save multiple files option becuase the download considers each file similarly named, I "lose" files/pages that I should have received. I have my # of connections defaulted to 10. If I "dumb down" the number of connections all the way to only 1 then all the files are received -- but the process is horribly slow :-(
I have tried other numbers of connections between 1 and 10 but they all seem to lose files and not download the site entirely -- unless I set the number to 1.
This only happens when the files being downloaded all have same name and need to being incremented -- like they all are search.asp named and OEP add the (1)...(999) to each file to uniquely name them -- sites where the files are all uniquely named at download run fine at any number of connections!
I have noticed similar occurences in Mass Downloader -- when running a download on sites where the file names are the same and it renames then by adding a sequence number -- I will get many files marked in red as not downloaded.
I am sure this is due to the server(s) being queried and not the OEP or MD products but I am wondering if there is an alternative to try rather than limiting the downloads to a single connection?
Thanks -- both products are AWESOME!!!
Steve
I am sorry for the late answer. Can you please tell me more on the URLs that disappear after downloading - what is the site name, link, etc.?
Thank you.
Best regards,
Oleg Chernavin
MetaProducts corp.
POST=StartRow={:1..29531|10}&TotalRows=29537&Search=11746678&HomeDeliveryID=0&MinPrice=0&MaxPrice=100000000&Addr=&City=ALL&Keyword=&PropertyID=&SortColumn=price&SortOrder=DESC&MinBedroom=0&MaxBedroom=100&MinBathroom=0&MaxBathroom=100&MinSqFootage=0&MaxSqFootage=100000&Style=ALL&SearchDesc=&FrequencyCd=0&AltEmailAddr=&x=62&y=11
IgnoreLogOutLinks
Referer=http://boulder.homesincolorado.com/mlssearch.asp
SetCookie=WEBTRENDS_ID=198.160.96.7-62329584.29577596; __utm1=3103069480.1059007022; __utm2=1059007022; WTLGuestSearch=MaxPrice=100000000&City=ALL&MinPrice=0&PropertyID=&Neighborhood=ALL&MinBedroom=0&Addr=&Style=ALL&MaxBedroom=100&PostalCd=&FeatureList=&Keyword=&PropTyp=ALL&MaxSqFootage=100000&MinBathroom=0&County=ALL&MaxBathroom=100&StartDt=&MinSqFootage=0; ASPSESSIONIDQARBBQQR=JEKDAGLCDGGKHCKLEEICPPMA
Level Limit is checked and set to 1.
Under file copies the Keep old files is checked and set to 99999.
I File Filters checked are Text, User Defined and Other.
The symptom is that all the URLs show in the Queue tab but when the download is complete, the download directory has only about 7,000 files and not the 20,000+. When I rerun the download to another directory setting the number of connects to 1 I get all the desired pages. I suspect that I am either flooding the target server with the default 10 connections or that there is some dependency on the order/sequence of the pages being requested and I am not getting them all.
This has happened consistantly on web sites where I need to keep old files because the downloaded file name is the same and would otherwise cause a overwrite of the previously downloaded file(s).
It is important to note that, when connects is set all the way down to 1, the sites all download properly but at a painfully slow pace.
Any suggestions you can make would be appreciated!
Steve
> Dear Sir,
>
> I am sorry for the late answer. Can you please tell me more on the URLs that disappear after downloading - what is the site name, link, etc.?
>
> Thank you.
>
> Best regards,
> Oleg Chernavin
> MetaProducts corp.
To overcome this, please go to the Options dialog | File Locations section, check the "Prevent directories from overloading" box and click OK button. Please remove all Project files and start downloading again.
Oleg.
I thought about that option but I don`t understand why it works in a single directory if I set the number of connections down to 1 versus 10? I would have thought the directory limit would have been an issue but the downloads were complete as expected! Putting the data into separate directories make it harder to deal with the final output as a single entity -- especially when the file names repeat under each folder ((1), (2), (3) ... (999)).
Let me know if you have any other ideas -- in the mean time I will try to rerun using the limit of files option!
Thanks,
Steve
> I think, I know what the problem is. When you download the Project, it places too many files in a single directory. Windows filesystem has a limit on number of files per directory. That`s why you can`t find many of downloaded files there.
>
> To overcome this, please go to the Options dialog | File Locations section, check the "Prevent directories from overloading" box and click OK button. Please remove all Project files and start downloading again.
>
> Oleg.
Oleg.