Demonoid.me Archival Website Rip and Database
|Paul Limoges||07/14/2011 09:57 am|
|I am going to try to accomplish this project again after 2 years of inactivity on my part i am hoping that there has been some bugs fixed in the program.
My Question is how do i run a project so that when i get an error message i can give you all the details of my settings, and logs.
I forgot how to turn on logging, and am wondering if i should just send you the template file in the message when i run into an error that i know will occur about 3 days into the project. I want to get this fixed so that i can continue on to other sites like isohunt and h33t.com.
|Paul Limoges||07/14/2011 10:36 am|
|Whats are the correct directories to start downloading from if all you want is the torrents and the html details page that go's with them.|
|Paul Limoges||07/14/2011 11:31 am|
|How do i download the html and torrent files from these pages
Only this page and up too page 200 has torrents and for some reason i cant browse torrents beyond that. Demonoid is only allowing the first 10000 torrents to be browsed, then it gives the search error "No Torrents Found"
I added this line to the URL FILTERS - Directory tab and am downloading alot more files but i am not getting any torrents or html.. I have it Checked Load files only from the starting directory and below.
BUT WHEN I DO IT LIKE THIS :
URL FILTERS - Filename
i get all i need but only 75 torrents and html pages worth, where as each page has 50 torrents on the website.
What page should i start the project on in order to download all the torrents and detailed html pages??
Once i get passed this issue all i have to do is figure out how to access more than the 10000 torrents, allowed, even though there is webpages for over 380000 torrents listed. Might have something to do with them switching from demonoid.com too demonoid.me
|Oleg Chernavin||07/14/2011 11:54 am|
I think, you need to increase the Level setting. This will help to get more files. Regarding the errors after 3 days of download - please see how much memory (RAM) is used by Offline Explorer. Its limit is 2 GBs.
Turning logging (Ctrl+Q) on usually decreases program performance a lot and increases used memory.
|Paul Limoges||07/14/2011 04:15 pm|
|Ive got it unchecked meaning its going to download all of the pages, but i even tried it with 200 limit for the 200 pages of torrents that are accessible but it still will only download the 175 files or so.
I actually had it working a long time ago with demonoid.com and i was able to get 80,000 torrents AND that was when you had i hadnt used the directory overload protection option which is now on all the time., But now i need the html pages that go with them (Details page). At the moment though i cant even get it to download any torrents.
If you could give it a try, please see if you can get it to download any torrents at all.
|Paul Limoges||07/14/2011 05:34 pm|
|I cant get it too download any torrents from these directories:
all it does is download from these directory's:
If you need my account user name and password to BYPASS the visitor limit to login just ask and ill send you a private message, just give me an email.
|Oleg Chernavin||07/15/2011 08:51 am|
|OK. I sent you the E-mail using the address you left here.
|Ive sent you the information||07/15/2011 03:37 pm|
|all the information you need is in the email i just sent you. Its Friday at 3:34 PM.. Hope to hear back from you soon.|
|Paul Limoges||07/19/2011 06:52 pm|
|I guess the simple educated question is how do i search and save the results from the search engine. Is there a way to access their database without the search feature. Other similar sites i plan to backup like this one, have disabled the ability to browse through all the sites torrents, and all there is, is a search engine. Am i right to guess that its connected to a sql database, or what technology do they use to query and store the torrents??
The site ia am talking about is www.isohunt.com, but once i understand how to do that one i will probably be able to do all the rest.
I think this www.demonoid.me site is broken maybe, because they havent fully switched it over to the new domain yet?
|Oleg Chernavin||07/20/2011 04:31 am|
|Yes, such sites use databases for fast search, sorting, etc. In most cases they have forums that contain all links to torrents. If not, you may use their search feature and search for letter "a" - this will get almost all results.
If it is possible, you may do the search in the Internal browser of Offline Explorer and when clicking the Search button, hold down Alt+Ctrl keys to record the search in a Project. Then you may adjust and download it.
|Ive sent you the information||07/20/2011 01:41 pm|
|did you get any results from the project yet?|
|Oleg Chernavin||07/21/2011 04:49 am|
|Sure. I replied your E-mail few days ago.
|Ive sent you the information||07/21/2011 01:39 pm|
|Can You send me a file that i can import into the program that has the correct settings that you used? Then ill adjust that one to fit my needs. Im up too 281 torrents, and i am noticing now that when the program stops adding to that directory that in fact the site has disabled me from downloading torrents.
Maybe there a speed limit. Ill try doing one or 2 downloads at a time
It would be great if you could send me the file to imp[ort with the settings that you used.
|Oleg Chernavin||07/21/2011 02:44 pm|
|Please simply add the Included Filename Keyword:
P.S. I restored the Project that I adjusted:
|Paul Limoges||07/23/2011 09:12 am|
|It seems as if even when i downloaded the torents manually it disabled my ability to to download the torrents after 340 of them and locked me out for 24 hours.
Is there a way i can run my own indexer site that will gather torrents for me, as thats all i want to do in the first place, because this seems to not be working.
Im sure we will figure it out soon stay in contact with me.
|Paul Limoges||07/23/2011 05:53 pm|
|Ive havent been able to get it to run past 30 minutes with these settings, and I Just wanted to confirm theat demonoid.com does lock me out for 24 hours if i download more than 340 torrents manually in under an hour.
Is there any way you can limit how many torrents it will download in your program just like a "bot" or "Script" would be designed to do. Maybe you can add a setting.
Downloading torrents, and .torrents only not their contents needs to have more attention in your program.
Sometimes even when i disable loading from bit torrent protocol it still downloads the entire torrents contents. Maybe its picking up the magnet link that right next to the download link for the torrent in every row.
|Oleg Chernavin||07/26/2011 04:28 pm|
|I tested again with the above settings - torrent contents were not downloaded at all. The protocol is disabled in the Project and it works correctly.
I would suggest you to disable torrent file downloads in the settings. Download the whole site and then enable these files again and use Download Missing Files mode to build the queue of these files. Also, set Offline Explorer to use 1 download connection and set big delay between downloads to avoid too fast files downloads.