Siterip of pictures where direct access to picture content is forbidden - You don't have permission to access /images on this server.
|Mr Smith||06/28/2016 01:22 pm|
|I am trying to do a rip of the photos from:
however the photos are not hosted at the above site - the photos are hosted at: http://art.cafimg.com/images/
The siterip cannot begin at http://art.cafimg.com/images/ as I "don't have permission to access /images on this server."
What settings should I use to download the images and not the whole internet
Here is an sample url:
|Oleg Chernavin||06/28/2016 07:26 pm|
|It is quite easy to do. You need to download from the starting server or site only and allow images to be downloaded from any site.
This is simple in the File - Tasks wizard. If you already created a Project, select it, click the Properties button, open the File Filters - Images section and select the "Load from any site" in the Location box.
|Mr Smith||06/29/2016 08:11 am|
|The problem is, this site links to many other sites, and my previous attempts resulted in millions of queued links. My download directory has 497 folders of links/urls linking from that site. The site "art.cafimg.com" is there however only a minor number of images were downloaded. Could you take a look at "http://www.comicartshop.com"?
I have the starting web address as:
Level limit set as disabled
download from any website
download from the starting server or domain is disabled
Maybe I could restrict the downloads via the "url filters/servers/included keywords" function?
|Oleg Chernavin||06/29/2016 08:20 am|
|Yes, this is also possible. Let's do it thuis way. Allow to download from all servers in URL Filters - Server section.
URL Filters - Directory - add to the Included list:
All File Filters categories - select "Load using URL Filters" in their Location boxes.
If you need
|Mr Smith||07/11/2016 12:59 pm|
|Could you take a look at this image and tell me if these numbers seem correct.
Also, is it possible to exclude "searchresults.asp" from being "processed"
maybe the software has been unable to go past the "searchresults.asp", it is a big website so maybe there is not a problem and I just need to continue waiting.
|Oleg Chernavin||07/11/2016 09:35 pm|
|Yes, this is easy to disable them. Add this keyword to the URL Filters - Filename section - Excluded filename keywords list:
|Mr Smith||07/12/2016 07:25 am|
|Is this correct:
Also should I pause the project, then resume for the changes to take effect? as it seems that "searchresults.asp" is still being processed.
|Oleg Chernavin||07/12/2016 08:29 am|
|Yes, this is correct. To remove them from the Downoad Queue, switch to it, click Select By Mask button, enter this keyword and abort all found items.
New URLs like this will not be added any more.
|Mr Smith||07/12/2016 10:06 am|
|I went to "queue" and pressed "select all" and then pressed "abort and disable". I then entered the "custom mask" which resulted in the remaining queued links being aborted and the completion of the project. However zero images were downloaded which perhaps means I made a mistake - this is just a guess- it seemed that the software did not link/process/find the image urls at "http://art.cafimg.com/images/" from the original url of "http://www.comicartshop.com".
Could you attempt to download a single image from "http://art.cafimg.com/images" via "http://www.comicartshop.com"
and then tell me what settings you used?
|Oleg Chernavin||07/12/2016 04:43 pm|
|OK. It looks like the following Project settings work. Select the whole text starting from the [Object] line, copy to clipboard, switch to Offline Explorer and press Ctrl+V on keyboard:
|Mr Smith||07/15/2016 12:13 am|
|the settings you have provided were a success. This is excellent software and I have received excellent customer support, you are a exceptional credit to MetaProducts®|
|Oleg Chernavin||07/25/2016 07:38 pm|
|It is very nice to hear! Thank you very much!
|Mr Smith||08/04/2016 06:59 pm|
|Could you tell me how you determined to set the depth of the rip to ten
from the results of the rip im quiet sure this was not an arbitrary decision.
|Oleg Chernavin||08/04/2016 07:23 pm|
|I looked at some typical artists to see how many clicks on links might be necessary to get to the deepest picture.
However I didn't notice that some authors have over 500 images. Since a page contains 18 pictures, it would require up to level=20 to get them all.