My friend is in a fraterniy and they just had their week here on campus. A lot of his frat brothers have uploaded their pics to a website and now my friend wants to get all the pics from that site and save them to a CD or other object. Problem, there are over 500 pics and nobody wants to go through one at a time and right click and save as for over 500 photos. I remember one time somebody showing me some way that you can download a large number of photos from a site using some method I just don't quite remember it, does anybody know what I am talking about? If so could you please post how to do this. Thanks.
Do you have FrontPage? Start FrontPage. Click New -> Website Import From URL. Fill in the blanks. FrontPage will download all images with all pages to a local website. Don't have FrontPage? There's also another program, called wget, that can get ALL LINKED FILES... I back up *imp0rntant* websites with it. Make sure you use the "recursive" option to download ALL linked files.
if you use mozilla firefox there is a nifty extension called DownThemAll! yeah the exclamation mark is part of the file name ... you can configure it to download to the directory of your choice.
You could also try HTTrack. It will download a whole website for you for offline viewing. You could download it, then just search for all the pictures in the folder(s) you saved to, then consolidate them to one folder.
Thanks for all the options guys. I have Frontpage and I use Firefox so those look like my best two choices. Thanks again.
I love this thing. Is there anyway to set it up where it will go to the next page? The pics are arranged 20 per page and there are 27 pages of pics, so is there anyway to do that or do I just have to go page by page?
Which version is it? In the one I have, there's an option to limit the download or import to the page you call (or the top level website) and more levels. Uncheck the option (don't put a limit) or add a limit to where you know it will end. A level is each time you click to another page, hierarchically going after links. Also, there may be a query string in the URL that you could change so that there are more pictures displaying per page. In this BBS, you can modify the query ( the suff after the ? in the URL ) to change the output. See if there is something like "per_page=20" or "s=20", that you can change on the URL. I'm glad I could be of help, SeƱor Pun-cito.
There is no query thing on the page anywhere so I guess 20 is the max. Where is the limit check box you are talking about?
Mine is FrontPage 2000. Let me try frontpg on my work machine, and if it is different I will post it tomorrow. G'night, yo.
This is pretty good too, as my question stated before is there anyway to go through all pages or do I have to do it page by page?
If the images are named with some sort of order (image01.jpg, image02.jpg, ..., imageN.jpg) then you can try mediaqueue. If they all have totally different names it won't help you though. Also, if you go to http://www.download.com/ and search for "website download" there are a bunch of programs that will do what you want.
try wGet like I told you. It's really, really fast. If you have a Linux system, it's already installed. You MUST have enough space, though.
You could also try Nettransport (from xi-soft), which has two options for doing what you want. a) Use the batch downloader. You will nned to determine the path of the file and set up a batch download, but this only works well if the files are labeled in a sequential fashion. b) Use the Site explorer, which allows you to navigate the file system of a web site and just select all the files you want to download, and would allow you to get multiple pages. Really useful, but doesn't always work depending on the site security.