Use wget to download links in a file | A file with a list of links

Written by
Date: 2012-07-02 17:25:43 00:00


As I was using my Mac, I tried to download some files that I had as a list of urls in a file.

Of course I tried to do it using curl which is available in Mac OS X, but I found out that the real tool to do that is wget

You can install wget using MacPorts, or if you are using Linux you may already have it installed.

Now, you have a list of urls in a file, something like this:

http://www.somesite.com/file1.pdf
http://www.somesite.com/file1.pdf
http://www.othersite.com/other-file.pdf

And so on, let suppose those links are in a file called url-list.txt. Then you want to download all of them. Simply run:

wget -i url-list.txt

If you have created the list from your browser, using (cut and paste), while reading file and they are big (which was my case), I knew they were already in the office cache server, so I used wget with proxy