This tutorial explains how to take a URL and get all of the links for a specific file type (pdf, jpg, mp3, wav, whatever extension you want) exported into a list and download all of the links in Linux. In my example, I have a web page with over 20 links to pdf files. Instead of downloading them individually and manually, this script will allow me to download all of them at one time, and give me a list of each link.

You need to have lynx and wget installed before running this script. To install, run the following command:

Ubuntu: sudo apt-get install lynx-cur wget

openSUSE: sudo zypper install lynx wget

Save the following text as link-dl.sh and execute it by running "sh link-dl.sh":

To find the operating system version number and name from the terminal in Ubuntu and openSUSE (should work in other Linux operating systems as well, but may need to install the package), you need to run the lsb_release command. Here are a few examples: