Download Every File of a Certain Extension on a Site

Today I was editing some video for Youtube and was looking for some free stock music. I ended up at a site I’ve seen many times before, DanoSongs.com, but haven’t really ended up using since finding a song involves going through a long list and playing them in an embedded player and then downloading the one you want. What I wanted to do was download every MP3 and put them in a folder that I have full of stock music and then when inside my video editing software I can preview it along with the rest. So here’s the command I used:

wget -r -l1 -H -t1 -nd -N -np -A.mp3 -erobots=off http://www.danosongs.com/

This is of course done with wget, a powerful command line utility for Linux. To tweak the command all you need to do is replace the URL with the page you’d like to scrape and then replace “mp3” with whatever file you’d like to grab. So you could easily scrape a page for all .png files or .docx files if you wanted.

Enjoy


Tags: , , ,

  • Del.icio.us
  • StumbleUpon
  • Reddit
  • Twitter
  • RSS

2 Responses to “Download Every File of a Certain Extension on a Site”

  1. Thunderflash says:

    An Alternative (works with this example page) is the Firefox Plugin DownloadHelper.
    Which does the same thing with media.
    Although this technique seems to be more versatile.

  2. Rootslogin says:

    Saved my day
    thanks

Leave a Reply