-1

I often need to download sets of links from a webpage and it's very inefficient having to click all the links one by one. For example, at the moment I want to download the lecture notes on this webpage. Is there some way I can batch download them? I am using Linux and Firefox in case that information is useful

ManUtdBloke
  • 151
  • 1
  • 5

1 Answers1

0

This line of code will download each pdf from the page.
It's quick and dirty and could probably be optimized (especially since I'm using both curl and wget):

for file in $(curl https://ocw.mit.edu/courses/mathematics/18-102-introduction-to-functional-analysis-spring-2009/lecture-notes/ | grep .pdf | cut -d'"' -f2); do wget https://ocw.mit.edu/$file; done
Mikael Kjær
  • 1,553