1

first of all excuse me for asking this way but I didn't know another way to express what I need. If anyone can edit the question, I would be very pleased.

Often searching for some materials through internet I finish opening some pages (repositories) whose contain all the information that I need but I don't know how to download all those files & folders. It is the possibility to download them one by one, but doing this for a big repository is almost impossible. For example these URL: OpenSusse Repository or Primefaces .

Can anyone help please?

Lazy Badger
  • 3,714

2 Answers2

4

Perhaps wget. There is even wget for Windows.

And here is curby's wget-mirror which I use:

#
#
#   get prerequisites
# don't get parent   \
#                 \   \
wget -N -l inf -r -np -p $1
#   /   /      /
#  /   |      /
# |    |     recursive get
# |   infinite recursion
#don't get unless newer than local
Dan D.
  • 6,342
1

Try this tool:

http://www.httrack.com/

It is a freeware tool that allows you to recursively download an entire website (including images, linked files etc) and make a local copy.