Is there a way/tool to get all links of a website ? Just the links , not looking to create a local copy/download a website . Example - Links of all questions posted on Superuser . Platform Windows 7 , Ubuntu 14.04
Asked
Active
Viewed 1,998 times
1 Answers
1
Sorry for keeping you waiting. I have uploaded my program here.
The program is still in very-very early phase, so most features do not work, but it does, however, grab all links to other pages on the website.
It needs java to run and you should be able to double click the file and a UI should load up. Type in the SearchW box (in the GUI) the website address i.e. http://google.com, http://bbc.co.uk
Then you can copy and paste all the links as they are printed (I still need to implement an export feature but you'll be able to copy the links for the moment)
Let me know if you have any issues! And if you like it, I will, (once it's in a decent state) post a link to my repo where you'll be able to download the newer versions.
benscabbia
- 430