Anda di halaman 1dari 2

7/6/2015

windows 7 - Download all PDF links in a web page? - Super User


sign up

Super User is a question and answer site for computer enthusiasts and power users. It's 100% free, no registration
required.

log in

tour

help

Take the 2-minute tour

Download all PDF links in a web page?


Do you know a good software to download all PDF links in a web
page??
Operating system is Windows 7.
windows-7

pdf

download

download-manager

edited Mar 20 '11 at 20:41

asked Mar 20 '11 at 20:20

studiohack
10.3k

13

iAsk
61

108

183

17

3 Answers

You can use wget and run a command like this:


wget --recursive --level=1 --no-directories --no-host-directories --accept pdf
http://example.com

Or with the short options:


wget -r -l 1 -nd -nH -A pdf http://example.com

UPDATE: Since your update says you are running Windows 7: use wget for Windows from a
cmd prompt.
UPDATE 2: For a graphical solution - though it may be overkill since it gets other files too is
DownThemAll
edited Feb 28 at 20:41

answered Mar 20 '11 at 20:33

Gordon Gustafson

Kevin Worthington

630

1,258

14

38

10

12

thank you kevin for your advice, wget looks good, anyway i would prefer a 'graphic' software, non command
line. :) iAsk Mar 20 '11 at 21:09
This rejects even the initial .html page. Has it ever been tested? dan3 Jan 21 at 18:28
The question asks about downloading all PDF links, so yes, the initial .html page will be ignored.
Kevin Worthington Jan 21 at 21:48
Is there a posibility to do the same thing in Windows 7 using Power Shell? Benedikt Dichgans yesterday

1. In your browser, press

CTRL

SHIFT

, and enter

var pdflinks =[]; Array.prototype.map. call(document.querySelectorAll("a[href$=\".pdf\"]"),


function(e, i){if((pdflinks||[]).indexOf(e.href)==-1){ pdflinks.push( e.href);} });
console.log(pdflinks.join(" "));
This will return in the console:
"http://superuser.com/questions/tagged/somepdf1.pdf"
"http://superuser.com/questions/tagged/somepdf2.pdf"
"http://superuser.com/questions/tagged/somepdf3.pdf"
2. Now using wget with the command line options wget url1 url2 ...
Copy and paste this, open a console enter wget press the right mouse button to insert your
clipboard content and press enter.
To use a download file, join the lines with "\n" and use the parameter as follows wget -i
mydownload.txt

Note that most other (GUI) download programs too accept to be called with a space separated list
of URLs.
Hope this helps. This is how I generally do it. It is faster and more flexible than any extension with

http://superuser.com/questions/260087/download-all-pdf-links-in-a-web-page

1/2

7/6/2015

windows 7 - Download all PDF links in a web page? - Super User

a graphical UI, I have to learn and remain familiar with.


edited Nov 30 '14 at 18:15

answered Nov 7 '13 at 13:28

Franck Dernoncourt

Lo Sauer

2,245

405

21

53

12

Better yet, console.log('"' + pdflinks.join('" "') + '"') -- otherwise you don't actually get
quoted URLs dan3 Jan 21 at 18:29

If you want to stay in the browser, I've written a web extension for exactly this purpose - I'm
working on adding the ability to save scholarly article PDFs with properly formatted titles but if you
just want to download 'em all it's perfect for this.
It's called Tab Save and on the Chrome web store here. You don't even have to input the list of
URLs if you just open them all in tabs (but for large numbers of files this might slow a computer
down so I added the option to add your own).
answered May 26 '14 at 16:27
Louis
222

http://superuser.com/questions/260087/download-all-pdf-links-in-a-web-page

2/2