0

I was mirroring a site using HTTrack (command-line version on Mac OS X's Terminal), when the mirroring suddenly stopped:

PANIC! : Too many URLs : >99999 [3031]f5641dz61e6fd4 (36896 bytes) - OK
Done.
Thanks for using HTTrack!

and then

* 
My-Names-iMac:~ username$ 

(The site I am mirroring has around 150'000 pages). My problem is very similar to this one, although, as a beginner with command-line tools, I am not sure I understand what I should type, and in what order, to resume download that was interrupted from where it started without having to start from the beginning again.

After "username$", should I type

httrack i -#L1000000

or just

i -#L1000000

or

httrack 

THEN enter THEN

i

THEN enter THEN

-#L1000000?

And do I have to re-type the address of the website and the path to the download folder?, as if I was starting a new HTTrack session?

I left the window open and didn't type anything yet because I didn't want to mess it up.

1 Answers1

0

I just experienced a similar panic error-out message:

PANIC! : Too many URLs : >99995

I am trying this first:

httrack

A cache (hts-cache/) has been found in the directory That means you can update faster the remote site(s) OK to Update httrack httrack?

Press <Y><Enter> to confirm, <N><Enter> to abort

and I typed Y and press the [enter] key

Guess it's continuing from where it left-off. It is running for the last 15 min now; not sure if I will hit the same error again later.

A quick check the files and size downloaded previously suggested that it did continue from where it left-off.

HTH whoever stumble into this in the future! :)

Bob-bwsb88 (ran ok for 20min now)

Update 01: 35min since started and still running.

Observation: some download mark with * at beginning of line and some others mark with numbers: 1497/6451. Guess some new pages added and existing pages got updated.

Update 02: 3hrs taken and about 500MB more downloaded. Tested some missing links/pages are now available offline! :)