1 |
I guess I'm not exactly sure what you're trying to do, but when I want to |
2 |
get a local copy of a website I do this: |
3 |
|
4 |
nohup wget -m http://www.someUrL.org & |
5 |
|
6 |
Shawn |
7 |
|
8 |
On 12/2/05, Robert Persson <ireneshusband@××××××××.uk> wrote: |
9 |
> |
10 |
> I have been trying all afternoon to make local copies of web pages from a |
11 |
> netscape bookmark file. I have been wrestling with httrack (through |
12 |
> khttrack), pavuk and wget, but none of them work. httrack and pavuk seem |
13 |
> to |
14 |
> claim they can do the job, but they can't, or at least not in any way an |
15 |
> ordinary mortal could be expected to work out. They do things like |
16 |
> pretending |
17 |
> to download hundreds of files without actually saving them to disk, |
18 |
> crashing |
19 |
> suddenly and frequently, and popping up messages saying that I haven't |
20 |
> contributed enough code to their project to expect the thing to work |
21 |
> properly. I don't want to do anything hideously complicated. I just want |
22 |
> to |
23 |
> make local copies of some bookmarked pages. What tools should I be using? |
24 |
> |
25 |
> I would be happy to use a windows tool in wine if it worked. I would be |
26 |
> happy |
27 |
> to reboot into Windows if I could get this job done. |
28 |
> |
29 |
> One option would be to feed wget a list of urls. The trouble is I don't |
30 |
> know |
31 |
> how to turn an html bookmark file into a simple list of urls. I imagine I |
32 |
> could do it in sed if I spent enough time to learn sed, but my afternoon |
33 |
> has |
34 |
> gone now and I don't have the time. |
35 |
> |
36 |
> Many thanks |
37 |
> Robert |
38 |
> -- |
39 |
> Robert Persson |
40 |
> |
41 |
> "Don't use nuclear weapons to troubleshoot faults." |
42 |
> (US Air Force Instruction 91-111, 1 Oct 1997) |
43 |
> |
44 |
> -- |
45 |
> gentoo-user@g.o mailing list |
46 |
> |
47 |
> |
48 |
|
49 |
|
50 |
-- |
51 |
Shawn Singh |