Gentoo Archives: gentoo-user

From: Helmut Jarausch <jarausch@××××××××××××××××.de>
To: gentoo-user@l.g.o
Subject: Re: [gentoo-user] maintaining clones
Date: Fri, 31 Jul 2009 07:59:02
Message-Id: tkrat.240c03e55845f0d0@igpm.rwth-aachen.de
In Reply to: Re: [gentoo-user] maintaining clones by Neil Bothwick
1 On 31 Jul, Neil Bothwick wrote:
2 > On Fri, 31 Jul 2009 09:07:14 +0200 (CEST), Helmut Jarausch wrote:
3 >
4 >> I have 4 identical machines, they only differ in the 2 files
5 >> /etc/conf.d/hostname
6 >> /etc/conf.d/net
7 >>
8 >> I'd like to maintain only one of them (updating
9 >> GenToo upto several times a week)
10 >> and 'rsync' the other ones.
11 >>
12 >> Now, rsync'ing a life root filesystem is risky.
13 >> I don't see any problems for the FS holding /usr.
14 >
15 > The whole idea sounds a little risky. I'd use binary packages to keep the
16 > other machines up to date. Set FEATURES="buildpkg" in make.conf on each
17 > computer and set PKGDIR to a directory accessible by all over NFS. Run
18 > your normal emerge -u --whatever world on the first then run the same
19 > with -k on the others. That way they all get the same updates but only
20 > the first has to compile them.
21 >
22 > I'd also set up distcc to reduce compile times, but that's a separate
23 > step.
24 >
25
26 Thanks for your help!
27
28 I've done so in the past but I've made bad experience.
29 Unfortunately portage isn't so clever, yet.
30
31 Many times (on the 'clones', as well) I had to block packages before
32 emerge and then unblock again. I even had to unmerge some packages
33 temporarily and emerge them later on again.
34
35 Sometime I have to patch an ebuild file (temporarily) and so on.
36
37 And let alone updating baselayout.
38
39 So, it would be much easier if I could simulate cloning the
40 machines each day.
41
42 Helmut.
43
44 --
45 Helmut Jarausch
46
47 Lehrstuhl fuer Numerische Mathematik
48 RWTH - Aachen University
49 D 52056 Aachen, Germany

Replies

Subject Author
Re: [gentoo-user] maintaining clones Neil Bothwick <neil@××××××××××.uk>