Roman Gaufman wrote:
> Personally, I find binary packages on gentoo perfect for my use. I
> have several computers on the network, all running gentoo with no gcc
> or any header files. I then have an offline machine that gets
> everything compiled and put on a shared /usr/portage for all the
> machines to just emerge -k simultaneously.
> That way when I emerge something, I can review the config files, add
> some custom options and some fancy polish and quickpkg will bundle
> everything together all pre-configured.
You're a very patient person, or your offline machine is at least
a 4 3Ghz Xeon with 4 GB RAM, or you never compiled something like
X and/or OpenOffice and some other nice little things ;-) .
It is that what you're describing is a lucky, but I fear uncommon,
situation: you can afford a plus machine and you don't have different
machines and machines types to compile for. I think that a common
situation, with different machines and machine types and production
machines, require a big overhead either in CPU load, either to schedule,
monitor and manage the compilations for not suffering for such CPU load
I made what you're describing, but also in my case, with the same
machines and machine types and the CPU load increase was not a concern.
I think that in most cases, with a smart monitoring, scheduling and
managing and with ccache and distcc you can get the target also in more
common situations without performance degradation in the business
services offered by the machines, but there are concerns either for
security (gcc installed everywhere to have distcc working) and the work
for managing monitoring, scheduling and control the CPU load and network
Sorry for the beastly English, but I can write a worst French and also a
very bad Italian ;-)
firstname.lastname@example.org mailing list