Gentoo Archives: gentoo-science

From: Alexey Shvetsov <alexxyum@×××××.com>
To: gentoo-science@l.g.o
Subject: Re: [gentoo-science] Re: Empi, should it be in the science overlay?
Date: Wed, 27 Feb 2008 09:18:19
Message-Id: df02a1a80802270118u6bae27eeydc8c79bc5bb92d24@mail.gmail.com
In Reply to: [gentoo-science] Re: Empi, should it be in the science overlay? by Justin Bronder
1 I can rewrite ebuild for mpich2 and mvapich2 to use them with empi and
2 eselect mpi
3
4 2008/2/26, Justin Bronder <jsbronder@g.o>:
5 > On 26/02/08 09:44 -0800, Donnie Berkholz wrote:
6 > > On 10:45 Tue 26 Feb , Justin Bronder wrote:
7 > > > I've been spending the majority of my Gentoo-related time working on a
8 > > > solution to bug 44132 [1], basically, trying to find a way to gracefully
9 > > > handle multiple installs of various MPI implementations at the same time in
10 > > > Gentoo. Theres more information about the solution in my devspace [2], but
11 > > > a quick summary is that there is a new package (empi) that is much like
12 > > > crossdev, a new eselect module for empi, and a new eclass that handles both
13 > > > mpi implementations and packages depending on mpi.
14 > >
15 > > Is it enough like crossdev enough to share code, with perhaps a bit of
16 > > abstraction? Maintaining the same thing twice is rarely a good idea...
17 >
18 >
19 > They are similar in that they both use the same finagling with portage to
20 > install things to different locations, but it pretty much ends there. So
21 > far as sharing code, I can see maybe the symlink, portdir and /etc/portage
22 > stuff that might be shared. Given that crossdev is ~650 lines and empi is
23 > half that though, I'm of the opinion that it's not worth the effort. The
24 > majority of the work in empi is reading command line arguments and testing to
25 > make sure preconditions are met.
26 >
27 >
28 > >
29 > > > So, I think I have pushed this work far enough along for it to actually be
30 > > > somewhat officially offered. My question then, is where should this be
31 > > > located? There are several mpi packages in the science overlay already, so
32 > > > should I push this work to there, or would it be more appropriate to make a
33 > > > new overlay specifically for hp-cluster?
34 > > >
35 > > > Future work related to this project will be getting all mpi implementations
36 > > > and dependant packages converted in the same overlay before bringing it up on
37 > > > -dev for discussion about inclusion into the main tree.
38 > > >
39 > > > I have no real preference either way, but the science team does already have
40 > > > an overlay :) Let me know what you think.
41 > >
42 > > Seems like people already committing cluster stuff to the sci overlay
43 > > could help; maybe they'll port packages, fix bugs, etc. With a new
44 > > overlay, we'd have to start from scratch, and I don't really see the
45 > > point.
46 >
47 >
48 > Pretty much sums up why I'm posting here :)
49 >
50 > --
51 >
52 > Justin Bronder
53 >
54 >
55
56
57 --
58 Gentoo GNU/Linux 2.6.23 Dual Xeon
59
60 Mail to
61 alexxyum@×××××.com
62 alexxy@××××××.ru
63 --
64 gentoo-science@l.g.o mailing list