1 |
This week's progress: |
2 |
1) created eclass function which properly supports parallel installs for |
3 |
different versions of mpi software |
4 |
2) implemented and tested new eclass in different mpi implementations |
5 |
3) reworked solution to all "/etc" configuration |
6 |
4) started framework for eclass to use arbitrary mpi implementations upon |
7 |
installation |
8 |
|
9 |
This will serve as a nice standard when building MPI software, with a |
10 |
pretty easy implementation from the ebuild-ers perspective. Parellel |
11 |
installs will be handy for mpi software in particular, where it takes quite |
12 |
a bit of tinkering to get things to work. |
13 |
|
14 |
Next week's plans: |
15 |
1) 2ND ECLASS: solidify design, verify with mentors. I expect the 2nd |
16 |
eclass to take a few weeks to complete. |
17 |
2) properly implement mpi-providers eclass for more mpi implementations, |
18 |
write patches accordingly |
19 |
3) refactor/modularize 1st eclass |
20 |
4) start proper documentation (formatted in markdown) |
21 |
|
22 |
Next week I want to finalize the existing code and work hard to get the |
23 |
next eclass (mpi-select) to be able to build arbitrary mpi software against |
24 |
arbitrary mpi implementations. I will be revising my GSoC schedule |
25 |
accordingly, considering this 2nd phase was unaccounted for in my original |
26 |
schedule, and this phase in particular will likely take up a good amount of |
27 |
time. |
28 |
|
29 |
Thanks, |
30 |
Michael |