1 |
Well ok i've made my own ebuild with openib-drivers , v 1.4 (in |
2 |
attachment), i tried it and it compiles except ehca flag (won't compile |
3 |
with it), if someone's interested, please check it and put into the overlay |
4 |
|
5 |
i have another question, i have big PERFORMANCE problems with infiniband |
6 |
on gentoo, no matter if i use kernel drivers or openib-drivers, here's |
7 |
what OSU latency benchmark says: |
8 |
|
9 |
first, OSU over Gigabit Ethernet (for comparison): |
10 |
# OSU MPI Latency Test v3.1.1 |
11 |
# Size Latency (us) |
12 |
0 76.37 |
13 |
1 77.77 |
14 |
2 77.66 |
15 |
4 77.92 |
16 |
8 77.95 |
17 |
16 77.98 |
18 |
|
19 |
Infiniband - drivers from openib-drivers ebuild |
20 |
# OSU MPI Latency Test v3.1.1 |
21 |
# Size Latency (us) |
22 |
0 50.23 |
23 |
1 52.10 |
24 |
2 51.99 |
25 |
4 51.62 |
26 |
8 51.79 |
27 |
16 52.17 |
28 |
|
29 |
Infiniband - drivers from kernel 2.6.27 - results are almost the same, |
30 |
except everywhere, where there is 5x (fifty) , it's 4x (fourty) |
31 |
|
32 |
|
33 |
Infiniband on CentOS (installed from RPM OFED) |
34 |
|
35 |
# OSU MPI Latency Test v3.1.1 |
36 |
# Size Latency (us) |
37 |
0 1.65 |
38 |
1 1.82 |
39 |
2 1.81 |
40 |
4 1.72 |
41 |
8 1.60 |
42 |
16 1.61 |
43 |
|
44 |
so, the last one is perfect, Infiniband over IP works as expected |
45 |
|
46 |
in previous test my benchmarks suffered terrible performance loss over |
47 |
infiniband on gentoo (the same with OSU Bandwidth benchmark, on Gentoo: |
48 |
500-750MB/s , on CentOS: 1400 MB/s) |
49 |
|
50 |
any suggestions? maybe some packages from this overlay are doing |
51 |
something wrong? |
52 |
|
53 |
-- |
54 |
Dept of Computational Biophysics & Bioinformatics, |
55 |
Faculty of Biochemistry, Biophysics and Biotechnology, |
56 |
Jagiellonian University, |
57 |
ul. Gronostajowa 7, |
58 |
30-387 Krakow, Poland. |
59 |
Tel: (+48-12)-664-6380 |