1 |
Michael Orlitzky <mjo <at> gentoo.org> writes: |
2 |
|
3 |
|
4 |
> Yes, and you can replace "spark-1.1.0" by ${P} in the path as well. The |
5 |
> link that Bryan posted has a list of all of the variables that are |
6 |
> available. You can go pretty crazy with some of them, but in this case |
7 |
> the only other thing I would replace is "spark" by ${PN}. |
8 |
|
9 |
|
10 |
OK, that behind me now...... |
11 |
|
12 |
So the build fails, so I figure I'll just build it manually, then |
13 |
finish the ebuild. So I went to: |
14 |
|
15 |
|
16 |
/var/tmp/portage/sys-cluster/spark-1.1.0/work/spark-1.1.0 |
17 |
and no configure scripts.... |
18 |
|
19 |
The README.md has this: |
20 |
|
21 |
Spark is built on Scala 2.10. To build Spark and its example programs, run: |
22 |
|
23 |
./sbt/sbt assembly |
24 |
|
25 |
I did and it looks like the manual compile worked: |
26 |
|
27 |
[info] Packaging |
28 |
/var/tmp/portage/sys-cluster/spark-1.1.0/work/spark-1.1.0/ |
29 |
examples/target/scala-2.10/spark-examples-1.1.0-hadoop1.0.4.jar |
30 |
... |
31 |
[info] Done packaging. |
32 |
[success] Total time: 786 s, completed Sep 20, 2014 3:04:22 PM |
33 |
|
34 |
So I need to add commands to the ebuild to launch |
35 |
" ./sbt/sbt assembly" |
36 |
|
37 |
I've been all over the man 5 ebuild and the devmanual. So naturally |
38 |
I've seen what to do, but missed it. |
39 |
|
40 |
Suggested reading (which section) or syntax is most welcome. |
41 |
|
42 |
James |