1 |
On 12/04/2014 02:02 PM, Alex Brandt wrote: |
2 |
> Hey James, |
3 |
> |
4 |
> I've removed the original content for length but I love the |
5 |
> ideas you've put together for an overarching testing strategy. |
6 |
> |
7 |
> I've begun work on a small ebuild testing framework, etest [1], |
8 |
> that I believe fits into your model quite well. It uses docker |
9 |
> images for isolation and repeatability. It allows developers to |
10 |
> verify the install time and some run time behavior of ebuilds in |
11 |
> an automated fashion. It's also extremely alpha and I find new |
12 |
> issues every time I use it. |
13 |
> |
14 |
> That all being said I would love feedback and if anyone is brave |
15 |
> enough to use it I would love to start cataloging issues that |
16 |
> people find in github. |
17 |
|
18 |
This looks awesome. I don't want to commit too early, but after this |
19 |
semester is over I think I can run it on a few ebuilds (inside Jenkins) |
20 |
and help fix issues. |
21 |
|
22 |
> |
23 |
> James, if this doesn't fit your vision then I apologize for the |
24 |
> tangential reply to your thread. |
25 |
> |
26 |
> [1] https://github.com/alunduil/etest |
27 |
> |
28 |
> Thanks, |
29 |
> |
30 |
|
31 |
It'd be cool if Gentoo had some sort of automated workflow (with |
32 |
Jenkins, buildbot, whatever) like this: |
33 |
|
34 |
1. Receive pull request |
35 |
2. Detect changed ebuild |
36 |
3. test ebuild with etest |
37 |
|
38 |
This will take a lot of work to set up, and depending on how my workload |
39 |
is next semester, I would love to help out as much as possible. |
40 |
|
41 |
Alec |