Dirkjan Ochtman wrote:
> On Mon, Jun 27, 2011 at 15:08, Fabian Groffen <firstname.lastname@example.org> wrote:
>> On 27-06-2011 14:28:34 +0200, Dirkjan Ochtman wrote:
>> It would be nice when a similar technique could be implemented only
>> once, in a consistent way. In a way, multilib-portage can be considered
>> equal to one of the objectives of the python (and ruby) eclass:
>> multiple times compiling and installing for different ABIs.
> Yeah, but it'd be nice not to have to compile subversion three times
> just because we want the python bindings installed in three different
> Python environments.
You wont be able to prevent this with a general solution, only with some specialized solution inside
the build, if you really want and need that.
Currently, if you want python bindings for three different python environments, you will have to
build everything for all three python environments, even if you only need a subset. Moving to a
similar system like ruby, in the long term maybe based on a general PM-implementation, would users
at least allow to select the targets per package and not just global like now.
> On Mon, Jun 27, 2011 at 17:53, Petteri Räty <email@example.com> wrote:
>> I like the ruby approach for the reason that it doesn't require users to
>> run update scripts like python-updater.
> Sure, but if that means the developers now have to bump every package
> in the tree when a new version of Python comes out, I'm not sure
> that's the best trade-off.
For multilib-portage, packages dont need to define the possible cross-compile targets, they have all
possible options in the USE flag list. Something similar could be done for this case:
Simplified: Define the range of supported python versions in the dependency section and allow the
eclass or PM to sort out the rest (you of course need a bit more, but this should be the only
required dependency/python-version related part).