Gentoo Archives: gentoo-dev

From: Alexis Ballier <aballier@g.o>
To: gentoo-dev@l.g.o
Subject: Re: [gentoo-dev] HDEPEND (host dependencies for cross-compilation) for EAPI 5?
Date: Wed, 05 Sep 2012 13:30:02
Message-Id: 20120905102830.2a624c65@gentoo.org
In Reply to: Re: [gentoo-dev] HDEPEND (host dependencies for cross-compilation) for EAPI 5? by Rich Freeman
1 On Wed, 5 Sep 2012 08:46:13 -0400
2 Rich Freeman <rich0@g.o> wrote:
3
4 > On Wed, Sep 5, 2012 at 7:27 AM, Ciaran McCreesh
5 > <ciaran.mccreesh@××××××××××.com> wrote:
6 > > Uhm. O(n) == O(n/2). Anything assuming they're different is just
7 > > wrong.
8 >
9 > We're basically debating definitions. O notation is used to indicate
10 > how algorithms scale and nobody uses O(n/2) and such as a result.
11 >
12 > An algorithm that is twice as slow is still twice as slow. That might
13 > or might not matter. However, this isn't the domain where O notation
14 > is used. You use O notation when your algorithm takes 30 seconds to
15 > run and you want to know what happens with the dataset doubles 3000
16 > times. It generally doesn't matter if the result is that it will take
17 > 1 trillion years to operate or 2 trillion years. You care more about
18 > whether it will take minutes, hours, weeks, years, or whatever.
19 >
20 > I can't really think of any practical examples where multiplying the
21 > time to parse a list of maybe 50 items vs 5 lists of 10 items is going
22 > to make that big of a difference. They're just lines in a text file -
23 > your CPU can compare a few billions characters per second. Sure, if
24 > you add 75 layers of abstraction you might be able to find just the
25 > right point where a factor of 5 is going to make it intolerable but a
26 > factor of 1 is almost acceptable, but go ahead and add/remove a few
27 > layers and suddenly it is all fine or all horrible anyway. That is a
28 > bit contrived. That's why everybody ignores constant factors in O
29 > notation anyway.
30
31 ewww, t_n=O(n) means 'there exists a constant C such that t_n <= C n';
32 it is just nonsensical to talk about O(n) vs O(n/2): take 2C instead
33 of C. If you care about the constant C then dont talk about O(). Measure
34 it in terms of CPU cycles for example, but then it becomes dependent on
35 a lot of things (compiler, hardware, etc.). However, it can be proved
36 that (with a sane implementation) all these differences scale only by a
37 constant factor between two implementations; this is why, when you talk
38 about algorithms and not execution time on a given hardware, the O()
39 asymptotic estimations are the ones that make sense. O() does not say
40 anything about execution time, but give a good idea of how your
41 algorithm will scale in the future: a O(n) algorithm will be able to
42 process twice as much data in the same time when your hardware will be
43 twice as fast, a O(2^n) one will be able to process only one more.
44
45 A.