1 |
Am 10.01.2014 19:19, schrieb Ciaran McCreesh: |
2 |
> On Fri, 10 Jan 2014 14:18:24 +0100 |
3 |
> René Neumann <lists@××××××.eu> wrote: |
4 |
>> And again: What is needed is streamlining the algorithms (discussion |
5 |
>> on that already started as far as I could notice). An algorithm in |
6 |
>> O(n³) is always¹ worse than O(n). The constant factor added by the |
7 |
> |
8 |
> Full dependency resolution is NP-hard... |
9 |
|
10 |
Though NP-hard does not necessarily mean 'unfeasible in practice'. |
11 |
|
12 |
> If you really wanted to |
13 |
> streamline the algorithms, you'd change the inputs slightly. Most of |
14 |
> the complexity of doing a decent job of dependency resolution comes from |
15 |
> dealing with crap input. |
16 |
|
17 |
My intention really was not to tell anybody how the depres algorithms |
18 |
should be improved. I just wanted to make a point against the 'Python is |
19 |
the root of all the bad performance'. |
20 |
|
21 |
>> ¹ For a large enough n. |
22 |
> |
23 |
> ...which could be larger than the number of atoms in the universe. |
24 |
> There are real world cases where an algorithm has an O(n^3) step that |
25 |
> takes a day, and a O(2^n) step that takes a second for most inputs. You |
26 |
> shouldn't be using O() to make performance comparisons. |
27 |
> |
28 |
|
29 |
Point taken. I should've use 'in general' instead of 'always'. What I |
30 |
had in mind when writing this were more smaller problems, where a good |
31 |
algorithm exists but people use their own naïve implementation/data |
32 |
structure. |
33 |
|
34 |
- René |