1 |
On Tue, 18 Feb 2014 22:53:12 +0400 |
2 |
the <the.guard@××××.ru> wrote: |
3 |
|
4 |
> -----BEGIN PGP SIGNED MESSAGE----- |
5 |
> Hash: SHA256 |
6 |
> |
7 |
> On 02/18/14 17:56, Gevisz wrote: |
8 |
> > On Mon, 17 Feb 2014 23:30:42 -0600 Canek Peláez Valdés |
9 |
> > <caneko@×××××.com> wrote: |
10 |
> > |
11 |
> >> On Mon, Feb 17, 2014 at 8:05 PM, Gevisz <gevisz@×××××.com> |
12 |
> >> wrote: [ snip ] |
13 |
> >>> How can you be sure if something is "large enough" if, as you |
14 |
> >>> say below, you do not care about probabilities? |
15 |
> >> |
16 |
> >> By writing correct code? |
17 |
> > |
18 |
> > No, by arguing that fixing bugs in a 200K line program is as easy |
19 |
> > as fixing a bug in 20 10K line programs. It is just not true, just |
20 |
> > the opposite. |
21 |
> > |
22 |
> >>>>> SysVinit code size is about 10 000 lines of code, OpenRC |
23 |
> >>>>> contains about 13 000 lines, systemd — about 200 000 |
24 |
> >>>>> lines. |
25 |
> >>>> |
26 |
> >>>> If you take into account the thousands of shell code that |
27 |
> >>>> SysV and OpenRC need to fill the functionality of systemd, |
28 |
> >>>> they use even more. |
29 |
> >>>> |
30 |
> >>>> Also, again, systemd have a lot of little binaries, many of |
31 |
> >>>> them optional. The LOC of PID 1 is actually closer to SysV |
32 |
> >>>> (although still bigger). |
33 |
> >>>> |
34 |
> >>>>> Even assuming systemd code is as mature as sysvinit or |
35 |
> >>>>> openrc (though I doubt this) you can calculate |
36 |
> >>>>> probabilities of segfaults yourself easily. |
37 |
> >>>> |
38 |
> >>>> I don't care about probabilities; |
39 |
> >>> |
40 |
> >>> If you do not care (= do not now anything) about probabilities |
41 |
> >>> (and mathematics, in general), you just unable to understand |
42 |
> >>> that debugging a program with 200K lines of code take |
43 |
> >>> |
44 |
> >>> 200000!/(10000!)^20 |
45 |
> >>> |
46 |
> >>> more time than debugging of 20 different programs with 10K |
47 |
> >>> lines of code. You can try to calculate that number yourself |
48 |
> >>> but I quite sure that if the latter can take, say, 20 days, the |
49 |
> >>> former can take millions of years. |
50 |
> >>> |
51 |
> >>> It is all the probability! Or, to be more precise, |
52 |
> >>> combinatorics. |
53 |
> >> |
54 |
> >> My PhD thesis (which I will defend in a few weeks) is in |
55 |
> >> computer science, specifically computational geometry and |
56 |
> >> combinatorics. |
57 |
> > |
58 |
> > It is even more shameful for you to not understand such a simple |
59 |
> > facts from elementary probability theory (which is mostly based on |
60 |
> > combinatorics). |
61 |
> TBH I don't understand your estimate. Where did permutations come |
62 |
> from? are you comparing all the different combinations of lines of |
63 |
> code? |
64 |
|
65 |
I just wanted to convey that, if an involved program is n times longer, |
66 |
than another one, it does not, in general, true that it will take only |
67 |
n times more time to find a bug. The dependence here would be nonlinear |
68 |
and with much more steep growth than the linear one, just because all |
69 |
the possible ways to go wrong grows proportional to permutations, not |
70 |
necessary of lines but at least of some other units whose number is |
71 |
roughly proportional to the number of lines. |
72 |
|
73 |
|
74 |
|
75 |
> -----BEGIN PGP SIGNATURE----- |
76 |
> Version: GnuPG v2.0.22 (GNU/Linux) |
77 |
> Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ |
78 |
> |
79 |
> iQEcBAEBCAAGBQJTA6wWAAoJEK64IL1uI2ha5nIH/iUl2VNVAabzJzRJzC29zmWg |
80 |
> t7KwGcfrtx2D40N7n4yM4LB7VBmnyoQ6+Iroh/uk3S33S/YK/5igN8UfuhvV+lvU |
81 |
> 85X3T3RE3oK3kURLq68bb4Ri2zLFQ8y1rQdrrUr9ABzy+F4Xfo+W4t+lLsHSQ+dY |
82 |
> f4F7ByfJAHwh9OziFKh2/qwLj4z0Trv8AzZZhP8M29kTNWEWGyo5rGg8vRqm8Klm |
83 |
> kHR3RvvTdV4AgYGHqxdtrO7qpB50VXZA8ihzl7lbmsBJj3pWBo1osFNWNP82yy7r |
84 |
> s4hev5QrCpgOlEebtYi/noX8Vxx335SUirGCgjN/W9xhIwt3jfMqRes6zD+bi7A= |
85 |
> =F5to |
86 |
> -----END PGP SIGNATURE----- |
87 |
> |