1 |
I do not mean to undertake your idea, but I got this simple question that |
2 |
your design does not solve: |
3 |
|
4 |
What's next when you get caught? |
5 |
|
6 |
The whole point of your system is protection of data for the user if the CD |
7 |
falls into other hands. But the legal context nowadays requires you to give |
8 |
the key. |
9 |
|
10 |
- With your system on my livecd, legally I can get in Europe 2 years and a |
11 |
huge fine if I keep my password secret, hence I will handle the key to get |
12 |
something under those 2 years, and every one would do so even you, I guess. |
13 |
Your design is like writing on that CD that it is encrypted: the first guy |
14 |
who boots it finds out that you have something to hide! |
15 |
|
16 |
So at the end, the whole design of such system falls into pieces. I mean, |
17 |
it's alright for the university, your friends but it is simply not enough |
18 |
against legal constraints which is my main concern, let's face the worst |
19 |
enemy, cryptography simply fails to protect valuable information because of |
20 |
yourself: be realistic, will you just keep your mouth shut when that legal |
21 |
voice will trade:"give us the password or it's 2 years!"? |
22 |
|
23 |
- With my system on my livecd, the overall clear side is part of the stealth |
24 |
concept, keep it as usual as you can from the outside, the less change the |
25 |
better. Then I just copy my binary to something irrelevant into /usr/bin and |
26 |
my container file somewhere in /usr/lib and here we go. From a forensic |
27 |
perspective analysis it is much more difficult to reveal the encrypted |
28 |
material. You got in the first place to actually find the encrypted |
29 |
material(few bytes over a Gigabytes). How can charges be pressed without |
30 |
evidence? Then when evidence is found as a file, you have several containers |
31 |
to fall back to. You give a first and single password away that will decrypt |
32 |
a different part than the one you are protecting. |
33 |
|
34 |
Finally, the comparison with true-crypt is unfair (for true-crypt) because |
35 |
it can handle up to 2 containers inside of each other, as denyfs goes up to |
36 |
16 containers, along with 16 != passwords . It's programmed so, but again |
37 |
the setup has to take that 15% data ratio in account or you will start |
38 |
messing with statistic which is your worst enemy. Truecrypt does not deal |
39 |
with this issue and will never, because math cannot solve this problem. The |
40 |
probability of collision increase exponentially with the number of |
41 |
containers along with the size of the main container. And yes denyfs one so |
42 |
often breaks, by overwriting some bits here or there, let's say 1/10 but |
43 |
this is the reason why it makes it so powerful and no reversible. At the end |
44 |
it is a human choice, the user defines a trade between the size of his main |
45 |
container and the number of sub containers he will create to fit the |
46 |
specific amount of data that needs hiding. And this is the missing parameter |
47 |
in the problem, hence the reason no such perfect system will ever be |
48 |
implemented. |
49 |
|
50 |
If you have containers A and B that do not know their respective |
51 |
existences, how can you ensure they won't overwrite bit of one another? You |
52 |
simply can't! You fall into statistics and the only choice to optimize |
53 |
integrity is to up size a huge main container and work on say 5% max of the |
54 |
whole size. Add to that, that 5% rate needs to be lower each time as you |
55 |
create sub containers. |
56 |
|
57 |
Cryptography is an open door to investigation, the first question is:"wtf |
58 |
does he want to hide?" |
59 |
If you can evade this reflexion in your "opponent" mind then it is then very |
60 |
highly probable that the material that needs to be hidden will stay hidden! |
61 |
|
62 |
That's the key concept of military data flow over enemy lines (sounds heavy |
63 |
but true) nowadays in military IT mainly inspired by an Oxford professor |
64 |
(the one who wrote the first public implementation forked on ext2 called |
65 |
StegFS, he's easy to find). The point is to evade torture (physical and |
66 |
psychological) of messengers when they get caught with encrypted data. |
67 |
|
68 |
anyway I made my point that the "best" technology for such job exists and is |
69 |
available and it does *not* apply your design but something much more |
70 |
subtle. |
71 |
|
72 |
Good work for your stuff, it's good for countries which do not have yet IT |
73 |
laws like the one mentioned, Africa, South America (except Brazil) and I'm |
74 |
sure a couple of penguins in the North pole but for European and US |
75 |
citizens, this kind of protection has no legal dimension, and will fall into |
76 |
piece in a real governmental situation. |
77 |
|
78 |
I'm looking forward to a merge into the official catalyst project. |
79 |
|
80 |
Thanks for reading. |
81 |
|
82 |
|
83 |
|
84 |
|
85 |
|
86 |
On 7/1/07, Nelson Batalha <nelson_batalha@××××.pt> wrote: |
87 |
> |
88 |
> I would like to quote these two statements: |
89 |
> |
90 |
> http://gentoo-wiki.com/SECURITY_System_Encryption_DM-Crypt_with_LUKS#Two_things_to_remember |
91 |
> |
92 |
> Thanks for your help, but: |
93 |
> |
94 |
> > It does not protect more the user while he uses it nor from |
95 |
> > potential "after-use" trails. |
96 |
> |
97 |
> So? Was I supposed to release a complete secure solution right now? :P |
98 |
> |
99 |
> > Either you lose the livecd |
100 |
> > along with your identity (or data that leads to your identity) and |
101 |
> > you get caught or while using the software you get caught (like |
102 |
> > your TOR connections have been detected). The only purpose and |
103 |
> > advantage encryption would have is to |
104 |
> > obfuscate some passwords like in the firefox example you gave. |
105 |
> |
106 |
> The idea is that with this livecd you're on the move, boot the cd, use |
107 |
> tor and go away asap once finished. Make sure all your sensible data |
108 |
> is sent in a package just before leaving. If you lose it or someone |
109 |
> looks at it, it won't suspect much. |
110 |
> |
111 |
> > The real solution to your problem would be to use a steganographic |
112 |
> > layer ( http://en.wikipedia.org/wiki/Steganography[1] ) . |
113 |
> |
114 |
> It's not like I didn't remembered steganography, read below. |
115 |
> |
116 |
> > You will not find much (I mean actual real software) besides some |
117 |
> > linux-2.2 tweak over ext2 "proof-of-concept" (10years old |
118 |
> > not stable unreliable) |
119 |
> |
120 |
> False? Look for TrueCrypt. |
121 |
> |
122 |
> > I think that encryption has nothing to do with hiding. In the |
123 |
> > contrary, it is like a big flag standing saying "hey look at |
124 |
> > me I got something to hide, come and get me!". It is just |
125 |
> > obfuscating technology. |
126 |
> |
127 |
> Using the crypt_silent option how likely are you of being catched? |
128 |
> Just put some binaries of emacs and so on on the root, and demonstrate |
129 |
> in the fake root that's what is for. It is a good hiding technique I |
130 |
> think, but not perfect. |
131 |
> |
132 |
> The thing is, given the low probability of being catched, either by |
133 |
> having the squashfs with Steganography or not, some large file would |
134 |
> be there, and if they're good enough to realize it is a bootable |
135 |
> livecd and it is forcing a fake boot, then they're good enough to see |
136 |
> a big closed file is there. |
137 |
> |
138 |
> Unless one did multiple hidden volumes inside this one, or just hide |
139 |
> some files inside the root. But we're back to less usability and we're |
140 |
> being forced to use truecrypt (I don't see a currently free maintained |
141 |
> option). |
142 |
> |
143 |
> If we accept the Truecrypt restrictions (haven't read everything, but |
144 |
> it's not gpl so I assume they're more restrictive :P), we could |
145 |
> implement these several layers of encryption and increase |
146 |
> functionality with some scripts hidden in a pen for example. But to |
147 |
> put any programs like firefox+torplugin+tor+privoxy in them, and |
148 |
> separate in small files, that's a lot of work. This implementation is |
149 |
> good enough for most cases. Also Luks is well maintained and GPL. |
150 |
> |
151 |
> > Now, from a legal point of view, being caught with an encrypted |
152 |
> > material whether livecd or not in major countries |
153 |
> > (UK,GER,FR,US,china) requires from you the decryption key |
154 |
> |
155 |
> Fine for me, don't do anything illegal in free countries. As for the |
156 |
> China example, just do as on my second point and use the following |
157 |
> idea: encrypt with luks as it is, and for the more sensitive files you |
158 |
> can use stenography using stenography software in a separate volume |
159 |
> (like a usb pen). If they ask you for the key, give it to them and |
160 |
> show just some more innocent files you were hiding. |
161 |
> |
162 |
> It's better then have the cd almost all open, again, because you may lose |
163 |
> it. |
164 |
> |
165 |
> Let me know if I'm wrong or if you have more ideas ;) |
166 |
> |
167 |
> Cheers, |
168 |
> Nelson |
169 |
> -- |
170 |
> gentoo-catalyst@g.o mailing list |
171 |
> |
172 |
> |