Gentoo Archives: gentoo-user

From: Rich Freeman <rich0@g.o>
To: gentoo-user@l.g.o
Subject: Re: [gentoo-user] why both /usr/lib and /usr/lib64 on a 64bit system?
Date: Mon, 15 Feb 2021 13:09:18
Message-Id: CAGfcS_kKgAuZ8TqBs6iq0hcxe=nZzSfsmxHkSz+Z1M1nGNptww@mail.gmail.com
In Reply to: Re: [gentoo-user] why both /usr/lib and /usr/lib64 on a 64bit system? by Walter Dnes
1 On Mon, Feb 15, 2021 at 3:17 AM Walter Dnes <waltdnes@××××××××.org> wrote:
2 >
3 > On Sun, Feb 14, 2021 at 06:09:58PM -0700, Grant Taylor wrote
4 > > On 2/14/21 10:51 AM, Jack wrote:
5 > > > I don't think you can completely get rid of it.
6 > >
7 > > My (long term) desire is to do away with /lib32 and /lib64, ultimately
8 > > only using /lib. Likewise for the other library directories in /usr or
9 > > wherever they are. I don't see a need for the specific bit variants in
10 > > the future.
11 >
12 > How long before we see /lib and /lib64 *AND* /lib128 ?
13
14 Well, anything is possible, but it seems unlikely. If it happens soon
15 then chances are that multilib will still be a thing and so less stuff
16 will break than when amd64 was introduced. If it happens in a century
17 when we're all running no-multilib then we'll be reinventing the
18 wheel.
19
20 The main things that drove amd64 though were:
21 * increasing the number of registers available
22 * allowing direct access to >4GB of RAM (or a fraction of this
23 depending on the OS design)
24
25 I suspect the first is less of a concern these days - compilers
26 generally only need so many registers and when instructions are added
27 that need more register space they tend to come with registers to
28 accommodate them. The second will be a concern when exabyte-scale
29 data structures are common to work with. Note that current processors
30 generally can't handle this much address space, but the amd64
31 instruction set itself can (I think), so the CPUs can continue to
32 scale up. RAM capacity doesn't really seem to be increasing in recent
33 years - I'm not sure if that is more market-driven or a technological
34 limitation. RAM speed has improved somewhat, especially in niches
35 like GPUs. Computers with 1GB of RAM were a thing in Y2K and today it
36 is pretty uncommon for a standard desktop to have more than 8GB, and
37 if you want to even cram more than about 128GB into a motherboard you
38 start needing more enterprise-grade hardware. That isn't a very large
39 increase in 20 years - doubling every 3 years (in terms of max
40 capacity). We're using 37 bits today (on desktops), so at 3 years per
41 bit that is another 80 years until we exhaust 64 bits, assuming that
42 we continue to grow exponentially at the same rate. Though you do
43 have to think about what use cases actually need that kind of working
44 set. At 64-bit depth 300dpi 3D graphics would require 200MB/in^3, If
45 you had a house-sized VR space (20k ft^3) rendered at that detail
46 you'd need 7TB of RAM to store a frame of video, which is still only
47 50 bits. Maybe if you want a holodeck that 1000 people can play
48 around in at once you'd run into the 64-bit limit (of course you'd
49 have a ton of IO issues to fix long before then).
50
51 So, that makes me wonder what the practical requirements are in order
52 to implement The Matrix. :) Of course, if you're sticking people in
53 it maybe you can borrow some of their own memory capacity and
54 processing abilities to drive it. Kind of makes you wonder why you'd
55 even need the human brains in the first place if you're able to deal
56 with that kind of data in a simulation...
57
58 --
59 Rich