The free software has a problem.
Posted: Tue May 13, 2025 5:32 pm
Hello. I didn't know where to post this message. Most internet forums - especially Finnish ones - are full of snobs and cyberbullies that make it impossible to have a civilized discussion about this matter. I noticed that MX Linux still plans to support 32-bit targets into the future so I thought that this forum might be a good place.
To the subject at hand. I have noticed that most computer programs gain a significant increase in binary size when they are compiled to a 64-bit target, compared to what the size is in 32-bit target. Since the default operand size in x86 is 32 bits in both 32- and 64-bit mode, this size difference has to come entirely from the pointer size. Usually 64-bit programs seem to be about 15 % larger than 32-bit programs. The exact difference depends on how much pointers are used.
The data also needs more memory in 64-bit mode, if there are pointers. In certain use cases, where the dynamic data is mostly just pointer tables (spreadsheets probably being a good example) the program can require twice as much memory in 64-bit mode than it needs in 32-bit mode.
Today I browsed online computer shops and noticed that most new laptops still have only 4 GB of RAM. Desktop computers usually have 8 GB. (I also find it weird to use the word "only" when speaking about a memory amount of more than four billion octets.) That means that they don't benefit much, if at all, from 64-bit code. Sure, some arithmetic calculations are faster in 64-bit CPU mode - but the cost is just too much, if the system is already low in memory. If 64-bit programs are on average 15 % larger than 32-bit programs, then "downgrading" to 32-bit Linux distribution in a computer with 8 GB of memory can easily free an entire gigabyte of memory.
It should also be noted that before Windows 11 was essential, most entry-level PCs didn't necessarily even have that 4 GB of memory.
And those are _new_ computers - old ones have even less memory and many don't even support 64-bit code at all. Those are the computers that people usually use to try Linux for the first time. Also, new computers tend to be more and more often boot-locked into their factory installed operating system. UEFI makes it harder or sometimes even impossible to boot any other programs than the bootloader of Windows.
Now many Linux popular Linux distributions have completely dropped the support for 32-bit targets. Even Debian drops support for 32-bit x86 targets, and the list of its supported computing platform is again one entry shorter. Ubuntu even dropped support for older 64-bit x86 CPUs. Considering that Ubuntu has achieved the status of the most well-known beginner-friendly Linux distribution, this means that less people are going to use Linux - instead they just throw their old computer to trash and buy a new one with Windows 11.
There is also many problems with programs that are at least partially written in rust. The official rust compiler has a bug that it always adds SSE2 instructions to i686 code. Pentium Pro does not have SSE2. Anything older than Pentium 4 doesn't. The maintainers of the rust compiler aren't going to fix that bug.
Thanks to rust and all compile-time safety checks that its compiler makes, compiling Firefox or SeaMonkey's experimental version now requires 32 gigabytes of memory. That kind of ruins the whole idea of free software, if you technically have the right to modify the program, but building the modified version of the program requires so much more powerful computer that you are stuck with the "official" version of the program binary anyway. It's almost funny how the "rust people" just don't see a problem here. Safety is the number one priority to them, no matter the cost.
Many software developers don't even seem to acknowledge the fact that an average computer that actually exists and is in use is not very powerful. For example, in one IRC channel someone made a claim that already in 1999 almost everyone had 1 GB of memory, which is ridiculous. These snobby software developers, who thrive on destroying old stuff, are creating a corporate-friendly computing monoculture where everyone just has to consume and buy more new stuff.
My opinion is that the developers of free software should target hardware that actually exists out there. That's the only way how free software can actually reach more users. The current direction leads to a situation where free software can only be ran on "specialized" computers that cannot be bought from normie PC stores.
To the subject at hand. I have noticed that most computer programs gain a significant increase in binary size when they are compiled to a 64-bit target, compared to what the size is in 32-bit target. Since the default operand size in x86 is 32 bits in both 32- and 64-bit mode, this size difference has to come entirely from the pointer size. Usually 64-bit programs seem to be about 15 % larger than 32-bit programs. The exact difference depends on how much pointers are used.
The data also needs more memory in 64-bit mode, if there are pointers. In certain use cases, where the dynamic data is mostly just pointer tables (spreadsheets probably being a good example) the program can require twice as much memory in 64-bit mode than it needs in 32-bit mode.
Today I browsed online computer shops and noticed that most new laptops still have only 4 GB of RAM. Desktop computers usually have 8 GB. (I also find it weird to use the word "only" when speaking about a memory amount of more than four billion octets.) That means that they don't benefit much, if at all, from 64-bit code. Sure, some arithmetic calculations are faster in 64-bit CPU mode - but the cost is just too much, if the system is already low in memory. If 64-bit programs are on average 15 % larger than 32-bit programs, then "downgrading" to 32-bit Linux distribution in a computer with 8 GB of memory can easily free an entire gigabyte of memory.
It should also be noted that before Windows 11 was essential, most entry-level PCs didn't necessarily even have that 4 GB of memory.
And those are _new_ computers - old ones have even less memory and many don't even support 64-bit code at all. Those are the computers that people usually use to try Linux for the first time. Also, new computers tend to be more and more often boot-locked into their factory installed operating system. UEFI makes it harder or sometimes even impossible to boot any other programs than the bootloader of Windows.
Now many Linux popular Linux distributions have completely dropped the support for 32-bit targets. Even Debian drops support for 32-bit x86 targets, and the list of its supported computing platform is again one entry shorter. Ubuntu even dropped support for older 64-bit x86 CPUs. Considering that Ubuntu has achieved the status of the most well-known beginner-friendly Linux distribution, this means that less people are going to use Linux - instead they just throw their old computer to trash and buy a new one with Windows 11.
There is also many problems with programs that are at least partially written in rust. The official rust compiler has a bug that it always adds SSE2 instructions to i686 code. Pentium Pro does not have SSE2. Anything older than Pentium 4 doesn't. The maintainers of the rust compiler aren't going to fix that bug.
Thanks to rust and all compile-time safety checks that its compiler makes, compiling Firefox or SeaMonkey's experimental version now requires 32 gigabytes of memory. That kind of ruins the whole idea of free software, if you technically have the right to modify the program, but building the modified version of the program requires so much more powerful computer that you are stuck with the "official" version of the program binary anyway. It's almost funny how the "rust people" just don't see a problem here. Safety is the number one priority to them, no matter the cost.
Many software developers don't even seem to acknowledge the fact that an average computer that actually exists and is in use is not very powerful. For example, in one IRC channel someone made a claim that already in 1999 almost everyone had 1 GB of memory, which is ridiculous. These snobby software developers, who thrive on destroying old stuff, are creating a corporate-friendly computing monoculture where everyone just has to consume and buy more new stuff.
My opinion is that the developers of free software should target hardware that actually exists out there. That's the only way how free software can actually reach more users. The current direction leads to a situation where free software can only be ran on "specialized" computers that cannot be bought from normie PC stores.