The free software has a problem.
The free software has a problem.
Hello. I didn't know where to post this message. Most internet forums - especially Finnish ones - are full of snobs and cyberbullies that make it impossible to have a civilized discussion about this matter. I noticed that MX Linux still plans to support 32-bit targets into the future so I thought that this forum might be a good place.
To the subject at hand. I have noticed that most computer programs gain a significant increase in binary size when they are compiled to a 64-bit target, compared to what the size is in 32-bit target. Since the default operand size in x86 is 32 bits in both 32- and 64-bit mode, this size difference has to come entirely from the pointer size. Usually 64-bit programs seem to be about 15 % larger than 32-bit programs. The exact difference depends on how much pointers are used.
The data also needs more memory in 64-bit mode, if there are pointers. In certain use cases, where the dynamic data is mostly just pointer tables (spreadsheets probably being a good example) the program can require twice as much memory in 64-bit mode than it needs in 32-bit mode.
Today I browsed online computer shops and noticed that most new laptops still have only 4 GB of RAM. Desktop computers usually have 8 GB. (I also find it weird to use the word "only" when speaking about a memory amount of more than four billion octets.) That means that they don't benefit much, if at all, from 64-bit code. Sure, some arithmetic calculations are faster in 64-bit CPU mode - but the cost is just too much, if the system is already low in memory. If 64-bit programs are on average 15 % larger than 32-bit programs, then "downgrading" to 32-bit Linux distribution in a computer with 8 GB of memory can easily free an entire gigabyte of memory.
It should also be noted that before Windows 11 was essential, most entry-level PCs didn't necessarily even have that 4 GB of memory.
And those are _new_ computers - old ones have even less memory and many don't even support 64-bit code at all. Those are the computers that people usually use to try Linux for the first time. Also, new computers tend to be more and more often boot-locked into their factory installed operating system. UEFI makes it harder or sometimes even impossible to boot any other programs than the bootloader of Windows.
Now many Linux popular Linux distributions have completely dropped the support for 32-bit targets. Even Debian drops support for 32-bit x86 targets, and the list of its supported computing platform is again one entry shorter. Ubuntu even dropped support for older 64-bit x86 CPUs. Considering that Ubuntu has achieved the status of the most well-known beginner-friendly Linux distribution, this means that less people are going to use Linux - instead they just throw their old computer to trash and buy a new one with Windows 11.
There is also many problems with programs that are at least partially written in rust. The official rust compiler has a bug that it always adds SSE2 instructions to i686 code. Pentium Pro does not have SSE2. Anything older than Pentium 4 doesn't. The maintainers of the rust compiler aren't going to fix that bug.
Thanks to rust and all compile-time safety checks that its compiler makes, compiling Firefox or SeaMonkey's experimental version now requires 32 gigabytes of memory. That kind of ruins the whole idea of free software, if you technically have the right to modify the program, but building the modified version of the program requires so much more powerful computer that you are stuck with the "official" version of the program binary anyway. It's almost funny how the "rust people" just don't see a problem here. Safety is the number one priority to them, no matter the cost.
Many software developers don't even seem to acknowledge the fact that an average computer that actually exists and is in use is not very powerful. For example, in one IRC channel someone made a claim that already in 1999 almost everyone had 1 GB of memory, which is ridiculous. These snobby software developers, who thrive on destroying old stuff, are creating a corporate-friendly computing monoculture where everyone just has to consume and buy more new stuff.
My opinion is that the developers of free software should target hardware that actually exists out there. That's the only way how free software can actually reach more users. The current direction leads to a situation where free software can only be ran on "specialized" computers that cannot be bought from normie PC stores.
To the subject at hand. I have noticed that most computer programs gain a significant increase in binary size when they are compiled to a 64-bit target, compared to what the size is in 32-bit target. Since the default operand size in x86 is 32 bits in both 32- and 64-bit mode, this size difference has to come entirely from the pointer size. Usually 64-bit programs seem to be about 15 % larger than 32-bit programs. The exact difference depends on how much pointers are used.
The data also needs more memory in 64-bit mode, if there are pointers. In certain use cases, where the dynamic data is mostly just pointer tables (spreadsheets probably being a good example) the program can require twice as much memory in 64-bit mode than it needs in 32-bit mode.
Today I browsed online computer shops and noticed that most new laptops still have only 4 GB of RAM. Desktop computers usually have 8 GB. (I also find it weird to use the word "only" when speaking about a memory amount of more than four billion octets.) That means that they don't benefit much, if at all, from 64-bit code. Sure, some arithmetic calculations are faster in 64-bit CPU mode - but the cost is just too much, if the system is already low in memory. If 64-bit programs are on average 15 % larger than 32-bit programs, then "downgrading" to 32-bit Linux distribution in a computer with 8 GB of memory can easily free an entire gigabyte of memory.
It should also be noted that before Windows 11 was essential, most entry-level PCs didn't necessarily even have that 4 GB of memory.
And those are _new_ computers - old ones have even less memory and many don't even support 64-bit code at all. Those are the computers that people usually use to try Linux for the first time. Also, new computers tend to be more and more often boot-locked into their factory installed operating system. UEFI makes it harder or sometimes even impossible to boot any other programs than the bootloader of Windows.
Now many Linux popular Linux distributions have completely dropped the support for 32-bit targets. Even Debian drops support for 32-bit x86 targets, and the list of its supported computing platform is again one entry shorter. Ubuntu even dropped support for older 64-bit x86 CPUs. Considering that Ubuntu has achieved the status of the most well-known beginner-friendly Linux distribution, this means that less people are going to use Linux - instead they just throw their old computer to trash and buy a new one with Windows 11.
There is also many problems with programs that are at least partially written in rust. The official rust compiler has a bug that it always adds SSE2 instructions to i686 code. Pentium Pro does not have SSE2. Anything older than Pentium 4 doesn't. The maintainers of the rust compiler aren't going to fix that bug.
Thanks to rust and all compile-time safety checks that its compiler makes, compiling Firefox or SeaMonkey's experimental version now requires 32 gigabytes of memory. That kind of ruins the whole idea of free software, if you technically have the right to modify the program, but building the modified version of the program requires so much more powerful computer that you are stuck with the "official" version of the program binary anyway. It's almost funny how the "rust people" just don't see a problem here. Safety is the number one priority to them, no matter the cost.
Many software developers don't even seem to acknowledge the fact that an average computer that actually exists and is in use is not very powerful. For example, in one IRC channel someone made a claim that already in 1999 almost everyone had 1 GB of memory, which is ridiculous. These snobby software developers, who thrive on destroying old stuff, are creating a corporate-friendly computing monoculture where everyone just has to consume and buy more new stuff.
My opinion is that the developers of free software should target hardware that actually exists out there. That's the only way how free software can actually reach more users. The current direction leads to a situation where free software can only be ran on "specialized" computers that cannot be bought from normie PC stores.
- trawglodyte
- Posts: 113
- Joined: Tue Feb 13, 2024 7:35 am
Re: The free software has a problem.
Out of curiosity I scrolled best-selling laptops on Newegg and 16G was the most common, with 32G or higher also popular. There were a few 8G, but nothing 4G.
- DukeComposed
- Posts: 1389
- Joined: Thu Mar 16, 2023 1:57 pm
Re: The free software has a problem.
Remember kids, DRAM is the only measurement of computing. When choosing a new computer, you only have one question to consider, because you want the one with more Gee Bees.trawglodyte wrote: Tue May 13, 2025 5:50 pm Out of curiosity I scrolled best-selling laptops on Newegg and 16G was the most common, with 32G or higher also popular. There were a few 8G, but nothing 4G.
This rant started out fairly benign, if not reductive, arguing that 64-bit pointers are bigger than 32-bit pointers, but then uses that as a strawman to divert OP's ire at its real target: "snobby software developers" who are guilty of the sin of... supporting new hardware.samwdpckr wrote: Tue May 13, 2025 5:32 pm Many software developers don't even seem to acknowledge the fact that an average computer that actually exists and is in use is not very powerful. For example, in one IRC channel someone made a claim that already in 1999 almost everyone had 1 GB of memory, which is ridiculous. These snobby software developers, who thrive on destroying old stuff, are creating a corporate-friendly computing monoculture where everyone just has to consume and buy more new stuff.
My opinion is that the developers of free software should target hardware that actually exists out there.
Yep. Those evil programmers just keep writing code that runs on the stuff you're most likely to find on Newegg. How fiendish!
While I can appreciate OP's underlying message of anti-consumerism and keeping old hardware around even though it isn't the newest or the fastest model anymore, saying "free software has a problem" purely because of The Innovator's Dilemma makes OP come across as misguided and more than a little unhinged. It's no wonder other forums haven't been receptive to whatever point this argument is trying to make.
If you don't want to upgrade your hardware? Don't. The fact that new software will come along to support new hardware doesn't hurt you too much because, thanks to the fact it's free and open source, the old stuff will keep working on your old machine.
Eventually the browser will break -- incompatible security parameters with modern web servers, eventually security bugs won't get patched, and eventually your machine won't be able to peacefully coexist online anymore. I should know, I recently played around with a Windows XP install that is patently unusable on the modern Internet. I didn't complain about this too much for the same reason I didn't cry when my trusty old graphing calculator died 15 years after I'd bought it.
So OP isn't necessarily wrong, per se, but definitely misguided. Is the argument that we should stop fabricating new chips? Perhaps he's arguing we should refuse to write new software for new processors altogether. If we're going to get stuck in the past, how do we pick which year of the past in which to get stuck? I hope it's either before the Pentium F00F bug, or after that but before the first generation of Centrino processors. Maybe we should just let OP pick which retroarchitecture to make the new, eternal standard for the world, because he clearly knows what's best for society. I can see the headlines now: "COMPUCOMMUNISM SAVES THE WORLD: Standard PC Specs Written In Stone, No New Upgrades, Ever... Or Else!"
P.S. That old Windows XP setup I mentioned? It runs the vintage .EXE I wanted it to run beautifully, and it will continue to do so for quite some time. I've taken to writing more Win32 assembly software for it, too, and that's been a fun endeavor I think more people should try doing. Retrocomputing is a wonderful hobby anyone can enjoy, if they wait long enough. If they don't let the bitterness of time, wear, and obsolescence get the better of them.
- trawglodyte
- Posts: 113
- Joined: Tue Feb 13, 2024 7:35 am
Re: The free software has a problem.
The OP claimed that "most" new laptops being sold had 4G of RAM. This was an easy claim to fact-check, which I did and found NONE of the best-selling laptops on Newegg had 4G, in fact MOST had 16G with 32G being the next most common. Nobody claimed this was the only consideration, your snark is unwarranted. My intention was only to fact-check one claim and show just how far from reality it was. I'm under no obligation to respond to every other claim made.DukeComposed wrote: Tue May 13, 2025 6:25 pm Remember kids, DRAM is the only measurement of computing. When choosing a new computer, you only have one question to consider, because you want the one with more Gee Bees.
Re: The free software has a problem.
While really think there are some points to running applications on the lean side... there is no question that quite a lot of software needs to grow larger to incorporate more functionality. Like everything - there is a balance.
I have been outfitting computer users with 8gb of ram LONG before people said it was 'required'.. and 16gb of ram has been my 'minimum' build for YEARS! When we talk about windows, the bloat on the OS takes over 4gb just to get legs under it, and HAS taken over 4gb since before win10 was alive ... ancient history on ram I'm afraid ;-/
As for how it all works - I strongly suggest that you have a look at how compilers, libraries and languages work, as all have gotten larger and larger and not needlessly. Users want more, and providing more comes with a price tag. Add in the bloat that many things add to cover multiple platforms, os's, etc etc.. and you start to see why things "get larger".
And... as with most things - cars for example... there are some people very happy to keep running that 1979 datsun or 1989 Dodge Ram pickup... But there are more people that want that newer model and the features that it provides.
I have been outfitting computer users with 8gb of ram LONG before people said it was 'required'.. and 16gb of ram has been my 'minimum' build for YEARS! When we talk about windows, the bloat on the OS takes over 4gb just to get legs under it, and HAS taken over 4gb since before win10 was alive ... ancient history on ram I'm afraid ;-/
As for how it all works - I strongly suggest that you have a look at how compilers, libraries and languages work, as all have gotten larger and larger and not needlessly. Users want more, and providing more comes with a price tag. Add in the bloat that many things add to cover multiple platforms, os's, etc etc.. and you start to see why things "get larger".
And... as with most things - cars for example... there are some people very happy to keep running that 1979 datsun or 1989 Dodge Ram pickup... But there are more people that want that newer model and the features that it provides.
*QSI = Quick System Info from menu (Copy for Forum)
*MXPI = MX Package Installer
*Please check the solved checkbox on the post that solved it.
*Linux -This is the way!
*MXPI = MX Package Installer
*Please check the solved checkbox on the post that solved it.
*Linux -This is the way!
Re: The free software has a problem.
We are trashing (*ahem* "recycling") 64-bit capable machines like 15 years ago...
We are not going to support 32-bit if Debian is not supporting it and as far as I know they don't even plan to build 32-bit kernels for Trixie and they won't have a 32-bit installer.
We are not going to support 32-bit if Debian is not supporting it and as far as I know they don't even plan to build 32-bit kernels for Trixie and they won't have a 32-bit installer.
Re: The free software has a problem.
Hi,
*Side notes..
*posted at the same time as Adrian
If I'm not mistaken Debian Trixie will not provide installers for 32bit but 32bit packages will still be present in Debian's repo, in the case of MX (and others) with their own installer there is at least the potential to continue with 32bit ISO's. I would also mention that I would guess VERY few developers test their application code on 32bit systems any more so if you have the good fortune that it still compiles it is probably not a guarantee that there aren't other problems to be found in the moldy code.
*Side notes..
*posted at the same time as Adrian
If I'm not mistaken Debian Trixie will not provide installers for 32bit but 32bit packages will still be present in Debian's repo, in the case of MX (and others) with their own installer there is at least the potential to continue with 32bit ISO's. I would also mention that I would guess VERY few developers test their application code on 32bit systems any more so if you have the good fortune that it still compiles it is probably not a guarantee that there aren't other problems to be found in the moldy code.
Re: The free software has a problem.
Where were you looking? When I go to Amazon's web site, their $500 desktop towers/laptops start with 16GB of RAM. Under $300 will get you at least 8GB of RAM.samwdpckr wrote: Tue May 13, 2025 5:32 pm Today I browsed online computer shops and noticed that most new laptops still have only 4 GB of RAM. Desktop computers usually have 8 GB.
This is my Fluxbox . There are many others like it, but this one is mine. My Fluxbox is my best friend. It is my life.
I must master it as I must master my life. Without me, my Fluxbox is useless. Without my Fluxbox, I am useless.
I must master it as I must master my life. Without me, my Fluxbox is useless. Without my Fluxbox, I am useless.
Re: The free software has a problem.
I recommend looking at stores where most people (read: normies) actually would buy a computer.trawglodyte wrote: Tue May 13, 2025 5:50 pm Out of curiosity I scrolled best-selling laptops on Newegg and 16G was the most common, with 32G or higher also popular. There were a few 8G, but nothing 4G.
I think you are the one using a strawman argument here.DukeComposed wrote: Tue May 13, 2025 6:25 pm but then uses that as a strawman to divert OP's ire at its real target: "snobby software developers" who are guilty of the sin of... supporting new hardware.
This has nothing to do with the innovator's dilemma. Supporting new hardware does not mean breaking compatibility with other types of hardware targets.DukeComposed wrote: Tue May 13, 2025 6:25 pm While I can appreciate OP's underlying message of anti-consumerism and keeping old hardware around even though it isn't the newest or the fastest model anymore, saying "free software has a problem" purely because of The Innovator's Dilemma makes OP come across as misguided and more than a little unhinged. It's no wonder other forums haven't been receptive to whatever point this argument is trying to make.
Old machines need security updates too.DukeComposed wrote: Tue May 13, 2025 6:25 pm If you don't want to upgrade your hardware? Don't. The fact that new software will come along to support new hardware doesn't hurt you too much because, thanks to the fact it's free and open source, the old stuff will keep working on your old machine.
That's exactly what the corporate-friendly monoculture is.DukeComposed wrote: Tue May 13, 2025 6:25 pm Eventually the browser will break -- incompatible security parameters with modern web servers, eventually security bugs won't get patched, and eventually your machine won't be able to peacefully coexist online anymore.
You just keep coming up with new strawmans.DukeComposed wrote: Tue May 13, 2025 6:25 pm So OP isn't necessarily wrong, per se, but definitely misguided. Is the argument that we should stop fabricating new chips? Perhaps he's arguing we should refuse to write new software for new processors altogether. If we're going to get stuck in the past, how do we pick which year of the past in which to get stuck? I hope it's either before the Pentium F00F bug, or after that but before the first generation of Centrino processors. Maybe we should just let OP pick which retroarchitecture to make the new, eternal standard for the world, because he clearly knows what's best for society. I can see the headlines now: "COMPUCOMMUNISM SAVES THE WORLD: Standard PC Specs Written In Stone, No New Upgrades, Ever... Or Else!"
Re: The free software has a problem.
I looked at Verkkokauppa.com, a Finnish computer hardware store - which is probably not where normies would buy a computer from. The places, where they more likely buy them, sell computers with even smaller amounts of memory.siamhie wrote: Tue May 13, 2025 7:08 pmWhere were you looking? When I go to Amazon's web site, their $500 desktop towers/laptops start with 16GB of RAM. Under $300 will get you at least 8GB of RAM.samwdpckr wrote: Tue May 13, 2025 5:32 pm Today I browsed online computer shops and noticed that most new laptops still have only 4 GB of RAM. Desktop computers usually have 8 GB.