I’ve often heard that the reason Windows has suffered from bloat and so much has been built on top of ancient underlying technologies, partially to ensure compatibility with old software.
If something like Windows 11 requires specific hardware in order to install it, why does it need to accommodate compatibility for archaic devices/software?
Would it not be preferable for Microsoft to start from scratch with an OS that is considerably more efficient and cut-down for newer devices, similar to something like Apple’s MacOS transition from Intel to Apple Silicon, and just provide security updates for the legacy operating systems that would be in use on un-upgradable hardware?


Worth considering that there’s less of a need for backwards-compatibility with Linux binaries because most Linux software is open-source, so they can be recompiled or updated for modern Linux by the end user if the maintainer is gone. A lot of legacy Windows software is still in use and the source is unavailable, so Windows has to support it for the businesses that use the legacy software. In other words, it’s a cultural difference too. Linux seems pretty good at supporting things users actually use, like old hardware.
Not disagreeing with you btw, just my thoughts on why that difference exists.