I’ve often heard that the reason Windows has suffered from bloat and so much has been built on top of ancient underlying technologies, partially to ensure compatibility with old software.
If something like Windows 11 requires specific hardware in order to install it, why does it need to accommodate compatibility for archaic devices/software?
Would it not be preferable for Microsoft to start from scratch with an OS that is considerably more efficient and cut-down for newer devices, similar to something like Apple’s MacOS transition from Intel to Apple Silicon, and just provide security updates for the legacy operating systems that would be in use on un-upgradable hardware?


No because it is old software running on new hardware. Modern Windows comes with a ton if code so “old stuff” still works, it even favors bundling runtime cold for OLD frameworks rather than new in the installation. That’s why you need to install modern. NET 10 etc after a new windows installation when installing new softeare, yet it runs old software out of the box. Companies aren’t running dinosaur code on old computers, they’re running dinosaur code on modern computers.
If i remember right, Microsoft said they’re dropping support for a lot of the old .NET stuff at least, so we’ll see if it happens and if companies get mad or update themselves finally
Ah right, I’d assumed old hardware because you’d said “upgrade aging infrastructure”.