I’ve often heard that the reason Windows has suffered from bloat and so much has been built on top of ancient underlying technologies, partially to ensure compatibility with old software.
If something like Windows 11 requires specific hardware in order to install it, why does it need to accommodate compatibility for archaic devices/software?
Would it not be preferable for Microsoft to start from scratch with an OS that is considerably more efficient and cut-down for newer devices, similar to something like Apple’s MacOS transition from Intel to Apple Silicon, and just provide security updates for the legacy operating systems that would be in use on un-upgradable hardware?


That’s not a big financial incentive.
Microsoft will remove stuff when it actually gets in the way.
If it’s easier to leave in and not have to touch dozens of other programs/services then they will.
They might mark it as depreciating, and start planning a suitable replacement. They might just mark it as depreciating and kick the can down the road.
When enough services that relied on that depreciating thing have been touched due to other updates, then they might look at actioning the depreciation.
But if it doesn’t actively break the thing they are currently working on, the cost overhead or ripping it out is insane.
There might be other dev teams working on features that now rely/leverage the thing marked as depreciating. But the thing getting marked as depreciating happened towards the end of the other teams new feature development cycle. At which point actually depreciating the thing might invalidate that other teams entire project.
And maybe the rip it out, and it turns out one of their large clients (or a large amount of the user base) was relying on it.
Addressing technical debt is always hard to justify, but it always makes a better project.
If management doesn’t care about a better project, they will prioritise features and things that make money