>This problem exists on Linux as well, but I'd like to just target this instead as Linux can still happily run on even the shittiest of boxes in the modern day.
Broken Promises
Twenty years ago, we were promised a future where computers would get faster, more efficient, and easier to use. Hardware delivered on that promise spectacularly. A modern CPU has thousands of times more processing power than a Pentium 4. RAM that cost $200 for 256MB now costs $30 for 32GB. Storage went from spinning disks to solid-state drives that are orders of magnitude faster.
So why does using a computer in 2025 often feel worse than it did in 2005?
The answer is simple: while hardware engineers kept their promises, software developers broke theirs.
Programmers suck
Let's start with the most obvious example: Windows itself.
Windows XP could run comfortably on 512MB of RAM. You could boot up, run Office, browse the web, and still have memory to spare. Windows 7 raised the bar but remained reasonable - 2GB was plenty for most users.
Then came Windows 10 and 11. These operating systems struggle with less than 8GB of RAM, and 16GB is becoming the new baseline. What exactly are we getting for that 32x increase in memory usage? A few translucent effects (Worse than Aero Glass)? Cortana nobody asked for? Mandatory telemetry services?
The core functionality - managing files, running programs, displaying windows - hasn't fundamentally changed. Yet somehow Microsoft needs 16 times more memory to accomplish the same basic tasks.
This isn't progress. It's waste masquerading as innovation.
Telemetry, spyware, other BS
Here's a real example from monitoring network traffic on a Windows 10 machine:
Steam, when accidentally opened for less than a minute, generated hundreds of outbound connections. Not for downloading games or updates - just for existing. Community features, marketplace data, friend status updates, advertising content - all transmitted constantly whether you use these features or not.
Windows itself isn't much better. Fire up a network monitor and watch your OS phone home dozens of times per minute. Sending what? Microsoft claims it's "anonymous telemetry" for improving user experience. But the user experience of Windows has arguably gotten worse over the past decade, so what exactly is all this data accomplishing?
Your computer isn't working for you anymore. It's working for Microsoft, Steam, and whatever other companies have decided they need a direct line to your system.
Microsoft doesn't actually use their own operating system

A friend sent me this screenshot. Can you see what's wrong here?
This is an error in Windows 11:
>Windows Vista error symbol >Win9x unthemed OK button >Content of window is Vista-white while it's 9x grey below and above the window's contents >Home - File Explorer: >but also explorer.exe - System Error just in case you didn't know what it was >The most important part of the Windows experience will randomly error at a moments notice
Windows 7 had something modern versions lack: consistency. Every dialog box followed the same design language. Menus worked predictably. The control panel was organized logically.
Windows 10 and 11? Good luck. Some settings live in the new "Settings" app with its touch-friendly interface designed for tablets nobody uses. Other settings still live in the old Control Panel. Some settings exist in both places but control different things. Some settings have been removed entirely.
This isn't user-hostile by accident. This is the result of developers prioritizing their own convenience over user experience. It's easier to build new interfaces than fix old ones. It's easier to add telemetry than optimize performance. It's easier to assume users have infinite RAM than write efficient code.
Modern programmers are ungrateful, or at least not truly skilled enough
We have computers that are literally thousands of times more powerful than the machines that sent humans to the moon. Yet somehow, opening a text editor takes longer than it did in 1995.
Why does a chat application need to run a full Chromium browser instance? Why does a simple text editor consume 200MB of RAM? Why do we need gigabytes of storage for what used to be kilobyte programs?
The answer isn't technical complexity. The underlying computing tasks haven't become dramatically harder. The answer is that modern software development culture has abandoned efficiency as a virtue.
"Memory is cheap," they say. "CPUs are fast," they justify. "Users won't notice," they assume.
But users do notice. They notice when their brand-new laptop feels sluggish. They notice when their battery dies after three hours. They notice when their "high-end" gaming rig struggles to run basic applications smoothly.
What We Lost
In chasing the illusion of modernization, we've sacrificed real value:
Stability: Windows 7 could run for months without rebooting. Windows 10 forces updates that break systems.
Privacy: Your OS didn't used to spy on you (Unless you consider forced IE integration in 98+ and Windows Customer Experience). Now it's the default.
Control: Users could disable features they didn't want. Now essential functions are tied to bloatware.
Efficiency: Programs used to be optimized for the hardware available. Now they assume unlimited resources.
Predictability: Interfaces used to work consistently. Now every update changes the UI for no apparent reason.
The Real Cost
This isn't just about nostalgia or resistance to change. There are real costs to this regression:
Economic: People replace perfectly functional hardware because software has become too bloated to run on it.
Environmental: Working computers become e-waste because they can't handle modern bloatware.
Productivity: Time is wasted navigating inconsistent interfaces and waiting for overpowered machines to perform simple tasks.
Security: Complexity creates vulnerabilities. Every unnecessary service is another attack vector.
The Path Forward
The solution isn't to go back to Windows XP (though honestly, it's tempting). The solution is to demand better from software developers, and if they can't deliver, REPLACE their inferior software with optimized alternatives.
We need to stop accepting "that's just how modern software works" as an excuse for poor engineering. We need to value efficiency, stability, and user control over flashy features and data collection.
Hardware engineers held up their end of the bargain. They gave us incredible performance improvements year after year. It's time for software developers to do the same.
The next time someone tells you that you need to upgrade your hardware to run modern software, ask them: what exactly is that software doing that requires so much more power than it did ten years ago?
The answer might surprise you. Or more likely, disappoint you.
Your computer should work for you, not against you.