FCC redefines broadband to 25mbps
Millions of Americans suddenly woke up without broadband today, as the FCC voted to change the…
Unless you’ve been living under a rock lately (good for you, can I join you?), you’ve probably noticed that computer components have gotten expensive. Like, really, really expensive, due to the ongoing chip shortage. Which is certainly a bit annoying if you’re a computer hobbyist or gamer. Though that’s arguably its the least of our concerns right now. But the question is, do you even really need a new computer? Well, depending on what you’re doing with it, probably not.
A number of years ago, I coined the term “‘good enough’ plateau”, after noticing a trend desktop computers were heading in. Hardware was continuing to follow Moore’s Law, that is roughly speaking performance doubles every two years. However, software had, by and large, stopped taking advantage of these improvements. When it came to the most common computing tasks; such as office work, email, browsing the internet, and streaming videos, a new top of the line model was unlikely to noticeably outperform a cheaper, entry level one. Or even systems that were a few years old. Most computers on the market had become “good enough” as far as everyday use was concerned.
To really put this into perspective, the first Athlon 64 released ten years after the first Pentium. The latter being a 32-bit chip that ran up to a blistering 66MHz, while the other was a 64-bit processor that capped out at 2.2GHz. A performance gain of 3330% just based on clock speed alone, and not including any new optimizations or instructions. In 2003, when the Athlon 64 came out, few people would have been satisfied using an original Pentium as their daily driver. They were just too old and too slow to keep up with modern software. Especially at a time when multimedia and Web 2.0 was really exploding.
Today, a Core i5-11400 is about three times faster than the Core i5-2400, released in 2011, in terms of single core performance. Yet I’d be willing to bet your average home users wouldn’t be able to tell the difference. In fact, the one thing that really tends to hold back decade old computers nowadays isn’t the CPU or GPU, but storage. Mechanical hard drives are sluggish, and programs on these systems take a lot longer to load as a result. Giving the impression that the entire system is slower overall. However, this is an issue that can be easily rectified by replacing you spinning platters with drop in 2.5” SSDs. With the exception of ultra-low end machines, a decade old computer is still perfectly usable today. Including the ability to run the latest productivity software and operating systems without too much trouble, if any at all. (Windows 11 being the exception, but more on that later.)
Things are similar over on the mobile front. In real world applications, most users aren’t going to see a noticeable differences between a brand new flagship, versus a mid-range model, or a flagship that’s a few years old.
Now, power users are of course going to be the exception here. But even with gaming, we haven’t seen a big spike in processing demand. You can still comfortably game on a five year old mid-range GPU, at 1080p and 60fps no less. Indeed, most people do if Steam’s hardware survey is to be believed. Even the launch PS4 and Xbox One are still putting up a valiant fight, with new AAA titles being regularly released. 4K might be out of reach for these people, but as we noted in a previous article, 4K tends to be a bit overkill for most applications anyway.
The fact is that consumer computing technology has slowed a great deal over the past ten years. There’s been less innovation overall. Even previously competitive fields like mobile ARM have started to a the barrier of diminishing returns, beyond a small niche of users who require high performance hardware. Really, the only thing that’s limiting the lifespans of these systems are artificial barriers implemented by hardware and software makers.
Speaking of which, I said I would circle back to Windows 11. The operating system is little more than Windows 10 with a new coat of paint. However, Microsoft has blocked it from being installed on a wide range of computers, including their own. My Surface Pro 5 for example, is not compatible due to its Kaby Lake CPU being “out of date”. This tablet still does everything I need it to, plenty fast. I even back door installed Windows 11 on it, and using the OS was fine. Well, aside from Windows 11 being garbage. But that’s another discussion for another time. It’s also worth noting that notoriously stingy Apple still supports iPads from 2014.
Now, if I were a cynical person, which I most certainly am, I’d say Microsoft did this to gently prod people into upgrading. Much like Apple deliberately slowed their iPhones in the past to cajole you into buying a new phone preserve the batteries. If people aren’t buying new hardware, people aren’t buying new operating system licenses, which means Microsoft is losing out on profits. Because I guess Bill Gates doesn’t own enough farmland, and wants more tenants so he can play ye old lord of the manor. Or something like that.
The truth is, most computers made in the last ten years, and most phones made in the last five, will be more than enough for your average ham and egg users. There’s really no need to be upgrading every two years. Unless of course you need the performance. But how many of us really do?