Why does hardware get slower with time? I have been a PC owner since 1990 and every computer I have had in my life became really, really slow after 3-4 years (even with a full system-reinstall). It is the case with Windows PCs. It is also the case with Apple Hardware. Why is this happening? Can this be avoided?
12 Answers
There are a few effects here:
- Your perception of how fast the computer should be is changing. When you first get new hardware you have something concrete to compare it against - the old hardware. This gives you an empirical measure of the speed improvement. As time goes by your memory of how slow the old hardware was fades you only have how fast the current hardware was recently to compare against.
- New versions of software come out which add new features to either extend functionality or make use of the new hardware. This will be, by definition, a larger program than before which will take up more resources thus causing your hardware to run a little bit slower.
- Accumulation of drivers, programs/tasks running in the background etc. Each additional driver/background task takes up a little bit more resource - hard disk space, memory, CPU cycles etc. While each one isn't large the effect is cumulative. People expect modern programs to update themselves so there are extra tasks running that you aren't aware of. The longer you have the computer the more of these programs you are likely to have installed.
When taken together they give the impression that the hardware is slowing down.
There may be other effects due to wear and tear on the hardware (disk fragmentation, memory latency) too.
- 41,540
Sometimes it IS the hardware, especially with laptops. Modern processors have circuitry to protect them from overheating, and will deliberately reduce the CPU speed if the core temperature gets too hot (or also to save power when demand is low and you're running on batteries - Intel calls the feature "SpeedStep" on their processors). If you notice your fan running all the time or the machine getting excessively hot around the cooling fan outlet, your computer's "airways" may have become clogged with dust.
I had a Dell Latitude that ran like new after I opened it up and removed about a quarter inch thick "sponge" of dust from between the fan and the heat sink. Dell actually has downloadable service instructions on their website that explain all the steps to open up the machine and get inside for this kind of service. If you're not comfortable with this, you probably have a techie friend who'll help you out. It's definitely worth the risk if you're planning to get rid of the machine otherwise!
If you think this might be what's happening on your machine, try downloading a utility like "SpeedFan" that allows you to check the temperature of your CPU as well as other components. With this app, you can graph the temperatures when you first start the machine. If they start climbing quickly and never seem to decrease, you can bet cooling is an issue. In my case, I also used a free app called "CS Fire Monitor" to show me the actual speed of my processor and I found that once it got hot, it was dropping to less than half speed. There's lots of good freeware out there that will show you this kind of information; just Google "CPU Temp Freeware" or "CPU Speed Freeware" or something along those lines and you'll find all sorts of options.
Hopefully, this will save a few people from replacing or throwing away decent hardware that just needs some respiratory therapy!
- 356
When I have run benchmarks (both trivial ones like bogomips, and more serious one like Dhrystone and Whetstone) on five to eight year old hardware, I have always found that it turned in the same results as when it was new. (Always on Linux and Mac OS boxen, BTW.)
I have less experience with hard drives, but I did test one fast and wide SCSI2 drive about five years on (with hdparm) and got answers comparable to the original spec.
So, I think it is mostly, as others have said, a combination of new expectations and heavier software.
That said, I do currently have a powerbook G4 which could use testing, as it sure feels slower now than it used to. The suggestion above that clock throttling may come into play if the cooling system gets fouled is a good one.
Page's Law ;)
Wirth's law, also known as Page's law, Gates' law and May's law, is a computing adage which states that software is getting slower more rapidly than hardware becomes faster.
Some slow-down is caused by hard disk fragmentation, whose cure is Defragmentation.
this is defined as:
file system fragmentation, sometimes called file system aging, is the inability of a file system to lay out related data sequentially (contiguously), an inherent phenomenon in storage-backed file systems that allow in-place modification of their contents. It is a special case of data fragmentation. File system fragmentation increases disk head movement or seeks, which are known to hinder throughput. The correction to existing fragmentation is to reorganize files and free space back into contiguous areas, a process called defragmentation.
On Windows there is another reason, that of the Windows Registry
The Windows Registry is a database that stores settings and options for Microsoft Windows operating systems. It contains information and settings for hardware, operating system software, most non-operating system software, and per-user settings. The registry also provides a window into the operation of the kernel, exposing runtime information such as performance counters and currently active hardware.
Over time, the registry time accumulates junk and needs also to be cleaned-out and optimized.
Another explanation is that newer versions of the operating system are usually more bloated and so slower. This means that just by installing the latest O/S version or patches, you may after a few years suddenly notice that your computer is now slower and it is time to invest in new hardware that can efficiently support the requirements of the latest version of your operating system.
- 498,455
You get used to the speed and it now longer feels fast.
For example, I had a customer who had a routine (which they regarded as down-time) that took over an hour on an old computer and when they upgraded their computer the process took five minutes which made them very happy for a while.
Fast forward a few years and they now complain about this routine taking five minutes. And every time they complain, they genuinely seem to have forgotten about the time it took an hour.
- 6,599
There's a certain amount of perception issue, but if you're actually measuring a reduction in performance, I'd look to moving parts in the system.
"Moving parts," you ask, "what moving parts?"
Two easy categories to check: fans and disk drives. Fans are obvious, but in addition to the fan itself, make sure the airflow and cooling are unobstructed to ensure that interior component temperatures are also where they were when the box was new. Disks are a little more subtle, but a deteriorating disk can cut down dramatically on performance while appearing to work. See if the disk benchmarks match new performance, or if the error count is up dramatically.
While they don't really move, they're the moral equivalent: cable connectors. Any detachable end of each cable. Unplug, ensure clean, replug and ensure tight.
- 2,842
Most (if any) benchmarks aren't reliable in measuring OS snappiness. Unless the benchmark is some USB-to-USB system that is controlling the UI of another computer, emulating being a mouse/keyboard, the execution paths will be entirely different. The slowness in PC's I know about arises due to driver/security updates that can also update the firmware (and you don't know if the fw update in the driver persists or not) so the only true apples to apple comparison is to buy 2 computers and never plug the other one to internet or update the drivers after first install but preserve it for later comparison using such external benchmarking tool.
I started suspecting all benchmarks when I found a case where the benchmark was returning "all good" numbers while some hardware issue was causing the mouse to freeze around and the system was actually only barely controllable - clearly the benchmarks aren't affected by some low level things that can affect eg. snappiness and controllability of the PC.
(Slightly different but similar case: even though Q6600 benchmarked about same than equivalent Ghz dual core, I noticed responsivity was clearly lower. Back then this was explained to be due to Win-Vista scheduler not being good with 4 cores - point being - just as most benchmarks that show FPS would not detect some tiny jitters that user would feel, the PC benchmarks that tech press uses don't measure things like "Interrupt to process latency" and show the statistics of that instead of just some average)
edit: And if you're doing such setup with untouched reference PC, if it has a battery and or is ever powered, the hw maker could cheat by running an LFO to covertly obsolete the hardware eg. by slowing down some operation that benchmarks don't benchmark. A better than usual game press benchmark would be to run eg. dosbox, emulators, latency measurements inside vmware/hyperv as that will tax the cpu in more complex ways than otherwise.
edit2: and if they really wanted they could put in something that ages or some ultra-low power counter and capacitor or tiny battery charged at factory. So no matter if you never power the device they could make it slower with time but this kind of thing could be a liability if someone finds it but it wouldn't really matter unless this was made illegal and the fines were enough to put them out of business.
- 11
- 2
Perhaps it's purely down to your perception.
3-4 years ago, it was sparkling new hardware which was faster than the previous generation of hardware, therefore it felt very fast.
In 3-4 year since then, no doubt you have used computers with better hardware, so even if you do a clean install on the old machine, your experiences on newer hardware will leave with a lackluster impression of the old machine.
Or do you have empirical evidence that the machine actually performs slower?
- 663
I believe some driver updates may these days also update firmware on the related device. There's also potential CPU-microcode updates, though rare.
I've seen some popular diagnostic/benchmark tools claim things worked at normal speed, yet there was some kind of low level driver/hardware issue that caused the mouse pointer to crawl and jump. At the time I didn't know about measuring DPC latency - that tool probably would have indicated there was an issue.
The point is - it's possible things can slow down in a way that makes things feel slower but doesn't show up in the kind of tools casual pc users use.
If someone wants to dig into this, I think they should have 2 identical computers, the other never getting connected on the net, never getting updates or new drivers installed. And time both computers using external timer/check time from NTP just to be sure. - and after 4 years, time both again and if there's a difference, clone the disk from the non-connected computer to the connected one and try again. And check any firmware version changes etc. edit: And when I say "time" I mean timing some custom task, not using existing benchmark. Both GPU and CPU vendors have been caught gaming known benchmarks according to Anandtech and few other sites in the past years I've read.
The answer is it is not getting slower. I have had computers since the 1970s. I kept them all too. I can boot up my 1997ish P3 with a Voodoo 3 video card and benchmark quake 2 and I will still get the same FPS as it got in 1997.
You might not be comparing apples to apples. You need to use the same exact software not new software such as new games on older hardware then you might run into slower software and drivers that are not compatible with new features of newer games.
Hardware in general does not slow down. The software can and the Hard drives can get slower with fragmentation.
A new OS of the same version will be just as fast as it ever was. Newer versions of the OS might be slower on old hardware. Compare apples to apples always.
- 11
Actually this is not a technical problem, but rather a human brain problem. This may surprise you, but let me explain. I have good basis for what I say.
Part of the problem is how the software updates and patches are applied, but that is not the core of the problem I don't think.
The hardware machines have actually gotten significantly faster over the years, but the software's ability to load it down has increased at an even faster rate, giving the perception and the actuality that some things are slower, as they are.
For example my first Z-80 box had a clock speed of 1 mega hertz. Now my development platform runs at 2.66 ghz, or over 2000 times faster. I don't recall exactly, but all of CPM fit in about 16kb. Now Windows is who knows how big, but much, much bigger. It uses many layers of abstraction which get amazing things done in a more general way, but these layers take their toll on performance.
Let me get back to the human brain. What is well understood is that software engineers for many years have said and believed with some good reason, that hardware would just get faster and faster and so software didn't need to be careful with issues of optimization. So programmers did things to get things working and quickly at the cost of speed, ... thinking that the hardware people would take care of that problem. So the updates and patches are done with the thinking they are temporary, i.e. short term.
It is: short term, micro thinking, in a long term, macro problem.
I read an interesting book many years ago where a couple of scientists laid out this short term versus long term human thinking problem, and did some experiments on a wide range of humans to see how they make these tradeoffs. Their book is New World New Mind, and the authors are Paul Ehrlich and Robert Ornstein. I would put it down as the most important book I have read in the past 20 years because it provided a solid framework for how we solve the problem.
What they noted was that the human brain evolved in a time when making short term decisions made sense. Live for the moment and the day, but don't think to much about the future. It just wasn't worth it. So our gut sense of things which we often use to make decisions is a very old part of the brain and not well suited to many modern problems. And the brain has had no realistic time to evolve as the world has rapidly changed with population growth and technology's impact on things.
What professor's Ehrlich and Ornstein discovered was that very smart and well educated Ph.D.'s but also janitors made the same mistakes when presented with short term versus long term problems. Not something we generally think is the case.
One very good and compelling example of how this same problem is playing out in the world today, has to do NOT with the hardware environment, but it's big brother the whole darn environment in which we live. We humans are generally making the mistake of living for today, for the moment, but the reality is that global warming is upon us exactly because we have not allowed for it or taken measures to deal with it. It's the slowing of the hardware, by the software problem, all over again, but in a different context.
Ornstein and Ehrlich suggested that we might be able to make more correct decisions by basing our decision not on our gut instinct, but rather on data and statistics. So for example if a software engineer had statistics as to how fast their software was bloating relative to how fast the hardware was getting faster they might make better decisions as to what to include, what to leave out, and how much to optimize algorithms. In other words if they used actual data to make decisions, rather than their gut instinct.
Thank you for the good question. Sometimes the simple questions are the best I think. it gave me the opportunity to consider this from a new angle. I had never before seen the parallel between the hardware software issue in the human context.
- 1,330