Such a basic requirement of a multitasking OS (Windows can't really be considered a multiuser OS) it's stunning that M$ didn't include some sort of task manager as a basic feature.
All good tips here. I'd also add: - If the hard drive is mechanical replace it with an SSD - Run CrystalDiskInfo and make sure none of your drives are on the way out. Slow storage is a performance killer and might lead to data loss - If the machine is a high end (gaming, workstation) machine with a heavy cpu heatsink - redo the thermal paste if you haven't touched it for years. Should help with those boost clocks. Especially if the machine has been transported, which might cause the old paste to crack if it has dried out. - DO NOT neglect fans on laptops. Get them cleaned out
CrystalDiskInfo is good. More BIOS/UEFI bootups should include SMART-status checks imho... the ones that automatically warn of failing drives at boot up are awesome. Also, tools that can run the drive's SMART-Self-Test (I like partedmagic) are even better if you're doing offline diagnostics but rarely needed.
SSD and maxing out the RAM are huge improvements. If the motherboard has PCIe, try an NVMe disk in an adapter card, make sure it's an x4 adapter, not x1. Might not be able to boot from it, depending on the BIOS, but it'll be faster than a SATA-connected SSD. Larger NVMe sticks are faster than smaller ones, also.
I saw discouraging stuff about SSD's on YT but I suppose I'll have to drag myself out of my cave eventually and get one. Some software even requires an SSD now.
@@dannygjk They're worth the $, imho, about 10x the speed of hdd, on my old systems. Leave some unassigned blocks and make sure partitions are "aligned," start on evenly divisible sectors, basically. I usually just specify start points at multiples of a gig.
@@dannygjk they are totally worth it. Can even turn an old machine into a nice snappy one again for all various common tasks like web browsing, email, office tasks etc. For gaming or other 3D or heaving high-res video performance the graphics card will be a major factor too, of course.
Single biggest performance increase I have seen on older machines is replacing any magnetic disk with an SSD. That one change can extend the usable lifespan of the machine for several years, especially if you also max out the amount of RAM that the motherboard can support.
I totally agree ! My youngest daughter has her big sister's 2012 Macbook. Big Sister said it was too slow. An SSD Sata drive and maxing out the RAM means it is still usable in 2022 ! The amount of people machines where I've said replace the HDD with an SSD , and they've been happy afterwards, is quite a few.
If the SSD does not come with cloning software then be prepared to buy such software to copy over your old HD data to the new SSD. I did not have much success with free cloning software.
As a person who has a few older computers, including two 30 year old PCs still running DOS and a 25 year old one on Win 95, I'll add that another problem that crops up as computers age is that magnetic media is not permanent. Over time, as the magnetic image decays, disks have a harder and harder time reading some sectors accurately. Disks always store extra bits, error correcting code (or ECC), that allows transparent fixes of errors, at least, until there are too many bit errors. There is a product that I have used for many years that addresses this problem, Spinrite, from Gibson Research. It is like Scandisk, but much more thorough, and much, much slower. It picks up the data from each sector, stores a variety of patterns to the disk (0000, 1111, 01010, etc) to test it, and if the sector passes, it rewrites the original data back (or moves it elsewhere), making it once again freshly written, and less prone to data bit errors. The entire process takes about 3-10 hours for most discs, depending on the disk size and transfer rate (120 hours is the longest I have seen), but it often makes strange random errors go away. The most amazing use of it I ever made was on an NT machine that wouldn't boot and went to the BSOD. After 19 hours of Spinrite, Spinrite reported that it had repaired 57,000 disk errors. I thought "Ha, this thing is dead forever." I turned it on, and it booted, and ran for 5 years with no more problems. Obviously, this is only for magnetic media, and should not be used on SSDs...
@AlfredScudiero Good question. They were used to price orders and function as a cash register. They performed the function just fine, and as I neared retirement, there was less and less point to spending money to upgrade them. I sold the business July 1, and the new buyer elected to replace them, which cost him about $10,000, buy also added ongoing expenses of about $5000 a year. Yes, they can do some things that the old DOS system couldn't do, but they also open up new problems. To hack my data, a hacker would have had to enter my building, for example, as it was not connected to anything. Had I been younger, I probably would have replaced them 4-5 years ago, which was about the time my competitors moved up from their DOS-based systems. Keeping DOS systems running was getting harder each year anyway, and in particular, the disk drives were going to fail, probably within a year. ;)
@@DavesGarage Hi Dave could you explain how file explorer calculates the size of the files in a folder. I can't seem to replicate it without recursively summing the file sizes in the folder. Is it possible to grab the folder size property without re-doing the work of file explorer?
@@albi2k88 First Dave did explicitly note that optical drives (i.e. CDs, DVDs and BlueRay) are a very different animal but since you are unlikely to be running slow because of an optical drive it's rather moot anyway. I'm pretty sure that most drives use variable geometry too but as someone else noted: just replace them with SSDs (which use less power!) and leave the rotating media in it's slow-lane past. All of the older computers that I replaced their rotating disks with SSDs seem to be a lot faster after the switch.
Speaking as an experienced teacher, I have to say that this is an excellent instructive video. Very fast paced, but easy to listen to. Clear, direct and understandable. I think I actually got 90% of it on the first pass. No annoying music or distracting graphics. I will definitely be turning in often. Gigathanks.
See, Bruce, you're an experienced teacher with English as your mother tongue and you caught only 90% and you require at least a second pass. English for me is not even the second language. Though I caught everything on the first pass and that's all that you mentioned: fast paced, no annoying music or distracting graphics. But what was it I was expecting to see? To hear, rather... from a guy who worked for Microsoft? All I heard is available online in Google.
@@TroyQwert "In Google"? Google is merely a tool through which to access information from other sources, some of which might be unreliable or outdated. From this video, I got a bunch of reliable information from a person who I **know** knows this software inside and out. I've learned a bunch of it before, as someone who maintains dozens of old computers in a high school and who's trying to squeeze the last bit of life out of them, so I've definitely searched high and low for tips like this. But I still felt this was a good use of my time because I learned some extra details about *why* things work the way they do, which deepens my understanding (and makes me better at my job).
Depending on the distro and age of the hardware, this is the answer. Anything besides Ubuntu with xfce or mate will work great even on like 20 year old hardware. Ubuntu is slow because snaps. Gnome, Cinnamon, and KDE are lighter than windows, but are still too heavy for a mechanical drive
Some examples of hardware slowing down; Your computer may have been tuned in the BIOS, XMP for example and then the BIOS gets reset to default values. Your SSD is too full so it has to do extra work to find space to save new data. A component may actually be failing but the computer is hiding the retries so it simply seems slow. You may require the correct driver which has been removed at some point and now the computer is making do with a basic driver. I found a user had been using a machine with the wrong network driver which gave them a 100mbps network when it should have been 1000. They also were using the Windows VGA driver when the graphics had a much better driver available. If they were a gamer they would have not been able to game but all that really happened is the computer felt slower to use.
I agree with replacing the old mechanical drive with an SSD. I run several old machines including laptops. The single most important thing I did for speed was to swap to an SSD in each of them. Those old computers would take 3 or more minutes to start up and several more to become useable. Each one now starts and is perfectly useable within 30 seconds and everything loads 10 times faster.
@@satunnainenkatselija4478 If your PC takes that long to boot your boot drive and or CPU must be complete trash. Also, as noted in this video you could have way too many startup programs. I have a Gen 3 NVME SSD with a Ryzen 7600 with Windows 11 and my boot time is about 15-20 seconds from BIOS splash screen to all startup programs opening. Your PC or OS install has some major issues.
@@satunnainenkatselija4478 most of the times why old computers take long to boot is because the new systems are just too much for them, its not that programers just make it longer on purpose, they do the opposite actually.
@@Silverhazey_ thats the thing, old laptops have shit cpus for todays standarts, thats the point. You can basically revive these old PCs to run smoothly when you swap up HDD for SSD. For 99% of PC users its good enough
How old are you 12 going on 30. That line is extremely over used. Are you able to think for yourself, or does everything you think and do comes from what you read on RUclips
He's inexperienced. He says that dust and clogged fans causing heat throttling is the only way he knows that hardware can slow with age. This is so false and the incorrect assumption that software is nearly always the culprit.
Good info. I find these are the most important: 1. Updates; Windows, browsers, apps. 2. RAM ; adding some RAM may improve speed. 3. hard drive space; some of my users I support save too much on the C drive and it gets full. 4. Malware. Many users don't pay any attention to these.
True re the RAM. First machine I had was a loan from brother in law a 486 DX-66 with 4MB RAM and a 200MB hard drive. Once I installed Windows 95 on it, it had about 35MB space left and I had to constantly install and uninstall games to play them. Put in a CD-Reader and another 4MB RAM and a second hard drive and I was in heaven. LOL.
Way easier to let the pros do all that for me. Seriously why hasn't this become the norm? Many areas it has like social media. But why do we labor with silly tools like Drive? the almost impossible local network arrangements? antique printers? cd burners and boxes of install discs?
@@akale2620 idk what you are using that's so special. The majority of backroom servers and internet are Linux. M$ may own the desktop, but now all you need is a browser for most things I have been using Linux for decades and NEVER had a problem. I run 7 servers at home, all Linux All the software I use is open source, FREE, and works perfectly. You might want to try it
@@rty1955 ohh I know servers etc run on it. But it simply didn't work for me. All we needed was to take a few printouts for family tax filing. Had win10 that never worked for me. So installed zorin os instead. Turns out the printer I have isn't supported by manufacturer in linux. The opensource drivers didn't work bcz the specific model I have doesn't have any. Couldn't play any torrented games on it either. Old games age of kings and mas.effect1. switched back to win7.
@@akale2620 did u try the games under Linux running Wine? I dont use my computers to play games. I do electronic design work & 3d modeling. I use email, word processing, personal accounting, spreadsheets etc. I have been running Linux since 1996 and never used Windoze ever since
I applaud you and your wife's parenting by keeping your kid's computers out of their rooms so you can keep an eye on what they are looking at on the internet. You are a rare breed my friend.
I think if my parents did that my life would have gone nowhere (I got into IT security and I'm a computer engineer and in college now), but my dad didn't work for Microsoft so it went about as far as "computers are nifty" before I had to take matters into my own hands and start tinkering. At one point he got mad 'cause he found out I had the source code for like 1400 computer viruses, but he never found out I was wandering around other people's computers because back then you got on the Internet and your IM program gave me your IP and I could just type \\ip\C$ in the box and look at your hard drive. Good job Windows, I guess Dave didn't write that part.
Just alone that you give a dedicated own desktop PC to each kids makes a huge difference. Most kids just addicted to phones (and silly apps) without alternatives. Even the gaming community is different at PC games vs phones.
This is the first time I've even seen one of your videos and I'm impressed. Extremely articulate, well spoken, and obviously very knowledgeable. Thanks for the info. Will try this on my older secondary desktop.
Dave, there's a third scenario to why I watched that video. I came across your channel a few weeks ago after you published a video on Task Manager. After I watched it, I opened another one, about pointers. Your channel is absolutely fantastic. I hope you're having as much fun doing these videos as we do watching them. I missed such content here on YT🎉
@@johngillanders9694 for all who do not like Linux and his thousands of forks and clones - you may try out Android x86 :) i used it on an old Phenom X4 and it runs fast and stable
It's really amusing to see the amount of interest in something that has become so ubiquitous, yet mundane. The windows operating system. Your channel is a treasure trove for information and entertainment for pertaining to all of the little thoughts and questions that pop into my head while using my PC. Thanks for all of your hard work throughout the years!
Also, if you are running a mechanical hard drive clone it to an SSD and old machines will perk up significantly. I use Macrium's Reflect. I find it does an excellent job at doing an exact clone. Don't forget to also allocate some empty space, (usually around 10% of total disk volume), at the end of the disk for over provisioning. (I also highly recommend Samsung SSD's for speed and reliability.)
This is the single thing that will have the most affect on an old computer. I've done it on my old laptops and it made a big difference. I did it on my old desktop that still had a CPU that was overclocked to a relatively high speed and it made a really big improvement on that computer.
About 15/18 years ago I switched to an Apple system because I was having to spend so much time doing maintenance cleaning crap out of the system-maybe an hour or two a week, usually on weekends. I am/was a journalist/researcher/copyeditor and used the internet in some odd places. Very little of this time wasted with my Mac mini. Anyway, one cost is I still feel uneasy with my lack of familiarity with Apple operating systems. I don’t need to kludge around in them, but in a couple of cases that caused big problems that would not have happened with windows, or I could have worked through them. But I was amazed that after all these years, I understood precisely or had used an earlier version of every fix you mentioned! When you pointed to old versions of a file, I remembered the file! Thanks, I enjoyed this.
Bonus bonus item: clean the physical machine. If the heatsink for your GPU or CPU is covered in a blanket of dust, you might encounter thermal throttling. Blowing that dust off can actually make your computer faster.
I keep the cold packs that come with my "old man" medications. I pack the laptop in them when we start each recording session (wife is a singer). Go ahead fan. Just TRY and spin. I don't care what Loiis Rossman says..
Amen, Amen, Amen. I had one customer whose desktop was literally crammed full of dust stuck to layers of nicotine that I actually took the time to take pictures of while cleaning. The thing would power up but shut down from the overtempt in about 30 seconds. I used the pics to educate him on one of the other hazards of smoking and justify the additional time charged for the "Repair". And that was really all that was wrong with the unit. Once cleaned, it ran beautifully. (Many other bad cases, but that was the worst,) So many people don't realize just how much dust, cat hairs and other airborne substances that those fans suck in and how it affects performance and length of life of the system.
All current hard drives use zoned geometry, The outer tracks have more sectors per track than inner tracks. The each zone will have a set number of sectors, as the zones progress towards the center of the disk surface, the number of sectors per track is reduced. This prevents the waste of space that would occur from only having the number of sectors on the outer track that can be reliably recorded on inner tracks. This zoning system started sometime around when the IDE/ATA LBA (logical block addressing) system came into widespread use. Early IDE drives emulated fixed sectors per track geometry for compatibility with older system BIOS and software.
Exactly! And at that time, the first tracks became the outermost ones, in order to hold more sectors per (physical) track), and more importantly, more sectors per revolution and thus per second, making the first tracks faster than the last ones. Drives can have up to 5 different sectors per track, possibly even more, but at a certain point, using another zone would add a negligible # of sectors. The "different angular velocity" fallacy is probably a mix-up with CD/DVD drives. HDDs don't have that, but the differences in _linear_ velocity matter. (HDDs with variable RPM exist, but that's done to save power when there's little I/O, e.g. if they stay on during the night as many servers do.)
@@DavesGarage Given that this is an issue, perhaps it's a case where Microsoft could follow the approach used by some other OSes (and some Linux distros): OS and all its "important files" are read-only - perhaps even sitting on a separate read-only partition - and updated as a unit. I'm personally not a fan of the approach, not for my own systems (Windows 10 for MSFS, Arch for other games, and OpenBSD for laptop because I need to compensate for my lack of beard growth), but since MS ships an OS specifically targeting "normies" it could be a thing. Or is this something that simply would work with how Windows is built? If nothing else, it seems like it would reduce the amount of times I have to remote into a family member's laptop to fix things. :P
This video was a good kick up the backside to actually clear out my drives. Over 3TB cleared, my decade old HDD emptied for good. I don't know about performance improvements but it's definitely satisfying to see all the bloat gone. Thanks Dave, I needed this :)
Thank you, Dave! I'm autistic myself, and am inspired by you in a huge way to make sure my non-verbal son can enjoy the gift of his life to an equally amazing degree! Take care Dave, and thank you so much.
Wow, thanks! Raising a non-verbal child would take a lot of patience, you're in my thoughts and prayers! Just remember the love is there even if its hard for ASD folks to show it!
@@DavesGarage Dave, I am sorry for the late reply, however, I was somewhat intimidated by the prospect of conveying the depth of my gratitude for the wisdom you've shared and for the inspiring accomplishments you've successfully engaged your own savant-like skills with. I am humbled by the humility on display, I assure you I learned much from my own youth as an individual verbally impeded and have been able to engage with a world far more receptive and cohesively committed to showing us we are people, too. I am almost forty and he is almost five, I remember to warmly show him love that I know is critical to his healthy development and appreciation of who he sees looking back at him in the mirror. I am so lucky to be his father, and Tanner is so dear to me. He smiles when wakes up, and he is so happy to be alive He's incredibly intelligent, although it carries a penalty I think is mutually known between you and I. As a father, thank you for the kind words reinforcing the reality this parent can attest to being amazing as long as one stays receptive to the somewhat unique expressiveness we have. Take care, Dave, and thanks again!
I love it! And I mostly agree with it. I would however have liked two things: A bigger warning that defragging is for spinning rust only. Like a seizure-warning epileptic-fit-inducing flashing lights and sounds big warning. Because not everyone is an expert and forcing a defrag on SSDs is bad. And the second, an actual in-depth look at how the performance of your SSD slows down horribly over time and how each one is a bit ... different. SLC is generally problem free, but low capacity. MLC, LTC, and QLC (and soon PLC) have ... things ... to watch out for. Storing more than one bit per cell, their first write is faster than subsequent writes, so write performance varies. Also they tend to cache a portion of their capacity as SLC. As long as you don't fill the drive up, you can enjoy that cache speed. Fill them up however, speed go bye bye. Sometimes deleting files gets you back that speed, sometimes not. Etc. SSDs are ... complex, and can be a H U G E factor on why your modern PC slows down with age. Seeing a cohesive analysis of all of the complexities to be aware of and what can be done would be a nice part 2, IMHO. You could also look at how some anti-malware packages are, "better" than others about slowing down your system, if you need more time to fill? Maybe go into background tasks? Bloatware from folks like Dell?
SSDs have to balance write cycles, i.e. to ensure that certain frequently used areas like directories don't wear out soon, they have to do some extra book-keeping and move those areas around. That "moving around" isn't trivial; simply put they have to store information "where sector 1 is today" to find the data, and _that_ information might change often, too. Long story short, defragging the SSD uses write cycles; using the _full defragmentation would be a major waste of SSD lifetime_ . A more selective mode, which leaves all but the most badly fragmented files untouched, isn't half as bad, but shouldn't be scheduled for regular maintenance either. Yes, there are statistics which claim that defragging SSDs is important. They claim that a regular HDD went from 500 to 1000 I/O operations and SSD went from 8,000 to 10,000 (so the HDD gained 500 and the SSD gained a whopping 2000), but that's not the metric you should look at. Why? Because you don't get up and say, "Today is a good day, I gonna do 100,000 I/Os" -- the number of I/Os is usually defined by the stuff you do. What you should look at is how long those I/Os take. That'll be 200 vs. 100 seconds on the HDD, and 12.5 vs. 10 seconds on the SSD. So, defragging would save you almost 2 minutes on the HDD, but only 2.5 seconds on the SSD. Unfortunaltely. _some_ defragger makers know that very well, but use bad metrics to sell defraggers to SSD users -- a very malignant practice.
Not true at all. VSS works better if your disk is defragmented, it even does it on SSD's on a schedule. It's normal. Enjoy your RUclips KNOWLEDGE though. A SSD is a block device it can and should be treated like a hard drive, it's faulty if you can't.
@@itstheweirdguy "A SSD is a block device it can and should be treated like a hard drive, it's faulty if you can't." Then I guess SSDs are faulty, according to you. Because while you certainly can do that, as in nothing stops you, you're also definitely shortening its lifespan. They have a limited number of writes. This is well documented fact of flash memory. Defragging a drive is moving data, eating up those limited writes. This is how flash memory has always worked. And with every "level" added to a flash cell (SLC->MLC->TSC->QLC) this gets significantly worse because of the way these levels increase the capacity. So as you shorten your SSD's lifespan by defragging it, "Enjoy your RUclips KNOWLEDGE though." I hope your non-RUclips knowledge (whatever that is) brings you some solace when your SSD dies.
I had an insane performance increase when swapping from an hdd to an ssd. That said, there are a lot of other features you can turn off that boost performance, such as drop shadows or windows search. I also find that programs open way faster by opening them one by one as needed rather than them all fighting eachother for the same memory and disk read upon boot up.
remember nthe water cooled cpu's in the early 200's lol water in a tower pc.. edo and non edo ram, limited to 128mb the y2k bug threat.. we have come a long way since 1999
As someone who grew up with xp the format option is what we did all the time. It's strange that you mention it as a last resort. We formatted our computers so much we memorized the service pack 2 key. Lol
Same here. If there's a problem that takes me more than two days to figure out, I generally say screw it and reinstall windows. Might as well use the third day resetting up a system I know is fully working, then potentially still not having the issue solved and wasting yet another day.
Great solid top level advice Dave. And yes, in your opener I have been that person having to try and speed up relatives PCs. My favourites are 1) disable all but the necessary start-up items and 2) convince them to use Windows Defender and ditch 3rd party offerings. My other trick is to turn up with an SSD as a "present" and amaze them with the speed improvement after imaging the C: drive to the SSD. Love your work.
My Dell Optiplex 780 had become very slow. Upping the RAM from 4GB to 10GB helped, but not as much as I had hoped for. Then I found your vid and did all 5 fixes. Wow, what a huge difference!
Dave - I used to use WinDirStat to map out drive contents, but I've found that WizTree is much faster. It scans the MFT of the machine instead of seemingly looping through all the files. While either will do the job, just thought I'd throw it out there in case you hadn't heard of it before. Have an awesome Friday!
Treesize (free) is another alternative that I've found to be pretty fast compared to WinDirStat. Not as fast as wiztree, especially for very large drives, though. But it's another option.
A brilliant session. Very well presented. I got so much from this too. We have such a machine in our family collection. You have prompted me to become rather thorough in my work to improve its performance. Much gratitude to you for jogging me into action and superbly providing me with a master class. Cheers
11:16 A note on WinDirStat: It's something of an old program and is somewhat slow. Wiztree is a newer variant and runs so much faster. Helps the file trimming go much faster when you aren't waiting minutes between drive scans
My preference is actually apps using Sunburst graphs. I find them easier to read. I don't use Windows so I am not sure what's available in that department, but I know they existed in the past.
Great video and insight for Windows users! Although I feel sorry for Dave's iMac, I just upgraded my 2015 21' iMac from LMDE 4 to LMDE 5 (Short for Linux Mint Debian Edition), the computer is now faster, more reliable and has better functionality, it will be running like that in next 3 years (minimum), regardless how many programs I will be installing. That will never happen with Mac OS or Windows, unheard of.
I've upgraded all my machines at work with SSDs whether they be SATA or Nvme and never hear about people complaining about their machine being slow anymore. Now I can spend more time watching Dave's videos haha
not if its running linux. do not install linux's /root /home to SSD! put each of these guys on a separate partition to a HDD instead. linux is so much faster than winX that the difference doesn't matter (linux's throughput will still be faster). there's a REASON why every server on the planet is running HDD (well almost every server is also running linux and the people managing these guys ain't stupid).
@@leecowell8165 Clearly you are delusional. Speed differences tend to be down to the actual program implementation. There are speed differences between OS, though generally not huge, and nothing anywhere near enough to make up for HDD snailspeed. SSD can easily be 50x faster in sequential speed, where HDD are good at, and thousands of times faster in random access, where HDD are horrible at. Linux can't magically change the laws of physics. It is physically impossible for an HDD to outperform even budget SSDs. Also, every server on the planet is not running HDD. There is a server hosting platform, Linode, they have been hosting on SSDs as long as I have used them, 8 years, and not sure how long before that. Many servers still use HDD, because they are cheaper per gigabyte. Many server purposes don't require high speed data access, such as archiving, and so there is still a purpose. For many servers though, they are switching partially or exclusively to SSDs, since costs have come down. I remember going to a conference, and learned Facebook was partially switching to SSDs... this was 9 years ago. I can agree that Linux has many server advantages over Windows, but HDDs is not one of them.
Just stumbled accross this and found it all really fascinating and well presented. Happy to discover at the end a bit of your background. You must have known my Father's brother Ken Dye who ran the MS usability lab in the 90's. He was a wonderful, beloved uncle who passed away early last year.
If your computer has a HDD, then run Defrag every month, or so. Another thing you could do that a lot of people overlook, is checking your service tab, and see what running automatically or delay. Try to see if you can set some of the services to manual. This can result in faster boot time, and the program will still work once trying to open. Opening up the laptop or desktop and clean out dusk can reduce the thermal throttle with proper air flow.
Bonus bonus tip: if Explorer seems to be getting sluggish (taking longer or even freezing up while switching folders, generating thumbnails, dragging and dropping files, etc.) there's a decent chance that your drive is in the process of failing. (Especially if you have multiple drives and these symptoms only present themselves on one of them.) The reason being that Explorer needs to wait longer while the drive retries the read operation potentially several times before either getting a result or bailing out and presenting you with an error after all.
Is that the ONLY file manager avail under windows? Does it have a split panel like Nemo or Nautilus over on linux? No probably not, huh? Doesn't windows check the drives at startup? No probably not, huh? Yeah linux is slow as hell to start but there's a reason why because it catches shit like this.
@@leecowell8165 I think windows actually has a program that scans for the health of a drive, though. Although I'm not sure if you can use it to set an alarm.
@@generalardiThat's because that isn't the way it is done. Enable S.M.A.R.T monitoring on your drives so you know when they are failing. ZFS (as one example) on *nix actively scrubs the disk in the background looking for and repairing disk blocks that don't match their checksum.
@@leecowell8165another Linups sectarian... > Is that the ONLY file manager avail under windows? Does it have a split panel like Nemo or Nautilus over on linux? No probably not, huh? Windows does have those. Even more and more functional than Linux does. Among the most widely known are FAR, Total Commander, Directory Opus... But in the latest Windows versions Explorer getting better and better. So people gradually abandon 3rd party ones. Even me, being a TC user from the time it was called Windows Commander, sometimes I prefer Explorer because it became more convenient. Compare all that to Linux distributions' native file managers. They're just trash. > Doesn't windows check the drives at startup? No probably not, huh? Probably it'll be unbelievable to you, but YES! Windows does! In case any problems are suspected be it ungraceful shutdown or a "dirty" flag in FS - Windows runs `autochk` during boot. > Yeah linux is slow as hell to start but there's a reason why because it catches shit like this. Aaaaand despite that I personally know some mates who got their storage corrupted while they use Linux distribution! I personally experienced storage corruption after Ubuntu installed updates! What an irony: neither the first nor the last I ever experienced on Windows. Just facts proving your points are bullshit.
One other option to reclaim disk space (I often do this around the same time as running chkdsk and sfc) is to run dism with the check health switches to remove any cruft I can from the sxs directory that grows to gargantuan sizes over the years as updates and patches are installed.
I love how authoritative this channel is. No hand-waving or shrugging: "I trust Windows Defender because the guy who runs its development worked for me and he's really good"
As an IT professional this is fantastic information in particular for helping relatives (as Dave mentions). He dispels some common misconceptions and helps prioritize repairs. Grateful for this content.
I have a 2015 macbook pro with 16 GB of RAM. I bought it in January 2019. It has the intel CPU The problem isn't "computers" it's the massive sewage pile of windows bloat BS. My 2015 macbook pro seems / feels as fast my 2023 macbook pro with 32 GB of RAM and M2 Pro chip. It's *windows* computers that slow down. Since 2013 I haven't owned a windows PC. I love their upgradability but windows is certified junk.
@@rohitnijhawan5281Because Mac doesn’t use a registry that has to be kept resident in memory. Linux is much the same, config files or plist files that are read as needed is far superior to registry style management. The whole time he was talking about registry related issues all I could think is…this is a windows specific issue.
Dave, two tools you didn't mention are "AUTORUNS" to edit startups & "PROCESS EXPLORER" to get a better detailed look at what is in memory and the process tree that is attached to the executables. But good cleanup video for the medium tech inclined used.
Yeah, big up for the various sysinternals utils! These two are the most generally useful, but for digging deeper into specific issues, the rest of the suite is gold too. Which is why µS bought the winternals dev studio into the corp in 2006.
Here's a quick maintenance tip. As a preventative measure, when downloading files from the internet to an HDD, if you have a second drive, transfer the file via cut/paste to the other drive then cut/paste again to it's final destination. This will prevent the file from being fragmented from the download process and make future defrags a lot faster. I found this out on my own when downloading large files and then running a scan on my disk for fragmentation and seeing massive amounts of it. Then I discovered that by simply moving the file between drives is like a form of defraging. Test it out yourself.
Finally a video that is 100% actual facts! I've been studying computers my whole life and have had to argue with people who claim silicon degrades over time and runs slower because of it. I mean your clock doesn't slow down if the silicon degrades, it might just become unstable (BSOD) under load or not work at all but won't slow down. Usually the actually slow computers are ones that have 4GB RAM and a HDD while running the latest version of Windows, where Windows Update is constantly active, the modules installer worker just doesn't stop using the HDD making paging also impossible and everything just takes minutes to respond even just opening a context menu. If you can't get 8GB RAM or an SSD or both, using Windows 7 or 8.1 will give a lot more performance but won't get security updates. Using the latest version of major Windows doesn't always help, I noticed that newer Windows 10 and Windows 11's Explorer and DWM are way more CPU intensive, and even on a fresh installation leave a dual core CPU at 100% on idle while seemingly accomplishing nothing with those caculations being done. The different Windows 10 builds have different levels of idle performance, with slightly older versions of Windows 10 usually giving a little bit more performance (1809 vs 22H2 for example). I guess Microsoft just added more bloat to Windows 10 when they were putting their focus on Windows 10X and 11 instead. Disabling Windows Update is not a good idea but it can speed up an old computer with a HDD by a lot because it won't constantly be modifying the component store. Windows Defender is less intensive but it can also slow down the downloading of files significantly. Uninstalling apps isn't an efficient way if you know how to use Taskmanager to actually see which processes are running, you should focus on uninstalling or disabling the processes that actually use your resources. Modern software is more built to work on SSDs and faster computers with less focus on that performance for old hardware. If you want to use old hardware you can, but to make the most out of it you might need to use older versions of software which means you are giving up security, so you'll have to be more careful with what software you run and which websites you visit if you decide to do that. Connecting to the internet should be fine on any machine (with IPv4) as long as you don't send requests out to malicious websites and don't have your ports open and firewall disabled. IPv6 is a different story but also harder to constantly monitor by malicious people.
You didn't spend much time on thermal throttling but I've found it to be significant with my old laptop. Laptops are used on your lap, or laying on the sofa, or the bed, and their intakes can bring in bits of lint. My laptop had a very small air cooler and right in front of that air cooler was a big dust bunny. I removed the fan, removed the dust bunny, and the computer didn't thermal throttle as bad. In most situations I don't think this would affect a big desktop computer as much unless it is in a pretty bad environment.
A laptop was NEVER intended, and they will clearly tell you to NOT use it on a sofa, bed or anything carpeted....if you're doing that you're hurting the device outright which is a completely different issue entirely I'm afraid. ^^;;
Hardware wise, I think you nailed it Dave. Cooling is a big thing - esp on laptops. Very often when a laptop battery gets old, a lot of the charging current ends up as heat in the battery, this - possibly in conjunction with clogged vents or fans - can cause thermal throttling. I have revived several laptops for people by cleaning the airways and replacing the battery after noting that the old (usually original) battery was almost too hot to touch! Completely agree about that hard drive "Outer edges" thing - it's rubbish cos (a) of the factors you cite and (b) cos on modern drives all access is via LBNs (Logical Block Numbers) at the OS driver level and at the cluster level above there. How the mapping of LBNs map to physical drive surface locations is a per-manufacturer/per-model decision implemented in the drive's internal firmware and varies quite a lot. In short, it may not use a "Start from the middle and work out" algorithm. All that HDD optimisation stuff that was built into early Unix systems (and for all I know early DOS and Windows 3.1 systems) and used intimate knowledge of the disk drive hardware, such as average seek times, head switch times, cylinder read rates, rotational latency minimisation and read or write clustering to get the best possible performance out of hard drives which, even with all that stuff, was pretty poor by today's standards. Thankfully all that optimisation stuff moved into drive firmware in the late 1990s for the most part and around the same time, they started including thousands of spare blocks to transparently replace any main blocks that became unreliable with age. So now (aside from manufacturer diagnostics) hard drives are black boxes offering a huge bucket of perfect 512 byte storage blocks, each with a unique LBN and that's about all that anything at the OS level knows, or needs to know. About the only optimisation you can do now at the OS level is to change the cluster size to suit your use of the storage (video files = big clusters, thousands of tiny files = tiny clusters). In short, anyone trying to tell you that you can affect physical placement of your files, is probably trying to sell you something!
@@gorak9000 Agreed that the registry was not a great idea. However, Windows contains one hell of a lot of other features which are. As a long long time Unix user I was so glad to discover PowerShell for example - a very powerful command line environment with unified syntaxes and excellent documentation which nevertheless takes on board all the key goodies of Unix shells.
Today, as you noted drives use LBN. I still support some non-Windows servers that use 300gb drives. Being no one actually MAKES 300db drives, often if a system won't use the 600gb drives we have in the logistics system, a drive will be destroked. I.E. a larger drive has firmware that just makes the drive look like the old 300gb drive. Most of the system WILL take the larger 300gb drive. It doesn't really help as they are usually mirrored, so unless it's the second of a pair being replaced, the volume is still based on the 300gb drive originally built upon.
It is amazing at how thermal throttling can effect performance. Even with a tiny improvement in cooling/heat transfer, the machine can run maybe 12% which doesn't seem like a lot, but it is noticeable.
yeah people are STUPID. they run these things on pillows instead of hard surfaces insane stuff like that. I rewired my aspire to keep that fan running 100% of the time. however its plugged in almost all the time anyway. heat is thy enemy.
@@leecowell8165 Yeah, I remember seeing a friend sat with a soft rug or on her lap and the laptop on top of that. When I asked why the rug - "Oh cos the laptop runs so hot it burns my leg" - Doh!
@@crankshaft3612 so do spinning platters of rust. Motor bearings die. Integrated electronics cease to function. It's a crap shoot. But yeah, SSDs die too.
@@crankshaft3612 I bought my SSD in 2012, still using it after ten years. If you don't write a lot of data on them, they can last for ages, specially the SLC ones.
Something I found throughout the years: Many of the SATA cables that I got with motherboards/drives back when it was a brand new standard got glitchy over time(also wobbly in their connectors although I'm not sure they weren't like that new). Sometimes loading something would just stall for a few seconds or even hang the system. If the system decided to "hang" moving the cables around a bit would often get it moving again. Even if all seemed fine drive benchmarks would sometimes be "slow" because the controllers fell back to the lowest speed they could as they were getting too much garbled data. On HDD's the actual disks would make clicking/chirping noises (seagate drives here). If you run into something like that, check the UDMA_CRC_Error_Count (or something similar) SMART stats. If it's high, replace the cable before replacing the disk to get everything running smooth again.
@@geonerd yeah, I noticed the performance issues, and then went digging as at first glance the disks ware fine, and found a lot of messages relating to re-initializing sata connections and lowering the speeds due to errors in the kernel log. Ever since then I've just had a bunch of extra cables and replace em every now and then. Kinda makes me wish back to those good old IDE cables that once plugged in could be hell to remove. I'd rather have issues pulling them out/off than having issues with them coming loose/crapping out creating weird glitchy behavior.
Just went through a month of hell moving to larger drives and having errant boot problems as Win 10 decided the offering was insufficient and a full check disk was in order (and Win 10 isn't kind enough to tell you which disk(s) it is unhappy with, and only ten seconds to skip the process after the spinning circle does some incantation for an unknown length of time). Two bad cables (and brand new!) and cursing windows move to hide the sausage (seriously considering moving to dual boot Win 7 and Solus). Oh, and the old trope of some disks being happier in a particular orientation still holds even today (some REALLY don't like being vertical).
@@wayland7150 Ive found them to be quite unreliable. Ive had ones that were fine for 5-10 yrs and then started acting up even though they weren't touched and the system hasn't been moved. It's weird.
The 1st change I tend to make when asked to "speed up" someones machine (with a HDD) would be to image it onto a higher capacity quality SSD. That provides a cold backup and a 30-ish-X IOPS bump. Basically a do no harm update that tends to be cost effective. Followed by all of the steps in this video of course. RAM isn't always accessible, and when it is it may not be available or have sane pricing.
Yeah this^. Reinstall is preferable imho but imaging is a good way to preserve everything. Generally not even worried about boosting the capacity these days - a lot of people I've seen use a tiny percent of their drive and do everything online. Often customers say a smaller drive is even ok if it's cheaper.
@@jjjacer I always forget how utterly unusable Windows 10 in for several minutes after boot on a mechanical disk, and every time I go to Task Manager all "WTF?!?!" it's because the spinning rust is pegged at 100%. SSD all the things.
@@QualityDoggo I've had customers with 2 TB HDD that was failing (because laptop HDDs last about 5 minutes) and they only used ~80GB including the OS, software, and update files. I think I installed a 256GB SSD for them and the system worked good as new.
What happens if the machine already has an SSD? Or, to put it another way, SSDs can only be a partial fix to NTFS fragmentation problems. Sure, you don’t have to worry about seek delays, but if you have to access the same data via separate smaller reads/writes versus one big transfer, the multiple smaller transfers are still going to be slower. And remember, the NTFS defragger had to be disabled on SSDs, because it was shortening their life.
Hard drives do use constant angular velocity, but the number of sectors per track does change. Fewer sectors per track at the inner part of the drive, but more sectors per track at the outer tracks. Check documention for Seagate drives . Disk to head transfer rates are for outer track. Ibm described the change in sectors per track as "notches". Even back when disk drives were described using tracks sectors and heads, the drives had the smarts to translate that to a block number, and locate the data on the drive. Data is written and read from the drive at different speeds to allow the bit density to remain constant over the surface of the drive. 3.5 inch hard disks could have as much as a 3 fold advantage in both speed and reduced head seek at the outer edge of the drive as opposed to the innermost track. Think this started with ata/ide drives.
@@DavesGarage I second what @andrewscott1451 has written. Some 2 decades ago, I've written a simple terminal proggie for Linux (using ncurses) that reads a disk drive sequentially and displays a progress bar, and also an instantaneous transfer rate gauge. You can read the whole block device start to end. If you do that, you'll find out that the drive is indeed faster at the start (= along the outer edge) and gets gradually slower. In late noughties, the typical 3.5" desktop drives drives of the era would start at about 105 MBps and gradually slow down to maybe 40 MBps towards the end. This is a known fact. The inner geometry is not published in the datasheets, but the observable behavior is universal, across spinning rust vendors and models. The max/min/avg transfer rate develops coarsely with drive capacity - more precisely, with data density per square inch (or what units you are used to) and, the dependency contains a square root! guess why. Hint: the bits along the track get denser, but the tracks are also narrower and closer together. The ratio between the sequential rate at the start and at the end had a notable correlation with the radius of the tracks at the start and end. Note that "enterprise" drives (Seagate Cheetah) were using a narrower "ring" on those platters, and therefore this ratio of outer/inner transfer rates was also lower. My HDD test proggie can read and write either sequentially or using random access - which also gives funny results. Such as: over the last two decades or so, while the disk drive capacity used to grow, the "fully random seeking capability" has remained pretty much constant, at about 75 random seeks (IOps) per second - for a desktop 3.5" drive at 7200 RPM. Enterprise drives can do more, maybe up to 300 (steady) IOps with TCQ/NCQ enabled and preferably also with transaction reordering in the host-side write-back queue. Notebook drives can do about 60. That was before we got blessed by the Shingled Magnetic Recording :-(
i used to install so much on my windows 98 machine back in the day i knew the ins and outs of the registry, never ran a antivirus and was willing and able to do a complete reinstall religiously about once a month... after evaluating all those juarez...... lol , sometimes stuff just bloats like you said. i love how you said exactly what i was thinking once again dave! THANKS man maybe ill be joining up in your discord or something where we can actually talk about some things but for now i will post my praise here and at least for me, i really dig your channel and persona. keep on rockin man.
A lot of OEM thermal paste degrades and/or pumps out so it's a good idea to do a sanity check on CPU temperatures with HWiNFO and Prime95 and repaste if temps get out of control. (by out of control I mean thermal shutdown, throttling below base clock speed, or reaching a steady state temperature over 90C with stock power limits (12900K excluded))
This video is addressed to a different level users. And if you haven't changed thermal paste in older laptops many times before, it would be safer and less time-consuming to have the repairman handle both diagnostics and the procedure itself. Most of the times diagnostics will be free no matter the result of it.
Replacing thermal paste isn't something I've had to do often in the repair industry, maybe once or twice. In practically all cases of thermal runaway I have seen was due to vents clogged with dust and debris, or the occasional bad fan. For that reason it's still not a bad idea to do what you said, and check temps. But I'd start with simplest solution first, checking dust and fan performance, before delving down into deeming the thermal paste bad: As far thermal paste itself, yes there is good and bad thermal paste. But it all gets to the point where it will look "dry" while not being a solid indicator on whether or not the paste has "gone bad." The thermal paste industry is kind of gimmicky in that they spend a ton of marketing money to convince people that it in fact does go bad regularly. Even the metric they use to measure performance is watts (of heat dissipation) per square meter, which when scaled down to the size of a CPU, a difference of 10 watts per square meter will be hardly noticeable if it is at all. That's not including the fact that once the HSF is squeezed against the CPU, the thermal paste actually doesn't cover it from end to end. Only in the microscopic crevices created by the manufacturing methods. Maybe a coverage of around 5mm total is done with thermal paste. I have used RadioShack branded "silicone thermal" paste, which only cost me 35 cents for a 2 ounce tube on an Athlon II X4 before (replaced stock paste because I had to undo it to test the CPU on a friend's board). Worked fine for 2 years, and probably would have gone longer if I hadn't upgraded the cooler for the sake of overclocking, so I opted to use its included thermal paste. Regular silicone paste is about the worst you can get performance-wise.
@@BrunodeSouzaLino only in OC'ing and times where people will run their computers for literal months while clogged up with dust and cigarette smoke remains have I seen thermal paste go bad enough to really matter. Thermal "pumpout" is actually the issue and not it drying up (which turns the paste into a powder, a lot of times visible on video card or bottom of the case). Modern PC CPU's stock generally don't run hot enough to do that. While drying out, apply some paste, and check back in 2 months. It'll likely be dry, so does that mean you should replace paste every 2 months? And yes, I have had paste dry up so badly that pulling and twisting on the hsf while removing it also caused the CPU to pull out of a latched socket before. Monitoring software showed no thermal issues prior to doing so. Generally when people replace thermal paste they address other issues while not realizing it -- Including removing dust and reseating fan connectors. They then attribute those gains to the paste. Which leads me to my last part, telling users via a social media platform to just replace their paste without investigating much simpler issues first actually does them a disservice. As they're more likely to damage something tearing their PC's apart that much with no prior experience just because some tech bro told them on RUclips. One thing replacing thermal paste as a first jab at diagnosis helps, is the bottom line of the companies that produce it. It's all about the money. Converting old PC's to SSD's especially when they have an HDD which is on its way out on the other hand, I always get noticeable performance gains from that.
Totally agree that Defender is all you need. Malware can be very odd. Had a case back in the 90s. A user got reprimanded for visiting a ton of porn sites. The router monitoring software said so. I knew very well she wouldn't do that at all, particularly at work. I checked the browser history, but nothing, which I expected. While I sat at her computer the monitored hits kept coming. Turns out there was some malware doing web gets to porn sites, with a referrer code in the get, to generate revenue. The content of the sites went straight to null, so no-one was the wiser.
I can't rely on any application from MS because they are just as likely to dump it and fail to support it. I'm thinking mostly of email programs which have gone through too many changes. And because MS thinks annoying users is a good practice. The same applies to Defender. It is apparently quite good at the moment. But a few years it went from a good above 95% detection rate of viruses etc to only 50% according to AV Comparatives. At that time AV programs had to be reliable as so many viruses were in the wild
If you never compare, you'll never know the difference. I have a desktop at work, and work remotely. Both home and work systems have Outlook opened on the same account. ESET anti virus on the home computer regularly detects phishing and malware in Outlook inbox, that the PC at work does not, which only has Defender. With some products, that could be from false indicators, but these are genuine phishing emails that are getting flagged by ESET.
Hi Dave! New subscriber, glad you're here. Just for fun I thought I'd mention one other way that hardware can degrade over time--the breakdown of thermal compound on the CPU/GPU. Granted, that's beyond the scope of most folks and your system suggestions here are top-notch. :) Thanks for this video!
Quite true, cleaning off the old and putting on fresh can help, but usually you don't notice till after the computer warms up. When that does happen, it's a good idea to clean the fans and blow out the dust too. I have 2 cats, needs to be done more often!
100%. So many laptop/PC's I see people buy are put in cold areas and left off which causes the cheap thermal grease to dry out. Removing the old grease and using a higher quality grease that doesn't dry out is the solution.
Nothing that can't be done by the average person. Anyone with 2 hands and a screwdriver should be able to open their PC case and clean their fans/check the state of the thermal paste and clean/readd new paste if necessary. If you are capable of cleaning your house, you-re capable of cleaning your PC, arguably one of your most valuable posession.
Ages ago a friend called me up and asked me to please come take his computer. He was just tired of dealing with it because it ran so poorly, he assumed it was because it had a cyrix CPU. 5 minutes on the bench and I knew why. It had the minimal amount of ram it could possibly run with, was running windows Me and had a winmodem (I did say ages ago). I maxed out the memory, installed NT and replaced the winmodem with a US Robotics serial modem. I used that thing for at least another 5 years in my lab. Now a days when a system seems slow I replace the drive with an SSD and max out the memory, when that doesn't cut it anymore I install linux and use it for other things. I still have systems running that are >15 years old. I drive them until something major gives out.
I used to have dial up internet. in those days the best modem i had ever owned was a US Robotics Modem. They were just simply the best dial up modems ever built.
I prefer Linux because Windows is getting to a point where they dictate would software you can run and besides let's face it it's full of spyware... Also love the point that Lennox is open source and free two thumbs up for Linux.
I have a slight issue with #2. While what you say is generally correct, I've run into trouble more than once by updating my NVidia drivers. While NVidia CLIAMED they were stable, it turns out that they were not as stable as they said. You might want to wait a week or two before installing them and make sure there are no bugs that come out of the update. And, no, this isn't common, but it happens enough that I gave up trying to keep NVidia drivers up-to-date as soon as humanly possible.
Why update when things are working fine? I postpone and delay Windows updates as much as possible. On my phone I learnt it the hard way when I went from ios 15 to 16. I regretted after a month and couldn’t rollback. Unless the new update fixes any issues or provides a good reason to update, I don’t bother these days.
True, haven't updated my Nvidia drivers since september 2022. And yes, everything runs just fine. I highly doubt I'll jump from 45fps to 60fps in Red Dead Redemption 2 by simply installing whatever newest driver there's available now. Mandatory "if it ain't broke, don't fix it" here.
Having been a PC repair guy, I've cleaned countless machines for customers to make them more responsive. Half of the times I ended up upgrading the system anyways because an intermediate upgrade doesn't cost as much as a new machine but it does offer a respectable improvement in raw speed. And the customer saves on not having to have all his or her apps and data transferred to the new machine (you won't believe the percentage of people not capable of this) which can take a couple of hours at € 50 - € 80 per hour.
Very cool. How do you manage the apps they want to transfer over. I'm always a bit weary of doing a nuke and then they say they wanted some app and now their details are gone.
@@jackkraken3888 If you don't want to risk losing data, install the fresh OS on a temporary drive. Then transfer data form the old OS installation, check if you have everything then clone the new OS over the old one.
@@jackkraken3888 Windows 7 has an app to transfer data from old to new machine. The app is also available in windows 10 under the old name (win 7 transfer app or something like that, been a while) It's a little less flexible than manually copying stuff, but it does take a lot of work out of hands, giving more time to sip coffee, check phone etc :)
There are several factors in the storage subsystem. You have mentioned FS fragmentation - that's a classic in its own right, even ignoring all context. Note that as programs get bigger, or in general as the storage requirements grow, this also means more "clusters" allocated on disk, and the tree structures of metadata also need to grow, and the trees get deeper, and take longer to walk upon any access. Which may mean walking a tree in memory, and that's an optimistic scenario. In the less favourable scenario, you need to spend seeks to walk the metadata tree on disk. Now when a filesystem gets over 75% full (a genaral ballpark figure), fragmentation starts to climb rather steeply. At say 90% full, fragmentation skyrockets. Because it's statistically very difficult to find continuous/contiguous sequences of blocks to allocate for larger files. Spinning rust had a limited IOps capacity, but what it had, could be depended upon - because the mapping from the outside LBA address space to the internal tracks (of variable capacity) was relatively straight-forward and sequential. For comparison, with SSD's, you have two additional layers of mapping / indirection, below the outside LBA address space: you have the space of Flash pages (or lines) and underneath the space of Erase Blocks. The LBA sectors get dynamically allocated to pages=lines, and these in turn get allocated from erase blocks. To modify data in a particular sector, you need to write a whole line, and to modify a line, you may need to erase a whole erase block. To do that, you may need to shuffle other, unrelated lines in that same erase block, out to other erase blocks. A simple write may get "amplified" in a rather gross way. The Flash SSD also needs to do wear leveling = not let individual erase blocks to pick up disproportionately high erase counts... Flash SSD's are not a silver bullet. You get what you pay for. They can bog down under a long-lasting sequential (or random} write load, if they need to catch up on some internal janitoring. On the spinning rust side of things, you get "shingled" recording... You are free to pick your poison.
A great and concise collection of info here. But the fact alone that you created Task Manager gets a new sub. We're in the presence of a legend here people!
Seeing the other comments, I’m clearly out of my league with your video. I’m a 73 yr old woman who understood about every 12th word you said. I anticipated jumping onto the video on my iPad and sitting at my desk and zipping right thru speeding up my desktop computer! For example: you were going thru Tip #1 and I thought, pretty soon he’s going to put up a few screens to show how to do what he’s talking about. However, very quickly you were saying, “Ok, now on to Tip #2……….” You have clearly geared this video to people who have a pretty decent knowledge of computers already. I was hoping for more of a beginner class; maybe a whole video on each tip separately with no fluff and get RIGHT TO YOUR POINT! The title of your video sounded so simple, but got very complicated and mystifying very quickly! I don’t doubt for a minute that you’re very good at what you do, but do you have videos that are much more basic? Or can you recommend beginner’s videos on this subject? No insult is intended here. My ex-husband was IT, my brother is an IT specialist (out of state), and my son is an IT guy (out of my city and too busy). Only my son comes even close to be able to talk to me at my level and I’m pretty certain that I’m not stupid. That’s why I turn to these RUclips videos for help. I just need to be spoken to on my level. Any ideas? I can’t afford to hire someone to speed up my computer! Thank you! I love your flashy computer! 😊
Dave just casually rocking some of the best hardware on the market. Also, his book is amazing and after 30 years of "Why am I like this" I am getting tested myself in a couple months.
Ah well, after 50 years stumbling in the dark, I found out and realized that although it had been a handicap I still had managed to get somewhere. But it explained a lot as in to how my life had unfolded.(most of the time I considered myself an alien) Good luck with your journey.
Most HDDs in the modern age do have variable geometry (ZBR) packing more sectors into the outer tracks - that's part reason why CHS addressing was replaced by LBA. So, outer tracks should experience better performance due to having to seek to the next one less often. Directory placement at the middle is generally for seek performance - any advantage having it near the outer edge would soon be lost otherwise. Minor revisions being better - sometimes. I remember Windows XP having zippy performance when you installed the base product but then you'd put the Service Packs on and it'd get more sluggish with each one (and this is before any other additional software or great time elapsing). Hibernation file - fast startup needs it since modern Win versions have the option to just shut most of the OS down and write the kernal off to the hiberfile. It's also a must-have if you have a UPS. Macs - the best thing you can do for one of them is to put Windows on it. But in general they'll be a bit slower than a same spec'd generic Intel machine.
True. I had a drive that was failing so I did a full disk check on it where the disk is writen fully with a pattern and then read. The writing speed was significantly faster in the beginning then at the end because the writing started from the outer tracks.
@@DavesGarage paradoxaly a system with Raid-0 SSD and 32GB of RAM definitely has blazing cached storage but surprisingly the system does not feel mind blowing snappy, right? 😳 There is a remaining 20% bottleneck in the core that has prevented MS architecture from beating Linux, iOS, Android clean.... that's a lot of missed business!
@@DavesGarage I recently tested a new 3.5" HDD by writing it completely until it was full. Write speeds started out at just over 200MB/s, then became progressively slower, reaching around 100MB/s at the end. I'm guessing the file system prefers to use the outer tracks when available.
@@tiarkrezar Can't find it, but I'm pretty sure LTT did a video where they created 3 partitions on a drive where they each literally had different overall performance due to where the partitions were physically on the disc. I think they got similar numbers as you and I think their proposed use-case was that you put the OS and "important" files on that faster partition, junk on the slowest one and everything else in the middle one. I guess something like a poor man's "small boot SSD plus a separate scratch disc for other stuff". Granted, the sharp decline in SSD pricing over the years probably just makes this a pretty silly exercise in 2022 :)
Great content Dave! High quality, well explained, reputable and trustworthy! Thank you very much for you work and service Sir! You've earned a fan, an admirer and a permanent subscriber!
The ultimate solution: wipe the hard drive and reload a fresh Copy of the OS and apps. That is essentially a factory reset that gives you exactly what you had when you bought it. Then do a disk image backup. Next time it gets slow, wipe the drive and restore the disk image backup. Voila, super fast computer again.
Thanks, forgot to mention you need to backup your personal data files before wiping your drive! But you do that regularly already, right? Segregating your personal data files (documents, pictures, etc) to an external drive or partition is a great idea.
A fresh OS install is better than a factory reset. A factory reset will come with all the preinstalled bloatware and will need all the subsequent updates to be installed.
Your vote of confidence in Defender has me thinking about ditching Avast, which seems to just get increasingly cumbersome, spammy and dopey, even though certainly it works fine as far as its core function is concerned. My performance on my Windows laptops is already reasonable, but getting rid of even more bloat sounds pretty cool. Thanks!
Hi Dave, Always fun watching your content. I can honestly say after 31 years in the IT industry I always learn bits n bobs from your content. thank you. I wonder if you could do a video on software tweaks. There used to be tons of Registry hacks we used when I was a desktop engineer (a while ago!) and it would be nice if we had your wisdom on the same, such as PowerToys or the like. Regards, TJ
believe it or not, i still have the original "PowerToys" for 95/98 and only a few of them will still work on windows 11 after some slight changes to the INF file. im not so much of a fan of the new PowerToys though.
Thanks as always, Dave! Having HFA In big tech and being pretty good a masking, I was always concerned about the stigma that might be associated with disclosing that I’m neurodivergent. The openness with which you discuss your own ASD has really helped me to embrace that part of myself. Looking forward to reading your book one of these days!
@@Darxide23 Indeed, as someone that suspects I'm possibly am on the spectrum myself (high function Autism/Asperger's, though I know it's been deprecated since 2013, but I still feel it should be used in some way to denote those that are high, or ultra high function as Asperger's is similar to, but not totally identical to autism. That said, I have a comorbidity of sorts, CRS (congenital Rubella Syndrome) by being born a prenatal Rubella baby in 1965, 4 years before the Rubella vaccine came along, and 6 years before the MMR vaccine.
@@johnhpalmer6098 Yea, I wasn't a fan of so many things being rolled into ASD, either. If you hear that someone is on the Autism Spectrum, that can mean almost anything. It gives almost zero information.
@@Darxide23 Yeah, I can see that, but reading up on all that is considered for Autism, there is a lot to it and they all share some of the traits, that is especially true for the brain and how it has more neurons than most neurotypicals tend to have. That said, very high functioning, and I mean, what would be considered borderline Asperger types may not need to mask nearly as much in daily life as others on the spectrum, thus may not get diagnosed as readily, which was more likely for older folk now in their 40's and up, but still struggle mightily in many ways to this day, which can affect how we get/keep work for starters.
I agree with all your recommendations. You sort-of touched on this BUT - The BIGGEST improvement one can make with an older machine is to replace the old rotary drive with a SSD. I didn't hear you explicitly say this.
I remember computer magazine c’t writing a HDD test program that showed different speeds on different parts of the disk. That was 100% a thing. But with many platters that gives you more of a sawtooth pattern depending on logical to physical sector mapping
The heads are always over the same sector number of each platter so it seems to me the fastest and best way to write data on the lowest level (physical) is to write to all heads simultaneously.
@@IrocZIV Logical drives have mappings that are often nothing like a perfect physical separation of partitions. The only sectors I know of that always are in the middle of the partition, logically and physically, is the NTFS file allocation table. And even then a failing sector in the FAT can cause parts of the FAT be physically relocated somewhere outside of that region and causing an extra long head move when reading the FAT. Bottom line: The speed between an "inner" and "outer"partition is not significant. The physical mapping can differ greatly from the logical mapping, depending on the health state of the physical platters. Sectors fail regularly on a lot of drives, we just don't notice because the drives have advanced sector recovery/relocation capabilities. A quick check of the C5 value in the drive's SMART log will tell immediately if the drive is failing or not. Pending sectors are really bad, it means the sector failed without notice and the drive could not recover the data on that sector.
@@paulmichaelfreedman8334 On my current HDD I get 187 MB/sec at the beginning, 78 MB/sec at the end, almost 60% slower. I've seen this on every HDD I've used in the past decade.
A couple things to add: dried up thermal paste causing thermal throttling and needing a repaste, and system drive's coming nearer the ends of their lives and replacing them by cloning to new drives, especially new faster ones.
Why would one need a repaste? I've never heard of that! Once you're CPU is installed how in the hell is the paste gonna go to shit? If you're being throttled its being caused by some other issue.
@@leecowell8165 "I've never heard of that" You have now. It dries out. Some types faster than others. Open up a 6 year old + computer that has never had it's paste serviced and I bet the paste you find will be dry, hard and crumbly. Some of the most premium pastes dry out really fast compared the the cheaper ones, but deliver better performance that makes the cost and effort worth it, for example to recreational overclockers.
For example, a laptop cpu's thermal paste could have something called pump out effect. Due the the small direct die contact area with the cooler, the paste will during lots of heat up and cool down sessions leech out of cpu die area, which could cause thermal throttling. Also due to people carrying around them, the thermal paste could move to other places, which cause uneven contact. For desktop, if a computer is used in stress for a long time, thermal paste might became dry due to ingredients get evaporated slowly and cause uneven contact between IHS and the cooler. There are solutions for both situations, such as Honeywell's PTM7950, which is a phase changing thermal solution, it is solid at room temperature and became less vicious when heat up, the interesting part of the material is that it stays in place after cooldown and has longevity that designed for the service life of your machine, it is used in vehicles where electronic maintainence is not often. I use this thermal solution in my laptop for 2 years and so far have not seen thermal degradation. I hope this clears up some ideas that might cause thermal issues in a computer. @leecowell8165
I'm really glad to have found your channel. Your the perfect middle ground. By that I mean, you don't dumb things down so far, that I feel like I'm in a 3rd grade classroom, however, you don't use so much technical jargon that I have no hope of ever following a single sentence you speak. Kudos for finding the happy space for folks like me. I've got enough experience with PC and other technology, that I can't be called a novice, or beginner, but I'm not at the master, or expert level either, not by a mile lol. I find your way of explaining things absolutely perfect for the way I comprehend them, and I don't feel like an idiot after watching your videos. Like I said, I'm very glad to have ran into your channel, you get my thumb and my sub, and regular interaction from here on out. Great content, thank you for it!
With regards to the "angular velocity" theory: I know there is a similar principle at play with optical media (which also spins), and the placement of certain files can greatly influence loading times. This heavily influenced the console development (DVD organization) of The Elder Scrolls IV: Oblivion. As it relates to a HDD, I'm not sure if angular speed has much influence, though I can say, at the very least, physical storage location on the platter does affect read speeds. We're aware of fragmentation and its effects, though I've also noticed performance improvements in reading files written to a nearly fresh (recently wiped) installation. That data can be consistently accessed far quicker than had it been written only slightly after a few other files on a clean install, lending itself to carefully choosing those very first files.
@@Shalmaneser1 The latter was in reference to a fresh Linux install. Must be one hell of a fullfilling life you lead, one in which you so ignorantly jump to false conclusions without any substantiating facts driving said deductions. No wonder the ladies love you...
Constant Angular Velocity sector placement on HDD became obsolete 30+ years ago. It's all Zone Bit Recording everywhere, with more sectors and much higher linear speed in outer tracks. So physical placement of files on disk has significant effect on performance. Any disk benchmark (like HD Tune or Victoria) can show that e.g. read speed goes from 150 MBps in the outer tracks down to 75 MBps in the inner tracks, or 50% performance hit.
So, you're the dude! I've been wanting to say thanks to you for decades about the Task Manager. There isn't a day that goes by that I don't use it for one reason or another and it's help me save countless systems from virus's, rootkits and other malware as well as find and tune performance issues on countless others. Sure, there are plenty of tools in the arsenal but the one that is used along with all of them has been the Task Manager (along with its sidekick, Performance Monitor). I'm originally from the Amiga world and was dealing with PCs (286's and DOS at the time (mid-80s to early 90s)) while in the Airforce. I remember finding an OEM copy of Windows 2 in one of the boxes and, out of curiosity, installing it on one of the spare systems in the shop (which was a classroom, I was an instructor by then). Once it was installed, I was severely dumfounded at how awful it ran compared to my Amiga 2000 at home. If I could get two programs loaded, they would try to beat the crap out of each other through the curtains of the "Windows" until one, or both, of them would give up, crash, or hang up the whole shebang. I wasn't impressed at all, So I went back to playing with the Amiga and a 286 system that I'd made a trade for and playing around with the "Upper Memory" and dialing up BBS systems and such. Windows 3 and 3.1 were the game changers that got me into my first hobby build of a 386DX system (at a blazing 33 Mhz, with a Turbo button and a huge, partitioned 80 MB Hard drive, too and 4 Megs of RAM. (My buddies were still swapping floppies on their Commadore 64s and playing Pong on their Ataris.)). It's been both a blast and major PITA (at times) ever since. I, too, try to keep systems out of the scrap piles for neighbors, relatives and such folks but have to get away from it every now and then or I get pestered half to death or end up getting stuck with of a lot of (sometimes piles) obsolete stuff from them. Thanks again, from the bottom of my heart! You got my Sub!
With older spinning drives partitioning used to help with reducing the distance the head had to move - so using 20 or 40% of the disk could result in significant performance improvements if the data fit well within that size - the rest of the disk could be used for data that was rarely accessed
@@0MoTheG not “only” linear to distance - but it did offer a significant improvement versus random in IO placement of the same data amount on an entire physical platter - I ran tests with ioMeter to work out the optimal set ups (And serial IO operations become the equivalent of random operations when they were done in parallel ) In particular this benefited the Minimum IO rate under multi threaded load with low queue depths and lowered the latency. (At the expense of capacity but this was more performance oriented) Use cases : shared databases or data stores or shared multi user systems, but the same thing actually benefited personal computers (installing OS on less than the total disk size ) because so much parallel IO and fragmentation eventually occurs
I just dumped Windows and installed Linux. Install any Linux of your choice, but I've been running Linux Mint for more than 5 years - and it runs at the same speed today as it did when I installed it for the first time. I'm running SSD, so hard disk seek times were never an issue. A major reason for phones and tablets slowing down (not computers/laptops etc.) is flash wear. When the CPU writes to flash memory and the memory is new, it can write to any flash page with no issues. As the flash wears out, the CPU attempts to write to a page and it doesn't work, so it has to try a new page (that is not already in use). Thus the CPU spends longer finding somewhere to store the data in flash than it does to actually store it. This is unavoidable with flash memory devices. They wear out.
I daily drove linux for years. It's great on slower machines, but I find windows 11 runs better than linux on my current rig, so I just switched to WSL to run my linux stuff. I suppose nvidia drivers are the main reason though.
Computers hardware don't get physically slower, the amount of crap that runs in the background increases over time and if you use a faster pc at work, the perception of speed means your crusty home pc now feels slow. With SSDs becoming standard, the need to defrag is becoming obsolete.
"Computers hardware don't get physically slower". Not true. A failing HDD will be significantly slower to use. It's one of the early red flags to look for. Additionally, if you aren't regularly cleaning your case, a buildup of dust will cause the system to run hotter than it previously did which usually leads to throttling itself earlier than before. So you'll notice it be slower, more often
If we're talking about older machines, they may well have only spinning disk(s). Technically the spinning disk hasn't gotten slower over time, but upgrading to an SSD will obviously be a huge speedup. Maybe a hardware upgrade doesn't fall into the category of "fixing" an old computer, but Dave did also mention more RAM in passing.
This is *DEFINITELY* the case with Mac OS. Somewhere after around Mavericks, I think it was assumed you had an SSD, and the base OS's performance on spinning media is _terrible._ Entirely conjecture here, but it seems like they just optimized the caching and queueing algorithms to prioritize the no-penalty random-access nature of solid state media. Accidentally clicking on an update notification that launches the App Store could be a 2-3 minute wait on perfectly cromulent hardware like a Mac Mini, but running from a 2.5" HDD. Pop in an SSD -- even with the same OS install, nothing at all changed except the underlying disk -- and it's a couple seconds. I once heard that the performance increase from moving to HDD to SSD was so big "it'll make you mad at how slow it was before." Absolutely.
SSD drives may be quicker, but I find their higher prices, lower capacity and limited lifespans to be more than a little prohibitive, platter drives tend to have greater capacity, last much longer and are much cheaper to both obtain, clone and replace. And yes, I frequently make clone copies of my drives as backups since they can and do occasionally fail over time, and its always a good idea to back up when you can... Sooo... I'll stick to platter drives for now, regardless of the speed boosts ssd drives may offer,
It does work. I upgraded a computer from 2010 to an SSD and as far as program launches and simple tasks it's on par with my newer 12 core machine. Granted not playing any AAA modern games on it but as far as web browsing and watching movies it was a welcome improvement.
Great to have another pro recommending MS own security. Years ago we had a IT security professor from a university visiting and he also stated that the built-in software worked fine without subscribing to an external one. If you need detailed parental control etc, extremal software might be a good option, but with Defender and "sound paranoia when downloading", I feel as safe as I can be on my computer where I'm the only user.
The defender has had very flaky success in tests. Sometimes it shines, sometimes it bombs. Offline detection is usually not good. I also recommend doing the registry tweak to enable sandboxing.
@@dingdong2103 I had only MSD for several years and no problems with either obvious malware or system resource problems that could indicate malware so I'm happy about it. I'm sure the tests can find weaknesses on any security software. An important thing is to stay aware and don't blindly trust anything saying "I am protected so I can do anything". That's my take on It security. I think that in general, one problem nowadays is that it seems more important to FEEL safe, rather than actually be as safe as possible.
I'd say this is just a windows thing, on Linux you don't need to deal with the registry and most package managers are pretty good at uninstalling software. Plus if you're running more minimalist distros you can decide what extra programs start at boot. Windows has definitely gotten better at this over the last couple of decades though
Isn't "reinstall" just a longer way to spell Linux? Sorry, thats a bad joke. I've been on since Win 3.11 WFW and have already decide to move to Linux in the face of the Window 11 disaster.
I'll keep Windows, thank you for reminding me that Linux is still...in existence. something about those long string command line instructions just spells "user friendly"
@@runnergo1398 plenty of good plug'n'play Linux distros exist now. It's almost like the Linux community is Embracing, Extending, and (hopefully) Extinguishing any reason to run Windows.
Even needing to learn how to configure some aspects of Redhat 5 manually in the 90's and figuring out which software to replace the things I had used on Windows was less of a problem than dealing with Windows itself. Nearly 25 years later, and I'm glad I don't have to put up with the problems I hear from Windows users: no spyware, no bloatware, no malware, no corporate overseer to spy on my computer and tell me what I can and can not do with it or the software, efficient, reliable and smooth running system with far fewer issues and no slowing down. If I don't like the way the interface works, I can replace and/or change it. Never going back!
I always keep the last 3 versions of packages in case I ever have to downgrade. I have root on an SSD and my home is on the HDD. its faster if its all on a SSD but I didn't want a perfectly good HDD going to waste. i generally dont update until there is a new kernel then I clean the cache out at the same time. boot time on mine is 5 seconds from grub screen to log in as I use very few system daemons and only XFCE with no desktop manager. i'm its only user so why bother?
@@haplozetetic9519 Once Win 7 no longer works due to planed obsolescence I am done with windows, 8 was pure crap, and they just keep piling on the bs, no I do not want ultra intigrated cross linked accounts, no I do not consent to your data harvesting, no I will not tolerate a cloud based OS
there's a few things worth mentioning, but on my WIn 8.1 PRO I found disabling the Store and superfetch helped considerably. Of course it may not have been as noticable on newer machines, they were a pain on my system that I keep over 800GB of files on. Of course i swapped my boot to an SSD, but 100% usage during startup for minutes at a time was BS. Upping the ram to 16GB only gave me more flexibility with graphics intensive software, and I could run more windows on 3 screens w/o maxxing anything out. i tried running win 10 on my machine, and it ran like a slug. A 4GHZ 4 core shouldn't be beholden to MS whims, the updates were tendious and always seemed to occur when I needed to get actual work done. So I unrolled it. My daughter's pc runs win 10 fine, but I'm still pissed off and recently she filled it as well. When you have a stable machine and MS Update continues to f*ck it up, it sucks...so once stable I recommend turning it off too. Then come the registry danglers ... the amount of fluff is still horrendous. We don't need cloud services either, I found several software programs that were sending/receiving data they shouldn't have had access to, including MS. Not even turning all the privacy settings on was keeping MS out of my system. They can keep their telemetry off my system tx. Yes, MY system. I put MS Server on it recently although I was already running XAMPP very stable, and omg the startup went to hell again. Other than that, the last thing that dismayed me was DirectX12 requiring WIN10/11 ... so I have to play my xmas present games on a different pc than my dev. pc. As a society I think we are at the point of having dedicated PCs to certain uses ... the all inclusive PCs seem to be going away. I went shopping for a new pc, and the prevalence of S versions of windows is kinda unsettling. Yes our house has many pcs, 4 of which stay on 24/7 .. don't hibernate and have lovely screensavers. And they function well despite ... the industry.
Just be aware cpu GHz isn’t everything… a modern 2 GHz is much faster than an old amd fx card, many of those were running upwards of 4 GHz and were notoriously slow
Modern hardware effectively auto-overclocks itself, so you *can* find it spends more time at base clock and less at boost clock (or lower boost clocks) with age. This *usually* comes from aged thermal paste, if it is from silicon degradation, that manifests as system instability rather than slow-downs, as the CPU generally cannot detect its own failure.
@@KnutBluetooth He was referring to degradation. Generally speaking, the CPU will not tell you if it is stable or not in regards to it's clock speeds and voltage. That said, modern CPUs don't degrade much. And likely won't be an issue for the vast majority of people. I've been running my 5800x at 1.4v for a year for an all core overclock for example. Has not degraded any noticable amount. Still is stable and unstable at the same voltage/frequency. It was when I was dialing it in.
@@tyf.4399 That's not correct. The more a CPU is made with a smaller process, the faster it will degrade. Since older CPUs are made with a larger process they last much longer. This is also why they use older generation CPUs for space applications because they are inherently more reliable and resistant to degradation. I've never heard of a 8086 or a 6505 that broke down due to degradation yet. What you can do to mitigate this is keep it as cool as possible by not overclocking it and using a cooler designed for CPUs that generate more heat.
@@KnutBluetooth We're talking about consumer chips in consumer builds. A modern or older CPU will outlive it's useful life. This idea that degradation makes any difference in a lifespan of a CPU in the real world is actually just nonsensical. Modern CPUs generally run cooler. But degradation isn't based on temps. Other factors could kill a chip from temp. But raw degradation is nearly purely voltage. Old CPU degrade and so do new ones. But tdlr is that it really just doesn't matter. A CPU from the last 20 years and from the next 20 years will all outlive the applications they are designed for. Degradation should be the last thing anyone is concerned about.
Thanks Dave. I learned a lot especially about Task Manager which I have always used to end tasks that were in a 100% CPU loop! I have a question that does relate to Windows performance. Over many years I have been annoyed that Windows does not seem to natively prioritise it's tasks. I mean that it doesn't differentiate between tasks that have a screen component (i.e. you are directly interacting with that task (say, Word)) and tasks that are running in the background - foreground v background I guess. A couple of examples, rendering a video or a security product running a full disk scan versus a user trying to start RUclips and waiting forever. I see that you can set the priority of a task but why doesn't WIndows do this automatically? I am miffed about this because in the 1980s I programmed DEC minicomputers that were way less capable than my current desktop and successfully hosted say 4 high speed data entry operators and as-many-as you-like concurrent background tasks, such as compiling COBOL programs or running reports without any degradation of the data entry tasks. In other words the users who needed the attention of the CPU and Ram got prioritised over the other tasks that were not time critical. Is there a way to get these result with Windows 11? Thanks.
I'm pretty sure that Windows does this. The priority boost for foreground windows is just not reflected as a different priority value in Process Explorer.
I have to disagree that the next version of Windows always runs better than the previous. For example, Windows 2000 was painfully slow to boot on a Pentium Pro, while NT 4 worked great on said hardware. Numerous other examples exist, so it's really dependent on the hardware involved.
Yeah he did say that it's the new versions will be slower because they have more features and the point releases are faster and then shot himself in the foot by saying to always install the new versions which obviously fall into the "makes it slower category" With graphics drivers maybe you can get away with it but certainly not OS's
Agreed. When you think about "office machines" 25 years ago, running nt 4 and office with like 32 MB of ram, one would think that doing the same tasks on current hardware would be blazing fast, but it seems like something went really wrong along the way... Certainly, you didn't have to wait like 20+ minutes back then for windows to boot because of software updates. And a blue screen meant bad RAM/hardware, almost 100% of the time. It doesn't help that windows 10 have a lot of "stuff" preinstalled, like xbox stuff (even on server OS!), application "offerings" (advertising) on your start menu, and who knows what. That's what finally made me switch to linux on my home desktop. It's very fast, stable, secure, can update while running without restart (except for kernel updates), had the concept of a "software store" before anything else, and respects your privacy. The only thing that can be a problem, is lack of support from a hardware manufacturer or software developer (for example: nvidia doesn't seems to care about Linux, but AMD certainly does). Other than that, it's just great. Linux Mint and MX Linux are good beginner-friendly choices to give new life to older PCs, or even to install on a new PC!
That's not what he said. He said major changes tend to be, as you say, slower, and minor changes fix the slowdown. Listen more carefully! Examples from my IT career: Service packs for Win95, WinVista, Win7 and of course Win8.1. For Win10, which wasn't fit for purpose in 2015, major work has been done under the bonnet to streamline its performance, but they spoil a lot of this with the Store App bloat and nasty Metro nonsense, so...
@@Blitterbug At 6:07 for example, Dave's experience with OSR2 being faster than Win95 retail and XP being faster than NT 4 is the complete opposite of what I've experienced.
Love your channel Dave. I was always fascinated by inside stories of people developing things impacting our lives and technology. Be it new type of engine or OS I sank countless hours using. Im using old machines with clean OS and I still observe a bit slowdown. I ovoid installing new software as well. This is mostly in very old ones from XP era, newer machine with win7 with high specs from 2011 still works super fast.
As always, Dave, great content and an interesting topic! My number one experience to this issue is, if you're getting a new computer, take your old computer off the internet - that is, never allow it to connect to the internet again, and use it for offline applications only. Once you've made this decision, you can uninstall any antivirus software application you have on it. This will give you a massive speed improvement on that old machine, particularly if you are using AVG/Avast, Sophos, or Norton.
If only my old (laptop) computer was still functioning... I could *_barely_* afford the computer that I use now [I bought it with the (extra) pandemic relief money that I received from the federal government for not working in due to illness in early June 2020, which I wouldn't have received otherwise there]. I only bought it because my old one wouldn't boot any longer there, and it wasn't worth getting my old computer fixed by a computer technician as it would probably cost several hundreds of dollars (and thus, it wasn't worth it; it would be better to get a brand new computer at that point).
i haven't installed an anti virus for 16+ years now , haven't had a single problem most mallware and virus'/problems i experienced came from cookies automatically removing cookies after closing browser , except the ones you trust , stopped my computer from slowing down have had every anti-virus , tried them all out , after i started removing cookies , never found another thing with any anti-virus so i stopped installing them for good , my pc's run the same as the day i put them together now , no slow downs , no anti virus
Any particular antivirus you would recommend on an internet connected machine that isn't AVG/Avast, Sophos or Norton (if they slow machines down), please? Thanks, :-)
Your point about Windows defender is a very interesting one. I think the problem is there are so many other tools and pieces of software where Microsoft gives you the bare minimum with the OS that the automatic assumption is you also need to search for a better option for something as critical as antivirus.
it kind of depends on each user's habits... i haven't seen a detection on my systems in years. everything seems cleaner to me these days. when i migrated to win10 i asked one of my old friends who's still active in the business what solution he's using and i went w/ his recommendation: for home use, just WD.
Shoot, I was reading an article about you last year - YOU ARE A HERO, SIR, THANK YOU SO MUCH FOR TASK MANAGER! I've loved that baby since day one. It was absolutely crucial for me in cleaning up the sort of virus problems people would get back in the mid-00s (along with a few different antiviral programs) - the sort where you'd be lucking to get 30 seconds of OS run time after boot before the whole system crashed. God bless you, sir!
Thanks for the tips. I have been using most of them for years now and quite skilled with keeping old computers still functional. Though for the people that can change components (Meaning the Hard drive is replaceable because it isn't soldered to the motherboard) I would also suggest changing a magnetic drive to an SSD if possible. Using Task manager in my current computer I notice a big difference between using a Magnetic drive to even a Lowly Sata SSD. Issue is Drive access bottleneck which is greatly relieved with a SATA SSD. My Setup has a 2 TB Magnetic hard drive for most of my game storage, a 1 TB NVME SSD for VR Game storage and a 500 GB SATA Drive as my main boot drive. Even in my mother's older laptop I simply switched her hard drive for an SSD and she saw a noticeable speed increase on a Pentium processor and this was a true reaction as I secretly swapped the drive and to this day still haven't told her I swapped the drives. I kept 2 Windows Vista laptops running for 8 years and I only stopped using the one machine because something in the sound card was damaged itself so it was causing BSOD as well as the ports were thoroughly trashed due to accidents while headphones were plugged in. Though i don't know what keeps on happening with my own computer that I find I have to keep doing a clean install of windows on my machine once a year as something keeps Majorly Borking though I thoroughly believe it is something to do with the NVIDIA graphics drivers. I also believe in keeping things up to date but I like to choose when I update. I like to wait until I have time to troubleshoot updates as I have had Many years ago now Where a windows update (windows XP days) where a driver update for Intel processors was sent out but the issue was I was running an AMD Athlon Processor (can't remember exactly which one) so the update completely just prevented my computer from booting. I had to contact Compaq for further assistance where they provided me procedures on how to disable the particular update. I have run into other update issues but that was the most memorable one.
Dave casually mentions that he created task manager as a side project and blows everyone’s collective mind
He makes sure we wont forget it actually but that's ok
Task manager was my favorite thing about NT4.
You must be new here 😉
Such a basic requirement of a multitasking OS (Windows can't really be considered a multiuser OS) it's stunning that M$ didn't include some sort of task manager as a basic feature.
@@axelBr1 I remember windows 95/98's task manager, it was great!
All good tips here. I'd also add:
- If the hard drive is mechanical replace it with an SSD
- Run CrystalDiskInfo and make sure none of your drives are on the way out. Slow storage is a performance killer and might lead to data loss
- If the machine is a high end (gaming, workstation) machine with a heavy cpu heatsink - redo the thermal paste if you haven't touched it for years. Should help with those boost clocks. Especially if the machine has been transported, which might cause the old paste to crack if it has dried out.
- DO NOT neglect fans on laptops. Get them cleaned out
CrystalDiskInfo is good. More BIOS/UEFI bootups should include SMART-status checks imho... the ones that automatically warn of failing drives at boot up are awesome.
Also, tools that can run the drive's SMART-Self-Test (I like partedmagic) are even better if you're doing offline diagnostics but rarely needed.
SSD and maxing out the RAM are huge improvements. If the motherboard has PCIe, try an NVMe disk in an adapter card, make sure it's an x4 adapter, not x1. Might not be able to boot from it, depending on the BIOS, but it'll be faster than a SATA-connected SSD. Larger NVMe sticks are faster than smaller ones, also.
I saw discouraging stuff about SSD's on YT but I suppose I'll have to drag myself out of my cave eventually and get one. Some software even requires an SSD now.
@@dannygjk They're worth the $, imho, about 10x the speed of hdd, on my old systems. Leave some unassigned blocks and make sure partitions are "aligned," start on evenly divisible sectors, basically. I usually just specify start points at multiples of a gig.
@@dannygjk they are totally worth it. Can even turn an old machine into a nice snappy one again for all various common tasks like web browsing, email, office tasks etc.
For gaming or other 3D or heaving high-res video performance the graphics card will be a major factor too, of course.
Single biggest performance increase I have seen on older machines is replacing any magnetic disk with an SSD. That one change can extend the usable lifespan of the machine for several years, especially if you also max out the amount of RAM that the motherboard can support.
I've found that this, combined with a CPU upgrade, made my 12-year laptop seem much, much younger and snappy.
I just did this! From my 6 year old laptop I replaced the ram from 8gb to 16gb and tomorrow I’ll get nvm ssd!
You're not wrong, but W10 is simply unusable on spinning rust. Windows 7 and earlier were *not* that way, even current Linux distros aren't that way.
I totally agree ! My youngest daughter has her big sister's 2012 Macbook. Big Sister said it was too slow. An SSD Sata drive and maxing out the RAM means it is still usable in 2022 !
The amount of people machines where I've said replace the HDD with an SSD , and they've been happy afterwards, is quite a few.
If the SSD does not come with cloning software then be prepared to buy such software to copy over your old HD data to the new SSD. I did not have much success with free cloning software.
As a person who has a few older computers, including two 30 year old PCs still running DOS and a 25 year old one on Win 95, I'll add that another problem that crops up as computers age is that magnetic media is not permanent. Over time, as the magnetic image decays, disks have a harder and harder time reading some sectors accurately. Disks always store extra bits, error correcting code (or ECC), that allows transparent fixes of errors, at least, until there are too many bit errors. There is a product that I have used for many years that addresses this problem, Spinrite, from Gibson Research. It is like Scandisk, but much more thorough, and much, much slower. It picks up the data from each sector, stores a variety of patterns to the disk (0000, 1111, 01010, etc) to test it, and if the sector passes, it rewrites the original data back (or moves it elsewhere), making it once again freshly written, and less prone to data bit errors. The entire process takes about 3-10 hours for most discs, depending on the disk size and transfer rate (120 hours is the longest I have seen), but it often makes strange random errors go away. The most amazing use of it I ever made was on an NT machine that wouldn't boot and went to the BSOD. After 19 hours of Spinrite, Spinrite reported that it had repaired 57,000 disk errors. I thought "Ha, this thing is dead forever." I turned it on, and it booted, and ran for 5 years with no more problems. Obviously, this is only for magnetic media, and should not be used on SSDs...
@AlfredScudiero Good question. They were used to price orders and function as a cash register. They performed the function just fine, and as I neared retirement, there was less and less point to spending money to upgrade them. I sold the business July 1, and the new buyer elected to replace them, which cost him about $10,000, buy also added ongoing expenses of about $5000 a year. Yes, they can do some things that the old DOS system couldn't do, but they also open up new problems. To hack my data, a hacker would have had to enter my building, for example, as it was not connected to anything.
Had I been younger, I probably would have replaced them 4-5 years ago, which was about the time my competitors moved up from their DOS-based systems. Keeping DOS systems running was getting harder each year anyway, and in particular, the disk drives were going to fail, probably within a year. ;)
Thanks for the information on Spinrate. My machine now has 250gb SSD for system files but I still keep my 2TB HDD since I can't afford bigger SSD yet.
Why in the world would you use WIndows 95? It was the worst OS ever!
Timestamps:
0:00 - Intro
0:29 - Why Computers slows down
4:06 - Tip #1: Unused Software Bloat
5:51 - Tip #2: Software Updates
7:13 - Tip #3: Startup Apps and Services
9:10 - Bonus Tip: Storage Device Problems
11:10 - Tip #4: System Cleaning & Hygiene
13:20 - Bonus Tip: Proper Malware Removal
14:10 - Tip #5: System File Repair
15:02 - Outro
Thanks! I'll add them to the desc!
@@DavesGarage Hi Dave could you explain how file explorer calculates the size of the files in a folder. I can't seem to replicate it without recursively summing the file sizes in the folder. Is it possible to grab the folder size property without re-doing the work of file explorer?
@@MrRozburn Nope! There's no "subtotal" for subfolders, so it must re-sum recursively every time you ask!
@@albi2k88 First Dave did explicitly note that optical drives (i.e. CDs, DVDs and BlueRay) are a very different animal but since you are unlikely to be running slow because of an optical drive it's rather moot anyway. I'm pretty sure that most drives use variable geometry too but as someone else noted: just replace them with SSDs (which use less power!) and leave the rotating media in it's slow-lane past.
All of the older computers that I replaced their rotating disks with SSDs seem to be a lot faster after the switch.
Dude, just Install Linux!
Speaking as an experienced teacher, I have to say that this is an excellent instructive video. Very fast paced, but easy to listen to. Clear, direct and understandable. I think I actually got 90% of it on the first pass. No annoying music or distracting graphics. I will definitely be turning in often. Gigathanks.
See, Bruce, you're an experienced teacher with English as your mother tongue and you caught only 90% and you require at least a second pass. English for me is not even the second language. Though I caught everything on the first pass and that's all that you mentioned: fast paced, no annoying music or distracting graphics. But what was it I was expecting to see? To hear, rather... from a guy who worked for Microsoft? All I heard is available online in Google.
@@TroyQwert "In Google"? Google is merely a tool through which to access information from other sources, some of which might be unreliable or outdated. From this video, I got a bunch of reliable information from a person who I **know** knows this software inside and out. I've learned a bunch of it before, as someone who maintains dozens of old computers in a high school and who's trying to squeeze the last bit of life out of them, so I've definitely searched high and low for tips like this. But I still felt this was a good use of my time because I learned some extra details about *why* things work the way they do, which deepens my understanding (and makes me better at my job).
@@frisbeepilot, good for you! Way to go!
@@TroyQwert Isn't pretty much everything on RUclips also found on Google so what's your point?
Completely agree with you @bruce
I have an old PC (Built it in 2010). It runs Linux just as fast as the day it was put together. Still happy with it.
Depending on the distro and age of the hardware, this is the answer. Anything besides Ubuntu with xfce or mate will work great even on like 20 year old hardware. Ubuntu is slow because snaps. Gnome, Cinnamon, and KDE are lighter than windows, but are still too heavy for a mechanical drive
Got one here from 2010 running Win10 just fine, 8GB ram with an SSD, i5 CPU. I don't do much on PC so I'll use it till it dies. lol
@@Tyler-KearneyKDE for quite some time has been lighter than even Xfce! Facts, please!
@@afriquelesud hm. I'll have to look into that. Always been a cinnamon fan myself
Xactly. This slowing down to a crawl is a Micro$oft issue
3:27 eventually HDD can also cumulate failing sectors and slow down, because the HDD is trying to read more that usual a hard to read sector.
Specifically this is what I was wondering.
Some examples of hardware slowing down;
Your computer may have been tuned in the BIOS, XMP for example and then the BIOS gets reset to default values.
Your SSD is too full so it has to do extra work to find space to save new data.
A component may actually be failing but the computer is hiding the retries so it simply seems slow.
You may require the correct driver which has been removed at some point and now the computer is making do with a basic driver.
I found a user had been using a machine with the wrong network driver which gave them a 100mbps network when it should have been 1000. They also were using the Windows VGA driver when the graphics had a much better driver available. If they were a gamer they would have not been able to game but all that really happened is the computer felt slower to use.
Ohh! I like the XMP revert. Wish I'd thought of that. Same with overprovisioning on the SSD.
Had that happen to me. Only discovered late at night with everything else in the office switched off. Then I could hear the drive doing retries.
@@crankshaft3612 The dreaded sound of a hard drive doing retries.
those are not retries those are screams of pain the drive is dying and begging for mercy, time for SSD
Is anybody mentioned bloating of files, and how to debloat them?
I agree with replacing the old mechanical drive with an SSD. I run several old machines including laptops. The single most important thing I did for speed was to swap to an SSD in each of them. Those old computers would take 3 or more minutes to start up and several more to become useable. Each one now starts and is perfectly useable within 30 seconds and everything loads 10 times faster.
My kingston ssd feel slower now after 1 year of mild usage (mostly for watching videos)
@@satunnainenkatselija4478 What are you talking about
@@satunnainenkatselija4478 If your PC takes that long to boot your boot drive and or CPU must be complete trash. Also, as noted in this video you could have way too many startup programs. I have a Gen 3 NVME SSD with a Ryzen 7600 with Windows 11 and my boot time is about 15-20 seconds from BIOS splash screen to all startup programs opening. Your PC or OS install has some major issues.
@@satunnainenkatselija4478 most of the times why old computers take long to boot is because the new systems are just too much for them, its not that programers just make it longer on purpose, they do the opposite actually.
@@Silverhazey_ thats the thing, old laptops have shit cpus for todays standarts, thats the point. You can basically revive these old PCs to run smoothly when you swap up HDD for SSD. For 99% of PC users its good enough
Dave is a walking talking technical computer encyclopedia, I love listening to his knowledge, experience, and opinions
It's all well known information. Absolutely nothing new. 20 years ago we used to get this info in online articles
@@zochbuppet448 L+Ratio, you’re super jealous lol
How old are you 12 going on 30. That line is extremely over used. Are you able to think for yourself, or does everything you think and do comes from what you read on RUclips
It's too bad he went to work for such a shitty company which should have been shut down under RICO laws 30 years ago.
He's inexperienced. He says that dust and clogged fans causing heat throttling is the only way he knows that hardware can slow with age.
This is so false and the incorrect assumption that software is nearly always the culprit.
Good info. I find these are the most important: 1. Updates; Windows, browsers, apps. 2. RAM ; adding some RAM may improve speed. 3. hard drive space; some of my users I support save too much on the C drive and it gets full. 4. Malware. Many users don't pay any attention to these.
True re the RAM. First machine I had was a loan from brother in law a 486 DX-66 with 4MB RAM and a 200MB hard drive. Once I installed Windows 95 on it, it had about 35MB space left and I had to constantly install and uninstall games to play them. Put in a CD-Reader and another 4MB RAM and a second hard drive and I was in heaven. LOL.
@@Niiixxxxmassive boost :p
5. Install Linux over Windows.
Way easier to let the pros do all that for me. Seriously why hasn't this become the norm? Many areas it has like social media. But why do we labor with silly tools like Drive? the almost impossible local network arrangements? antique printers? cd burners and boxes of install discs?
I fixed my older machine from running really slow. I removed windows and installed Linux. Now it runs super fast!
Lol but does anything actually work on Linux
@@akale2620 idk what you are using that's so special. The majority of backroom servers and internet are Linux. M$ may own the desktop, but now all you need is a browser for most things
I have been using Linux for decades and NEVER had a problem.
I run 7 servers at home, all Linux
All the software I use is open source, FREE, and works perfectly.
You might want to try it
@@rty1955 ohh I know servers etc run on it. But it simply didn't work for me.
All we needed was to take a few printouts for family tax filing. Had win10 that never worked for me. So installed zorin os instead. Turns out the printer I have isn't supported by manufacturer in linux. The opensource drivers didn't work bcz the specific model I have doesn't have any. Couldn't play any torrented games on it either. Old games age of kings and mas.effect1. switched back to win7.
@@akale2620 did u try the games under Linux running Wine?
I dont use my computers to play games. I do electronic design work & 3d modeling. I use email, word processing, personal accounting, spreadsheets etc. I have been running Linux since 1996 and never used Windoze ever since
@@rty1955 I couldn't figure out how.
Nothing could be downloaded from zorins app store.
I applaud you and your wife's parenting by keeping your kid's computers out of their rooms so you can keep an eye on what they are looking at on the internet. You are a rare breed my friend.
I think if my parents did that my life would have gone nowhere (I got into IT security and I'm a computer engineer and in college now), but my dad didn't work for Microsoft so it went about as far as "computers are nifty" before I had to take matters into my own hands and start tinkering. At one point he got mad 'cause he found out I had the source code for like 1400 computer viruses, but he never found out I was wandering around other people's computers because back then you got on the Internet and your IM program gave me your IP and I could just type \\ip\C$ in the box and look at your hard drive. Good job Windows, I guess Dave didn't write that part.
Literally why does it matter
Just alone that you give a dedicated own desktop PC to each kids makes a huge difference. Most kids just addicted to phones (and silly apps) without alternatives. Even the gaming community is different at PC games vs phones.
This is the first time I've even seen one of your videos and I'm impressed. Extremely articulate, well spoken, and obviously very knowledgeable. Thanks for the info. Will try this on my older secondary desktop.
Dave, there's a third scenario to why I watched that video. I came across your channel a few weeks ago after you published a video on Task Manager. After I watched it, I opened another one, about pointers. Your channel is absolutely fantastic. I hope you're having as much fun doing these videos as we do watching them. I missed such content here on YT🎉
“Real computers don’t get slow, they get relegated to full time Space Invader platforms”
Most laptops arn't evn good boat anchors.
Doom and Panzer II
Like my Amber PET! :-)
Yes, Batocera Linux is an excellent way to turn an old machine into a retro gaming console.
@@johngillanders9694 for all who do not like Linux and his thousands of forks and clones - you may try out Android x86 :) i used it on an old Phenom X4 and it runs fast and stable
It's really amusing to see the amount of interest in something that has become so ubiquitous, yet mundane. The windows operating system. Your channel is a treasure trove for information and entertainment for pertaining to all of the little thoughts and questions that pop into my head while using my PC. Thanks for all of your hard work throughout the years!
You must not have been around in the late 1990s. Windows' market share is down to 68%. Back then it was over 95%. Nadella has ruined Microsoft.
Also, if you are running a mechanical hard drive clone it to an SSD and old machines will perk up significantly. I use Macrium's Reflect. I find it does an excellent job at doing an exact clone. Don't forget to also allocate some empty space, (usually around 10% of total disk volume), at the end of the disk for over provisioning. (I also highly recommend Samsung SSD's for speed and reliability.)
This is the single thing that will have the most affect on an old computer. I've done it on my old laptops and it made a big difference. I did it on my old desktop that still had a CPU that was overclocked to a relatively high speed and it made a really big improvement on that computer.
Another affect of an SSD is that Windows will change the paging and caching algorithms. I think it uses less RAM with an SSD installed
Did you say SSD old man? Lycom PCIe NVMe card with an NVMe drive is cheap and gets you booting at 5.5Gb/s
Agreed on SSD, and Macrium is my favourite cloner!
This comment aged worse than Biden.
About 15/18 years ago I switched to an Apple system because I was having to spend so much time doing maintenance cleaning crap out of the system-maybe an hour or two a week, usually on weekends. I am/was a journalist/researcher/copyeditor and used the internet in some odd places. Very little of this time wasted with my Mac mini.
Anyway, one cost is I still feel uneasy with my lack of familiarity with Apple operating systems. I don’t need to kludge around in them, but in a couple of cases that caused big problems that would not have happened with windows, or I could have worked through them. But I was amazed that after all these years, I understood precisely or had used an earlier version of every fix you mentioned! When you pointed to old versions of a file, I remembered the file!
Thanks, I enjoyed this.
Bonus bonus item: clean the physical machine. If the heatsink for your GPU or CPU is covered in a blanket of dust, you might encounter thermal throttling. Blowing that dust off can actually make your computer faster.
I keep the cold packs that come with my "old man" medications. I pack the laptop in them when we start each recording session (wife is a singer). Go ahead fan. Just TRY and spin. I don't care what Loiis Rossman says..
Amen, Amen, Amen. I had one customer whose desktop was literally crammed full of dust stuck to layers of nicotine that I actually took the time to take pictures of while cleaning. The thing would power up but shut down from the overtempt in about 30 seconds. I used the pics to educate him on one of the other hazards of smoking and justify the additional time charged for the "Repair". And that was really all that was wrong with the unit. Once cleaned, it ran beautifully. (Many other bad cases, but that was the worst,)
So many people don't realize just how much dust, cat hairs and other airborne substances that those fans suck in and how it affects performance and length of life of the system.
3:17
All current hard drives use zoned geometry, The outer tracks have more sectors per track than inner tracks. The each zone will have a set number of sectors, as the zones progress towards the center of the disk surface, the number of sectors per track is reduced. This prevents the waste of space that would occur from only having the number of sectors on the outer track that can be reliably recorded on inner tracks.
This zoning system started sometime around when the IDE/ATA LBA (logical block addressing) system came into widespread use. Early IDE drives emulated fixed sectors per track geometry for compatibility with older system BIOS and software.
Exactly! And at that time, the first tracks became the outermost ones, in order to hold more sectors per (physical) track), and more importantly, more sectors per revolution and thus per second, making the first tracks faster than the last ones. Drives can have up to 5 different sectors per track, possibly even more, but at a certain point, using another zone would add a negligible # of sectors.
The "different angular velocity" fallacy is probably a mix-up with CD/DVD drives. HDDs don't have that, but the differences in _linear_ velocity matter. (HDDs with variable RPM exist, but that's done to save power when there's little I/O, e.g. if they stay on during the night as many servers do.)
As a long time PC Tech I thank you for repeatedly reminding people not to delete stuff they don't know or understand. The advice you give is spot on!
Thanks! You definitely don't want the "delete it all and let God sort it out" approach ;-)
@@DavesGarage Given that this is an issue, perhaps it's a case where Microsoft could follow the approach used by some other OSes (and some Linux distros): OS and all its "important files" are read-only - perhaps even sitting on a separate read-only partition - and updated as a unit.
I'm personally not a fan of the approach, not for my own systems (Windows 10 for MSFS, Arch for other games, and OpenBSD for laptop because I need to compensate for my lack of beard growth), but since MS ships an OS specifically targeting "normies" it could be a thing. Or is this something that simply would work with how Windows is built? If nothing else, it seems like it would reduce the amount of times I have to remote into a family member's laptop to fix things. :P
Too true. Read only for important files?
This video was a good kick up the backside to actually clear out my drives. Over 3TB cleared, my decade old HDD emptied for good. I don't know about performance improvements but it's definitely satisfying to see all the bloat gone.
Thanks Dave, I needed this :)
Thank you, Dave! I'm autistic myself, and am inspired by you in a huge way to make sure my non-verbal son can enjoy the gift of his life to an equally amazing degree! Take care Dave, and thank you so much.
Wow, thanks! Raising a non-verbal child would take a lot of patience, you're in my thoughts and prayers! Just remember the love is there even if its hard for ASD folks to show it!
@@DavesGarage Dave, I am sorry for the late reply, however, I was somewhat intimidated by the prospect of conveying the depth of my gratitude for the wisdom you've shared and for the inspiring accomplishments you've successfully engaged your own savant-like skills with. I am humbled by the humility on display, I assure you I learned much from my own youth as an individual verbally impeded and have been able to engage with a world far more receptive and cohesively committed to showing us we are people, too. I am almost forty and he is almost five, I remember to warmly show him love that I know is critical to his healthy development and appreciation of who he sees looking back at him in the mirror. I am so lucky to be his father, and Tanner is so dear to me. He smiles when wakes up, and he is so happy to be alive He's incredibly intelligent, although it carries a penalty I think is mutually known between you and I. As a father, thank you for the kind words reinforcing the reality this parent can attest to being amazing as long as one stays receptive to the somewhat unique expressiveness we have. Take care, Dave, and thanks again!
I love it! And I mostly agree with it. I would however have liked two things: A bigger warning that defragging is for spinning rust only. Like a seizure-warning epileptic-fit-inducing flashing lights and sounds big warning. Because not everyone is an expert and forcing a defrag on SSDs is bad. And the second, an actual in-depth look at how the performance of your SSD slows down horribly over time and how each one is a bit ... different. SLC is generally problem free, but low capacity. MLC, LTC, and QLC (and soon PLC) have ... things ... to watch out for. Storing more than one bit per cell, their first write is faster than subsequent writes, so write performance varies. Also they tend to cache a portion of their capacity as SLC. As long as you don't fill the drive up, you can enjoy that cache speed. Fill them up however, speed go bye bye. Sometimes deleting files gets you back that speed, sometimes not. Etc. SSDs are ... complex, and can be a H U G E factor on why your modern PC slows down with age. Seeing a cohesive analysis of all of the complexities to be aware of and what can be done would be a nice part 2, IMHO. You could also look at how some anti-malware packages are, "better" than others about slowing down your system, if you need more time to fill? Maybe go into background tasks? Bloatware from folks like Dell?
SSDs have to balance write cycles, i.e. to ensure that certain frequently used areas like directories don't wear out soon, they have to do some extra book-keeping and move those areas around. That "moving around" isn't trivial; simply put they have to store information "where sector 1 is today" to find the data, and _that_ information might change often, too.
Long story short, defragging the SSD uses write cycles; using the _full defragmentation would be a major waste of SSD lifetime_ . A more selective mode, which leaves all but the most badly fragmented files untouched, isn't half as bad, but shouldn't be scheduled for regular maintenance either.
Yes, there are statistics which claim that defragging SSDs is important. They claim that a regular HDD went from 500 to 1000 I/O operations and SSD went from 8,000 to 10,000 (so the HDD gained 500 and the SSD gained a whopping 2000), but that's not the metric you should look at.
Why? Because you don't get up and say, "Today is a good day, I gonna do 100,000 I/Os" -- the number of I/Os is usually defined by the stuff you do. What you should look at is how long those I/Os take. That'll be 200 vs. 100 seconds on the HDD, and 12.5 vs. 10 seconds on the SSD. So, defragging would save you almost 2 minutes on the HDD, but only 2.5 seconds on the SSD.
Unfortunaltely. _some_ defragger makers know that very well, but use bad metrics to sell defraggers to SSD users -- a very malignant practice.
Not true at all. VSS works better if your disk is defragmented, it even does it on SSD's on a schedule. It's normal. Enjoy your RUclips KNOWLEDGE though. A SSD is a block device it can and should be treated like a hard drive, it's faulty if you can't.
@@itstheweirdguy "A SSD is a block device it can and should be treated like a hard drive, it's faulty if you can't."
Then I guess SSDs are faulty, according to you. Because while you certainly can do that, as in nothing stops you, you're also definitely shortening its lifespan. They have a limited number of writes. This is well documented fact of flash memory. Defragging a drive is moving data, eating up those limited writes. This is how flash memory has always worked. And with every "level" added to a flash cell (SLC->MLC->TSC->QLC) this gets significantly worse because of the way these levels increase the capacity. So as you shorten your SSD's lifespan by defragging it, "Enjoy your RUclips KNOWLEDGE though." I hope your non-RUclips knowledge (whatever that is) brings you some solace when your SSD dies.
I had an insane performance increase when swapping from an hdd to an ssd. That said, there are a lot of other features you can turn off that boost performance, such as drop shadows or windows search. I also find that programs open way faster by opening them one by one as needed rather than them all fighting eachother for the same memory and disk read upon boot up.
remember nthe water cooled cpu's in the early 200's lol water in a tower pc.. edo and non edo ram, limited to 128mb the y2k bug threat.. we have come a long way since 1999
wow thanks capt obvious
As someone who grew up with xp the format option is what we did all the time. It's strange that you mention it as a last resort. We formatted our computers so much we memorized the service pack 2 key. Lol
❤ xp
Ya, Every now and then... you are right😅
Same here. If there's a problem that takes me more than two days to figure out, I generally say screw it and reinstall windows. Might as well use the third day resetting up a system I know is fully working, then potentially still not having the issue solved and wasting yet another day.
dont tell me.. fcckgw or something like that xD
I was on win 98 till probably 2016
Great solid top level advice Dave. And yes, in your opener I have been that person having to try and speed up relatives PCs. My favourites are 1) disable all but the necessary start-up items and 2) convince them to use Windows Defender and ditch 3rd party offerings. My other trick is to turn up with an SSD as a "present" and amaze them with the speed improvement after imaging the C: drive to the SSD. Love your work.
My Dell Optiplex 780 had become very slow. Upping the RAM from 4GB to 10GB helped, but not as much as I had hoped for. Then I found your vid and did all 5 fixes. Wow, what a huge difference!
Dave - I used to use WinDirStat to map out drive contents, but I've found that WizTree is much faster. It scans the MFT of the machine instead of seemingly looping through all the files. While either will do the job, just thought I'd throw it out there in case you hadn't heard of it before. Have an awesome Friday!
Treesize (free) is another alternative that I've found to be pretty fast compared to WinDirStat. Not as fast as wiztree, especially for very large drives, though. But it's another option.
TreeSize as well
WinDirStat is open source... and good enough. The other alternatives are binary only distributions.
A brilliant session. Very well presented. I got so much from this too. We have such a machine in our family collection. You have prompted me to become rather thorough in my work to improve its performance. Much gratitude to you for jogging me into action and superbly providing me with a master class. Cheers
hi just upgrade to a ssd if you dont have one , a 1tb drive is down to 60 bucks last i looked . you will be amazed at the speed difference.
11:16 A note on WinDirStat: It's something of an old program and is somewhat slow. Wiztree is a newer variant and runs so much faster. Helps the file trimming go much faster when you aren't waiting minutes between drive scans
My preference is actually apps using Sunburst graphs. I find them easier to read. I don't use Windows so I am not sure what's available in that department, but I know they existed in the past.
Wiztree is sooo fast, I still prefer Jam Software's Treesize, just find it more readable, even though it's slower.
Great video and insight for Windows users! Although I feel sorry for Dave's iMac, I just upgraded my 2015 21' iMac from LMDE 4 to LMDE 5 (Short for Linux Mint Debian Edition), the computer is now faster, more reliable and has better functionality, it will be running like that in next 3 years (minimum), regardless how many programs I will be installing. That will never happen with Mac OS or Windows, unheard of.
Revo uninstaller will make installed software and all traces thereof, disappear like it was never installed.
If anyone hasn't done so already, the single biggest upgrade you can do is replace a hard drive as boot device with an SSD.
Absolutely! If you're still rocking rotating rust, get an SSD now!
I've upgraded all my machines at work with SSDs whether they be SATA or Nvme and never hear about people complaining about their machine being slow anymore. Now I can spend more time watching Dave's videos haha
even better is a second SSD for the PAGE file, Dave can explain why.
not if its running linux. do not install linux's /root /home to SSD! put each of these guys on a separate partition to a HDD instead. linux is so much faster than winX that the difference doesn't matter (linux's throughput will still be faster). there's a REASON why every server on the planet is running HDD (well almost every server is also running linux and the people managing these guys ain't stupid).
@@leecowell8165 Clearly you are delusional. Speed differences tend to be down to the actual program implementation. There are speed differences between OS, though generally not huge, and nothing anywhere near enough to make up for HDD snailspeed. SSD can easily be 50x faster in sequential speed, where HDD are good at, and thousands of times faster in random access, where HDD are horrible at. Linux can't magically change the laws of physics. It is physically impossible for an HDD to outperform even budget SSDs.
Also, every server on the planet is not running HDD. There is a server hosting platform, Linode, they have been hosting on SSDs as long as I have used them, 8 years, and not sure how long before that. Many servers still use HDD, because they are cheaper per gigabyte. Many server purposes don't require high speed data access, such as archiving, and so there is still a purpose. For many servers though, they are switching partially or exclusively to SSDs, since costs have come down. I remember going to a conference, and learned Facebook was partially switching to SSDs... this was 9 years ago.
I can agree that Linux has many server advantages over Windows, but HDDs is not one of them.
Just stumbled accross this and found it all really fascinating and well presented. Happy to discover at the end a bit of your background. You must have known my Father's brother Ken Dye who ran the MS usability lab in the 90's. He was a wonderful, beloved uncle who passed away early last year.
If your computer has a HDD, then run Defrag every month, or so. Another thing you could do that a lot of people overlook, is checking your service tab, and see what running automatically or delay. Try to see if you can set some of the services to manual. This can result in faster boot time, and the program will still work once trying to open. Opening up the laptop or desktop and clean out dusk can reduce the thermal throttle with proper air flow.
Bonus bonus tip: if Explorer seems to be getting sluggish (taking longer or even freezing up while switching folders, generating thumbnails, dragging and dropping files, etc.) there's a decent chance that your drive is in the process of failing. (Especially if you have multiple drives and these symptoms only present themselves on one of them.)
The reason being that Explorer needs to wait longer while the drive retries the read operation potentially several times before either getting a result or bailing out and presenting you with an error after all.
Is that the ONLY file manager avail under windows? Does it have a split panel like Nemo or Nautilus over on linux? No probably not, huh? Doesn't windows check the drives at startup? No probably not, huh? Yeah linux is slow as hell to start but there's a reason why because it catches shit like this.
@@leecowell8165 I think windows actually has a program that scans for the health of a drive, though. Although I'm not sure if you can use it to set an alarm.
@@leecowell8165 I'm not aware of any current (desktop) Linux distro that performs a full bad sector scan on startup.
@@generalardiThat's because that isn't the way it is done. Enable S.M.A.R.T monitoring on your drives so you know when they are failing. ZFS (as one example) on *nix actively scrubs the disk in the background looking for and repairing disk blocks that don't match their checksum.
@@leecowell8165another Linups sectarian...
> Is that the ONLY file manager avail under windows? Does it have a split panel like Nemo or Nautilus over on linux? No probably not, huh?
Windows does have those. Even more and more functional than Linux does. Among the most widely known are FAR, Total Commander, Directory Opus...
But in the latest Windows versions Explorer getting better and better. So people gradually abandon 3rd party ones. Even me, being a TC user from the time it was called Windows Commander, sometimes I prefer Explorer because it became more convenient.
Compare all that to Linux distributions' native file managers. They're just trash.
> Doesn't windows check the drives at startup? No probably not, huh?
Probably it'll be unbelievable to you, but YES! Windows does! In case any problems are suspected be it ungraceful shutdown or a "dirty" flag in FS - Windows runs `autochk` during boot.
> Yeah linux is slow as hell to start but there's a reason why because it catches shit like this.
Aaaaand despite that I personally know some mates who got their storage corrupted while they use Linux distribution! I personally experienced storage corruption after Ubuntu installed updates! What an irony: neither the first nor the last I ever experienced on Windows.
Just facts proving your points are bullshit.
One other option to reclaim disk space (I often do this around the same time as running chkdsk and sfc) is to run dism with the check health switches to remove any cruft I can from the sxs directory that grows to gargantuan sizes over the years as updates and patches are installed.
I love how authoritative this channel is. No hand-waving or shrugging: "I trust Windows Defender because the guy who runs its development worked for me and he's really good"
As an IT professional this is fantastic information in particular for helping relatives (as Dave mentions). He dispels some common misconceptions and helps prioritize repairs. Grateful for this content.
I have a 2015 macbook pro with 16 GB of RAM. I bought it in January 2019. It has the intel CPU
The problem isn't "computers" it's the massive sewage pile of windows bloat BS.
My 2015 macbook pro seems / feels as fast my 2023 macbook pro with 32 GB of RAM and M2 Pro chip.
It's *windows* computers that slow down. Since 2013 I haven't owned a windows PC. I love their upgradability but windows is certified junk.
@@rohitnijhawan5281Because Mac doesn’t use a registry that has to be kept resident in memory. Linux is much the same, config files or plist files that are read as needed is far superior to registry style management. The whole time he was talking about registry related issues all I could think is…this is a windows specific issue.
Can you timestamp when he gives info? I'm having trouble finding it.
Dave, two tools you didn't mention are "AUTORUNS" to edit startups & "PROCESS EXPLORER" to get a better detailed look at what is in memory and the process tree that is attached to the executables. But good cleanup video for the medium tech inclined used.
Yeah, big up for the various sysinternals utils! These two are the most generally useful, but for digging deeper into specific issues, the rest of the suite is gold too. Which is why µS bought the winternals dev studio into the corp in 2006.
Neuber's Security Task Manager can sometimes find and help quash a hidden malicious process.
Here's a quick maintenance tip. As a preventative measure, when downloading files from the internet to an HDD, if you have a second drive, transfer the file via cut/paste to the other drive then cut/paste again to it's final destination. This will prevent the file from being fragmented from the download process and make future defrags a lot faster.
I found this out on my own when downloading large files and then running a scan on my disk for fragmentation and seeing massive amounts of it. Then I discovered that by simply moving the file between drives is like a form of defraging. Test it out yourself.
Finally a video that is 100% actual facts! I've been studying computers my whole life and have had to argue with people who claim silicon degrades over time and runs slower because of it. I mean your clock doesn't slow down if the silicon degrades, it might just become unstable (BSOD) under load or not work at all but won't slow down.
Usually the actually slow computers are ones that have 4GB RAM and a HDD while running the latest version of Windows, where Windows Update is constantly active, the modules installer worker just doesn't stop using the HDD making paging also impossible and everything just takes minutes to respond even just opening a context menu. If you can't get 8GB RAM or an SSD or both, using Windows 7 or 8.1 will give a lot more performance but won't get security updates.
Using the latest version of major Windows doesn't always help, I noticed that newer Windows 10 and Windows 11's Explorer and DWM are way more CPU intensive, and even on a fresh installation leave a dual core CPU at 100% on idle while seemingly accomplishing nothing with those caculations being done.
The different Windows 10 builds have different levels of idle performance, with slightly older versions of Windows 10 usually giving a little bit more performance (1809 vs 22H2 for example). I guess Microsoft just added more bloat to Windows 10 when they were putting their focus on Windows 10X and 11 instead.
Disabling Windows Update is not a good idea but it can speed up an old computer with a HDD by a lot because it won't constantly be modifying the component store. Windows Defender is less intensive but it can also slow down the downloading of files significantly.
Uninstalling apps isn't an efficient way if you know how to use Taskmanager to actually see which processes are running, you should focus on uninstalling or disabling the processes that actually use your resources.
Modern software is more built to work on SSDs and faster computers with less focus on that performance for old hardware.
If you want to use old hardware you can, but to make the most out of it you might need to use older versions of software which means you are giving up security, so you'll have to be more careful with what software you run and which websites you visit if you decide to do that. Connecting to the internet should be fine on any machine (with IPv4) as long as you don't send requests out to malicious websites and don't have your ports open and firewall disabled. IPv6 is a different story but also harder to constantly monitor by malicious people.
You didn't spend much time on thermal throttling but I've found it to be significant with my old laptop. Laptops are used on your lap, or laying on the sofa, or the bed, and their intakes can bring in bits of lint. My laptop had a very small air cooler and right in front of that air cooler was a big dust bunny. I removed the fan, removed the dust bunny, and the computer didn't thermal throttle as bad. In most situations I don't think this would affect a big desktop computer as much unless it is in a pretty bad environment.
true story...
sometimes refreshing the CUP/GPU thermal paste also yields improvements
A laptop was NEVER intended, and they will clearly tell you to NOT use it on a sofa, bed or anything carpeted....if you're doing that you're hurting the device outright which is a completely different issue entirely I'm afraid. ^^;;
@@Arcanuaat the very least LAPtops are intended for your LAP, although better then carpet, it’s still not the same as a desktop; which is the point.
Hardware wise, I think you nailed it Dave. Cooling is a big thing - esp on laptops. Very often when a laptop battery gets old, a lot of the charging current ends up as heat in the battery, this - possibly in conjunction with clogged vents or fans - can cause thermal throttling. I have revived several laptops for people by cleaning the airways and replacing the battery after noting that the old (usually original) battery was almost too hot to touch!
Completely agree about that hard drive "Outer edges" thing - it's rubbish cos (a) of the factors you cite and (b) cos on modern drives all access is via LBNs (Logical Block Numbers) at the OS driver level and at the cluster level above there. How the mapping of LBNs map to physical drive surface locations is a per-manufacturer/per-model decision implemented in the drive's internal firmware and varies quite a lot. In short, it may not use a "Start from the middle and work out" algorithm.
All that HDD optimisation stuff that was built into early Unix systems (and for all I know early DOS and Windows 3.1 systems) and used intimate knowledge of the disk drive hardware, such as average seek times, head switch times, cylinder read rates, rotational latency minimisation and read or write clustering to get the best possible performance out of hard drives which, even with all that stuff, was pretty poor by today's standards. Thankfully all that optimisation stuff moved into drive firmware in the late 1990s for the most part and around the same time, they started including thousands of spare blocks to transparently replace any main blocks that became unreliable with age.
So now (aside from manufacturer diagnostics) hard drives are black boxes offering a huge bucket of perfect 512 byte storage blocks, each with a unique LBN and that's about all that anything at the OS level knows, or needs to know. About the only optimisation you can do now at the OS level is to change the cluster size to suit your use of the storage (video files = big clusters, thousands of tiny files = tiny clusters). In short, anyone trying to tell you that you can affect physical placement of your files, is probably trying to sell you something!
@@gorak9000 Agreed that the registry was not a great idea. However, Windows contains one hell of a lot of other features which are. As a long long time Unix user I was so glad to discover PowerShell for example - a very powerful command line environment with unified syntaxes and excellent documentation which nevertheless takes on board all the key goodies of Unix shells.
Today, as you noted drives use LBN. I still support some non-Windows servers that use 300gb drives. Being no one actually MAKES 300db drives, often if a system won't use the 600gb drives we have in the logistics system, a drive will be destroked. I.E. a larger drive has firmware that just makes the drive look like the old 300gb drive.
Most of the system WILL take the larger 300gb drive. It doesn't really help as they are usually mirrored, so unless it's the second of a pair being replaced, the volume is still based on the 300gb drive originally built upon.
It is amazing at how thermal throttling can effect performance. Even with a tiny improvement in cooling/heat transfer, the machine can run maybe 12% which doesn't seem like a lot, but it is noticeable.
yeah people are STUPID. they run these things on pillows instead of hard surfaces insane stuff like that. I rewired my aspire to keep that fan running 100% of the time. however its plugged in almost all the time anyway. heat is thy enemy.
@@leecowell8165 Yeah, I remember seeing a friend sat with a soft rug or on her lap and the laptop on top of that. When I asked why the rug - "Oh cos the laptop runs so hot it burns my leg" - Doh!
#1 upgrade for an older computer is: replace the HDD with an SSD.
Many moons ago I would speed up PCs by upgrading hard drives for people.
Yup, that's actually what I had to do for my wife's iMac. Had a "fusion" drive, but thanks to thunderbolt I was able to install a fast external ssd!
Sure but SSD's have a limited life span. Prepare to image and replace.
@@crankshaft3612 so do spinning platters of rust. Motor bearings die. Integrated electronics cease to function. It's a crap shoot. But yeah, SSDs die too.
@@crankshaft3612 I bought my SSD in 2012, still using it after ten years. If you don't write a lot of data on them, they can last for ages, specially the SLC ones.
This guy is very well spoken, & for to knowledgeable for me, I’m new to computers, so I will just look & listen.
Something I found throughout the years: Many of the SATA cables that I got with motherboards/drives back when it was a brand new standard got glitchy over time(also wobbly in their connectors although I'm not sure they weren't like that new). Sometimes loading something would just stall for a few seconds or even hang the system. If the system decided to "hang" moving the cables around a bit would often get it moving again. Even if all seemed fine drive benchmarks would sometimes be "slow" because the controllers fell back to the lowest speed they could as they were getting too much garbled data. On HDD's the actual disks would make clicking/chirping noises (seagate drives here).
If you run into something like that, check the UDMA_CRC_Error_Count (or something similar) SMART stats. If it's high, replace the cable before replacing the disk to get everything running smooth again.
Yup. After a few rounds of "WTF?" I finally learned to unplug/plug (exercise) the SATA connectors every few months.
@@geonerd yeah, I noticed the performance issues, and then went digging as at first glance the disks ware fine, and found a lot of messages relating to re-initializing sata connections and lowering the speeds due to errors in the kernel log. Ever since then I've just had a bunch of extra cables and replace em every now and then.
Kinda makes me wish back to those good old IDE cables that once plugged in could be hell to remove. I'd rather have issues pulling them out/off than having issues with them coming loose/crapping out creating weird glitchy behavior.
Just went through a month of hell moving to larger drives and having errant boot problems as Win 10 decided the offering was insufficient and a full check disk was in order (and Win 10 isn't kind enough to tell you which disk(s) it is unhappy with, and only ten seconds to skip the process after the spinning circle does some incantation for an unknown length of time).
Two bad cables (and brand new!) and cursing windows move to hide the sausage (seriously considering moving to dual boot Win 7 and Solus).
Oh, and the old trope of some disks being happier in a particular orientation still holds even today (some REALLY don't like being vertical).
Crikey, I wonder if some of the problems I have suffered with Windows 10 was actually just a SATA cable.
@@wayland7150 Ive found them to be quite unreliable. Ive had ones that were fine for 5-10 yrs and then started acting up even though they weren't touched and the system hasn't been moved. It's weird.
The 1st change I tend to make when asked to "speed up" someones machine (with a HDD) would be to image it onto a higher capacity quality SSD. That provides a cold backup and a 30-ish-X IOPS bump. Basically a do no harm update that tends to be cost effective. Followed by all of the steps in this video of course.
RAM isn't always accessible, and when it is it may not be available or have sane pricing.
this especially with most OS's now days hammering the disk forever
Yeah this^. Reinstall is preferable imho but imaging is a good way to preserve everything. Generally not even worried about boosting the capacity these days - a lot of people I've seen use a tiny percent of their drive and do everything online. Often customers say a smaller drive is even ok if it's cheaper.
@@jjjacer I always forget how utterly unusable Windows 10 in for several minutes after boot on a mechanical disk, and every time I go to Task Manager all "WTF?!?!" it's because the spinning rust is pegged at 100%.
SSD all the things.
@@QualityDoggo I've had customers with 2 TB HDD that was failing (because laptop HDDs last about 5 minutes) and they only used ~80GB including the OS, software, and update files. I think I installed a 256GB SSD for them and the system worked good as new.
What happens if the machine already has an SSD?
Or, to put it another way, SSDs can only be a partial fix to NTFS fragmentation problems. Sure, you don’t have to worry about seek delays, but if you have to access the same data via separate smaller reads/writes versus one big transfer, the multiple smaller transfers are still going to be slower.
And remember, the NTFS defragger had to be disabled on SSDs, because it was shortening their life.
Hard drives do use constant angular velocity, but the number of sectors per track does change. Fewer sectors per track at the inner part of the drive, but more sectors per track at the outer tracks. Check documention for Seagate drives . Disk to head transfer rates are for outer track. Ibm described the change in sectors per track as "notches". Even back when disk drives were described using tracks sectors and heads, the drives had the smarts to translate that to a block number, and locate the data on the drive. Data is written and read from the drive at different speeds to allow the bit density to remain constant over the surface of the drive. 3.5 inch hard disks could have as much as a 3 fold advantage in both speed and reduced head seek at the outer edge of the drive as opposed to the innermost track. Think this started with ata/ide drives.
Can you send me a link showing variable geometry on Seagates? Or let me know whcih model you're thinking of? I'm curious!
@@DavesGarage Not Seagate in particular but very in-depth article titled "HDD inside: Tracks and Zones"
@@DavesGarage I second what @andrewscott1451 has written. Some 2 decades ago, I've written a simple terminal proggie for Linux (using ncurses) that reads a disk drive sequentially and displays a progress bar, and also an instantaneous transfer rate gauge. You can read the whole block device start to end. If you do that, you'll find out that the drive is indeed faster at the start (= along the outer edge) and gets gradually slower. In late noughties, the typical 3.5" desktop drives drives of the era would start at about 105 MBps and gradually slow down to maybe 40 MBps towards the end. This is a known fact. The inner geometry is not published in the datasheets, but the observable behavior is universal, across spinning rust vendors and models. The max/min/avg transfer rate develops coarsely with drive capacity - more precisely, with data density per square inch (or what units you are used to) and, the dependency contains a square root! guess why. Hint: the bits along the track get denser, but the tracks are also narrower and closer together.
The ratio between the sequential rate at the start and at the end had a notable correlation with the radius of the tracks at the start and end. Note that "enterprise" drives (Seagate Cheetah) were using a narrower "ring" on those platters, and therefore this ratio of outer/inner transfer rates was also lower.
My HDD test proggie can read and write either sequentially or using random access - which also gives funny results. Such as: over the last two decades or so, while the disk drive capacity used to grow, the "fully random seeking capability" has remained pretty much constant, at about 75 random seeks (IOps) per second - for a desktop 3.5" drive at 7200 RPM. Enterprise drives can do more, maybe up to 300 (steady) IOps with TCQ/NCQ enabled and preferably also with transaction reordering in the host-side write-back queue. Notebook drives can do about 60. That was before we got blessed by the Shingled Magnetic Recording :-(
i used to install so much on my windows 98 machine back in the day i knew the ins and outs of the registry, never ran a antivirus and was willing and able to do a complete reinstall religiously about once a month... after evaluating all those juarez...... lol , sometimes stuff just bloats like you said. i love how you said exactly what i was thinking once again dave! THANKS man maybe ill be joining up in your discord or something where we can actually talk about some things but for now i will post my praise here and at least for me, i really dig your channel and persona. keep on rockin man.
A lot of OEM thermal paste degrades and/or pumps out so it's a good idea to do a sanity check on CPU temperatures with HWiNFO and Prime95 and repaste if temps get out of control. (by out of control I mean thermal shutdown, throttling below base clock speed, or reaching a steady state temperature over 90C with stock power limits (12900K excluded))
I just use Speccy to do a quick test and it once caught a PC which was a bit too hot while doing nothing.
This video is addressed to a different level users. And if you haven't changed thermal paste in older laptops many times before, it would be safer and less time-consuming to have the repairman handle both diagnostics and the procedure itself. Most of the times diagnostics will be free no matter the result of it.
Replacing thermal paste isn't something I've had to do often in the repair industry, maybe once or twice. In practically all cases of thermal runaway I have seen was due to vents clogged with dust and debris, or the occasional bad fan. For that reason it's still not a bad idea to do what you said, and check temps. But I'd start with simplest solution first, checking dust and fan performance, before delving down into deeming the thermal paste bad:
As far thermal paste itself, yes there is good and bad thermal paste. But it all gets to the point where it will look "dry" while not being a solid indicator on whether or not the paste has "gone bad." The thermal paste industry is kind of gimmicky in that they spend a ton of marketing money to convince people that it in fact does go bad regularly. Even the metric they use to measure performance is watts (of heat dissipation) per square meter, which when scaled down to the size of a CPU, a difference of 10 watts per square meter will be hardly noticeable if it is at all. That's not including the fact that once the HSF is squeezed against the CPU, the thermal paste actually doesn't cover it from end to end. Only in the microscopic crevices created by the manufacturing methods. Maybe a coverage of around 5mm total is done with thermal paste.
I have used RadioShack branded "silicone thermal" paste, which only cost me 35 cents for a 2 ounce tube on an Athlon II X4 before (replaced stock paste because I had to undo it to test the CPU on a friend's board). Worked fine for 2 years, and probably would have gone longer if I hadn't upgraded the cooler for the sake of overclocking, so I opted to use its included thermal paste. Regular silicone paste is about the worst you can get performance-wise.
All thermal paste degrades over time. It's a consumable. Eventually, it dries up and stops performing as it should.
@@BrunodeSouzaLino only in OC'ing and times where people will run their computers for literal months while clogged up with dust and cigarette smoke remains have I seen thermal paste go bad enough to really matter. Thermal "pumpout" is actually the issue and not it drying up (which turns the paste into a powder, a lot of times visible on video card or bottom of the case). Modern PC CPU's stock generally don't run hot enough to do that. While drying out, apply some paste, and check back in 2 months. It'll likely be dry, so does that mean you should replace paste every 2 months?
And yes, I have had paste dry up so badly that pulling and twisting on the hsf while removing it also caused the CPU to pull out of a latched socket before. Monitoring software showed no thermal issues prior to doing so.
Generally when people replace thermal paste they address other issues while not realizing it -- Including removing dust and reseating fan connectors. They then attribute those gains to the paste.
Which leads me to my last part, telling users via a social media platform to just replace their paste without investigating much simpler issues first actually does them a disservice. As they're more likely to damage something tearing their PC's apart that much with no prior experience just because some tech bro told them on RUclips.
One thing replacing thermal paste as a first jab at diagnosis helps, is the bottom line of the companies that produce it. It's all about the money. Converting old PC's to SSD's especially when they have an HDD which is on its way out on the other hand, I always get noticeable performance gains from that.
Totally agree that Defender is all you need. Malware can be very odd. Had a case back in the 90s. A user got reprimanded for visiting a ton of porn sites. The router monitoring software said so. I knew very well she wouldn't do that at all, particularly at work. I checked the browser history, but nothing, which I expected. While I sat at her computer the monitored hits kept coming. Turns out there was some malware doing web gets to porn sites, with a referrer code in the get, to generate revenue. The content of the sites went straight to null, so no-one was the wiser.
I can't rely on any application from MS because they are just as likely to dump it and fail to support it. I'm thinking mostly of email programs which have gone through too many changes. And because MS thinks annoying users is a good practice. The same applies to Defender. It is apparently quite good at the moment. But a few years it went from a good above 95% detection rate of viruses etc to only 50% according to AV Comparatives. At that time AV programs had to be reliable as so many viruses were in the wild
If you never compare, you'll never know the difference. I have a desktop at work, and work remotely. Both home and work systems have Outlook opened on the same account. ESET anti virus on the home computer regularly detects phishing and malware in Outlook inbox, that the PC at work does not, which only has Defender. With some products, that could be from false indicators, but these are genuine phishing emails that are getting flagged by ESET.
Hi Dave! New subscriber, glad you're here. Just for fun I thought I'd mention one other way that hardware can degrade over time--the breakdown of thermal compound on the CPU/GPU. Granted, that's beyond the scope of most folks and your system suggestions here are top-notch. :) Thanks for this video!
Quite true, cleaning off the old and putting on fresh can help, but usually you don't notice till after the computer warms up. When that does happen, it's a good idea to clean the fans and blow out the dust too. I have 2 cats, needs to be done more often!
100%. So many laptop/PC's I see people buy are put in cold areas and left off which causes the cheap thermal grease to dry out. Removing the old grease and using a higher quality grease that doesn't dry out is the solution.
Nothing that can't be done by the average person. Anyone with 2 hands and a screwdriver should be able to open their PC case and clean their fans/check the state of the thermal paste and clean/readd new paste if necessary. If you are capable of cleaning your house, you-re capable of cleaning your PC, arguably one of your most valuable posession.
That's true. I use graphene sheets as the thermal transfer medium because it never breaks down, hardens or cracks.
@@BlackhatAudio cool! I hadn't heard of this but it's good to know.
I love the information you provide, but it would be highly appreciable if you could show a demo of how things are done and the aftermath.
when did he provide info? was it after the 4 minute intro?
Ages ago a friend called me up and asked me to please come take his computer. He was just tired of dealing with it because it ran so poorly, he assumed it was because it had a cyrix CPU. 5 minutes on the bench and I knew why. It had the minimal amount of ram it could possibly run with, was running windows Me and had a winmodem (I did say ages ago). I maxed out the memory, installed NT and replaced the winmodem with a US Robotics serial modem. I used that thing for at least another 5 years in my lab.
Now a days when a system seems slow I replace the drive with an SSD and max out the memory, when that doesn't cut it anymore I install linux and use it for other things. I still have systems running that are >15 years old. I drive them until something major gives out.
I used
I used to have dial up internet. in those days the best modem i had ever owned was a US Robotics Modem. They were just simply the best dial up modems ever built.
I prefer Linux because Windows is getting to a point where they dictate would software you can run and besides let's face it it's full of spyware... Also love the point that Lennox is open source and free two thumbs up for Linux.
I have a slight issue with #2. While what you say is generally correct, I've run into trouble more than once by updating my NVidia drivers. While NVidia CLIAMED they were stable, it turns out that they were not as stable as they said. You might want to wait a week or two before installing them and make sure there are no bugs that come out of the update. And, no, this isn't common, but it happens enough that I gave up trying to keep NVidia drivers up-to-date as soon as humanly possible.
You're not wrong, but sometimes newer games don't give you the option.
I have a GTX 660 and it doesn't like most new drivers.
having an older card and also from finding out the hard way, that is a card I do not suggest updating when things are working fine.
Why update when things are working fine? I postpone and delay Windows updates as much as possible. On my phone I learnt it the hard way when I went from ios 15 to 16. I regretted after a month and couldn’t rollback. Unless the new update fixes any issues or provides a good reason to update, I don’t bother these days.
True, haven't updated my Nvidia drivers since september 2022. And yes, everything runs just fine. I highly doubt I'll jump from 45fps to 60fps in Red Dead Redemption 2 by simply installing whatever newest driver there's available now. Mandatory "if it ain't broke, don't fix it" here.
Having been a PC repair guy, I've cleaned countless machines for customers to make them more responsive. Half of the times I ended up upgrading the system anyways because an intermediate upgrade doesn't cost as much as a new machine but it does offer a respectable improvement in raw speed. And the customer saves on not having to have all his or her apps and data transferred to the new machine (you won't believe the percentage of people not capable of this) which can take a couple of hours at € 50 - € 80 per hour.
Very cool. How do you manage the apps they want to transfer over. I'm always a bit weary of doing a nuke and then they say they wanted some app and now their details are gone.
@@jackkraken3888 If you don't want to risk losing data, install the fresh OS on a temporary drive. Then transfer data form the old OS installation, check if you have everything then clone the new OS over the old one.
@@paulmichaelfreedman8334 Good idea. I think I might try that.
@@jackkraken3888 Windows 7 has an app to transfer data from old to new machine. The app is also available in windows 10 under the old name (win 7 transfer app or something like that, been a while) It's a little less flexible than manually copying stuff, but it does take a lot of work out of hands, giving more time to sip coffee, check phone etc :)
@@paulmichaelfreedman8334 Oh the migration wizard thing. Yeah I need to do more research on that.
There are several factors in the storage subsystem.
You have mentioned FS fragmentation - that's a classic in its own right, even ignoring all context.
Note that as programs get bigger, or in general as the storage requirements grow, this also means more "clusters" allocated on disk, and the tree structures of metadata also need to grow, and the trees get deeper, and take longer to walk upon any access. Which may mean walking a tree in memory, and that's an optimistic scenario. In the less favourable scenario, you need to spend seeks to walk the metadata tree on disk.
Now when a filesystem gets over 75% full (a genaral ballpark figure), fragmentation starts to climb rather steeply. At say 90% full, fragmentation skyrockets. Because it's statistically very difficult to find continuous/contiguous sequences of blocks to allocate for larger files.
Spinning rust had a limited IOps capacity, but what it had, could be depended upon - because the mapping from the outside LBA address space to the internal tracks (of variable capacity) was relatively straight-forward and sequential.
For comparison, with SSD's, you have two additional layers of mapping / indirection, below the outside LBA address space: you have the space of Flash pages (or lines) and underneath the space of Erase Blocks. The LBA sectors get dynamically allocated to pages=lines, and these in turn get allocated from erase blocks. To modify data in a particular sector, you need to write a whole line, and to modify a line, you may need to erase a whole erase block. To do that, you may need to shuffle other, unrelated lines in that same erase block, out to other erase blocks. A simple write may get "amplified" in a rather gross way. The Flash SSD also needs to do wear leveling = not let individual erase blocks to pick up disproportionately high erase counts...
Flash SSD's are not a silver bullet. You get what you pay for. They can bog down under a long-lasting sequential (or random} write load, if they need to catch up on some internal janitoring.
On the spinning rust side of things, you get "shingled" recording...
You are free to pick your poison.
A great and concise collection of info here. But the fact alone that you created Task Manager gets a new sub. We're in the presence of a legend here people!
Thanks for the kind words!
One of the most useful tools on my system.
Seeing the other comments, I’m clearly out of my league with your video. I’m a 73 yr old woman who understood about every 12th word you said. I anticipated jumping onto the video on my iPad and sitting at my desk and zipping right thru speeding up my desktop computer! For example: you were going thru Tip #1 and I thought, pretty soon he’s going to put up a few screens to show how to do what he’s talking about. However, very quickly you were saying, “Ok, now on to Tip #2……….”
You have clearly geared this video to people who have a pretty decent knowledge of computers already. I was hoping for more of a beginner class; maybe a whole video on each tip separately with no fluff and get RIGHT TO YOUR POINT! The title of your video sounded so simple, but got very complicated and mystifying very quickly!
I don’t doubt for a minute that you’re very good at what you do, but do you have videos that are much more basic? Or can you recommend beginner’s videos on this subject? No insult is intended here. My ex-husband was IT, my brother is an IT specialist (out of state), and my son is an IT guy (out of my city and too busy). Only my son comes even close to be able to talk to me at my level and I’m pretty certain that I’m not stupid. That’s why I turn to these RUclips videos for help. I just need to be spoken to on my level. Any ideas? I can’t afford to hire someone to speed up my computer! Thank you! I love your flashy computer! 😊
Dave just casually rocking some of the best hardware on the market. Also, his book is amazing and after 30 years of "Why am I like this" I am getting tested myself in a couple months.
> 30 years of "Why am I like this"
dan, guess I was lucky and only my childhood was "ruined".
Ah well, after 50 years stumbling in the dark, I found out and realized that although it had been a handicap I still had managed to get somewhere. But it explained a lot as in to how my life had unfolded.(most of the time I considered myself an alien)
Good luck with your journey.
Most HDDs in the modern age do have variable geometry (ZBR) packing more sectors into the outer tracks - that's part reason why CHS addressing was replaced by LBA. So, outer tracks should experience better performance due to having to seek to the next one less often. Directory placement at the middle is generally for seek performance - any advantage having it near the outer edge would soon be lost otherwise.
Minor revisions being better - sometimes. I remember Windows XP having zippy performance when you installed the base product but then you'd put the Service Packs on and it'd get more sluggish with each one (and this is before any other additional software or great time elapsing).
Hibernation file - fast startup needs it since modern Win versions have the option to just shut most of the OS down and write the kernal off to the hiberfile. It's also a must-have if you have a UPS.
Macs - the best thing you can do for one of them is to put Windows on it. But in general they'll be a bit slower than a same spec'd generic Intel machine.
True. I had a drive that was failing so I did a full disk check on it where the disk is writen fully with a pattern and then read. The writing speed was significantly faster in the beginning then at the end because the writing started from the outer tracks.
That's what I assumed - that LBA just hid variable geometry where it did exist, but I wonder if it can be shown to impact perf?
@@DavesGarage paradoxaly a system with Raid-0 SSD and 32GB of RAM definitely has blazing cached storage but surprisingly the system does not feel mind blowing snappy, right? 😳
There is a remaining 20% bottleneck in the core that has prevented MS architecture from beating Linux, iOS, Android clean.... that's a lot of missed business!
@@DavesGarage I recently tested a new 3.5" HDD by writing it completely until it was full. Write speeds started out at just over 200MB/s, then became progressively slower, reaching around 100MB/s at the end. I'm guessing the file system prefers to use the outer tracks when available.
@@tiarkrezar Can't find it, but I'm pretty sure LTT did a video where they created 3 partitions on a drive where they each literally had different overall performance due to where the partitions were physically on the disc.
I think they got similar numbers as you and I think their proposed use-case was that you put the OS and "important" files on that faster partition, junk on the slowest one and everything else in the middle one.
I guess something like a poor man's "small boot SSD plus a separate scratch disc for other stuff". Granted, the sharp decline in SSD pricing over the years probably just makes this a pretty silly exercise in 2022 :)
Great video; a lot of technical information provided in an understandable format. Very watchable.
Great content Dave! High quality, well explained, reputable and trustworthy!
Thank you very much for you work and service Sir! You've earned a fan, an admirer and a permanent subscriber!
The ultimate solution: wipe the hard drive and reload a fresh Copy of the OS and apps.
That is essentially a factory reset that gives you exactly what you had when you bought it.
Then do a disk image backup.
Next time it gets slow, wipe the drive and restore the disk image backup.
Voila, super fast computer again.
Thanks, forgot to mention you need to backup your personal data files before wiping your drive! But you do that regularly already, right?
Segregating your personal data files (documents, pictures, etc) to an external drive or partition is a great idea.
A fresh OS install is better than a factory reset. A factory reset will come with all the preinstalled bloatware and will need all the subsequent updates to be installed.
That’s why they say, Windows is a great OS -- if your time is worth nothing.
Your vote of confidence in Defender has me thinking about ditching Avast, which seems to just get increasingly cumbersome, spammy and dopey, even though certainly it works fine as far as its core function is concerned. My performance on my Windows laptops is already reasonable, but getting rid of even more bloat sounds pretty cool. Thanks!
Hi Dave, Always fun watching your content. I can honestly say after 31 years in the IT industry I always learn bits n bobs from your content. thank you. I wonder if you could do a video on software tweaks. There used to be tons of Registry hacks we used when I was a desktop engineer (a while ago!) and it would be nice if we had your wisdom on the same, such as PowerToys or the like. Regards, TJ
I'll make a note of it as an episode idea, thanks!
@@DavesGarage use of LINUX will make possibly biggest difference on old pc
believe it or not, i still have the original "PowerToys" for 95/98 and only a few of them will still work on windows 11 after some slight changes to the INF file.
im not so much of a fan of the new PowerToys though.
@@coper137 Especially if it's a dedicated machine like a guardian node.
Yes please please do a video on software tweaks tyvm
I really appreciate the style of your videos. They are clear, concise and entertaining! ❤
Thanks as always, Dave! Having HFA In big tech and being pretty good a masking, I was always concerned about the stigma that might be associated with disclosing that I’m neurodivergent. The openness with which you discuss your own ASD has really helped me to embrace that part of myself. Looking forward to reading your book one of these days!
Thanks! I don't use it as a label OR excuse, but sometimes an explanaiton :-)
This kind of candor is needed for normalization.
@@Darxide23 Indeed, as someone that suspects I'm possibly am on the spectrum myself (high function Autism/Asperger's, though I know it's been deprecated since 2013, but I still feel it should be used in some way to denote those that are high, or ultra high function as Asperger's is similar to, but not totally identical to autism. That said, I have a comorbidity of sorts, CRS (congenital Rubella Syndrome) by being born a prenatal Rubella baby in 1965, 4 years before the Rubella vaccine came along, and 6 years before the MMR vaccine.
@@johnhpalmer6098 Yea, I wasn't a fan of so many things being rolled into ASD, either. If you hear that someone is on the Autism Spectrum, that can mean almost anything. It gives almost zero information.
@@Darxide23 Yeah, I can see that, but reading up on all that is considered for Autism, there is a lot to it and they all share some of the traits, that is especially true for the brain and how it has more neurons than most neurotypicals tend to have.
That said, very high functioning, and I mean, what would be considered borderline Asperger types may not need to mask nearly as much in daily life as others on the spectrum, thus may not get diagnosed as readily, which was more likely for older folk now in their 40's and up, but still struggle mightily in many ways to this day, which can affect how we get/keep work for starters.
I agree with all your recommendations. You sort-of touched on this BUT - The BIGGEST improvement one can make with an older machine is to replace the old rotary drive with a SSD. I didn't hear you explicitly say this.
I whole heartedly agree a new SSD and added ram can make very significant improvements to an old system
I remember computer magazine c’t writing a HDD test program that showed different speeds on different parts of the disk. That was 100% a thing. But with many platters that gives you more of a sawtooth pattern depending on logical to physical sector mapping
The heads are always over the same sector number of each platter so it seems to me the fastest and best way to write data on the lowest level (physical) is to write to all heads simultaneously.
Yeah, I used to always partition my drives, and use the "faster" partition for programs, and the other for storage.
@@IrocZIV Logical drives have mappings that are often nothing like a perfect physical separation of partitions. The only sectors I know of that always are in the middle of the partition, logically and physically, is the NTFS file allocation table. And even then a failing sector in the FAT can cause parts of the FAT be physically relocated somewhere outside of that region and causing an extra long head move when reading the FAT.
Bottom line: The speed between an "inner" and "outer"partition is not significant. The physical mapping can differ greatly from the logical mapping, depending on the health state of the physical platters. Sectors fail regularly on a lot of drives, we just don't notice because the drives have advanced sector recovery/relocation capabilities. A quick check of the C5 value in the drive's SMART log will tell immediately if the drive is failing or not. Pending sectors are really bad, it means the sector failed without notice and the drive could not recover the data on that sector.
Yep, Dave is wrong on the HDD speed.
@@paulmichaelfreedman8334 On my current HDD I get 187 MB/sec at the beginning, 78 MB/sec at the end, almost 60% slower. I've seen this on every HDD I've used in the past decade.
Its always a joy to hear from people that know what they’re talking about. Very informative.
A couple things to add: dried up thermal paste causing thermal throttling and needing a repaste, and system drive's coming nearer the ends of their lives and replacing them by cloning to new drives, especially new faster ones.
Why would one need a repaste? I've never heard of that! Once you're CPU is installed how in the hell is the paste gonna go to shit? If you're being throttled its being caused by some other issue.
@@leecowell8165 "I've never heard of that" You have now. It dries out. Some types faster than others. Open up a 6 year old + computer that has never had it's paste serviced and I bet the paste you find will be dry, hard and crumbly. Some of the most premium pastes dry out really fast compared the the cheaper ones, but deliver better performance that makes the cost and effort worth it, for example to recreational overclockers.
For example, a laptop cpu's thermal paste could have something called pump out effect. Due the the small direct die contact area with the cooler, the paste will during lots of heat up and cool down sessions leech out of cpu die area, which could cause thermal throttling. Also due to people carrying around them, the thermal paste could move to other places, which cause uneven contact.
For desktop, if a computer is used in stress for a long time, thermal paste might became dry due to ingredients get evaporated slowly and cause uneven contact between IHS and the cooler.
There are solutions for both situations, such as Honeywell's PTM7950, which is a phase changing thermal solution, it is solid at room temperature and became less vicious when heat up, the interesting part of the material is that it stays in place after cooldown and has longevity that designed for the service life of your machine, it is used in vehicles where electronic maintainence is not often. I use this thermal solution in my laptop for 2 years and so far have not seen thermal degradation.
I hope this clears up some ideas that might cause thermal issues in a computer.
@leecowell8165
I'm really glad to have found your channel. Your the perfect middle ground. By that I mean, you don't dumb things down so far, that I feel like I'm in a 3rd grade classroom, however, you don't use so much technical jargon that I have no hope of ever following a single sentence you speak. Kudos for finding the happy space for folks like me. I've got enough experience with PC and other technology, that I can't be called a novice, or beginner, but I'm not at the master, or expert level either, not by a mile lol. I find your way of explaining things absolutely perfect for the way I comprehend them, and I don't feel like an idiot after watching your videos. Like I said, I'm very glad to have ran into your channel, you get my thumb and my sub, and regular interaction from here on out. Great content, thank you for it!
With regards to the "angular velocity" theory:
I know there is a similar principle at play with optical media (which also spins), and the placement of certain files can greatly influence loading times. This heavily influenced the console development (DVD organization) of The Elder Scrolls IV: Oblivion.
As it relates to a HDD, I'm not sure if angular speed has much influence, though I can say, at the very least, physical storage location on the platter does affect read speeds. We're aware of fragmentation and its effects, though I've also noticed performance improvements in reading files written to a nearly fresh (recently wiped) installation. That data can be consistently accessed far quicker than had it been written only slightly after a few other files on a clean install, lending itself to carefully choosing those very first files.
That's on Windows based systems. Try a *cough* real OS.
@@Shalmaneser1 The latter was in reference to a fresh Linux install.
Must be one hell of a fullfilling life you lead, one in which you so ignorantly jump to false conclusions without any substantiating facts driving said deductions. No wonder the ladies love you...
Constant Angular Velocity sector placement on HDD became obsolete 30+ years ago. It's all Zone Bit Recording everywhere, with more sectors and much higher linear speed in outer tracks.
So physical placement of files on disk has significant effect on performance. Any disk benchmark (like HD Tune or Victoria) can show that e.g. read speed goes from 150 MBps in the outer tracks down to 75 MBps in the inner tracks, or 50% performance hit.
So, you're the dude! I've been wanting to say thanks to you for decades about the Task Manager. There isn't a day that goes by that I don't use it for one reason or another and it's help me save countless systems from virus's, rootkits and other malware as well as find and tune performance issues on countless others. Sure, there are plenty of tools in the arsenal but the one that is used along with all of them has been the Task Manager (along with its sidekick, Performance Monitor).
I'm originally from the Amiga world and was dealing with PCs (286's and DOS at the time (mid-80s to early 90s)) while in the Airforce. I remember finding an OEM copy of Windows 2 in one of the boxes and, out of curiosity, installing it on one of the spare systems in the shop (which was a classroom, I was an instructor by then). Once it was installed, I was severely dumfounded at how awful it ran compared to my Amiga 2000 at home. If I could get two programs loaded, they would try to beat the crap out of each other through the curtains of the "Windows" until one, or both, of them would give up, crash, or hang up the whole shebang. I wasn't impressed at all, So I went back to playing with the Amiga and a 286 system that I'd made a trade for and playing around with the "Upper Memory" and dialing up BBS systems and such.
Windows 3 and 3.1 were the game changers that got me into my first hobby build of a 386DX system (at a blazing 33 Mhz, with a Turbo button and a huge, partitioned 80 MB Hard drive, too and 4 Megs of RAM. (My buddies were still swapping floppies on their Commadore 64s and playing Pong on their Ataris.)). It's been both a blast and major PITA (at times) ever since. I, too, try to keep systems out of the scrap piles for neighbors, relatives and such folks but have to get away from it every now and then or I get pestered half to death or end up getting stuck with of a lot of (sometimes piles) obsolete stuff from them.
Thanks again, from the bottom of my heart! You got my Sub!
With older spinning drives partitioning used to help with reducing the distance the head had to move - so using 20 or 40% of the disk could result in significant performance improvements if the data fit well within that size - the rest of the disk could be used for data that was rarely accessed
If you’re still using non ssd drives, you deserve all bad things.
That advantage is not significant.
The time the head needs to move and settle is not linear to distance.
BUT it works and I have done it for decades.
@@0MoTheG not “only” linear to distance - but it did offer a significant improvement versus random in IO placement of the same data amount on an entire physical platter - I ran tests with ioMeter to work out the optimal set ups
(And serial IO operations become the equivalent of random operations when they were done in parallel )
In particular this benefited the Minimum IO rate under multi threaded load with low queue depths and lowered the latency. (At the expense of capacity but this was more performance oriented)
Use cases : shared databases or data stores or shared multi user systems, but the same thing actually benefited personal computers (installing OS on less than the total disk size ) because so much parallel IO and fragmentation eventually occurs
I just dumped Windows and installed Linux. Install any Linux of your choice, but I've been running Linux Mint for more than 5 years - and it runs at the same speed today as it did when I installed it for the first time. I'm running SSD, so hard disk seek times were never an issue.
A major reason for phones and tablets slowing down (not computers/laptops etc.) is flash wear. When the CPU writes to flash memory and the memory is new, it can write to any flash page with no issues. As the flash wears out, the CPU attempts to write to a page and it doesn't work, so it has to try a new page (that is not already in use). Thus the CPU spends longer finding somewhere to store the data in flash than it does to actually store it. This is unavoidable with flash memory devices. They wear out.
I daily drove linux for years. It's great on slower machines, but I find windows 11 runs better than linux on my current rig, so I just switched to WSL to run my linux stuff. I suppose nvidia drivers are the main reason though.
@@ElShogoso win 11 is worse than win 8 they say.. IDK as I only use it for gaming, Linux does most stuff for me.
Computers hardware don't get physically slower, the amount of crap that runs in the background increases over time and if you use a faster pc at work, the perception of speed means your crusty home pc now feels slow. With SSDs becoming standard, the need to defrag is becoming obsolete.
Because defrag is useless with an SSD.
"Computers hardware don't get physically slower".
Not true. A failing HDD will be significantly slower to use. It's one of the early red flags to look for.
Additionally, if you aren't regularly cleaning your case, a buildup of dust will cause the system to run hotter than it previously did which usually leads to throttling itself earlier than before. So you'll notice it be slower, more often
Defrag isn’t just obsolete, but isn’t it BAD? SSDs have a finite number of rewrites before a sector can no longer be used.
Not true, thermal throttling from being encased in dust in a physical cause. ;)
@@hxc7273 Yup! You got it.
Dave, you are Such a great addition to RUclips! Computers dont get slower! You damn sofware engineers just get MORE creative. For that, WE LOVE YOU!
Computers get slow because of those crazy programmers making apps bigger
That's part of it!
If we're talking about older machines, they may well have only spinning disk(s). Technically the spinning disk hasn't gotten slower over time, but upgrading to an SSD will obviously be a huge speedup. Maybe a hardware upgrade doesn't fall into the category of "fixing" an old computer, but Dave did also mention more RAM in passing.
This is *DEFINITELY* the case with Mac OS. Somewhere after around Mavericks, I think it was assumed you had an SSD, and the base OS's performance on spinning media is _terrible._ Entirely conjecture here, but it seems like they just optimized the caching and queueing algorithms to prioritize the no-penalty random-access nature of solid state media.
Accidentally clicking on an update notification that launches the App Store could be a 2-3 minute wait on perfectly cromulent hardware like a Mac Mini, but running from a 2.5" HDD. Pop in an SSD -- even with the same OS install, nothing at all changed except the underlying disk -- and it's a couple seconds.
I once heard that the performance increase from moving to HDD to SSD was so big "it'll make you mad at how slow it was before." Absolutely.
actually, they run dry or nearly dry bearings to get the RPM's, after a couple years they are usually badly worn and won't spin as fast
With an ssd and running on windows 10, even an older core2duo will run with acceptable speeds.
SSD drives may be quicker, but I find their higher prices, lower capacity and limited lifespans to be more than a little prohibitive, platter drives tend to have greater capacity, last much longer and are much cheaper to both obtain, clone and replace.
And yes, I frequently make clone copies of my drives as backups since they can and do occasionally fail over time, and its always a good idea to back up when you can...
Sooo... I'll stick to platter drives for now, regardless of the speed boosts ssd drives may offer,
It does work. I upgraded a computer from 2010 to an SSD and as far as program launches and simple tasks it's on par with my newer 12 core machine. Granted not playing any AAA modern games on it but as far as web browsing and watching movies it was a welcome improvement.
Great to have another pro recommending MS own security. Years ago we had a IT security professor from a university visiting and he also stated that the built-in software worked fine without subscribing to an external one. If you need detailed parental control etc, extremal software might be a good option, but with Defender and "sound paranoia when downloading", I feel as safe as I can be on my computer where I'm the only user.
The defender has had very flaky success in tests. Sometimes it shines, sometimes it bombs. Offline detection is usually not good. I also recommend doing the registry tweak to enable sandboxing.
@@dingdong2103 I had only MSD for several years and no problems with either obvious malware or system resource problems that could indicate malware so I'm happy about it. I'm sure the tests can find weaknesses on any security software. An important thing is to stay aware and don't blindly trust anything saying "I am protected so I can do anything". That's my take on It security. I think that in general, one problem nowadays is that it seems more important to FEEL safe, rather than actually be as safe as possible.
Reapplying cooling paste between heatsink and CPU / GPU every couple of years is a must on old hardware.
I'd say this is just a windows thing, on Linux you don't need to deal with the registry and most package managers are pretty good at uninstalling software.
Plus if you're running more minimalist distros you can decide what extra programs start at boot.
Windows has definitely gotten better at this over the last couple of decades though
Isn't "reinstall" just a longer way to spell Linux? Sorry, thats a bad joke. I've been on since Win 3.11 WFW and have already decide to move to Linux in the face of the Window 11 disaster.
I'll keep Windows, thank you for reminding me that Linux is still...in existence.
something about those long string command line instructions just spells "user friendly"
Linux seems to keep speed over the years compared to Windows. Same hardware, two different experiences.
They both have their issues. Most people just want to plug and play.
@@runnergo1398 plenty of good plug'n'play Linux distros exist now. It's almost like the Linux community is Embracing, Extending, and (hopefully) Extinguishing any reason to run Windows.
Even needing to learn how to configure some aspects of Redhat 5 manually in the 90's and figuring out which software to replace the things I had used on Windows was less of a problem than dealing with Windows itself. Nearly 25 years later, and I'm glad I don't have to put up with the problems I hear from Windows users: no spyware, no bloatware, no malware, no corporate overseer to spy on my computer and tell me what I can and can not do with it or the software, efficient, reliable and smooth running system with far fewer issues and no slowing down. If I don't like the way the interface works, I can replace and/or change it. Never going back!
I always keep the last 3 versions of packages in case I ever have to downgrade. I have root on an SSD and my home is on the HDD. its faster if its all on a SSD but I didn't want a perfectly good HDD going to waste. i generally dont update until there is a new kernel then I clean the cache out at the same time.
boot time on mine is 5 seconds from grub screen to log in as I use very few system daemons and only XFCE with no desktop manager. i'm its only user so why bother?
@@haplozetetic9519 Once Win 7 no longer works due to planed obsolescence I am done with windows, 8 was pure crap, and they just keep piling on the bs, no I do not want ultra intigrated cross linked accounts, no I do not consent to your data harvesting, no I will not tolerate a cloud based OS
there's a few things worth mentioning, but on my WIn 8.1 PRO I found disabling the Store and superfetch helped considerably. Of course it may not have been as noticable on newer machines, they were a pain on my system that I keep over 800GB of files on. Of course i swapped my boot to an SSD, but 100% usage during startup for minutes at a time was BS. Upping the ram to 16GB only gave me more flexibility with graphics intensive software, and I could run more windows on 3 screens w/o maxxing anything out. i tried running win 10 on my machine, and it ran like a slug. A 4GHZ 4 core shouldn't be beholden to MS whims, the updates were tendious and always seemed to occur when I needed to get actual work done. So I unrolled it. My daughter's pc runs win 10 fine, but I'm still pissed off and recently she filled it as well. When you have a stable machine and MS Update continues to f*ck it up, it sucks...so once stable I recommend turning it off too. Then come the registry danglers ... the amount of fluff is still horrendous. We don't need cloud services either, I found several software programs that were sending/receiving data they shouldn't have had access to, including MS. Not even turning all the privacy settings on was keeping MS out of my system. They can keep their telemetry off my system tx. Yes, MY system. I put MS Server on it recently although I was already running XAMPP very stable, and omg the startup went to hell again. Other than that, the last thing that dismayed me was DirectX12 requiring WIN10/11 ... so I have to play my xmas present games on a different pc than my dev. pc. As a society I think we are at the point of having dedicated PCs to certain uses ... the all inclusive PCs seem to be going away. I went shopping for a new pc, and the prevalence of S versions of windows is kinda unsettling. Yes our house has many pcs, 4 of which stay on 24/7 .. don't hibernate and have lovely screensavers. And they function well despite ... the industry.
Next video we need is “How to nuke Windows Update safely”
Just be aware cpu GHz isn’t everything… a modern 2 GHz is much faster than an old amd fx card, many of those were running upwards of 4 GHz and were notoriously slow
linux
@@blakelapierresadly its fragmentated, there are distros that good for daily driving but still missing things windowses have. (I use arch btw)
@@norbertnagy5514 try fedora
Dave you are the best. Brilliant human being. Thank you for all your awesome Videos.
Cheers From Sunny Warm Puerto Rico Island
Modern hardware effectively auto-overclocks itself, so you *can* find it spends more time at base clock and less at boost clock (or lower boost clocks) with age. This *usually* comes from aged thermal paste, if it is from silicon degradation, that manifests as system instability rather than slow-downs, as the CPU generally cannot detect its own failure.
A CPU most definitely can detect it's own failure, at least since the Pentium on x86. This is called MCE (Machine-check exception).
You can disable CPU throttling by setting your power plan to Performance.
@@KnutBluetooth He was referring to degradation. Generally speaking, the CPU will not tell you if it is stable or not in regards to it's clock speeds and voltage.
That said, modern CPUs don't degrade much. And likely won't be an issue for the vast majority of people.
I've been running my 5800x at 1.4v for a year for an all core overclock for example. Has not degraded any noticable amount. Still is stable and unstable at the same voltage/frequency. It was when I was dialing it in.
@@tyf.4399 That's not correct. The more a CPU is made with a smaller process, the faster it will degrade. Since older CPUs are made with a larger process they last much longer. This is also why they use older generation CPUs for space applications because they are inherently more reliable and resistant to degradation. I've never heard of a 8086 or a 6505 that broke down due to degradation yet. What you can do to mitigate this is keep it as cool as possible by not overclocking it and using a cooler designed for CPUs that generate more heat.
@@KnutBluetooth We're talking about consumer chips in consumer builds. A modern or older CPU will outlive it's useful life. This idea that degradation makes any difference in a lifespan of a CPU in the real world is actually just nonsensical.
Modern CPUs generally run cooler. But degradation isn't based on temps. Other factors could kill a chip from temp. But raw degradation is nearly purely voltage.
Old CPU degrade and so do new ones. But tdlr is that it really just doesn't matter. A CPU from the last 20 years and from the next 20 years will all outlive the applications they are designed for. Degradation should be the last thing anyone is concerned about.
Thanks Dave. I learned a lot especially about Task Manager which I have always used to end tasks that were in a 100% CPU loop! I have a question that does relate to Windows performance. Over many years I have been annoyed that Windows does not seem to natively prioritise it's tasks. I mean that it doesn't differentiate between tasks that have a screen component (i.e. you are directly interacting with that task (say, Word)) and tasks that are running in the background - foreground v background I guess. A couple of examples, rendering a video or a security product running a full disk scan versus a user trying to start RUclips and waiting forever. I see that you can set the priority of a task but why doesn't WIndows do this automatically?
I am miffed about this because in the 1980s I programmed DEC minicomputers that were way less capable than my current desktop and successfully hosted say 4 high speed data entry operators and as-many-as you-like concurrent background tasks, such as compiling COBOL programs or running reports without any degradation of the data entry tasks. In other words the users who needed the attention of the CPU and Ram got prioritised over the other tasks that were not time critical. Is there a way to get these result with Windows 11? Thanks.
I'm pretty sure that Windows does this. The priority boost for foreground windows is just not reflected as a different priority value in Process Explorer.
I have to disagree that the next version of Windows always runs better than the previous. For example, Windows 2000 was painfully slow to boot on a Pentium Pro, while NT 4 worked great on said hardware. Numerous other examples exist, so it's really dependent on the hardware involved.
This is actually the opposite. Each distribution is larger than the prevoius one, therefore slower on the same machine.
Yeah he did say that it's the new versions will be slower because they have more features and the point releases are faster and then shot himself in the foot by saying to always install the new versions which obviously fall into the "makes it slower category"
With graphics drivers maybe you can get away with it but certainly not OS's
Agreed. When you think about "office machines" 25 years ago, running nt 4 and office with like 32 MB of ram, one would think that doing the same tasks on current hardware would be blazing fast, but it seems like something went really wrong along the way... Certainly, you didn't have to wait like 20+ minutes back then for windows to boot because of software updates. And a blue screen meant bad RAM/hardware, almost 100% of the time.
It doesn't help that windows 10 have a lot of "stuff" preinstalled, like xbox stuff (even on server OS!), application "offerings" (advertising) on your start menu, and who knows what. That's what finally made me switch to linux on my home desktop. It's very fast, stable, secure, can update while running without restart (except for kernel updates), had the concept of a "software store" before anything else, and respects your privacy. The only thing that can be a problem, is lack of support from a hardware manufacturer or software developer (for example: nvidia doesn't seems to care about Linux, but AMD certainly does). Other than that, it's just great. Linux Mint and MX Linux are good beginner-friendly choices to give new life to older PCs, or even to install on a new PC!
That's not what he said. He said major changes tend to be, as you say, slower, and minor changes fix the slowdown. Listen more carefully! Examples from my IT career: Service packs for Win95, WinVista, Win7 and of course Win8.1. For Win10, which wasn't fit for purpose in 2015, major work has been done under the bonnet to streamline its performance, but they spoil a lot of this with the Store App bloat and nasty Metro nonsense, so...
@@Blitterbug At 6:07 for example, Dave's experience with OSR2 being faster than Win95 retail and XP being faster than NT 4 is the complete opposite of what I've experienced.
Best video on PC maintenance I've seen! Thank you.
Love your channel Dave. I was always fascinated by inside stories of people developing things impacting our lives and technology. Be it new type of engine or OS I sank countless hours using.
Im using old machines with clean OS and I still observe a bit slowdown. I ovoid installing new software as well. This is mostly in very old ones from XP era, newer machine with win7 with high specs from 2011 still works super fast.
As always, Dave, great content and an interesting topic! My number one experience to this issue is, if you're getting a new computer, take your old computer off the internet - that is, never allow it to connect to the internet again, and use it for offline applications only. Once you've made this decision, you can uninstall any antivirus software application you have on it. This will give you a massive speed improvement on that old machine, particularly if you are using AVG/Avast, Sophos, or Norton.
If only my old (laptop) computer was still functioning... I could *_barely_* afford the computer that I use now [I bought it with the (extra) pandemic relief money that I received from the federal government for not working in due to illness in early June 2020, which I wouldn't have received otherwise there]. I only bought it because my old one wouldn't boot any longer there, and it wasn't worth getting my old computer fixed by a computer technician as it would probably cost several hundreds of dollars (and thus, it wasn't worth it; it would be better to get a brand new computer at that point).
i haven't installed an anti virus for 16+ years now , haven't had a single problem
most mallware and virus'/problems i experienced came from cookies
automatically removing cookies after closing browser , except the ones you trust , stopped my computer from slowing down
have had every anti-virus , tried them all out , after i started removing cookies , never found another thing with any anti-virus
so i stopped installing them for good , my pc's run the same as the day i put them together now , no slow downs , no anti virus
@@ThermaL-ty7bw Erm. Cookies aren't viruses (or viri if you prefer) and have little to do with any virus either. Just saying.
Any particular antivirus you would recommend on an internet connected machine that isn't AVG/Avast, Sophos or Norton (if they slow machines down), please? Thanks, :-)
Your point about Windows defender is a very interesting one. I think the problem is there are so many other tools and pieces of software where Microsoft gives you the bare minimum with the OS that the automatic assumption is you also need to search for a better option for something as critical as antivirus.
it kind of depends on each user's habits... i haven't seen a detection on my systems in years. everything seems cleaner to me these days.
when i migrated to win10 i asked one of my old friends who's still active in the business what solution he's using and i went w/ his recommendation: for home use, just WD.
Shoot, I was reading an article about you last year - YOU ARE A HERO, SIR, THANK YOU SO MUCH FOR TASK MANAGER! I've loved that baby since day one. It was absolutely crucial for me in cleaning up the sort of virus problems people would get back in the mid-00s (along with a few different antiviral programs) - the sort where you'd be lucking to get 30 seconds of OS run time after boot before the whole system crashed.
God bless you, sir!
Thanks for the tips. I have been using most of them for years now and quite skilled with keeping old computers still functional. Though for the people that can change components (Meaning the Hard drive is replaceable because it isn't soldered to the motherboard) I would also suggest changing a magnetic drive to an SSD if possible. Using Task manager in my current computer I notice a big difference between using a Magnetic drive to even a Lowly Sata SSD. Issue is Drive access bottleneck which is greatly relieved with a SATA SSD. My Setup has a 2 TB Magnetic hard drive for most of my game storage, a 1 TB NVME SSD for VR Game storage and a 500 GB SATA Drive as my main boot drive. Even in my mother's older laptop I simply switched her hard drive for an SSD and she saw a noticeable speed increase on a Pentium processor and this was a true reaction as I secretly swapped the drive and to this day still haven't told her I swapped the drives.
I kept 2 Windows Vista laptops running for 8 years and I only stopped using the one machine because something in the sound card was damaged itself so it was causing BSOD as well as the ports were thoroughly trashed due to accidents while headphones were plugged in. Though i don't know what keeps on happening with my own computer that I find I have to keep doing a clean install of windows on my machine once a year as something keeps Majorly Borking though I thoroughly believe it is something to do with the NVIDIA graphics drivers. I also believe in keeping things up to date but I like to choose when I update. I like to wait until I have time to troubleshoot updates as I have had Many years ago now Where a windows update (windows XP days) where a driver update for Intel processors was sent out but the issue was I was running an AMD Athlon Processor (can't remember exactly which one) so the update completely just prevented my computer from booting. I had to contact Compaq for further assistance where they provided me procedures on how to disable the particular update. I have run into other update issues but that was the most memorable one.