OK so I don't know if this is true, I just
thought of it. Why is it called booting and rebooting? What's it got to do with a boot? Well, you know how firefighter boots have
those straps on the side so they can pull them on more easily? Well, there used to be a saying that derived
from that, which is that people who were self-starters effectively pulled themselves up by their
own bootstraps. Which is of course physically impossible but
a clever visual. So when a computer loads itself up by starting
with just a few instructions that then load the rest of it, that became known as bootstrapping,
which has been shortened to just booting, and, when you start over, rebooting. Now look how much smarter you are in just
30 seconds! Imagine 12 minutes from now! ---
Hey, I'm Dave, Welcome to my Shop. I'm Dave Plummer, a retired operating system
engineer from Microsoft going back to the MS-DOS and Windows 95 days, and who better
to look at why rebooting fixes so many problems than one of the programmers of Windows itself? That's why today, I'm going to take a peek
behind the curtains at rebooting. Why is the simple act of rebooting such a
perfect fix for so many different computer problems? It seems obvious at first, and yet, like so
many things, it's much more complicated and interesting once you really think about it. Why does your computer feel and act differently
when it's been freshly booted than it does when it's been on for a long time? Why can't computers just be like that all
the time? We're taking a deep dive into rebooting and
what it all means right here today on Dave's Garage. [Intro]
A new computer is awesome. A new computer is fast and snappy and has
lots of ram and the hard disk has tons of space free. It might only be a few months or a year old
and it's built with all of the latest and greatest components. It's shiny and pretty and we like it. Because new computers are good. Old computers suck. Not only are they built out of boring old
components that no one even wants any more, they're slow and draggy and the hard disk
is almost completely full and fragmented. They take a long time to start up and a long
time to shut down and a long time to use in between. An old computer is no fun at all. And yet, all it takes to turn a new computer
into an old computer is maybe a year or so of use. Then you might reformat it and wipe it and
start over, which could extend its useful life a little bit. But this aging process take time, and it doesn't
happen overnight. Things like filesystem fragmentation that
might accumulate can take weeks or months to happen before you can really even notice
them. The slide into middle age mediocrity is a
gradual one, but the pull is exorable unless you're actively fighting it every day. But we all know it doesn't always take months
or years for a new computer to slow down. All it takes is some bad software and a little
time. The worse the software, the shorter the time
it takes. If you were to run a program like a word processor
that simply never freed the memory it used along the way, and then kept actively using
the program to edit your document, let's say it'd last about 5 minutes before it would
have used up all the free memory that were available on the system before coming to a
halt due to running out entirely. The thing is, though, people don't actually
write software that poorly. It's a contrived example because nobody writes
a word processor that doesn't free any of the memory it allocates. But mistakes do happen. I could envision how someone might create
a word processor that frees all its memory properly except in some weird edge case where
they have a bug that they didn't know about. Maybe when there are alternate Canadian spellings
for a word like "Colour" or "Checque" the program forgets to free those, and so it leaks
a little bit of memory every time you scroll past one of them. After a couple of hours of editing, let's
say, a great deal of memory has been leaked by the nefarious Canadian vowels. Once physical RAM is exhausted and you start
paging into virtual memory, your PC will become rather glacial, all brought to you by the
letter U. Now what happens to that memory? Is it gone forever? Or do we need to reboot the computer right
now to get it back? Actually, the answer is neither, because the
operating system is still in charge. It may not have any idea what a program intended
to do with the memory it allocated or why it allocated it, but the operating system,
be it Windows, Mac, or Linux, has dutifully recorded and kept track of every allocation
made in a long list. It's like a bank ledger of every outstanding
memory allocation that the process has ever made and not yet returned. When the program exits, the operating system
knows the program is done with that memory and can return it to general use. It runs through its list of memory that was
allocated to or by that program and returns it to the pool for everyone's use. Even if the program lost track of the memory
it was using, the system accounting will have an authoritative list. Of course, there are two problems with this
approach. The first is that no one wants to have to
periodically stop, close, and restart the programs they are using. Another is that you don't know how much memory
the program might be using or might have leaked. And a third is that programs might not be
responsive enough to cooperate. What if a program allocated a bunch of memory,
leaked it, and then due to a bug or some other reason failed to respond appropriately to
user input? First off, you couldn't be sure which process
had allocated the memory, and second, you'd have no way of getting it back from it even
if you did. And that's where Task Manager comes in. In fact, that's one big reason why I wrote
it. Back in the early days of Windows NT, all
of the software programs that ran on NT were very new, and it was common to discover bugs
and defects in them. Programs could leak memory or handles or other
resources. Your machine might slow down and start to
act erratically, but unless it's obvious somehow, you have no real idea of what process is misbehaving. Task Manager makes it easy to identify programs
that aren't playing nicely. If you sort by the memory used column only
to discover that your word processor is using gigabytes of memory even without a document
loaded, that's a good indication that the program has a memory leak or other defect. Same deal with handles and other resources
- anything that can be allocated can be leaked, and if it's a limited resource of some kind,
Task Manger will help you figure out which process is the culprit. Once you know which process is causing the
problems, you can use Task Mangager to end it, which will return everything back to the
system just as I described earlier, because the operating system has kept track of everything. But who's watching the watchers? By that I mean what if the operating system
itself leaks memory? Or more likely, some other piece of code that
gets to run in the special privileged mode known as Ring 0, like a driver does. In Ring 0, code has no recourse to the law. If your video driver leaks memory, whether
it's system memory or video memory, it's gone "forever". Well, until you reboot. Aha! And now you know one of the very real reasons
that rebooting can be so beneficial - it forces the operating system itself to start over,
which means it starts with a clean slate and zero memory and resource leaks on the kernel
side. Memory and resource leaks on the kernel side
last until the next reboot, and that's a big factor in why rebooting can often make your
computer feel like new again. After resource leaks, the next most common
problems you'll find with application software are logic errors that cause it to either lock
up entirely or use a ton of CPU. Again, Task Manager rides to the rescue with
the ability to save the day in two ways. First, it allows you to identify which program
is using a disproportionate amount of CPU. By that I mean disproportionate to the amount
of work that it's doing. If Adobe Premiere is rendering video in the
background, it's normal to use all the CPU it can get. It's not normal for notepad to do so, however. CPU hogs can be a little more challenging
to detect if they're running on a single core and your system has many cores. If you have 16 cores and one program is CPU
bound, it could be wasting an entire core and yet using less than 7 percent of the total
available CPU, so it wouldn't be immediately obvious. That's why you should switch to the Task Manager
performance mode that shows you individual CPUs, and look for one core that's close to
100%. If you find it, and it seems abnormal for
that program, your next task is to find a program in the details view that is using
about a core's worth of the total CPU. From there, once identified, you can make
the decision of what to do next. By far your best option is to just close that
program, ideally after saving your work to a new file, just in case all that CPU usage
is a hint that there's corruption of some kind afoot in the process. No point in overwriting your existing good
copy with new bad data. Ask me how I know these things! If the program won' t respond properly, that's
where Task Manager provides its second major service in the ability to terminate a program
against its wishes, or without its knowledge. Normally this is a bad idea as it doesn't
allow the program a chance to save any information at all. But if the program is bunged and you just
want your machine back, terminating the offending process can be a wise approach. As soon as you do, as discussed before, any
resources its holding are made freely available to other programs again. Terminating the rogue program should return
your machine to normal again. And if not, there's always rebooting. This notion of rebooting to reclaim leaked
resources does not just extend to memory. It's also true of other constrained system
resources, such as handles and windows and sockets and file objects and so on. The only thing that doesn't get reset entirely
to factory original, of course, is your hard disk. Now the disk side of things is more than I
want to get into for this video, but odds are by now you have some inkling if how hard
disk fragmentation can slow down your disk access. When a disk becomes fragmented, data that
is normally used together winds up scattered all over the disk and so the heads have to
travel a further distance with more changes of direction over more time to read it in. Having the data broken up and scattered around
the disk slows access, and that can be true of memory as well. Now, virtual memory acts as a layer of indirection
above them but there are still physical pages with actual memory addresses. If memory references are close together, you
have a greater chance of you CPU cache being effective than if they are randomly distributed
around in memory. The bigger problem with fragmentation, however,
is known as heap fragmentation. It's when there are no large, contiguous blocks
of memory left even though lots of memory is still free. To illustrate, I'm going to use a somewhat
theoretical example. Imagine a program that allocated all of the
free memory in the system in 1 megabyte chunks. Then it returned all but every 4th chunk. You'd have 75% of your system memory free,
but you would be unable to allocate any memory larger than 3 megabytes! You might have dozens of gigabytes free, but
allocating 4 megabytes would fail. That's an extreme example of heap fragmentation. Short of meticulously unwinding every one
of those allocations and/or terminating their owners, the only way to fix it is reboot the
system. When you freshly boot a computer in a network
environment there are other benefits as well. Your computer's host name will be refreshed
in the DNS tables around your LAN. You'll get a chance at a new IP address, or
at least a new lease on your old one. Routing tables might be created or refreshed. Your default gateway is reset and so on, all
potentially fixing network issues that you might have been experiencing along the way. The list of things that get reset by a reboot
is long, and sometimes not obvious. For example, the clock is reloaded from the
battery backed up hardware clock. Since software clocks drift over time, your
time will be most accurate right after a reboot, when it's been synced to your hardware clock
or perhaps an SNTP server. When your system comes up freshly booted,
there are plenty of resources available. As long as you haven't unduly cluttered your
system with startup programs or contracted a virus, not much should be running yet and
there won't be any windows open. That, combined with all the factors we've
talked about so far, combine to make that "just booted feeling" a rather special one. Now if only you'd clean up the icons on your
desktop, you'd really be onto something. Over time, however, you'll find that even
a freshly booted PC can be slow due to factors ranging from disk fragmentation and registry
errors to malware, but those are topics for a future episode. If you want to see it, be please sure you're
subscribed to my channel! I'm not selling anything and I don't have
any patreons, I'm just in this for the subs ands likes, so please do me a solid and leave
me one of each before you go today! There is one thing you actually can buy, but
I donate any and all profits all to Autism research: the Dave's Garage Mug. A mug so amazing that it's literally guaranteed
to make your coffee actually taste better! Get yours now from the link in the video description. You're probably going to drink a warm beverage
at some point in your life anyway, so why not do it from a cool mug while helping a
kid? Thanks for joining me out here in the shop
today! In the meantime, and in-between time, I hope
to see you next time, right here in Dave's Garage.
As to the question someone had about restarting vs rebooting, it SHOULD all be the same on a modern PC. In older hardware, it didn't clear the memory usually and instead only caused the CPU to execute the reset vector interrupt. Nowadays a soft reset on a PC is quite thorough.
Another thing that should be more common knowledge is that modern Windows 10 installations have Fast Startup enabled by default. If you habitually shut down your PC when you aren't using it, you would think that your system is operating on a "fresh boot" when in reality, it isn't.
I always make it a point to disable Fast Startup, regardless if the system is on an SSD or not.