Whether you’re copying files,
loading a level in a game, or installing an app on a phone: it is 2021, and progress bars
and loading time estimates still suck just as much as they did
back when they looked like this. Why can’t computers just tell you
how long something’s going to take? Progress bars exist in all sorts
of places all over the real world, from countdown lights
on pedestrian crossings to the floor indicator
on an elevator. Even a line of people
queueing for something is a sort of progress bar. You know how far you’ve got to go, you know how fast the line’s moving, you can get an idea of how long it’ll take. More importantly, though,
you can tell that the line is moving. There’s a big, obvious, physical sign that Things Are Happening and
Progress Is Being Made, which means you’re more likely
to be okay with the wait. Particularly if you can actually
get an estimate of how long it’ll take or even just how many people
are in front of you. But as for computers:
well, they’re mysterious boxes. Without a progress bar,
it’s impossible to tell whether a task is barely started, about to finish,
or stuck in a loop it’ll never recover from. Or whether you’ve forgotten
to hit the start button. Hence: the Progress Bar, and its slightly fancier cousin,
the Loading Time Estimate. The simplest possible progress bar
takes the number of things to do, and shows how many of those are done. This video has a progress bar like that:
there’s five minutes and ten seconds of video, and we’ve got through one minute
and twenty-four seconds. That’s a pretty accurate
progress bar right there. Of course, the video player has one job: display the frames of this video
at a steady pace. Barring you pausing or rewinding,
it’ll probably just carry on at this speed. But something like installing a program
is a lot more complex and unpredictable. An installer might have to
download files off the internet, then decompress those files,
then save those files, along with reading or deleting stuff
that’s already there, and maybe make some changes
to system settings. Now, in theory, a programmer could
work out roughly how long each of those processes will take,
and bias the progress bar accordingly. But each one of those steps will take different amounts of time
on different machines. Downloading will depend on internet speed. Decompression will depend on
how fast the processor is. Saving, reading, deleting will depend
on how fast the device’s disk is, and the states of the files on it. And on some physical spinning disks,
it’ll even depend on where exactly the 1s and 0s are stored. And as for the making changes
to system settings, well, that’s up to
the operating system. It’s basically impossible to get a
smooth progress bar for something like that. But let’s say you’ve got a progress bar
that should be fairly steady. Like: copying 20,000 files
from one place to another. Surely you can just…
count the files as they go? Well, that’ll go wrong if one file’s
much bigger than all the others. Okay, so maybe it should count the
amount of data that’s been transferred. Sure: but that can also go wrong,
because copying lots of small files takes longer than a few large ones. It takes a little bit of extra time to
switch from copying one file to the next. So even for something that
seems like a simple task, the bar could speed up and slow down. Fine. Maybe you work out
all of that in advance, you program it really carefully so that the progress bar can
be as smooth as possible. Very few programmers think that’s
really worth the effort, but let’s say you try. And then you add a
“time remaining” display, based on how long it’s taken so far,
and how much there is still to do. That will still go wrong. Because the speed of your transfer,
or your download, or your complicated video render, all of those speeds could
change at any time because of what the device is doing,
or what the user’s doing. Maybe a desktop user switched
to another window because they were bored, and now the computer’s got to deal with
playing a game as well as rendering video. Maybe a phone user got a notification,
and now the device is struggling to download a video clip that someone’s sent in a group chat
in the background. Maybe the computer is starting to get too hot, so it slows down its processor
to avoid being damaged by overheating. Or maybe the device just decided that now would be a really good time to do
some maintenance in the background because… well, who knows why,
apparently it just decides that sometimes. One option is to take a “rolling average”: rather than just naively estimating based on
the time it’s taken for everything so far, you look at how much has been done in
the last ten seconds, or twenty seconds, and extrapolate from just that bit. It’ll still probably be wrong,
but it’ll be differently wrong. Or, and I think this is why most programmers
don’t spend too much time on this, you could accept that it doesn’t
actually matter all that much. Progress bars don’t have
to be perfectly accurate. Actually, a perfectly smooth
progress bar does kinda imply there’s some trickery
going on behind the scenes, like scammy web sites that make you artificially
wait ten seconds for something to load, just so they can show you an ad in the meantime? I reckon progress bars should be
just a little bit janky, because they reflect something
that’s really happening. And besides, the most important job
of a progress bar is not to give an exact, precise estimate
of how long is left. It's great if they can, but
the most important job is just to reassure the user that, yes,
things are happening, progress is being made,
calm down, don’t try to reboot the machine
just because you think it’s cra--
Thank you, Tom! Finally acquitted!
Love how he starts saying 1 minute 24 seconds at 1:22 but finishes perfectly at 1:24
https://xkcd.com/612/
There are two programs that I use that most people don't and I feel they're on the opposite ends of the task bar situation.
One is a very obscure image analysis tool that was written in Matlab by someone who's good at image processing but not a UI designer. It has no progress bars at all. You click a button to load a file and nothing happens. Now because it's dealing with large images it might be that it's taking a while to load the image and to the initial analysis, or it could be the program didn't register the click. And for the next minute or two I develop a lot of stress thinking did actually I click that button?
The other is Agisoft Metashape, which is a photogrammetry software and it's progress bars are super detailed, spelling out which task in the larger operation is currently being performed, how many tasks are remaining, and a reasonably accurate estimate of how much time is remaining for that task. Which is nice, except the tasks often will take hours and sometimes days. I've even had estimated tasks which exceeded 2 weeks. And it's mostly reasonably accurate (as long as the entire computer doesn't crash) of course accurate is relative because if it's off 1% on a 14 day progress bar... it would be off by 3 hours and 20 minutes. And that is only telling me the length of that task but it may have 3 to 7 tasks in a given operation before I can do anything again.
Ill just leave this here https://www.youtube.com/watch?v=1V2SHW6Qx_E
One thing about the video progress bar is that it's time based, yes, but the video decoder is doing extra things like throwing out frames in order to catch up with the time. The user shouldn't notice these extra frames being dropped if their computer is fast enough, but on slow computers you might notice the janky video.
I really expected Tom to talk about how a lot of progress bars are fake and don't actually indicate how much progress has been made, rather they're designed to set expectations for a long wait, then make the user feel like progress has been made and it's going faster than expected. A lot of them do this by moving slowly at the beginning and speeding up near the end.
It's true that making a "100% accurate" progress bar is near impossible, but it doesn't fully explain what we're really looking at today. This 99 Percent Invisible episode is a lot more informative. https://99percentinvisible.org/episode/wait-wait-tell-me/
Also: Depending on the task, progress bars can actually slow the underlying process being tracked too.
For example, imagine a parallel process doing work using multiple threads where even the UI/progress bar has its own thread, you'd think updating the progress bar would be inexpensive, but you either run into cost in adding the auditing tracing itself or run into concurrency/locking issues where the workers push to the UI (and even a fast lock of a few ms can add up depending on task/interval).
I've run across this in the real world. In particular as a lot of programs/processes aren't nearly as concurrent as they could be (or have a naive IPC, with lock contention). Simply just replacing the progress bar with an animation, and taking out all the process hooks resulted in a user perceivable improvement in loading times (and improved maintainability of the codebase).
But let's say that you could make a completely accurate and low cost progress bar via effect, the question then becomes if the programmer-hours are better spent there instead of spent on core functionality or working on actually reducing loading times instead? Because there's always an "instead," you can do this or do this, and progress bars are never going to win that war.
this is why i think the 2 bar system is the best. one is for overall percentage done and the other shows me exactly what the computer is doing. also the blocks are much better than a line that fils up by the pixel.