Here is an interesting fact about the Windows System Clock. It is leap second aware! Starting a few years ago if there is a leap second (hasn't been one since 2016) the clock will actually show 61 seconds for that minute (if seconds are enabled). I worked on that feature. Dave may appreciate that in order to support the combability modes we needed to add a new flag to gflags. But Gflags was already using all the bits available. So Leap Seconds was the first bit for gflags2.
Why put it in gflags and not the clock global memory . Specifically the time of next leap could be a field in the internal timezone data (although loaded from elsewhere when applicable) Then the longlong to system time function can access the data without extra cache misses and the kernel time code can implement a way to keep the longlong time unidirectional during leaps .
It would show 60, right? Because usually after 59 there comes 00, and there's usually just one leap second so it would roll over after 60. At least from my vague "knowledge" of leap seconds.
@@johndododoe1411 I'm glad you asked. By default a process will not get leap seconds and will be in a compatibility mode. (In this mode the you will only get 60 seconds, but the 60th second will run twice as long to absorb the extra time.) This is because apps may crash if it gets more than 60 seconds in a minute. (Imagine an analog clock app trying to render that). But if your app is leap second compatible it can "opt in" by calling SetProcessInformation and setting the PROCESS_LEAP_SECOND_INFO_FLAG_ENABLE_SIXTY_SECOND flag. If you don't want to recompile your app you can use gflags to set that at runtime. The details for the leap seconds are stored somewhere in the memory of the kernel. If a new leap second is announced, it gets updated through Windows Update. You can use win32tm.exe to manually set leap seconds for testing purposes. Interestingly it was recently announced that no new leap seconds will be added after 2035.
@@wampacat6031 > Interestingly it was recently announced that no new leap seconds will be added after 2035. but.. why? surely we still need leap seconds so the date doesn't drift?
It's confusing that in Windows 10 you could just click on the system time to open the calendar view and see the time in seconds there whenever you need to, as a convenience, but in Windows 11 they took that away and now you have to either always see minutes only or always see minutes and seconds, with no convenience to switch from one to the other. It's a problem they created themselves.
Yeah the right side of task bar is the one thing changed in Win 11 that is just objectively worse than Win 10. And it infuriates me every time I have to change my audio output device.
@@d_9696 Oh that too but you can change that, regardless if I want to show the start menu or search for a program I just use the Win key on my keyboard
I’m an automation engineer. You can’t believe how useful for debugging is having a seconds display, so much I’ve been relying on third party solutions for that for a decade now
I never cared much about not having seconds in the task bar. What really annoyed me is Win11's removal of the seconds when you click on the time to expand it and see the calendar. I've often used that expanded view as a low effort stopwatch, and it was taken away in Win11. Great to hear seconds are coming back, but it'll be a couple years before that build will be available on $work computers!
Companies are constantly making garbage changes that nobody wants and only make everything worse. 😒 Changes like ROUNDED GD MF CORNERS ON EVERYTHING. 😠 - Whenever you find yourself wonder wtf a company changed something that made the app/site/program/game/product/etc. worse, it's usually because some jerk was selfishly inventing work for themselves to justify continuing to have a job indefinitely instead of just doing a job, finishing it, and moving on. YT devs do it every month. 😒 Oh, and it's also infuriating to be unable to see seconds on smartphones as well. The default clock app on Android will show it (if you know where to look), but Apple refuses to show the seconds, you need a third-party app to see it. 🤦
@@I.____.....__...__ uuh, rounded corners on windows are not a modern feature. Windows XP, Vista and 7 all had rounded corners. And it was existent on Mac since MacOS X. If anything, it's sharp corners that are a modern feature (or rather, a feature brought back from the dead), not rounded corners
Wow, Dave, I like your pretty pink fingernails. Jokes aside, I’m always happy when I see you’ve posted a new video. Not only are they extremely interesting, you often take me back to the days of Windows 3.1 or 95. Those were good times, indeed.
Hey I don't know if you've covered this already, but any idea why Microsoft search sucks so much? Like how is _Everything,_ a small app developed by one dude SO much superior to Windows search when the latter is developed by a billion dollar company and one of the few things that almost everyone still uses their operating system for: (finding the pdf they just downloaded so they can drag it into an email).
Two things come to mind. 1. Have you configured Indexing Options to cover all your common file locations? If Indexing Options are left to their default state, and you don't put all your personal files in your User folders (Photos, Videos, Documents, etc.), then you haven't configured Windows to index your files. It makes a world of difference in terms of search speed once Windows has indexed your files; I can hit Windows key, start typing the name of any file on any of my multiple drives that I want to access, and Windows will present it to me after having typed just a few letters or words; 2. Disabling web search in Windows Search also helps cut away the bloat and improve search time for local files. This can be done in Windows 10 using gpedit or via registry changes; not sure if it's still possible in Windows 11, I haven't upgraded to it, and won't consider it as long as they keep the retarded dumbed down right-click menu (while hiding away the full menu in a second click; this destroys my work flow on so many levels).
@@Cklodar There is an easy way to get the old context menu back by default. Just create a .reg file with this content: Windows Registry Editor Version 5.00 [HKEY_CURRENT_USER\Software\Classes\CLSID\{86ca1aa0-34aa-4e8b-a509-50c905bae2a2}\InprocServer32] @="" and ofcourse run the reg file to import it.
@@gassie123indeed, that registry command is the first thing I run after completing every Win11 installation to get the full right-click menus back. It’s great.
MS is huge and things move incredibly slow now, everything is design by committee. They are so big they can just pump out turds and people will buy it. Look how many people still use Windows even though they hate it. Vote with your wallet.
I remember learning about virtual memory as a kid in the 90s... it was a magical idea. Around that same time, I remember dreaming of a hard drive with a gigabyte... I think I had a HUGE 16mb drive at the time. Now I have 128gb of RAM, terabytes of hard drive space, and that drive space is solid state that runs faster than the system memory used to run.
At Fermilab, in the Feynman Computing Center there is a display on storage tech over time. The large coffee table for the display itself is an 8Mb platter drive from the late 70s worth a few hundred thousand dollars. It progresses through time, showing how drives get larger in volume and smaller in size. It ends at ~2008 (we should update it) with examples of 128GB USB flash drives. Yesterday, I had a 2TB MicroSD card delivered to my door for just over $100. The future is rad.
i grew up on an Acer Aspire w/ a 1gb internal HDD. i would have to uninstall one game to play another. now when talking to my mother about data storage, i measure everything in "Acers."
That's a great story and background on the process behind decisions like this. From the user side they always ask "How hard can it be?" and would be surprised to know the real answer. I run into this pretty often in system support, and have heard several times "Why can't this do X? It's just an IF statement." IF only it were that simple.
And yet, the clock had seconds if you clicked on the tray, the calendar would show along with the full clock with seconds. Then windows 11 decided no more clock on the calendar.
@@jonorgames6596 You can by setting an additional clock in the settings, but it doesn't show seconds. Also, compared to the calendar clock which had very big text (it took the whole horizontal size of the calendar for the xx:xx:xx format) it is in a very small font
@@Aardwolf001 Same exact thought. I added seconds to my taskbar because there's nothing better for now. But that feature was the best. Every time I had to check if the minute just started or the :59 was about to turn to the whole hour, I tapped on my clock to see the calendar and I would get the full clock
Dave, thank you so much for such a detailed and incredible stories behind the "usual" Windows scenes. It is an honor, a pleasure and a huge luck having you here on RUclips telling all these stories with this passion and fun. Cheers and Kudos from another ex-MS engineer.
Longtime T-Clock user here. (I also remember struggling to keep the clock accurate because at some point it developed massive inaccuracy so I had to sync it to the router clock every 15 minutes or so.) - It is always nice to have options! And T-Clock AFAIU can also offer more format flexibility than any Windows, so I have seconds and date with weekday but without year in it and it is very compact in H an V. That seconds are a registry-only option in Windows 10 is another good example of the ugly mentality at Microsoft: Take features away, then advertise their return in the next version, and occasionally hide super-useful stuff, and then maybe also remove it because 'no one uses it' because you hid it. And they expect people to be such robots that they just go along with this mess.
In the same era, 92ish I was using macos 6. Showing seconds was invaluable to determine if the machine had crashed or not. Which was way more common than not. I used that all the way through the first versions of osx before I switched over to windows machines in the early 2000s so I could play games with my friends.
Video Request: REGEDIT Can you please make a dedicated Video about the Registry? When was it introduced? It probably started already with the first Windows Version. Or did MS-Dos. Furthermore, I have the following question: - How does a Windows Developer know which Key / Folder he should use to store it - Did developers @Microsoft or from other vendors sometimes messed up the registry? - Is there any documentation / Readme about it - While trying to turn the seconds on my Windows 10 Machine. I googled it. I was wondering how people find out which setting they can add to the registry. Trial and error? - Optional: Is there any safety switch that prevents an application with admin rights from writing / deleting Windows Keys which are essential And of course any other secret story about it ;)
Heh. This reminded me that Android development hit a snag when the UI was taking much more resources than it should to update the data transfer arrows on WI Fi and mobile connections
It isn't really necessary to do the whole formatting job every time. Incrementing and displaying a second count is the same regardless of any and all locale options. But the code is easier to write when the entire task including accessing the locale and formatting is repeated every time. We refer to this as "abstracting away performance." Most software is doing thousands of times as much work as it needs to because abstracting away performance is such a normal, universal thing to do. People want a job done, so they import a library that does it, and they ignore the fact that that library also does fifty other things as well none of which they need or which their code will ever benefit from, and the fact that that library depends on six other libraries which it needs only for one or two jobs each regardless of whether those other libraries are doing fifty MORE things each.
@@Felice_Enellen From my understanding it's like a routing table for the computer to find files and programms. I really don't know, but if i remove all registry keys for one of my programs It'll still be on the computer, but the computer won't run the programm.
@@Felice_Enellen HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\Advanced and create a new DWORD called "ShowSecondsInSystemClock" then set the value of it to 1
Meanwhile, there are endless annoying features going on constantly that you have to deep dive into settings to get them all to stop. The seconds issue makes some sense when taken on its own, but is comical when considering all the 'functions' windows seems to obliged to process until you figure out how to shut them off. It reminds me of New York State, when having budget arguments, will shut down a rest stop on a local highway 'to save money'. This rest stop has nothing but parking spots for cars and trucks and provides no services by the state at all. No buildings. They block it off supposedly to save money. It is clearly just done to annoy and inconvenience the public to make kind of point.
Dave, you are the best PR guy for MS of all time. I used to hate the company because of their buggy OSes, but your explanations of the problems with software development has softened my view toward them a lot. However, having said that, I still think Win7 is the best OS of all time and they need to steer the ship back in that direction. You can quote me on that, but I don't think it will carry much weight. Thank you for your service!
*Money.* Microsoft wants your data because that's where the money is at, and they'll manufacture justifications for that data being collected by presenting it obtusely as possible, as various data-dependent features you had never asked for, then say that disabling those features is a bad idea because it'll "[D]egrade your experience" _somehow._ Modern-day Microsoft can keep their network-burdening horsecrap right where it belongs; in the ass that should had dumped it int he toilet. While I enjoy Dave's commentary on Microsoft's decisions from an understanding and closure aspect, and _deeply_ appreciate the commitment of time he puts forth to produce videos like these, I'll stick with Linux systems for years to come - specifically, of the Arch variety, and _of that_ EndeavourOS. Stupid-simple, fun to play with, try it in a VM some time.
Oh Microsoft is still a god-awful company, God bless all the engineers working there doing the best they can though. Same thing in most places - lots of talented people on the ground floor, but it's all for naught when the ship is steers by morons in suits.
You can understand the reasons - more CPU and power used (the latter being important for laptops) - but it is ridiculous that the code for displaying seconds is present in the later versions of Windows, but no UI is present to enable the seconds! When I ran Windows, I'd use a third party task bar tweaker to enable the display of seconds, which I suspect just modified a registry value to turn on the seconds. What I think Windows should have done - if they didn't want a UI - is display seconds when it is on mains power and not display seconds when on battery power only. Of course, all these years, they should have had a UI to pick whether you want seconds or not. It should be noted that pretty well every Linux desktop UI for decades has allowed you to config whether you display seconds on the task bar clock or not (and defaults to not showing seconds, which I find an annoying default in this day and age, but at least I can change it!), so Microsoft has no excuse for taking 30 years for letting users have a choice. This is a big issue I have with Windows overall - its dismal lack of UI configurability.
I was fairly keen on the impact of showing seconds back when I was using a Mac with a G3 CPU on OS X (but with a Rage Pro graphics chip, therefore everything was basically being CPU rendered under Quartz), and when seconds were shown in the clock at the top, every second ticked, could be felt in Quartz's performance, like dragging a window, a "stutter" every second with the clock.
Yeah, 1 hz is a bit aggressive for a background GUI, even today. You can probably measurably affect your battery life by doing this. It's just not resolution I usually need and when I do it's fine to open up something else to do it.
I built a clock for an NLE that updated smoothly at 1/10 second resolution back in the OS 8 days with no noticeable lag. The bottleneck was always ATSUI text rendering, but even then it was fairly performant at fractions of a second. I guess I wasn't doing anything fancy, just requesting an update when the timer or other events fired, and then drawing the current time when the OS was ready. Of course, the draw request responsiveness was coupled with whatever the underlying video hardware was running at (back then it was 30hz) but as long as you could finish the draw within that time interval, it would work smoothly.
I properly won't bother enabling anyway, but it's a fascinating anecdote. And as one who sold and build Windows 95 systems, I perfectly understand that sacrifices needed to be made.
I really didnt care much about seconds display, but i had a hunch Dave might go on a tangent and was pleased with the virtual memory deacriptions. Learned about the page table today. I knew most of the other bits but tbat brings it all together. 👍
Your (former?) colleague Raymond Chen has discussed this topic more than once. On October 10th, 2003 ('Why doesn't the clock in the taskbar display seconds?) he pointed out just how heavy the impact was. It was in fact even harsher than you allude to in this video: the frequent updates were actually snowballing by keeping pages used for text rendering, taskbar window procedure and a lot of Explorer data structures from ever being paged out, all of which amounted to way more than just a few pages related to locale data remaining hot. On April 11th, 2022 ('Now that computers have more than 4MB of memory, can we get seconds on the taskbar?') he dug into the topic once more. Here he brings up the impact on Terminal servers and the cascading effect on many different connected clients, which also explains things like caret blinking not being a thing in that situation. But even for single user systems, it becomes a barrier because such periodic activity interferes with the ability of the CPU to enter a low power state. Which also loosely ties into my own frustration: something that constantly blinks or moves interferes with my concentration. It would be less of an issue nowadays (the clock takes up less screen real estate compared to back during the 640x480 days) bu constant changes will make those changes far more noticeable than the occasional surreptitious change. My interpretation of the entire situation is basically that the clock on the taskbar is like rolling coal: you get some of the attention, but everyone thinks you are an asshole for doing it. Just drive to your destination safely and get your job done, Mr. Taskbar!
The CPU being unable to enter a low power state is what angers me the most about the solution Microsoft has chosen by making the seconds available permanently. personall I take that over no seconds at all. But what I find baffeling is all the way back to windows 98 first edition (don't know about 95) you could show seconds either by double or single click on the clock. And I bet 99% of people who want seconds were just fine with that solution. Why didn't Microsoft bring that version back, that worked just fine for decades? Would run the code only when seconds are actually needed and let's the cpu sleep when they're not.
For me, the April 2022 blogpost just doesn't stick because of the sheer size of telemetry and "suggestions" (i.e. ads) crams into Windows releases since Win8. That stuff creates way more CPU wakes then once per second clock refresh + desktop repaint ever will. Fun exercise: Install a fresh Win10, finish setup/updates/drivers, open procmon from sysinternals and look what the system constantly does, quite noisy.
"periodic activity interferes with the ability of the CPU to enter a low power state", it might be possible to respond to this state and temporarily suspend updates of the time, also when the screen is turned off to save power. Please also see my reply to BlueSquid's popular comment.
In short, the code is wildly inefficient. Just the fact they're paging localization data every time the clock ticks makes me want to tear my hair out. CACHE CACHE CACHE
I failed out of computer science but everything you said sounds fake and reddity. I will agree it would be a harsh ui experience it's absolutely impossible for me to accept your cope. To believe that is to believe the same lies why a text editor is slower to respond on what is basically a modern super computer to a device 40 years ago. Devs don't want to admit they're bad. They have to iterate to keep their jobs. This over time degrades almost every user experience. Programming a clock is a high school intro level project 30 years ago.
I never took into consideration a clock could be so complicated yet simple at the same time. Would really like to see a break down of the builds comparing the old and new clocks. As always, great video bro!
it's part of the charm of listening to an old timer programmer. every small feature they made had been meticulously optimized because they have to at that time. I wouldn't really want to experience the headache of being in that place and time but listening to them does make you feel like they care a whole lot more
@@aronseptianto8142 True tech has changed, and I second the notion how much easier technology has become in some ways. I'm glad the oldest tech is just the ones sitting on the shelf, and not one I would play around on. Lol good old blue emachine.
@@aronseptianto8142 Yes, we had to optimize a lot in those days. In 1980, a lot of computers were restricted to 64K, including Data General's Micronova. I wrote a good portion of the OS for that and the Nova 4K in assembly; the rest in C. Debugging? I would do core dumps to spocked-fed paper -- which came out in octal -- and spread it out on the floor and went up and down hunting down the bugs. Yes, the instruction set was aligned with the octal nicely, so I could do the decompilation on sight. Nobody would do it that way today. There all sorts of robust debugging tools. But not back in 1980. I consider most of the programmers today spoiled brats. I had a guy tell me he did Ruby because Java was too hard. I felt like smacking him. LOL
Working with TS/Citrix for decades, I remember one of the early optimizations back when CPU and bandwidth were at a premium was to disable the clock completely to prevent sending out a screen update glyph for the time ticking over to every user every minute.
Does software like that not send out something every X seconds anyway? Just to keep the connection alive, so to speak, or to check if the user hasn't been disconnected for whatever reason? Seems like a pretty basic neccesity for a system that requires long-lived connections. A few extra bytes to update the clock could easily piggyback onto this keepalive signal. Or the other way around. It also bares the question: why is the clock drawn on the server in the first place, sending it over as bitmaps? Seems like a waste to me if the client can do it as well. I think Citrix at least, can actually do that: by not actually providing a full desktop, but only the application the user needs, just as any regular window on the user's otherwise off-the-shelf OS.
@@thany3 Not by default, no. That actually causes issues with some network gear that assumes no packets = dead session and they would terminate the session trying to be helpful. So there is an option keep-alive for that but it's a tiny packet and we don't usually set it to 60 seconds. With Citrix, especially in the Metaframe 1.0 or winframe days, everything was sent to the client as a screen update as parts of the screen changed. It was done in a very efficient way but still, that's the way it worked. These days it works a little different but screen glyphs are still the majority of the server to client traffic (provided they are not printing or moving a file from the server to their client via the Citrix protocol, which is an option). We could do published apps but desktops were and still are a popular option as well. It depends on the use case for the users. When we were trying to run sessions on low bandwidth connections (think 12kb/s cellular in the pre-2g days) every bit we could cull counted.
I used to use VNC to remote into my home computer, back in the Windows XP days. Something else XP did (optionally) was to provide a network activity monitor in the task tray. You would have the little icon of two computer monitors, which meant to imply network connectivity. One monitor would light up blue to show send activity, the other receive. When you're on a remote connection that sends screen updates over the network, changing the activity monitor caused a screen update which caused network traffic, which (potentially) caused the activity monitor to reflect that something had been sent (and ACKs received), which caused a screen update, which caused network traffic . . . . .
SO you're saying back in your days computers had runes? That's pretty based. @9Blu That's never gone all the way away. A faint neighbors wifi signal you've over gone your datacaps but you're with sprint before they got yeeted and they give you 10 kbs. 10kbs is an oasis in the desert compared to 0 internets. So you're saying I'll have this movie downloaded in only 3 days? I'd be all excited my gf would just glare at me lol.
Personally, I dont like seeing seconds on a standard clock. I remember using a hack to put seconds on the clock, thinking that would be nice. It wasnt. My eyes get drawn to anything on the screen that changes - including the clock' seconds. I reversed that hack real fast. If I need to see seconds, I will bring up a stopwatch program... which is very rare.
I can't see how your eyes could get drawn to some numbers changing in the corner of the screen, they're so small and discreet. I use a program called T-Clock to display the time with seconds in 20 point Calibri Bold font, as big as I can make it and still fit in the Windows 10 task bar, and I can't notice the numbers moving unless I'm looking almost directly at the clock.
"will be compared to the previous version and any slowdown will not be accepted" and then came 11 that was mostly new shell for 10 and yet it was allowed to be so much slower
Holy cow would I like for somebody like Dave to do a series on the weird hybrid architecture of Windows 95. What were VxDs? What did the kernel look like? My understanding is that it was based on an earlier abortive attempt at a 32-bit DOS and it is not in KERNEL32.DLL, it had something to do with VMM32.VXD which also had a bunch of drivers compiled into it during setup or something. It'd be really interesting to get a behind the scenes intro to how freakish the whole thing really was. Was there a team that worked on that kernel knowing full well that it was going to be thrown away five years later in favor of NT? Because we all knew that was going to happen eventually. There are bits and pieces in Raymond Chen's blog, but there's no semi-comprehensive talk that I have seen.
Freakish and delicately balanced. Probably why it was so easy to break. Comparing Win95 with NT4 back in the day it was like they were made by different companies in terms of reliability.
@@drozcompany4132 pretty much. I switched to NT 4 as soon as it was released, and while it was a relative pain to set up (no PnP) it was a night and day difference in day to day use. But you do get a sense that some very crafty stuff was done to make Windows 9x work as well as it did and I'd love to see background on all of that.
@@drozcompany4132 ... yes on all points, due to the simple fact that Microsoft wanted Windows 95 to be (a) an ambitious step forward in terms of new features, & (b) backwards compatible. It amuses me to no end that Microsoft & its fanboys were crowing that Win95 was "a true 32-bit operating system" ... because if that was true, it would've been practically impossible for it to run 16-bit software! Since it absolutely HAD to do just that, Win95 was built from the ground up as a newer version of the MS-DOS code base (DOS 7.0) coupled w/ all the code essentials to run Win3.x applications PLUS a newer version of the Win32s extension to run 32-bit applications PLUS the newer 32-bit graphical desktop shell ...and, if that sounds like the same old Win3.1x + Win32s setup hiding under a new 32-bit GUI coat of paint ... well, it was. By necessity, this was unavoidable in order to not break compatibility w/ yrs of support from the 3rd-party software developers for both DOS & Win3.x, but the result was that Win95 (& by extension Win98) was as every bit of a kludge as every Win3.x + Win32s installation, just buggier. If you can get a hold of it, the book "Inside Windows 95" is a fascinating read on the software structure & inner workings of Win95 as well as a rare glimpse into the "design philosophy" of the most ambitious Microsoft project at the time (aside from the entire NT program). The whole affair puts me in mind of two lines, one from "The New Hacker's Dictionary" & its definition of "kludge", the other a humorous misquote from "2001: A Space Odyssey" -- "It was terribly tempermental and prone to frequent breakdowns -- but oh, so clever!" "My God! It's FULL OF BUGS!!!"
Seconds are super useful when your system hangs. Unless you had a video or something playing, if your mouse and keyboard stop responding, it could be difficult to tell if it's a problem with input or if you've hanged up entirely. With seconds on the clock, you can immediately tell your system is toast if it stopped updating. Otherwise, you might have to wait a few minutes to be able to compare with an external clock
On the flip side, that's also IMHO an unmentioned reason not to include a clock with a seconds display (which should be supportable in some ways that minimize memory and CPU time, such as computing the display of everything but the seconds portion, and the range of times that should represent 0-59 seconds). The "problem" is that if people could see all the times their system backup task stopped running for awhile, they'd realize that their system isn't really as performant as they would like to believe.
From Windows '95 upto Windows 10, it is extremely rare (in my experience) for the mouse to quit being updated on-screen even with low memory / virtual memory killing performance otherwise. Sometimes it doesn't update at 60 frames per second, but almost always more than once per second. Updating the mouse requires redrawing what was uncovered in addition to drawing at the new position. Updating the clock display only requires writing to a fixed (not variable) portion of the screen once. The seconds display code doesn't need to call any fancy time-formatting (you can do that once per minute). It just needs to increment the seconds portion of the display. And if keeping the font-rendering / GDI stuff in memory is a problem, then just blit a pre-rendered bitmap to the screen (presumably that code is in RAM like the mouse blit software). You only need 10 digits to be pre-rendered. If each digit needs 8x8 pixels (64 total) and the display uses 32-bit color (4 bytes/pixel) then you would need 256 bytes per digit, and thus 2560 bytes for all ten digits. That would easily fit in a single 4K memory block. I think the software devs weren't trying very hard to code, but put a lot of effort into making excuses!
While this is a valid explanation to why seconds were not turned on by default, I still cannot grasp, why Windows 10 decided to make that feature a registry-only option and there wasn't a GUI way to turn it on. I mean, Windows is great, when it comes down to warning a user about anything in the world. That could be an "Information" button, or a "Yes/No" dialog, when it is turned on. Moreover, Windows 10 is a relatively recent OS, which was released 3 years later, than the NVME SSDs appeared on the market. So there really should have been an option. For example, ClockDummy software was available since the Windows 95 days and the license cost almost $30, so what was the real deal?
It's cool to hear about the hurdles they had to overcome and how they did it, especially since my experience with Windows was always focused on the bugs.
I remember my iMac G3, the plastic blue one, to be able to display seconds in the menu bar back then in 2000. I was very surprised that Windows was not able to do such a basic thing all the time, I mean there is a registry key you could change and Windows was able to show seconds, but there was no setting in the App or system preferences for this. Good think they finally added this option. I use the seconds on my Mac all the time, never experienced any problems with this even back in 2000
Well in 2000 maybe you already got a much faster computer than a 386 with 4mb of ram running windows 95. And seconds are nice but not something crucial. Even on a pentium machine with 16mb of ram from mid 90s, showing seconds should be a trivial task.
I've used seconds on the clock for years on both my Android phone and in Windows 10 with a regedit. You can see that it affects performance on the phone sometimes when you unlock the phone and immediately look at the clock it's sometimes frozen for a few seconds before it starts going
One thing that strikes me is why the clock would need to format and paint the whole time string every second. Every second the whole time string will be exactly the same except for the two digits of the second. What can be done, and I have actually done myself, is to have a secondary routine that only updates the seconds display. The whole displaying of the time once a minute is done as usual but then separately once every second the two digits for the seconds are drawn. I often do such things when using microcontrollers with very limited processing and limited display bandwidth, but in the past, like last century past, I also used it in PC and Amiga applications.
I was wondering the same, but we’re living in an age where such intricate designs are easy and well known. Probably was a nightmare to implement in older systems
@@SirKenchalotGraphic designer here... The numeric characters for fonts intended to be used in a UI should not have this issue for that reason and more. In fact, most good fonts intended for body type behave this way. This applies to the classic, default Microsoft fonts I can remember (Calibri, Tahoma, Segoe, MS Sans Serif...). Although how the type is rendered can affect the kerning and other behaviors but that shouldn't be a factor... For example in .NET DrawString() for graphical text is different then DrawText() with the text renderer used in explorer and there are various attributes for this. I can't say for sure how all languages would be affected but I'm fairly sure the UI would behave the same way. Back when they made it easy for users to change various fonts it could be problematic but obscure fonts had some issues anyway.
The whole time string may not be identical for a whole minute, it may be identical for either 10 seconds or 50 seconds only: The time format can be set to either "ss" (coded on 2 digits) or "s" (coded on 1 digit with no significant leading zero for the first 10 seconds from 0 to 9, then 2 digits from 10 to 59). Since the time string is displayed aligned to the right, every time there's a change in the number of digits displayed (seconds, but also minutes "m" instead of "mm" or hours "h" or "H" instead of "hh" or "HH"), then the time string needs to be drawn in a different position. Internationalisation (or localisation as they call it in software) and customization make a great worldwide product but are not without compromise...
It is probably easier to punt and just repaint the entire string in order to simplify. That also handles step changes which may occur in the middle of a minute. I agree those cases can be handled by logic to make repainting smart, but that itself takes CPU time and introduces opportunity to make an error
While working my Masters, for an OS course, I did a mini thesis on memory paging techniques. I chose the topic because I knew the one implemented in production code was least recently used. Still, even knowing the answer it was fascinating seeing the simulated effects in different sizes and numbers if swap pages.
Reminds me of the implementation difficulties with implementing CRON on new platforms. CRON implementations vary pretty widely between different hardware architectures, and as such, how it was able to and how well it was able to maintain that coherence differed. In theory, cron should be able to handle hundreds of thousands of schedules down to the second with only very minimal slew in the range of milliseconds. To do that and keep it performant, rather than check every system clock tick to see if one of these huge numbers of jobs are scheduled to run, it creates a list of counters (based on system ticks) to decrement each tick. Once an item hits zero, it spawns the process, figures out the next run in ticks, and does it again. Low level time management is a really interesting part of computers, and it doesn't surprise me that Windows had to play it low key and cool as well.
Raymond Chen also wrote about this just last year and he still mentioned possible performance impacts. Sounds kind of funny, when you consider how many thousands of events per second sysinternals process monitor spits out.
I like to think of virtual memory as being similar to FM radio (kids still know what radio stations are right?) Think of the processes like cities. In one city, 104.9 is EZ Rock, but if you drive to another city where there are different stations, it's The Zed. Same number, but different contents. Similarly, the same address might point to one DLL in one process, but a completely different DLL in another: same number, but different contents there.
So, Virtual Memory is a bit like the banking system…”money isn’t physical there” 😂 As regarding the lack of seconds, I actually always found that quite alright. I would’ve been constantly distracted by something updating on my screen, when trying to focus onto something else. Is perfectly fine not to see the seconds.
Yeah, this should be easily selectable from a clock right click, just like the dialog that shows a clock with seconds if you have permission to set the time . Even Win3.1 had clock.exe to show seconds elsewhere .
It's very much like the banking system indeed: not only is it not there, it gets taken away when you need it, and when you get it, it's never really yours. 😊
Kind of understandable, but not so much in view of other wasted memory for all the bloatware in Windows. Could have been cut from some other things at the cost of assembly-level optimization. Like improving memory fragmentation handling.
i cannot agree with the blanket assertion that ms windows was a net good. as terry pratchett pointed out in the book "raising steam", when it is time for trains,, you get trains. similarly, the dreadful design of the pc came about because everyone else was building a comparable product and ibm did not spot this disruptive technology coming. even desktop gui designs date back to xerox parc back in the 80s, and only needed the hardware capabilities to catch up. in the mean time, you gradually had a diversity of manufacturers converging on standards for interoperability, which microsoft deliberately killed to force the windows virus platform into existence. there is lots of evidence of lots of harm caused by microsoft in their relentless abuse of monopoly power which undercuts your blanket assertion. note: i am not saying that it did no good, just that the balance is a lot finer than the fan boys think, and that a case can be made for a net harm.
Never missed or need seconds, never dreamed it was such a complex task to do so. Been working through Windows Internals books for Windows 10/11 (only because I'm a giant nerd and fascinated by the inner workings of stuff) and memory management has been a struggle to grasp. Your tutorial really helped, thank you!
Granted, it was an app, but 3.1 had a clock, and it displayed seconds (at least in analog mode). If you put the clock app in your Startup folder, it would automatically run on startup. And with a little bit of adjusting, it can hover wherever you wanted. I used to have the analog clock hover in the main background (outside of the Program window).
A now cheap upgrade for a system maxed out on ram is an SSD. So much faster handling the swap file it wakes up an old system or the few underpowered off the shelf systems that still sell with a spinning disk and very little ram significantly. Never really had anyone ask for seconds as most applications that needed it like editing software had it built in.
When early Windows 95 betas came out i was so happy to not have to use Windows 3.1 I didnt care about the clock seconds. It did run quite nicely on my 486dx2 66 and later 100 mhz systems with 4mb of ram when ram was going for around $50 a megabyte. Later on though by around 2010 i was really wanting seconds for the system clock. I just kind of forgot about it until this video and now i can finally enable it.
Same. Windows 95 was a massive upgrade and the first Windows OS for the general public that was any good (not wanting to offend anyone). I came from an Amiga which already had a pre-emptive multitasking OS with a windowing gui. Win 3.1 seemed positively archaic in comparison.
I'm not properly educated about this stuff but somewhat fluent in Arduino C and such, and microcontrollers are generally very low power single-thread computers which makes any real tasks require careful timing considerations to not hang up the CPU with delays and loops what can be done with short checks against the clock. It took me a while to be able to, say, generate two pulsed outputs at two different frequencies from a single program on a single core. Love hearing the insight you have about this stuff I learn a lot thanks!
I had seconds in Windows clock for years, there was a registry key for it. It definitely worked in W10 but I think I used it also in W7. Only after I upgraded to W11 it stopped working and I had to download an app for it. I have no problem with seconds being off by default but it should be possible to enable them for people who actually use them like me. Having the feature available for years and then removing it is silly.
I'd love to hear about how the windows UI works and how it's separated from the kernel. I also am curious in comparison to say a game that pretty much creates and redraws the entire screen every frame, how is the windowsUI divided and drawn in comparison? How does it's "frame loop" work?
I love it! I just changed my clock to show seconds! There is really no particular reason for me to do this except to do it because I can. Much like upgrading performance in a car I am going to drive on the streets -- it won't change anything. But it's cool to know that mine has more capabilities than my neighbor's.
I would love to see a video why the explorer process contains the whole desktop with the taskbar. I guess there is a reason for this strange design but why call it explorer.exe and not windowsui.exe or something like that.
the taskbar and start menu are now separate processes (somewhat) with windows 11, you can see it in task manager. the desktop needs to be part of windows explorer since it's essentially a whole folder with large icons.
Probably because explorer.exe was already baked into the API and changing it would be difficult and probably break backwards compatibility. This was before Internet Explorer was a thing. Dave may know, but the windows file manager may be the original explorer? @DavesGarage?
@@SmiliesGarage shell32 provides the API for the shell. I think the reason is that early versions of Windows 95 allowed for the program manager to be used instead of explorer. The program manager doesnt use the taskbar. You can switch what application is used as the shell in WIN.INI / registry idk which.
It's crazy to me that proper customization of the clock and date have never been a thing without external tools. On my Win7 with Aero and two lines of tasks thickness I can see the clock, the day of the week, and the date, all three of which are centered. This is basically impossible to achieve in Win10 or Win11 without external tools such as Windhawk (which I love btw).
Windows Vista and 7 introduced such a beautiful desktop that has been completely obliterated in 10/11. It's a shame they just threw all that hard work away.
@@drozcompany4132 After Win7 they started to dumb down stuff and hid relevant options in sub menus, making it more difficult to get to the settings you actually want to instead of this useless junk they put in menus in Win10 with that tiles design.
They always have had that feature. It’s a preference thing. Go to search bar -> type calendar -> click settings -> click calendar settings -> scroll down until you see “Week Numbers” -> from “OFF” change to “First Day Of The Year”. It’s easy.
@@DavesGarage I'm very MS averse, I wish the whole IBM derived architecture and Windows would give way to RISC V and nix, but you add a very human face to how it was. I'd also like to thank you for all you have done for Aspergers and Autistic people such as myself and so many others.
It'd be better if they spent their time restoring vital productivity features like "Never Combine Taskbar Items" rather than pointless things like seconds display.
Microsoft once admitted that w32time cannot reliably maintain sync time to the range of 1 to 2 seconds. This got better over time, and with faster hardware.
I'd heard that, but it didn't make sense to me. Yes, if it's going to paint once a second I could see that might use power, but one presumes they're clever enough to notice the machine in the sleep state and not do the thing that causes the badness. In technical terms, anyway.
The thing is that there was a possibility in Windows 10 to watch the seconds by clicking on the digital clock in the tray. This option was eliminated with Windows 11 and there was absolutely no option to get a live seconds display. This was simply hilarious and I had to install a 3rd party tool.
They say it's in " Settings > Personalization > Taskbar > Taskbar behaviors, and check the “Show seconds in system tray clock” option." If you don't have that option, you need to run an update - Microsoft claims it's there since Stable April 2023.
But my Mac does it so... welcome to the 21st century Windows! In all honesty, Windows is seen in more extreme cases than macOS ever could. Great video as always Dave!
Format the time once per minute as usual along with the seconds separator, then draw an incrementing number every second..? A bit of a hack obviously, but I can't see any reason why the seconds would need special formatting... They're the same everywhere, or am I mistaken?
Keep in mind that modern UIs use vector fonts, not raster fonts, so rendering even a single digit tends to hit a lot of code and temporary buffers to render even one subpixel-accurate glyph. In 1995 you could probably have hardcoded a raster font for the clock and gotten away with it, but it wouldn't have been forward-looking. Dave faced the same issue with the LCD-like readouts he put in the original task manager, where they didn't scale well with display tech.
And it could cache the 61 (leap second) possible values into a bitmap that will still be a fraction of the size used by most users for their background image.
@@Felice_Enellen The LCD readouts were (mainly) replaced because the localisation team couldn't use any other numerals than the "LCD" ones were programmed to and so was a nightmare to localise. Dave explained that some videos ago.
@@Rob_III Right. Ultimately the point is that you have to hardcode the digit display to make it truly performant and that inevitably limits what you can do in the future.
tldr? i want to absorb the information but my brain wont let me concentrate. *EDIT* i found a youtube summary extension in the chrome store. "According to Dave, displaying seconds on the clock would require the clock to update every second, resulting in a small but measurable impact on CPU usage. Back when Windows 95 was released, computers had limited resources, including CPU, memory, and disk space. Therefore, the Windows team prioritized keeping the system fast and responsive, and displaying seconds on the clock was deemed unnecessary and potentially detrimental to overall performance."
As this video started, i looked st my system tray and saw seconds and was confused. When he mentioned this was available in the registry of Win10, I realized I must have set this feature years ago and forgotten about it. Still, nice to know that MS will make it easily available.
Hello Dave, maybe u can explain, why Systray-Icons dont disappear when application are crashing? only when i move the mouse over the systray-icon the icon disappears. This effect is since Windows95.
I was displaying seconds on 16 MHz Macs with 4MB with SuperClock in 1991 and never thought much about it. There was no VM yet. We had just moved to 32-bit addressing. Apple eventually added the menubar clock with seconds off by default. I always turned it on. I'm measuring all sorts of things at a glance and not having seconds displayed feels wrong and constraining. When I managed hundreds of Macs 20 years ago, my boss identified which Macs I touched this way.
It doesn't seem right to turn this setting on, on computers you manage, assuming they are meant to be used by other people. It is after all, if performance can be ignored, a personal preference.
Another thing to recall is that a bunch of the OS in those old Macs was in ROM, never had to be paged in or paged out of anything. I think it got to the point where as much as 1 or 2MB of the OS was soldered (or socketed) onto the motherboard...?
I knew this was going to be fascinating! Having the seconds shown would have avoided at least one instance of being late for something because I hadn't realised the clock display had frozen! The caution about slowdowns made me wonder how much more resource-intensive would be an analog clock with a seconds hand, which were certainly seen in the '80s and '90s...but of course those were generally implemented as a separate program in its own window, so wouldn't be consuming resources by default. It all reminds me of the "Expensive Typewriter" and friends from MIT, and the extravagance of using that hardware for such frivolous things as word processing, gaming, or telling the time! Another potentially tricky thing is how to ensure that the clock only updates on the true second boundaries. When writing my own clock display code in high-level scripting languages, I've wished for some kind of interrupt that could be triggered every realtime "tick". It feels wrong to update once a second knowing that the display will be up to a second wrong, but equally it seems cheesy to update say 10 times a second and still know that you're wrong! (Of course, this issue still occurs with minutes, hours, days, etc. precision.) If using a wake-up timer, I'd be paranoid about the accuracy of such a timer, and how to ensure that it didn't drift too far from real time.
It’s a wonder no one ever just made a clock chip that just keeps time. All the software would have to do is initiate it on startup, print it’s current values, and occasionally check it against internet time for accuracy. This is a situation where hardware and dedicated circuits shine where software requires much more resources to produce the same results.
That’s exactly why real watch companies still exist. Why have time on a toy when a tool for time has existed for hundreds of years, a tool that’s way more beautiful than some poxy computer
I have; on a semi-daily basis; and I've missed it EVERY time I looked at my clock. Previous OS'es had registrykeys etc. to show seconds and Win11 had support for seconds on-and-off. I hope it now finally stays. When working with processes that fire or doing other 'real time' stuff, say, every 5 or 30 seconds having seconds in your notification area is a big deal. What was even more annoying was that even double-clicking the clock "expanding" the clock didn't show seconds in the later iterations of Windows where it did use to show an analog clock with seconds hand or a digital variation that DID include seconds.
@@willpemberton6823what a dumb statement… you call the PC a toy and the watch a tool? I agree a watch is a tool but I’d much rather have the more effective tool the PC than a watch.
@@willpemberton6823 Regardless of whether or not you need to see the time, your computer needs to know the time internally. There would be a clock even if windows had no clock feature.
Years ago we made a distributed database for a logging application that is still in use today. Windows didn't know the time accurately. We had to build our own time corrections into the system so times in the database were comparable. Today we use NTP clients, Windoze still doesn't keep time very accurately, especially on laptops.
With years ago you meant over 2 decades ago? Because Windows has had NTP support sinds Windows 2000. And even then; NTP has been around since '85 or thereabouts and been widely available as 3rd party software, even for Windows 95 (e.g. Dimension 4, Automachron) and 3.11 (Tardis, Netdate). And even without NTP, hardware GPS and "Radio Controlled" clocks have been around since forever.
NTP in Windows 2000 and later is explicitly dedicated to keeping the time error within the 5 minute margin allowed by the domain authentication protocol, anything better is a bonus .
@@Rob_III NTP support doesn't mean they keep time well. Even fairly new gear. If you build a good GPS clock and compare it you see the Windoze clock jerking around, getting worse as often as getting better. Real NTP does smooth time adjustments, most of these other time hacking programs do jerky clock setting. Lots of programs query NTP for the time, it's what they do with it that matters, and most apps don't handle it very well. Especially machines that hibernate or sleep, waking up doesn't seem to be handled well. If your application needs good time you'll probably have to deal with these issues. Take a fleet of laptops and test it. Displaying seconds is meaningless when they are seconds off. Eventually we just made our own GPS NTP servers and use actual NTP clients in every machine, nothing else actually worked consistently and smoothly.
It'd be interesting getting an interview with someone still with Microsoft that could talk about what fixes and workarounds they've had to do with the rapid integration of modern technologies like SSDs, VR/AR hardware, etc.
Can we have BIOS in UTC with an offset in the OS instead of setting the BIOS with the time zone? Means I can use Windows and UNIX systems, such as Linux, on the same system without fiddling with the registry :D
@@IshayuG MS-DOS only used local time and had no concept of time zones or UTC. To maintain compatibility when dual-booting legacy OS it made sense to keep the RTC set to local time. There hasn't been a good time to change this behavior. Actually the switch to UEFI would've been a good time to switch but it's too late for that now.
@@eDoc2020 I mean all you have to do is just make a Windows update that sets this and adds the registry key. Do it for Windows 10 and 11, all iterations/service packs if you will. Now every supported version of Windows behaves the same way and the same way as Linux, which is a more common usecase for dual booting than dual booting multiple Windows versions is anyway.
@@IshayuG I wouldn't want them to do that because suddenly switching on everybody could cause unforeseen problems. I'd rather see it just be the default on new installs.
If they wanted to cut the fat they could remove the telemetry baked in everywhere. Even with switches turned off, you can see all the traffic with an application firewall like netlimiter or an internet proxy.
Here is an interesting fact about the Windows System Clock. It is leap second aware! Starting a few years ago if there is a leap second (hasn't been one since 2016) the clock will actually show 61 seconds for that minute (if seconds are enabled). I worked on that feature. Dave may appreciate that in order to support the combability modes we needed to add a new flag to gflags. But Gflags was already using all the bits available. So Leap Seconds was the first bit for gflags2.
Why put it in gflags and not the clock global memory . Specifically the time of next leap could be a field in the internal timezone data (although loaded from elsewhere when applicable) Then the longlong to system time function can access the data without extra cache misses and the kernel time code can implement a way to keep the longlong time unidirectional during leaps .
It would show 60, right? Because usually after 59 there comes 00, and there's usually just one leap second so it would roll over after 60. At least from my vague "knowledge" of leap seconds.
@@johndododoe1411 I'm glad you asked. By default a process will not get leap seconds and will be in a compatibility mode. (In this mode the you will only get 60 seconds, but the 60th second will run twice as long to absorb the extra time.) This is because apps may crash if it gets more than 60 seconds in a minute. (Imagine an analog clock app trying to render that). But if your app is leap second compatible it can "opt in" by calling SetProcessInformation and setting the PROCESS_LEAP_SECOND_INFO_FLAG_ENABLE_SIXTY_SECOND flag. If you don't want to recompile your app you can use gflags to set that at runtime. The details for the leap seconds are stored somewhere in the memory of the kernel. If a new leap second is announced, it gets updated through Windows Update. You can use win32tm.exe to manually set leap seconds for testing purposes. Interestingly it was recently announced that no new leap seconds will be added after 2035.
Fun fact: Originally in leap years February had two consecutive 24th days instead of adding a 29th day.
@@wampacat6031 > Interestingly it was recently announced that no new leap seconds will be added after 2035.
but.. why? surely we still need leap seconds so the date doesn't drift?
Windows: saves performance by removing seconds in the clock
Also Windows: runs Onedrive and 10 other programs every boot
Also Windows: create Microsoft Edge WebView2
@@kleqx2842 52% cpu usage
@@madams4606 are you, by any chance, using an intel atom CPU?
@@_Tzebra_ no, 8 core amd
@@_Tzebra_I got an i7 13th Gen and webview2 still sometimes goes crazy
It's confusing that in Windows 10 you could just click on the system time to open the calendar view and see the time in seconds there whenever you need to, as a convenience, but in Windows 11 they took that away and now you have to either always see minutes only or always see minutes and seconds, with no convenience to switch from one to the other. It's a problem they created themselves.
The calendar was really useful overall
Yeah the right side of task bar is the one thing changed in Win 11 that is just objectively worse than Win 10. And it infuriates me every time I have to change my audio output device.
@@Megaranator I also think the taskbar being centered by default can be annoying if you try to click "start" quickly
@@d_9696 Oh that too but you can change that, regardless if I want to show the start menu or search for a program I just use the Win key on my keyboard
@@Megaranator True
I’m an automation engineer. You can’t believe how useful for debugging is having a seconds display, so much I’ve been relying on third party solutions for that for a decade now
I never cared much about not having seconds in the task bar. What really annoyed me is Win11's removal of the seconds when you click on the time to expand it and see the calendar. I've often used that expanded view as a low effort stopwatch, and it was taken away in Win11. Great to hear seconds are coming back, but it'll be a couple years before that build will be available on $work computers!
Yeah, give us back the clock on the calendar, windows. You heard him! and me
Companies are constantly making garbage changes that nobody wants and only make everything worse. 😒 Changes like ROUNDED GD MF CORNERS ON EVERYTHING. 😠 - Whenever you find yourself wonder wtf a company changed something that made the app/site/program/game/product/etc. worse, it's usually because some jerk was selfishly inventing work for themselves to justify continuing to have a job indefinitely instead of just doing a job, finishing it, and moving on. YT devs do it every month. 😒
Oh, and it's also infuriating to be unable to see seconds on smartphones as well. The default clock app on Android will show it (if you know where to look), but Apple refuses to show the seconds, you need a third-party app to see it. 🤦
Win11 has taken away so many useful features from users for the sake of styling things up 😒
Yes, I'd like to strangle the bloatware developer who thought pressing pause on a YT video could be "improved" by fading the screen into darkness.
@@I.____.....__...__ uuh, rounded corners on windows are not a modern feature. Windows XP, Vista and 7 all had rounded corners. And it was existent on Mac since MacOS X. If anything, it's sharp corners that are a modern feature (or rather, a feature brought back from the dead), not rounded corners
Wow, Dave, I like your pretty pink fingernails.
Jokes aside, I’m always happy when I see you’ve posted a new video. Not only are they extremely interesting, you often take me back to the days of Windows 3.1 or 95. Those were good times, indeed.
Hey I don't know if you've covered this already, but any idea why Microsoft search sucks so much? Like how is _Everything,_ a small app developed by one dude SO much superior to Windows search when the latter is developed by a billion dollar company and one of the few things that almost everyone still uses their operating system for: (finding the pdf they just downloaded so they can drag it into an email).
Two things come to mind. 1. Have you configured Indexing Options to cover all your common file locations? If Indexing Options are left to their default state, and you don't put all your personal files in your User folders (Photos, Videos, Documents, etc.), then you haven't configured Windows to index your files. It makes a world of difference in terms of search speed once Windows has indexed your files; I can hit Windows key, start typing the name of any file on any of my multiple drives that I want to access, and Windows will present it to me after having typed just a few letters or words; 2. Disabling web search in Windows Search also helps cut away the bloat and improve search time for local files. This can be done in Windows 10 using gpedit or via registry changes; not sure if it's still possible in Windows 11, I haven't upgraded to it, and won't consider it as long as they keep the retarded dumbed down right-click menu (while hiding away the full menu in a second click; this destroys my work flow on so many levels).
@@Cklodar There is an easy way to get the old context menu back by default. Just create a .reg file with this content:
Windows Registry Editor Version 5.00
[HKEY_CURRENT_USER\Software\Classes\CLSID\{86ca1aa0-34aa-4e8b-a509-50c905bae2a2}\InprocServer32]
@=""
and ofcourse run the reg file to import it.
@@gassie123indeed, that registry command is the first thing I run after completing every Win11 installation to get the full right-click menus back. It’s great.
MS is huge and things move incredibly slow now, everything is design by committee. They are so big they can just pump out turds and people will buy it. Look how many people still use Windows even though they hate it. Vote with your wallet.
I remember learning about virtual memory as a kid in the 90s... it was a magical idea. Around that same time, I remember dreaming of a hard drive with a gigabyte... I think I had a HUGE 16mb drive at the time. Now I have 128gb of RAM, terabytes of hard drive space, and that drive space is solid state that runs faster than the system memory used to run.
yeah in these days who cares about virtual memory when you can have more then enough ram and virtual memory kills an ssd these days in months
At Fermilab, in the Feynman Computing Center there is a display on storage tech over time. The large coffee table for the display itself is an 8Mb platter drive from the late 70s worth a few hundred thousand dollars. It progresses through time, showing how drives get larger in volume and smaller in size. It ends at ~2008 (we should update it) with examples of 128GB USB flash drives.
Yesterday, I had a 2TB MicroSD card delivered to my door for just over $100. The future is rad.
I’m dreaming of a hard drive with a petabyte of storage. It will be interesting to see when and how this will be accomplished as well.
i grew up on an Acer Aspire w/ a 1gb internal HDD. i would have to uninstall one game to play another. now when talking to my mother about data storage, i measure everything in "Acers."
I absolutely loved the red cup analogy. Thanks!
Glad it was helpful!
I love your videos! I have nothing of substance to add, but I wanted to leave encouragement and praise.
That's a great story and background on the process behind decisions like this. From the user side they always ask "How hard can it be?" and would be surprised to know the real answer. I run into this pretty often in system support, and have heard several times "Why can't this do X? It's just an IF statement." IF only it were that simple.
As someone who is inclined to say “how hard can it be?” in this case, I’d love to see the complete code for this functionality.
And yet, the clock had seconds if you clicked on the tray, the calendar would show along with the full clock with seconds.
Then windows 11 decided no more clock on the calendar.
I'd rather have that feature back, versus adding the seconds to the taskbar display
@@Aardwolf001 me too.
Maybe you can add in the clock to show in the calendar in some settings?
@@jonorgames6596 You can by setting an additional clock in the settings, but it doesn't show seconds.
Also, compared to the calendar clock which had very big text (it took the whole horizontal size of the calendar for the xx:xx:xx format) it is in a very small font
@@Aardwolf001 Same exact thought. I added seconds to my taskbar because there's nothing better for now.
But that feature was the best.
Every time I had to check if the minute just started or the :59 was about to turn to the whole hour, I tapped on my clock to see the calendar and I would get the full clock
Dave, thank you so much for such a detailed and incredible stories behind the "usual" Windows scenes. It is an honor, a pleasure and a huge luck having you here on RUclips telling all these stories with this passion and fun. Cheers and Kudos from another ex-MS engineer.
Longtime T-Clock user here. (I also remember struggling to keep the clock accurate because at some point it developed massive inaccuracy so I had to sync it to the router clock every 15 minutes or so.) - It is always nice to have options!
And T-Clock AFAIU can also offer more format flexibility than any Windows, so I have seconds and date with weekday but without year in it and it is very compact in H an V.
That seconds are a registry-only option in Windows 10 is another good example of the ugly mentality at Microsoft: Take features away, then advertise their return in the next version, and occasionally hide super-useful stuff, and then maybe also remove it because 'no one uses it' because you hid it. And they expect people to be such robots that they just go along with this mess.
>I had to sync my lock every 15 minutes or so
You do realize at that point you could use an egg timer or sun dial or a hourglass right?
In the same era, 92ish I was using macos 6. Showing seconds was invaluable to determine if the machine had crashed or not. Which was way more common than not. I used that all the way through the first versions of osx before I switched over to windows machines in the early 2000s so I could play games with my friends.
Thanks a lot for a good video Dave :) Also i enjoy you're way of speaking very clear and good flow
I appreciate that!
Video Request: REGEDIT
Can you please make a dedicated Video about the Registry?
When was it introduced? It probably started already with the first Windows Version. Or did MS-Dos.
Furthermore, I have the following question:
- How does a Windows Developer know which Key / Folder he should use to store it
- Did developers @Microsoft or from other vendors sometimes messed up the registry?
- Is there any documentation / Readme about it
- While trying to turn the seconds on my Windows 10 Machine. I googled it. I was wondering how people find out which setting they can add to the registry. Trial and error?
- Optional: Is there any safety switch that prevents an application with admin rights from writing / deleting Windows Keys which are essential
And of course any other secret story about it ;)
I would also like to see this
Also, loving your nails Dave.
Heh. This reminded me that Android development hit a snag when the UI was taking much more resources than it should to update the data transfer arrows on WI Fi and mobile connections
It isn't really necessary to do the whole formatting job every time. Incrementing and displaying a second count is the same regardless of any and all locale options. But the code is easier to write when the entire task including accessing the locale and formatting is repeated every time. We refer to this as "abstracting away performance." Most software is doing thousands of times as much work as it needs to because abstracting away performance is such a normal, universal thing to do. People want a job done, so they import a library that does it, and they ignore the fact that that library also does fifty other things as well none of which they need or which their code will ever benefit from, and the fact that that library depends on six other libraries which it needs only for one or two jobs each regardless of whether those other libraries are doing fifty MORE things each.
I’ve been setting this via the registry for what seems like forever. So useful to have those seconds there.
I always hide my taskbar so it's never been important to me, but what's the registry key?
@@Felice_Enellen From my understanding it's like a routing table for the computer to find files and programms. I really don't know, but if i remove all registry keys for one of my programs It'll still be on the computer, but the computer won't run the programm.
Winaero Tweaker can enable it
@@Felice_Enellen HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\Advanced and create a new DWORD called "ShowSecondsInSystemClock" then set the value of it to 1
@@404-UsernameNotFound Thank you! Just had to restart Explorer and it worked like a charm. Cheers!
Meanwhile, there are endless annoying features going on constantly that you have to deep dive into settings to get them all to stop. The seconds issue makes some sense when taken on its own, but is comical when considering all the 'functions' windows seems to obliged to process until you figure out how to shut them off.
It reminds me of New York State, when having budget arguments, will shut down a rest stop on a local highway 'to save money'. This rest stop has nothing but parking spots for cars and trucks and provides no services by the state at all. No buildings. They block it off supposedly to save money. It is clearly just done to annoy and inconvenience the public to make kind of point.
Dave, you are the best PR guy for MS of all time. I used to hate the company because of their buggy OSes, but your explanations of the problems with software development has softened my view toward them a lot. However, having said that, I still think Win7 is the best OS of all time and they need to steer the ship back in that direction. You can quote me on that, but I don't think it will carry much weight. Thank you for your service!
...and not just because he'd be relaying Ralph Wiggum's opinion on things. ;D
*Money.* Microsoft wants your data because that's where the money is at, and they'll manufacture justifications for that data being collected by presenting it obtusely as possible, as various data-dependent features you had never asked for, then say that disabling those features is a bad idea because it'll "[D]egrade your experience" _somehow._
Modern-day Microsoft can keep their network-burdening horsecrap right where it belongs; in the ass that should had dumped it int he toilet. While I enjoy Dave's commentary on Microsoft's decisions from an understanding and closure aspect, and _deeply_ appreciate the commitment of time he puts forth to produce videos like these, I'll stick with Linux systems for years to come - specifically, of the Arch variety, and _of that_ EndeavourOS. Stupid-simple, fun to play with, try it in a VM some time.
Oh Microsoft is still a god-awful company, God bless all the engineers working there doing the best they can though. Same thing in most places - lots of talented people on the ground floor, but it's all for naught when the ship is steers by morons in suits.
You can understand the reasons - more CPU and power used (the latter being important for laptops) - but it is ridiculous that the code for displaying seconds is present in the later versions of Windows, but no UI is present to enable the seconds!
When I ran Windows, I'd use a third party task bar tweaker to enable the display of seconds, which I suspect just modified a registry value to turn on the seconds. What I think Windows should have done - if they didn't want a UI - is display seconds when it is on mains power and not display seconds when on battery power only. Of course, all these years, they should have had a UI to pick whether you want seconds or not.
It should be noted that pretty well every Linux desktop UI for decades has allowed you to config whether you display seconds on the task bar clock or not (and defaults to not showing seconds, which I find an annoying default in this day and age, but at least I can change it!), so Microsoft has no excuse for taking 30 years for letting users have a choice. This is a big issue I have with Windows overall - its dismal lack of UI configurability.
I was fairly keen on the impact of showing seconds back when I was using a Mac with a G3 CPU on OS X (but with a Rage Pro graphics chip, therefore everything was basically being CPU rendered under Quartz), and when seconds were shown in the clock at the top, every second ticked, could be felt in Quartz's performance, like dragging a window, a "stutter" every second with the clock.
I think this is the real answer - not just Microsoft trying to save a kilobyte of ram
Yeah, 1 hz is a bit aggressive for a background GUI, even today. You can probably measurably affect your battery life by doing this. It's just not resolution I usually need and when I do it's fine to open up something else to do it.
I built a clock for an NLE that updated smoothly at 1/10 second resolution back in the OS 8 days with no noticeable lag. The bottleneck was always ATSUI text rendering, but even then it was fairly performant at fractions of a second. I guess I wasn't doing anything fancy, just requesting an update when the timer or other events fired, and then drawing the current time when the OS was ready.
Of course, the draw request responsiveness was coupled with whatever the underlying video hardware was running at (back then it was 30hz) but as long as you could finish the draw within that time interval, it would work smoothly.
@@andrewpotato6002 Dave described this as the principal problem, but always RAM is a concern.
As long as I remember, my Macs showed the seconds way before 2000.
I properly won't bother enabling anyway, but it's a fascinating anecdote. And as one who sold and build Windows 95 systems, I perfectly understand that sacrifices needed to be made.
Remember struggling with win95 and also having fun and thinking who were the people that code all this ?
So happy i see your videos
I really didnt care much about seconds display, but i had a hunch Dave might go on a tangent and was pleased with the virtual memory deacriptions. Learned about the page table today. I knew most of the other bits but tbat brings it all together. 👍
I did all 3, well 4. Liked, Subscribed, Shared on Twitter and Facebook.
Thanks a lot for doing so! Every bit helps!
Your (former?) colleague Raymond Chen has discussed this topic more than once.
On October 10th, 2003 ('Why doesn't the clock in the taskbar display seconds?) he pointed out just how heavy the impact was. It was in fact even harsher than you allude to in this video: the frequent updates were actually snowballing by keeping pages used for text rendering, taskbar window procedure and a lot of Explorer data structures from ever being paged out, all of which amounted to way more than just a few pages related to locale data remaining hot.
On April 11th, 2022 ('Now that computers have more than 4MB of memory, can we get seconds on the taskbar?') he dug into the topic once more. Here he brings up the impact on Terminal servers and the cascading effect on many different connected clients, which also explains things like caret blinking not being a thing in that situation. But even for single user systems, it becomes a barrier because such periodic activity interferes with the ability of the CPU to enter a low power state.
Which also loosely ties into my own frustration: something that constantly blinks or moves interferes with my concentration. It would be less of an issue nowadays (the clock takes up less screen real estate compared to back during the 640x480 days) bu constant changes will make those changes far more noticeable than the occasional surreptitious change.
My interpretation of the entire situation is basically that the clock on the taskbar is like rolling coal: you get some of the attention, but everyone thinks you are an asshole for doing it. Just drive to your destination safely and get your job done, Mr. Taskbar!
The CPU being unable to enter a low power state is what angers me the most about the solution Microsoft has chosen by making the seconds available permanently. personall I take that over no seconds at all. But what I find baffeling is all the way back to windows 98 first edition (don't know about 95) you could show seconds either by double or single click on the clock. And I bet 99% of people who want seconds were just fine with that solution. Why didn't Microsoft bring that version back, that worked just fine for decades? Would run the code only when seconds are actually needed and let's the cpu sleep when they're not.
For me, the April 2022 blogpost just doesn't stick because of the sheer size of telemetry and "suggestions" (i.e. ads) crams into Windows releases since Win8.
That stuff creates way more CPU wakes then once per second clock refresh + desktop repaint ever will.
Fun exercise:
Install a fresh Win10, finish setup/updates/drivers, open procmon from sysinternals and look what the system constantly does, quite noisy.
"periodic activity interferes with the ability of the CPU to enter a low power state", it might be possible to respond to this state and temporarily suspend updates of the time, also when the screen is turned off to save power. Please also see my reply to BlueSquid's popular comment.
In short, the code is wildly inefficient. Just the fact they're paging localization data every time the clock ticks makes me want to tear my hair out. CACHE CACHE CACHE
I failed out of computer science but everything you said sounds fake and reddity. I will agree it would be a harsh ui experience it's absolutely impossible for me to accept your cope. To believe that is to believe the same lies why a text editor is slower to respond on what is basically a modern super computer to a device 40 years ago. Devs don't want to admit they're bad. They have to iterate to keep their jobs. This over time degrades almost every user experience. Programming a clock is a high school intro level project 30 years ago.
I know I personally would be very annoyed to be woken up every minute on the minute to draw a new clock.
I never took into consideration a clock could be so complicated yet simple at the same time. Would really like to see a break down of the builds comparing the old and new clocks. As always, great video bro!
it's part of the charm of listening to an old timer programmer. every small feature they made had been meticulously optimized because they have to at that time. I wouldn't really want to experience the headache of being in that place and time but listening to them does make you feel like they care a whole lot more
@@aronseptianto8142 True tech has changed, and I second the notion how much easier technology has become in some ways. I'm glad the oldest tech is just the ones sitting on the shelf, and not one I would play around on. Lol good old blue emachine.
@@aronseptianto8142And today, it’s anthesis is Electron apps 🤮
May Allah (S.W.T.) guide you and bestow upon you His Blessings; Ameen.
@@aronseptianto8142 Yes, we had to optimize a lot in those days. In 1980, a lot of computers were restricted to 64K, including Data General's Micronova. I wrote a good portion of the OS for that and the Nova 4K in assembly; the rest in C.
Debugging? I would do core dumps to spocked-fed paper -- which came out in octal -- and spread it out on the floor and went up and down hunting down the bugs. Yes, the instruction set was aligned with the octal nicely, so I could do the decompilation on sight.
Nobody would do it that way today. There all sorts of robust debugging tools. But not back in 1980.
I consider most of the programmers today spoiled brats. I had a guy tell me he did Ruby because Java was too hard. I felt like smacking him. LOL
God bless you Dave.
Working with TS/Citrix for decades, I remember one of the early optimizations back when CPU and bandwidth were at a premium was to disable the clock completely to prevent sending out a screen update glyph for the time ticking over to every user every minute.
Does software like that not send out something every X seconds anyway? Just to keep the connection alive, so to speak, or to check if the user hasn't been disconnected for whatever reason? Seems like a pretty basic neccesity for a system that requires long-lived connections.
A few extra bytes to update the clock could easily piggyback onto this keepalive signal. Or the other way around. It also bares the question: why is the clock drawn on the server in the first place, sending it over as bitmaps? Seems like a waste to me if the client can do it as well. I think Citrix at least, can actually do that: by not actually providing a full desktop, but only the application the user needs, just as any regular window on the user's otherwise off-the-shelf OS.
@@thany3 Not by default, no. That actually causes issues with some network gear that assumes no packets = dead session and they would terminate the session trying to be helpful. So there is an option keep-alive for that but it's a tiny packet and we don't usually set it to 60 seconds. With Citrix, especially in the Metaframe 1.0 or winframe days, everything was sent to the client as a screen update as parts of the screen changed. It was done in a very efficient way but still, that's the way it worked. These days it works a little different but screen glyphs are still the majority of the server to client traffic (provided they are not printing or moving a file from the server to their client via the Citrix protocol, which is an option).
We could do published apps but desktops were and still are a popular option as well. It depends on the use case for the users. When we were trying to run sessions on low bandwidth connections (think 12kb/s cellular in the pre-2g days) every bit we could cull counted.
I used to use VNC to remote into my home computer, back in the Windows XP days. Something else XP did (optionally) was to provide a network activity monitor in the task tray. You would have the little icon of two computer monitors, which meant to imply network connectivity. One monitor would light up blue to show send activity, the other receive. When you're on a remote connection that sends screen updates over the network, changing the activity monitor caused a screen update which caused network traffic, which (potentially) caused the activity monitor to reflect that something had been sent (and ACKs received), which caused a screen update, which caused network traffic . . . . .
SO you're saying back in your days computers had runes? That's pretty based. @9Blu That's never gone all the way away. A faint neighbors wifi signal you've over gone your datacaps but you're with sprint before they got yeeted and they give you 10 kbs. 10kbs is an oasis in the desert compared to 0 internets. So you're saying I'll have this movie downloaded in only 3 days? I'd be all excited my gf would just glare at me lol.
Use to turn the second on by adjusting the time display format.
Personally, I dont like seeing seconds on a standard clock. I remember using a hack to put seconds on the clock, thinking that would be nice. It wasnt. My eyes get drawn to anything on the screen that changes - including the clock' seconds. I reversed that hack real fast. If I need to see seconds, I will bring up a stopwatch program... which is very rare.
Opposite for me. I've had seconds on my W7 for a decade by now and I won't go without them.
I can't see how your eyes could get drawn to some numbers changing in the corner of the screen, they're so small and discreet. I use a program called T-Clock to display the time with seconds in 20 point Calibri Bold font, as big as I can make it and still fit in the Windows 10 task bar, and I can't notice the numbers moving unless I'm looking almost directly at the clock.
"will be compared to the previous version and any slowdown will not be accepted"
and then came 11 that was mostly new shell for 10 and yet it was allowed to be so much slower
Holy cow would I like for somebody like Dave to do a series on the weird hybrid architecture of Windows 95. What were VxDs? What did the kernel look like? My understanding is that it was based on an earlier abortive attempt at a 32-bit DOS and it is not in KERNEL32.DLL, it had something to do with VMM32.VXD which also had a bunch of drivers compiled into it during setup or something. It'd be really interesting to get a behind the scenes intro to how freakish the whole thing really was.
Was there a team that worked on that kernel knowing full well that it was going to be thrown away five years later in favor of NT? Because we all knew that was going to happen eventually.
There are bits and pieces in Raymond Chen's blog, but there's no semi-comprehensive talk that I have seen.
Freakish and delicately balanced. Probably why it was so easy to break. Comparing Win95 with NT4 back in the day it was like they were made by different companies in terms of reliability.
@@drozcompany4132 pretty much. I switched to NT 4 as soon as it was released, and while it was a relative pain to set up (no PnP) it was a night and day difference in day to day use. But you do get a sense that some very crafty stuff was done to make Windows 9x work as well as it did and I'd love to see background on all of that.
@@drozcompany4132 ... yes on all points, due to the simple fact that Microsoft wanted Windows 95 to be (a) an ambitious step forward in terms of new features, & (b) backwards compatible.
It amuses me to no end that Microsoft & its fanboys were crowing that Win95 was "a true 32-bit operating system" ... because if that was true, it would've been practically impossible for it to run 16-bit software! Since it absolutely HAD to do just that, Win95 was built from the ground up as a newer version of the MS-DOS code base (DOS 7.0) coupled w/ all the code essentials to run Win3.x applications PLUS a newer version of the Win32s extension to run 32-bit applications PLUS the newer 32-bit graphical desktop shell
...and, if that sounds like the same old Win3.1x + Win32s setup hiding under a new 32-bit GUI coat of paint ... well, it was. By necessity, this was unavoidable in order to not break compatibility w/ yrs of support from the 3rd-party software developers for both DOS & Win3.x, but the result was that Win95 (& by extension Win98) was as every bit of a kludge as every Win3.x + Win32s installation, just buggier.
If you can get a hold of it, the book "Inside Windows 95" is a fascinating read on the software structure & inner workings of Win95 as well as a rare glimpse into the "design philosophy" of the most ambitious Microsoft project at the time (aside from the entire NT program). The whole affair puts me in mind of two lines, one from "The New Hacker's Dictionary" & its definition of "kludge", the other a humorous misquote from "2001: A Space Odyssey" --
"It was terribly tempermental and prone to frequent breakdowns -- but oh, so clever!"
"My God! It's FULL OF BUGS!!!"
YES that was informative and entertaining, although admittedly not useful, information. Downright fascinating.
Seconds are super useful when your system hangs. Unless you had a video or something playing, if your mouse and keyboard stop responding, it could be difficult to tell if it's a problem with input or if you've hanged up entirely. With seconds on the clock, you can immediately tell your system is toast if it stopped updating. Otherwise, you might have to wait a few minutes to be able to compare with an external clock
That's the exact reason I have seconds displayed on my PC
On the flip side, that's also IMHO an unmentioned reason not to include a clock with a seconds display (which should be supportable in some ways that minimize memory and CPU time, such as computing the display of everything but the seconds portion, and the range of times that should represent 0-59 seconds). The "problem" is that if people could see all the times their system backup task stopped running for awhile, they'd realize that their system isn't really as performant as they would like to believe.
From Windows '95 upto Windows 10, it is extremely rare (in my experience) for the mouse to quit being updated on-screen even with low memory / virtual memory killing performance otherwise. Sometimes it doesn't update at 60 frames per second, but almost always more than once per second. Updating the mouse requires redrawing what was uncovered in addition to drawing at the new position. Updating the clock display only requires writing to a fixed (not variable) portion of the screen once.
The seconds display code doesn't need to call any fancy time-formatting (you can do that once per minute). It just needs to increment the seconds portion of the display. And if keeping the font-rendering / GDI stuff in memory is a problem, then just blit a pre-rendered bitmap to the screen (presumably that code is in RAM like the mouse blit software). You only need 10 digits to be pre-rendered. If each digit needs 8x8 pixels (64 total) and the display uses 32-bit color (4 bytes/pixel) then you would need 256 bytes per digit, and thus 2560 bytes for all ten digits. That would easily fit in a single 4K memory block.
I think the software devs weren't trying very hard to code, but put a lot of effort into making excuses!
Whoa …. Just viewed for why no seconds on the clock. And down the rabbit hole of virtual memory and page files. Love it. Thanks Dave.
While this is a valid explanation to why seconds were not turned on by default, I still cannot grasp, why Windows 10 decided to make that feature a registry-only option and there wasn't a GUI way to turn it on. I mean, Windows is great, when it comes down to warning a user about anything in the world. That could be an "Information" button, or a "Yes/No" dialog, when it is turned on. Moreover, Windows 10 is a relatively recent OS, which was released 3 years later, than the NVME SSDs appeared on the market. So there really should have been an option. For example, ClockDummy software was available since the Windows 95 days and the license cost almost $30, so what was the real deal?
Oh, amazing! Just yesterday was looking for info how to return the seconds to windows clock
This man is literally dissecting windows for us, bit by bit.
Well Dave found a gold mine indeed, if he manages to live for a few hundred years more. 😀
It's cool to hear about the hurdles they had to overcome and how they did it, especially since my experience with Windows was always focused on the bugs.
@@Chimel31Haha. He has indeed. He used to be in it for the subs and likes, now he's mostly in it for the subs and likes ;)
Dave you are so cool...half way through your book BTW.
I remember my iMac G3, the plastic blue one, to be able to display seconds in the menu bar back then in 2000. I was very surprised that Windows was not able to do such a basic thing all the time, I mean there is a registry key you could change and Windows was able to show seconds, but there was no setting in the App or system preferences for this. Good think they finally added this option. I use the seconds on my Mac all the time, never experienced any problems with this even back in 2000
Well in 2000 maybe you already got a much faster computer than a 386 with 4mb of ram running windows 95.
And seconds are nice but not something crucial.
Even on a pentium machine with 16mb of ram from mid 90s, showing seconds should be a trivial task.
I've used seconds on the clock for years on both my Android phone and in Windows 10 with a regedit.
You can see that it affects performance on the phone sometimes when you unlock the phone and immediately look at the clock it's sometimes frozen for a few seconds before it starts going
One thing that strikes me is why the clock would need to format and paint the whole time string every second. Every second the whole time string will be exactly the same except for the two digits of the second. What can be done, and I have actually done myself, is to have a secondary routine that only updates the seconds display. The whole displaying of the time once a minute is done as usual but then separately once every second the two digits for the seconds are drawn. I often do such things when using microcontrollers with very limited processing and limited display bandwidth, but in the past, like last century past, I also used it in PC and Amiga applications.
I was wondering the same, but we’re living in an age where such intricate designs are easy and well known. Probably was a nightmare to implement in older systems
Is it perhaps that the numeric characters are not uniform in width or equally spaced so the pixels dedicated to seconds may vary from 01-59?
@@SirKenchalotGraphic designer here... The numeric characters for fonts intended to be used in a UI should not have this issue for that reason and more. In fact, most good fonts intended for body type behave this way. This applies to the classic, default Microsoft fonts I can remember (Calibri, Tahoma, Segoe, MS Sans Serif...). Although how the type is rendered can affect the kerning and other behaviors but that shouldn't be a factor... For example in .NET DrawString() for graphical text is different then DrawText() with the text renderer used in explorer and there are various attributes for this.
I can't say for sure how all languages would be affected but I'm fairly sure the UI would behave the same way. Back when they made it easy for users to change various fonts it could be problematic but obscure fonts had some issues anyway.
The whole time string may not be identical for a whole minute, it may be identical for either 10 seconds or 50 seconds only: The time format can be set to either "ss" (coded on 2 digits) or "s" (coded on 1 digit with no significant leading zero for the first 10 seconds from 0 to 9, then 2 digits from 10 to 59). Since the time string is displayed aligned to the right, every time there's a change in the number of digits displayed (seconds, but also minutes "m" instead of "mm" or hours "h" or "H" instead of "hh" or "HH"), then the time string needs to be drawn in a different position.
Internationalisation (or localisation as they call it in software) and customization make a great worldwide product but are not without compromise...
It is probably easier to punt and just repaint the entire string in order to simplify. That also handles step changes which may occur in the middle of a minute. I agree those cases can be handled by logic to make repainting smart, but that itself takes CPU time and introduces opportunity to make an error
On any OS clock, the only reason I like to see the seconds is so I can tell when the UI has frozen quickly 🙄
While working my Masters, for an OS course, I did a mini thesis on memory paging techniques.
I chose the topic because I knew the one implemented in production code was least recently used.
Still, even knowing the answer it was fascinating seeing the simulated effects in different sizes and numbers if swap pages.
Reminds me of the implementation difficulties with implementing CRON on new platforms. CRON implementations vary pretty widely between different hardware architectures, and as such, how it was able to and how well it was able to maintain that coherence differed. In theory, cron should be able to handle hundreds of thousands of schedules down to the second with only very minimal slew in the range of milliseconds. To do that and keep it performant, rather than check every system clock tick to see if one of these huge numbers of jobs are scheduled to run, it creates a list of counters (based on system ticks) to decrement each tick. Once an item hits zero, it spawns the process, figures out the next run in ticks, and does it again.
Low level time management is a really interesting part of computers, and it doesn't surprise me that Windows had to play it low key and cool as well.
Raymond Chen also wrote about this just last year and he still mentioned possible performance impacts.
Sounds kind of funny, when you consider how many thousands of events per second sysinternals process monitor spits out.
I've often wondered what Android performance would be like if it weren't spewing 8 million lines of debug logging per second.
@@renakunisaki To be honest, I've never heard of it and I don't know enough about it.
But if that is really the case... ignorance is bliss. ;-)
Cool. Thanks for sharing.
I like to think of virtual memory as being similar to FM radio (kids still know what radio stations are right?) Think of the processes like cities. In one city, 104.9 is EZ Rock, but if you drive to another city where there are different stations, it's The Zed. Same number, but different contents. Similarly, the same address might point to one DLL in one process, but a completely different DLL in another: same number, but different contents there.
The rock-solid gaze at the camera is the reason I subscribed.
Glad I did.
So, Virtual Memory is a bit like the banking system…”money isn’t physical there” 😂
As regarding the lack of seconds, I actually always found that quite alright. I would’ve been constantly distracted by something updating on my screen, when trying to focus onto something else. Is perfectly fine not to see the seconds.
We should call it Fiat RAM from now, haha.
Yeah, this should be easily selectable from a clock right click, just like the dialog that shows a clock with seconds if you have permission to set the time .
Even Win3.1 had clock.exe to show seconds elsewhere .
True... if I'm doing something that involves the taskbar, I do sometimes pause for a few seconds to look at the seconds :)
It's very much like the banking system indeed: not only is it not there, it gets taken away when you need it, and when you get it, it's never really yours. 😊
It makes me wonder how a 1970s era Casio watch is powerful enough to display seconds
Because that's all it does and there's no software, all done in flip flops
Kind of understandable, but not so much in view of other wasted memory for all the bloatware in Windows. Could have been cut from some other things at the cost of assembly-level optimization. Like improving memory fragmentation handling.
Exactly! Spying useless services running in background consume A LOT more CPU cycles than stupid clock updating every 1 second.
Well from one point of view, the bloatware makes msft money...
@@sadmoneysouljaFashionable but meaningless remarks.
@@sadmoneysouljaNot really, mine and many thousands of people's careers wouldn't have existed without MS. The whole thing has been a net good.
i cannot agree with the blanket assertion that ms windows was a net good.
as terry pratchett pointed out in the book "raising steam", when it is time for trains,, you get trains. similarly, the dreadful design of the pc came about because everyone else was building a comparable product and ibm did not spot this disruptive technology coming. even desktop gui designs date back to xerox parc back in the 80s, and only needed the hardware capabilities to catch up.
in the mean time, you gradually had a diversity of manufacturers converging on standards for interoperability, which microsoft deliberately killed to force the windows virus platform into existence.
there is lots of evidence of lots of harm caused by microsoft in their relentless abuse of monopoly power which undercuts your blanket assertion.
note: i am not saying that it did no good, just that the balance is a lot finer than the fan boys think, and that a case can be made for a net harm.
Never missed or need seconds, never dreamed it was such a complex task to do so. Been working through Windows Internals books for Windows 10/11 (only because I'm a giant nerd and fascinated by the inner workings of stuff) and memory management has been a struggle to grasp. Your tutorial really helped, thank you!
Granted, it was an app, but 3.1 had a clock, and it displayed seconds (at least in analog mode). If you put the clock app in your Startup folder, it would automatically run on startup. And with a little bit of adjusting, it can hover wherever you wanted.
I used to have the analog clock hover in the main background (outside of the Program window).
Yeah but it was a butt ugly clock. And there were no such things as "apps" in those days. They were programs.
A now cheap upgrade for a system maxed out on ram is an SSD. So much faster handling the swap file it wakes up an old system or the few underpowered off the shelf systems that still sell with a spinning disk and very little ram significantly. Never really had anyone ask for seconds as most applications that needed it like editing software had it built in.
When early Windows 95 betas came out i was so happy to not have to use Windows 3.1 I didnt care about the clock seconds. It did run quite nicely on my 486dx2 66 and later 100 mhz systems with 4mb of ram when ram was going for around $50 a megabyte.
Later on though by around 2010 i was really wanting seconds for the system clock. I just kind of forgot about it until this video and now i can finally enable it.
Same. Windows 95 was a massive upgrade and the first Windows OS for the general public that was any good (not wanting to offend anyone). I came from an Amiga which already had a pre-emptive multitasking OS with a windowing gui. Win 3.1 seemed positively archaic in comparison.
Thank you for this.
I'm not properly educated about this stuff but somewhat fluent in Arduino C and such, and microcontrollers are generally very low power single-thread computers which makes any real tasks require careful timing considerations to not hang up the CPU with delays and loops what can be done with short checks against the clock. It took me a while to be able to, say, generate two pulsed outputs at two different frequencies from a single program on a single core. Love hearing the insight you have about this stuff I learn a lot thanks!
I had seconds in Windows clock for years, there was a registry key for it. It definitely worked in W10 but I think I used it also in W7. Only after I upgraded to W11 it stopped working and I had to download an app for it.
I have no problem with seconds being off by default but it should be possible to enable them for people who actually use them like me. Having the feature available for years and then removing it is silly.
I'd love to hear about how the windows UI works and how it's separated from the kernel. I also am curious in comparison to say a game that pretty much creates and redraws the entire screen every frame, how is the windowsUI divided and drawn in comparison? How does it's "frame loop" work?
I love it! I just changed my clock to show seconds! There is really no particular reason for me to do this except to do it because I can. Much like upgrading performance in a car I am going to drive on the streets -- it won't change anything. But it's cool to know that mine has more capabilities than my neighbor's.
I would love to see a video why the explorer process contains the whole desktop with the taskbar. I guess there is a reason for this strange design but why call it explorer.exe and not windowsui.exe or something like that.
the taskbar and start menu are now separate processes (somewhat) with windows 11, you can see it in task manager.
the desktop needs to be part of windows explorer since it's essentially a whole folder with large icons.
Probably because explorer.exe was already baked into the API and changing it would be difficult and probably break backwards compatibility. This was before Internet Explorer was a thing. Dave may know, but the windows file manager may be the original explorer? @DavesGarage?
@@SmiliesGarage shell32 provides the API for the shell. I think the reason is that early versions of Windows 95 allowed for the program manager to be used instead of explorer. The program manager doesnt use the taskbar. You can switch what application is used as the shell in WIN.INI / registry idk which.
@@DorperSystems Yep, you are spot on with this. Thanks!
@@DorperSystems interestingly, progman remained in windows till XP SP1! It was removed in XP SP2 according to a quick search.
Enlightening. Thanks!
It's crazy to me that proper customization of the clock and date have never been a thing without external tools. On my Win7 with Aero and two lines of tasks thickness I can see the clock, the day of the week, and the date, all three of which are centered. This is basically impossible to achieve in Win10 or Win11 without external tools such as Windhawk (which I love btw).
Windows Vista and 7 introduced such a beautiful desktop that has been completely obliterated in 10/11. It's a shame they just threw all that hard work away.
@@drozcompany4132 After Win7 they started to dumb down stuff and hid relevant options in sub menus, making it more difficult to get to the settings you actually want to instead of this useless junk they put in menus in Win10 with that tiles design.
Another fantastic short video for all to understand. Cheers Dave.
Will the calendar also finally have week numbers?
They always have had that feature. It’s a preference thing.
Go to search bar -> type calendar -> click settings -> click calendar settings -> scroll down until you see “Week Numbers” -> from “OFF” change to “First Day Of The Year”.
It’s easy.
Thanks Dave, I've always wondered
Happy to help
@@DavesGarage I'm very MS averse, I wish the whole IBM derived architecture and Windows would give way to RISC V and nix, but you add a very human face to how it was. I'd also like to thank you for all you have done for Aspergers and Autistic people such as myself and so many others.
It'd be better if they spent their time restoring vital productivity features like "Never Combine Taskbar Items" rather than pointless things like seconds display.
Microsoft once admitted that w32time cannot reliably maintain sync time to the range of 1 to 2 seconds.
This got better over time, and with faster hardware.
Either it is mrs. Dave handling the cups or mr. Dave prefers red nail polish.
I'm curious about power consumption. If i recall correctly, power usage is why animated wallpapers in Vista were removed in Win7.
I'd heard that, but it didn't make sense to me. Yes, if it's going to paint once a second I could see that might use power, but one presumes they're clever enough to notice the machine in the sleep state and not do the thing that causes the badness. In technical terms, anyway.
The thing is that there was a possibility in Windows 10 to watch the seconds by clicking on the digital clock in the tray. This option was eliminated with Windows 11 and there was absolutely no option to get a live seconds display. This was simply hilarious and I had to install a 3rd party tool.
Who willingly uses Windows 11 and needs to know the time down to the second?
They say it's in " Settings > Personalization > Taskbar > Taskbar behaviors, and check the “Show seconds in system tray clock” option." If you don't have that option, you need to run an update - Microsoft claims it's there since Stable April 2023.
@@Lernos1 ah thank you. This must be new. There is also a hint that it needs more energy.
@@encycl07pedia-When you are scheduling college classes and need to beat everyone else to the very second for good professors 😅
This was a lot more interesting than I expected.
But my Mac does it so... welcome to the 21st century Windows!
In all honesty, Windows is seen in more extreme cases than macOS ever could.
Great video as always Dave!
thanks for information.
Format the time once per minute as usual along with the seconds separator, then draw an incrementing number every second..? A bit of a hack obviously, but I can't see any reason why the seconds would need special formatting... They're the same everywhere, or am I mistaken?
Can you display it in Eastern Arabic numerals?
Keep in mind that modern UIs use vector fonts, not raster fonts, so rendering even a single digit tends to hit a lot of code and temporary buffers to render even one subpixel-accurate glyph. In 1995 you could probably have hardcoded a raster font for the clock and gotten away with it, but it wouldn't have been forward-looking. Dave faced the same issue with the LCD-like readouts he put in the original task manager, where they didn't scale well with display tech.
And it could cache the 61 (leap second) possible values into a bitmap that will still be a fraction of the size used by most users for their background image.
@@Felice_Enellen The LCD readouts were (mainly) replaced because the localisation team couldn't use any other numerals than the "LCD" ones were programmed to and so was a nightmare to localise. Dave explained that some videos ago.
@@Rob_III Right. Ultimately the point is that you have to hardcode the digit display to make it truly performant and that inevitably limits what you can do in the future.
Been using that little registry edit for almost 10 years now.. Good to finally have a simple option for people though for sure.
Where I think they crossed the line is when they removed the clock from the clock app
Excusez moi?
Cannot tell what you are trying to say.
tldr? i want to absorb the information but my brain wont let me concentrate. *EDIT* i found a youtube summary extension in the chrome store. "According to Dave, displaying seconds on the clock would require the clock to update every second, resulting in a small but measurable impact on CPU usage. Back when Windows 95 was released, computers had limited resources, including CPU, memory, and disk space. Therefore, the Windows team prioritized keeping the system fast and responsive, and displaying seconds on the clock was deemed unnecessary and potentially detrimental to overall performance."
As this video started, i looked st my system tray and saw seconds and was confused. When he mentioned this was available in the registry of Win10, I realized I must have set this feature years ago and forgotten about it. Still, nice to know that MS will make it easily available.
Liar.
Hello Dave, maybe u can explain, why Systray-Icons dont disappear when application are crashing? only when i move the mouse over the systray-icon the icon disappears. This effect is since Windows95.
I was displaying seconds on 16 MHz Macs with 4MB with SuperClock in 1991 and never thought much about it. There was no VM yet. We had just moved to 32-bit addressing. Apple eventually added the menubar clock with seconds off by default. I always turned it on. I'm measuring all sorts of things at a glance and not having seconds displayed feels wrong and constraining. When I managed hundreds of Macs 20 years ago, my boss identified which Macs I touched this way.
It doesn't seem right to turn this setting on, on computers you manage, assuming they are meant to be used by other people. It is after all, if performance can be ignored, a personal preference.
Another thing to recall is that a bunch of the OS in those old Macs was in ROM, never had to be paged in or paged out of anything. I think it got to the point where as much as 1 or 2MB of the OS was soldered (or socketed) onto the motherboard...?
I knew this was going to be fascinating! Having the seconds shown would have avoided at least one instance of being late for something because I hadn't realised the clock display had frozen!
The caution about slowdowns made me wonder how much more resource-intensive would be an analog clock with a seconds hand, which were certainly seen in the '80s and '90s...but of course those were generally implemented as a separate program in its own window, so wouldn't be consuming resources by default. It all reminds me of the "Expensive Typewriter" and friends from MIT, and the extravagance of using that hardware for such frivolous things as word processing, gaming, or telling the time!
Another potentially tricky thing is how to ensure that the clock only updates on the true second boundaries. When writing my own clock display code in high-level scripting languages, I've wished for some kind of interrupt that could be triggered every realtime "tick". It feels wrong to update once a second knowing that the display will be up to a second wrong, but equally it seems cheesy to update say 10 times a second and still know that you're wrong! (Of course, this issue still occurs with minutes, hours, days, etc. precision.) If using a wake-up timer, I'd be paranoid about the accuracy of such a timer, and how to ensure that it didn't drift too far from real time.
Should i be able to access that Video?
I love your content Dave, thanks so much!
wait how was this in my recommended feed if it's been unlisted since yesterday?
Did you maybe see it on Facebook or Twitter? I posted it in advance there while still unlisted.
@@DavesGarage oh that's it yeah. I for some reason remembered seeing it on my youtube homepage but now that you say that I saw you post on twitter
@@DavesGarage i just randomly went through your playlist ^^
It’s a wonder no one ever just made a clock chip that just keeps time. All the software would have to do is initiate it on startup, print it’s current values, and occasionally check it against internet time for accuracy. This is a situation where hardware and dedicated circuits shine where software requires much more resources to produce the same results.
I can't think of a single time that I've looked at the taskbar on my computers and thought, "I wish it showed the seconds".
That’s exactly why real watch companies still exist. Why have time on a toy when a tool for time has existed for hundreds of years, a tool that’s way more beautiful than some poxy computer
I have; on a semi-daily basis; and I've missed it EVERY time I looked at my clock. Previous OS'es had registrykeys etc. to show seconds and Win11 had support for seconds on-and-off. I hope it now finally stays. When working with processes that fire or doing other 'real time' stuff, say, every 5 or 30 seconds having seconds in your notification area is a big deal. What was even more annoying was that even double-clicking the clock "expanding" the clock didn't show seconds in the later iterations of Windows where it did use to show an analog clock with seconds hand or a digital variation that DID include seconds.
@@willpemberton6823what a dumb statement… you call the PC a toy and the watch a tool? I agree a watch is a tool but I’d much rather have the more effective tool the PC than a watch.
@@willpemberton6823 Regardless of whether or not you need to see the time, your computer needs to know the time internally. There would be a clock even if windows had no clock feature.
Macs have had seconds since forever. It was a really handy way to visually deduce if the system's frozen or the application.
Windows has pretty much always displayed seconds on the analogue clock.
Years ago we made a distributed database for a logging application that is still in use today. Windows didn't know the time accurately. We had to build our own time corrections into the system so times in the database were comparable. Today we use NTP clients, Windoze still doesn't keep time very accurately, especially on laptops.
With years ago you meant over 2 decades ago? Because Windows has had NTP support sinds Windows 2000. And even then; NTP has been around since '85 or thereabouts and been widely available as 3rd party software, even for Windows 95 (e.g. Dimension 4, Automachron) and 3.11 (Tardis, Netdate). And even without NTP, hardware GPS and "Radio Controlled" clocks have been around since forever.
NTP in Windows 2000 and later is explicitly dedicated to keeping the time error within the 5 minute margin allowed by the domain authentication protocol, anything better is a bonus .
@@Rob_III NTP support doesn't mean they keep time well. Even fairly new gear. If you build a good GPS clock and compare it you see the Windoze clock jerking around, getting worse as often as getting better. Real NTP does smooth time adjustments, most of these other time hacking programs do jerky clock setting. Lots of programs query NTP for the time, it's what they do with it that matters, and most apps don't handle it very well. Especially machines that hibernate or sleep, waking up doesn't seem to be handled well. If your application needs good time you'll probably have to deal with these issues. Take a fleet of laptops and test it. Displaying seconds is meaningless when they are seconds off. Eventually we just made our own GPS NTP servers and use actual NTP clients in every machine, nothing else actually worked consistently and smoothly.
It'd be interesting getting an interview with someone still with Microsoft that could talk about what fixes and workarounds they've had to do with the rapid integration of modern technologies like SSDs, VR/AR hardware, etc.
Can we have BIOS in UTC with an offset in the OS instead of setting the BIOS with the time zone? Means I can use Windows and UNIX systems, such as Linux, on the same system without fiddling with the registry :D
And actually, why was THAT choice made?
UTC is fake time. Fake news.
@@IshayuG MS-DOS only used local time and had no concept of time zones or UTC. To maintain compatibility when dual-booting legacy OS it made sense to keep the RTC set to local time. There hasn't been a good time to change this behavior. Actually the switch to UEFI would've been a good time to switch but it's too late for that now.
@@eDoc2020 I mean all you have to do is just make a Windows update that sets this and adds the registry key.
Do it for Windows 10 and 11, all iterations/service packs if you will.
Now every supported version of Windows behaves the same way and the same way as Linux, which is a more common usecase for dual booting than dual booting multiple Windows versions is anyway.
@@IshayuG I wouldn't want them to do that because suddenly switching on everybody could cause unforeseen problems. I'd rather see it just be the default on new installs.
Seconds are now displaying on my desktop for the first time in 30-odd years. Balance has been restored to the force! Thanks, Dave.
If they wanted to cut the fat they could remove the telemetry baked in everywhere. Even with switches turned off, you can see all the traffic with an application firewall like netlimiter or an internet proxy.
All that stuff simply didn't exist back in the days that memory was constrained.
Commonly repeated but not actually true.
@@eadweard. I checked it it myself, so safe to assume you know nothing.
@@linux666 Course you did :)