I lived through the Y2K problem, and was actually on a team to deal with it within my company at the time. I honestly believe that it was not overblown at all, and that the reaction people had to it forced companies and governments to do something about it. It didn't matter if Company A believed it was a big deal, because Company A's customers were demanding it be addressed, and Company A's competitor, Company B, was already on it. Company A knew they'd lose customers to Company B if they didn't address the issue, so Company A dealt with it. And in the US, it was brought up early enough that it impacted politicians, too. People wanted this addressed because of all the hype, so if you weren't in support of spending money to address it, you would lose votes. I guess you could say that the scare mongering, which may have been overblown, forced people to take preventative measures, and, as a result, nothing major happened.
I was a developer for a credit union in the 90's. Though most of our code would have been ok, storing dates as the number of seconds since a specific day, there were some that would have failed. I know for sure that there would have been some serious issues, at least financially, if nobody worked to fix the issue. Of course humans are smart enough to make adjustments as needed, and things weren't overly automated yet, but it would have caused problems.
The Y2K bug was fixed globally. There were very few control groups. Few systems were left alone to see what would happen. The corona virus does have control groups. The first-world has the resources to ride out the effects of lock-down. The third-world which can't pump stimulus money or have the hospital resources are the control group. The vast slums of Africa and South America are going to show us what happens when the corona virus infects nearly everyone simultaneously and kills 2-5% of the infected.
@@CarFreeSegnitz Total mortality rate isn't going to be nearly that high. Herd immunity kicks in faster than you think and untested asymptomatic/mild cases seem to make the prevalence at least a magnitude higher than we see in official confirmed numbers. I expect the mortality rate to be more like 0.3% of actual total cases in a country with a western demographic make-up, probably a third or lower in a younger country.
@@RBuckminsterFuller 0.3% mortality is still quite bad if it infects everyone. It means that you have around 1 in 3 chance that someone you have stable social relationship(friend, colleague, family member) will die, assuming Dunbar number around 100. 0.3% is still 3000 deaths per million people, so around million deaths in US alone.
I'm not always in agreement with The Good Stuff videos, but this one; YES. I also remember the Y2K times quite well and I was working at a call center for Bell South dial-up internet support when we rolled over to January 1, 2000. It was a relief that nothing big happened and I absolutely believe that it was because of the hard work ahead of time since it was clear that some systems that were NOT upgraded had failures.
I was 4 when the Y2K bug was causing all that bug, honestly all I can remember about it was the Simpsons episode that made a joke out of it being the end of the world. For the longest time I thought it was some myth that made the general public freak out like 2012 but then some programmer friends explained to me why it was a problem and it makes sense how much of a disaster it could've been. I'd say the hundreds of hours of hard work by hundreds of engineers and programmers was definitely worth it and deserves recognition.
While 2000 is 20 years ago, another *two* "millennium" bugs are looming at the horizon. The NTP epoch overflow in 2036, and the Unix timestamp overflow in 2038. While both have reasonable solutions, there is a scary amount of legacy devices that are still vulnerable to there bugs. Even scarier is how dependent we've become of the digital infrastructure over these past 20 years. When the year 2000 happened, internet was commercially available for less than a decade. A few pioneers were trying to make money off of it. But today tens or hundreds millions of people are depending on this infrastructure for their livelihood.
Y2K helped spur investment into computers. Lots of places were limping along with 20-30 year-old hardware and software. Why fix it when it's not broken, right? But Y2K broke them all. It was a lot of money. But it wasn't money just piled up and burned. It went to pay technicians. It went to motivate people to learn about computers. It probably contributed to the Dot-Com bubble, a super-weird time when burning through cash reserves was seen by the stock market as a good thing.
*nod* I've seen the argument that y2k spending is what really got the dot-com era going. huge amounts of capital going into a tiny industry, and a lot of people in it all of a sudden being flush wish cash to spend on experimental projects.
Why do I get the feeling that timing of this video has something to do with current 'troubles'? When you avert or minimize a disaster, you are branded an alarmist. When a disaster hits, you are branded as an incompetent for ignoring the advice and signals.
Being the head developer in a large company for the y2k problem. I feel the $600 billion may be over blown. Most (not all) of our work for y2k consisted of "well if were are working on that area anyway fix the year too" and a bunch of testing. Also a few older PCs in the control systems had to be replaced. They could have been fixed but where well past the best before dates anyway.
I remember the panic. I was also forced to work overtime at the hospital I worked in case of a power outage. Nothing happened.. A much more real and annoying problem was the failure of the date to roll over at midnight. This happened with the BIOS in an IBM Compatible. The way out of this was to burn a genuine IBM BIOS. They were good days.
I remember people being really worried about the Y2K problem. I dont remember any news networks talking about a fix that the industry was trying to establish. I watched news channels as a kid but it could just be my memory. I also remember when the new year rang in, people celebrated more and were more happy than any other new year I've experienced since.
"MM/DD/YY" - pfft... no programmer in their right mind uses that format for other than frontend presentation :P "YYMMDD" were the way to go because it was easier to sort dates based on that syntax ;)
You also realize that UK uses YYDDMM, and how do you solve for hardware that used IC's that counted to 00? Ask me i did the actually did the rollout from 97 to 00 I made so much money.
yeah, I think that one is going to be an even bigger nightmare to fix, esp as so much programming has moved away from the heavy 'big design up front' techniques that made the y2k stuff so comparatively easy to identity.
@@lulu4882 The Unix-based systems (including Linux) that are for 32-bit processors, use 32-bit integers for representing the time, where they're counting the number of seconds since the start of January 1st, 1970. One of the bits of the integer is sometimes used for something else, (basically a negative sign), leaving 31 bits, and the highest number that can be represented in a 31-bit integer corresponds with a time early in the year 2038. Unix-based systems tend to be used for a lot of control systems and servers. 64-bit Linux doesn't have the issue, but a lot of old systems are probably still using old 32-bit computers.
I predict that there will be cluster of bugs near February of 2100. Starting year earlier. Because 2100 is not a leap year. Fortunately year 2000 was an exception to an exception and it was a leap year. Otherwise the February 2000 would probably have caused much much more damage than Y2K bug ever did.
Y2K was crazy back when it was going on. I remember so many software creators, offering their sure fire "Cures" for computers. Some made sense, while most were nothing more than a way to separate you from your money, LOL! We had one heck of a New Years Eve party that year, just waiting to see what would happen at the stroke of midnight. The only thing that happened that night was, we all got drunk, and played Monopoly...
It was probably the most disappointing New Years Eve in my experience. I worked in computer tech support at a major university at the time. Our boss had us at our desks on New Year Day like it was a regular workday. We sat staring at our phones. To his credit our boss ordered in a party sub-sandwich. Party like it was 1999? Yeah...no...went to bed early and went to work the next day.
Is it really that big a deal to change the date format in the source before sending out the next version? Seems to me this would be a one-guy-one-afternoon job for most applications. Even if it went wrong set your system clock back a week and bam, it should work again. Some networked stuff was probably more advanced, but putting an extra line in the updated API a few years before the issue hit...?
There's too much wrong with this comment. To sum up: Much less was possible due to the way applications were built and rolled out. Applications are context-dependent. Current-day solutions don't work in old contexts.
It is incorrect to describe this as a glitch or bug though everyone at the time called it the Y2K Bug because it was the result of deliberate decisions but early programmers.
That's like writing a time as seconds/minutes/hours. So no! The standard in computers, and for the one recommended by UNESCO, ISO, et al in written-out dates, is year/month/day, the traditional way used in Japan and in computers.
I hate that people think it was overblown... We did so much work to minimize its damage, that's why most of the world suffered very little. It's what always happen when a crisis is well handled...
I noticed the problem in the late 1970s and I'm sure a lot of other people did before that. We just assumed that by 2000 all that software would be long since replaced anyway. Nope! The notion that it wasn't a big deal to start with is definitely wrong, as several commenters with direct knowledge have pointed out, and you're also definitely right that this has parallels in many other areas, for example anti-vaccine activism.
Limiting animal-human contact? Raising testing capacity? More production and better distribution of preventative materials? Re-organisation of infrastructure and travel? Better education? Many countries are learning now, because they have to. If we could learn something before we have no choice, we would avoid a lot of grief.
20 years ago, modern civilization almost came to a complete halt . . . because of a computer bug. Or did it? Some said the Y2K bug was overblown and wasn't that big of a deal. But countries around the world spent massive amounts of money fixing it. And when the year 2000 hit . . . nothing happened. Did we fix the problem? Or was it never a problem to begin with? We might never know! Enjoy!
I was a programmer during the "Y2K bug" ... storm in a teacup ... a non-event ... as evidenced by you having to scrounge around for things that actually crashed. A ticketing system failing ... really ... Anything that failed because of this non-existent problem were not inherent in the systems but inherent in the crap programming that some people did.
can we stop talking about this? the only people who care are computer programmers. and we sure as hell don't care any more. talk about time zones if you want to get us pissed off.
I lived through the Y2K problem, and was actually on a team to deal with it within my company at the time. I honestly believe that it was not overblown at all, and that the reaction people had to it forced companies and governments to do something about it. It didn't matter if Company A believed it was a big deal, because Company A's customers were demanding it be addressed, and Company A's competitor, Company B, was already on it. Company A knew they'd lose customers to Company B if they didn't address the issue, so Company A dealt with it. And in the US, it was brought up early enough that it impacted politicians, too. People wanted this addressed because of all the hype, so if you weren't in support of spending money to address it, you would lose votes.
I guess you could say that the scare mongering, which may have been overblown, forced people to take preventative measures, and, as a result, nothing major happened.
I'll just agree with you. Thanks
I was a developer for a credit union in the 90's. Though most of our code would have been ok, storing dates as the number of seconds since a specific day, there were some that would have failed. I know for sure that there would have been some serious issues, at least financially, if nobody worked to fix the issue.
Of course humans are smart enough to make adjustments as needed, and things weren't overly automated yet, but it would have caused problems.
this is a paradox that will haunt the current situation with the corona virus too
WHY IS YOUR COMMENT A DAY OLD WTF
Seeing people going back to "Normal" so soon, I have a really bad feeling about it all...
The Y2K bug was fixed globally. There were very few control groups. Few systems were left alone to see what would happen.
The corona virus does have control groups. The first-world has the resources to ride out the effects of lock-down. The third-world which can't pump stimulus money or have the hospital resources are the control group. The vast slums of Africa and South America are going to show us what happens when the corona virus infects nearly everyone simultaneously and kills 2-5% of the infected.
@@CarFreeSegnitz Total mortality rate isn't going to be nearly that high. Herd immunity kicks in faster than you think and untested asymptomatic/mild cases seem to make the prevalence at least a magnitude higher than we see in official confirmed numbers. I expect the mortality rate to be more like 0.3% of actual total cases in a country with a western demographic make-up, probably a third or lower in a younger country.
@@RBuckminsterFuller 0.3% mortality is still quite bad if it infects everyone. It means that you have around 1 in 3 chance that someone you have stable social relationship(friend, colleague, family member) will die, assuming Dunbar number around 100.
0.3% is still 3000 deaths per million people, so around million deaths in US alone.
I'm not always in agreement with The Good Stuff videos, but this one; YES. I also remember the Y2K times quite well and I was working at a call center for Bell South dial-up internet support when we rolled over to January 1, 2000.
It was a relief that nothing big happened and I absolutely believe that it was because of the hard work ahead of time since it was clear that some systems that were NOT upgraded had failures.
I was 4 when the Y2K bug was causing all that bug, honestly all I can remember about it was the Simpsons episode that made a joke out of it being the end of the world.
For the longest time I thought it was some myth that made the general public freak out like 2012 but then some programmer friends explained to me why it was a problem and it makes sense how much of a disaster it could've been. I'd say the hundreds of hours of hard work by hundreds of engineers and programmers was definitely worth it and deserves recognition.
While 2000 is 20 years ago, another *two* "millennium" bugs are looming at the horizon. The NTP epoch overflow in 2036, and the Unix timestamp overflow in 2038.
While both have reasonable solutions, there is a scary amount of legacy devices that are still vulnerable to there bugs.
Even scarier is how dependent we've become of the digital infrastructure over these past 20 years. When the year 2000 happened, internet was commercially available for less than a decade. A few pioneers were trying to make money off of it. But today tens or hundreds millions of people are depending on this infrastructure for their livelihood.
Yup. I'm hoping most vital systems will be 64 bit by 2038 but some will cause headaches for sure.
Y2K helped spur investment into computers. Lots of places were limping along with 20-30 year-old hardware and software. Why fix it when it's not broken, right? But Y2K broke them all.
It was a lot of money. But it wasn't money just piled up and burned. It went to pay technicians. It went to motivate people to learn about computers. It probably contributed to the Dot-Com bubble, a super-weird time when burning through cash reserves was seen by the stock market as a good thing.
*nod* I've seen the argument that y2k spending is what really got the dot-com era going. huge amounts of capital going into a tiny industry, and a lot of people in it all of a sudden being flush wish cash to spend on experimental projects.
Why do I get the feeling that timing of this video has something to do with current 'troubles'?
When you avert or minimize a disaster, you are branded an alarmist. When a disaster hits, you are branded as an incompetent for ignoring the advice and signals.
duh lol
have you finished watching it?
Being the head developer in a large company for the y2k problem. I feel the $600 billion may be over blown. Most (not all) of our work for y2k consisted of "well if were are working on that area anyway fix the year too" and a bunch of testing. Also a few older PCs in the control systems had to be replaced. They could have been fixed but where well past the best before dates anyway.
When it comes to viruses of any kind overreacting is a good thing.
In 2000 news anchors were on tv screaming about how the world was still here when they were the ones who caused the panic in the first place
I remember the panic. I was also forced to work overtime at the hospital I worked in case of a power outage. Nothing happened.. A much more real and annoying problem was the failure of the date to roll over at midnight. This happened with the BIOS in an IBM Compatible. The way out of this was to burn a genuine IBM BIOS. They were good days.
Awesome and timely video!
Glad you enjoyed it
This was Peter's job in Office Space
I work in a factory with old equipment. To this day we lie to the tools about the date because if we don't they stop working. So dates matter.
I remember people being really worried about the Y2K problem. I dont remember any news networks talking about a fix that the industry was trying to establish. I watched news channels as a kid but it could just be my memory. I also remember when the new year rang in, people celebrated more and were more happy than any other new year I've experienced since.
Just wait the Y2k38 error
"MM/DD/YY" - pfft... no programmer in their right mind uses that format for other than frontend presentation :P "YYMMDD" were the way to go because it was easier to sort dates based on that syntax ;)
You realise that YYMMDD suffered from the same issue right?
You also realize that UK uses YYDDMM, and how do you solve for hardware that used IC's that counted to 00? Ask me i did the actually did the rollout from 97 to 00 I made so much money.
Aw, no mention of the upcoming UNIX epoch Y2k38 problem?
yeah, I think that one is going to be an even bigger nightmare to fix, esp as so much programming has moved away from the heavy 'big design up front' techniques that made the y2k stuff so comparatively easy to identity.
what's this problem? can you elaborate?
@@lulu4882 The Unix-based systems (including Linux) that are for 32-bit processors, use 32-bit integers for representing the time, where they're counting the number of seconds since the start of January 1st, 1970. One of the bits of the integer is sometimes used for something else, (basically a negative sign), leaving 31 bits, and the highest number that can be represented in a 31-bit integer corresponds with a time early in the year 2038. Unix-based systems tend to be used for a lot of control systems and servers. 64-bit Linux doesn't have the issue, but a lot of old systems are probably still using old 32-bit computers.
I predict that there will be cluster of bugs near February of 2100. Starting year earlier. Because 2100 is not a leap year. Fortunately year 2000 was an exception to an exception and it was a leap year. Otherwise the February 2000 would probably have caused much much more damage than Y2K bug ever did.
Thanks! Very important message right now
Y2K was crazy back when it was going on.
I remember so many software creators, offering their sure fire "Cures" for computers. Some made sense, while most were nothing more than a way to separate you from your money, LOL!
We had one heck of a New Years Eve party that year, just waiting to see what would happen at the stroke of midnight.
The only thing that happened that night was, we all got drunk, and played Monopoly...
It was probably the most disappointing New Years Eve in my experience. I worked in computer tech support at a major university at the time. Our boss had us at our desks on New Year Day like it was a regular workday. We sat staring at our phones. To his credit our boss ordered in a party sub-sandwich. Party like it was 1999? Yeah...no...went to bed early and went to work the next day.
@@CarFreeSegnitz I hope you're afforded more free time now.
do
do you get it?
did you get the analogy?
Great analysis
Is it really that big a deal to change the date format in the source before sending out the next version?
Seems to me this would be a one-guy-one-afternoon job for most applications.
Even if it went wrong set your system clock back a week and bam, it should work again.
Some networked stuff was probably more advanced, but putting an extra line in the updated API a few years before the issue hit...?
There's too much wrong with this comment.
To sum up: Much less was possible due to the way applications were built and rolled out.
Applications are context-dependent. Current-day solutions don't work in old contexts.
It is incorrect to describe this as a glitch or bug though everyone at the time called it the Y2K Bug because it was the result of deliberate decisions but early programmers.
The decision was deliberate, but the behavior was not anticipated. Bugs are unexpected behavior.
DD/MM/YYYY anyone?
00/00/0000 = ***/**/***** = 99/99/9999 = 11+/11+/1111+ algorithm
Simple
That's like writing a time as seconds/minutes/hours. So no! The standard in computers, and for the one recommended by UNESCO, ISO, et al in written-out dates, is year/month/day, the traditional way used in Japan and in computers.
YYYY-MM-DD of course. ISO sets standards for a reason.
So glad this video exists.
I hate that people think it was overblown... We did so much work to minimize its damage, that's why most of the world suffered very little. It's what always happen when a crisis is well handled...
At least my phone is charging faster now
They had inspectors going around putting stickers on things saying this device is Y2K compliant. things like lamps and electric pencil sharpeners.
why do things like toasters and pencil sharpeners even need an internal clock?
I noticed the problem in the late 1970s and I'm sure a lot of other people did before that. We just assumed that by 2000 all that software would be long since replaced anyway. Nope! The notion that it wasn't a big deal to start with is definitely wrong, as several commenters with direct knowledge have pointed out, and you're also definitely right that this has parallels in many other areas, for example anti-vaccine activism.
IT has gone through many happenings that were initially thought of as inconsequential, only to take the world by storm later.
Y10K is our next chance to test whether or nor Y2K would have happened if we did nothing
We have a few more milestones before 10,000, like the ones in 2036 and 2038 I mentioned elsewhere.
Why do I get the feeling that this video is not at all about Y2K
Funnily enough, there was 1 program that we had to set the date to December 31, 1999 for it to work
It was a Typing tutor program xD
More people should see this video
>20 years ago our world almost came crashing down, financial markets were preparing for a meltdown...
Well, they were wrong by just 20 years.
It was a flaw that was purposely programmed in because of the cost of storage.
It's debatable if they could have predicted the potential disaster they were signing up for.
Laughs in 2021.
IBN 5100
008 John Titor 2036
El Psy Kongroo
Mad scientist Hououin Kyouma!
Also was schließt man daraus in Bezug auf Covid 19?
Egal was getan wird, es wird nachher sowieso falsch sein. Entweder hat man überreagiert oder zu wenig gehandelt. Scheiss Menschheit, ganz ehrlich...
Y22k bug
Sooo. . . What are the preventive measures for covid that should be done years before?. . .
Limiting animal-human contact?
Raising testing capacity?
More production and better distribution of preventative materials?
Re-organisation of infrastructure and travel?
Better education?
Many countries are learning now, because they have to.
If we could learn something before we have no choice, we would avoid a lot of grief.
20 years ago, modern civilization almost came to a complete halt . . . because of a computer bug.
Or did it?
Some said the Y2K bug was overblown and wasn't that big of a deal. But countries around the world spent massive amounts of money fixing it. And when the year 2000 hit . . . nothing happened.
Did we fix the problem? Or was it never a problem to begin with?
We might never know!
Enjoy!
I was a programmer during the "Y2K bug" ... storm in a teacup ... a non-event ... as evidenced by you having to scrounge around for things that actually crashed. A ticketing system failing ... really ...
Anything that failed because of this non-existent problem were not inherent in the systems but inherent in the crap programming that some people did.
can we stop talking about this? the only people who care are computer programmers. and we sure as hell don't care any more. talk about time zones if you want to get us pissed off.