Something which has disappointed me in recent years is the number of 'conspiracy theories' which have evolved around the so called 'Y2K' bug, promoting the idea that the whole issue was fake and a scam.
As a professionally qualified IT professional who was there at the time, I would like to present an account of the work I undertook to avoid the so called 'bug'.
So what was the 'Y2K Bug' in a nutshell ?
To explain the causes of the 'Y2K bug', we need to go back in history to the 1950's and 60's and look at how early computer hardware and software processed dates. We need to remember that at that time, computer memory and backup storage space (disks, tapes) was extremely limited and was very expensive, therefore, every possible attempt to compress functionality and/or encode data was paramount.
One such area of 'compression' was the storage of dates. Today, there are numerous ways to store dates, but historically, they tended to be stored as a textual representation in the form:
dd/mm/yy eg 21/09/68
Note that the year only consists of two digits. It was assumed that since all dates began with '19' there was no need to store the '19'.
When it came to date calculations, for example determining the age of a person for retirement purposes:
Current Date: 21/09/85
=> 85 - 20
All the time calculations were occurring within the 1900's, everything was sweet, however, by the early 1990's, the was a recognition that we would shortly be moving into the 2000's and the '19' which was assumed on all dates would change to '00'. If we then apply the above example again:
Current Date: 21/09/00
=> 00 - 20
As can be seen, this gives an incorrect result and this was the crux of the whole Y2K issue. It should be noted that there were several variations on this where value could be stored with a limited number of bits as integers which ran out of range after 31/12/1999. There were also several variations of 'date range windows' used.
The example above was found in enourmous amounts of source code (which in many cases, was very old and predated my time) and could be corrected, however, there were numerous hardware devices at the time which had this kind of logic implemented in electrical hardware which could not be fixed by a software change and required an actual hardware replacement. The most common example of this was the clock on personal computers which remembered the date and time while a computer was switched off. After 31/12/1999 these devices could not handle 01/01/2000 and defaulted back to 01/01/1970.
What needed to be done ?
Having now made ourselves aware of what the actual Y2K bug was, the task at hand became a massive logistical exercise to:
- Identify all the systems we managed (many organisations didn't even know what systems they had)
- Categorise each of the systems in terms of their risk and criticality to the operation of business
- Some systems were 'black boxes' (eg telephone systems) provided by external companies. For these, we needed to contact vendors to ascertain their Y2K state as they may be driven by hardware or software which may have issues
- For the systems we developed, we had to analyse all of the program source code for each system, looking for date processing issues like that described above
- Correct any erroneous code
- Test the new code and correct any further bugs found
- Install the new code
- Monitor the new code from installation, through the transition to 01/01/2000 and for some period afterwards
What I did
At the time of Y2K, I was responsible for numerous systems which had been written in:
- Visual Basic
In C/C++ most dates were being stored using the Unix 32bit notation, which of course, would have its 'Y2K moment' in 2038 when 32 bits ran out of range. Subsequently, this method has been changed as well, so there should be no 2038 issue unless 35 year old software is still being used - which probably won't run on the operating systems of the time anyway.
My C/C++ code was all using 4 digit years and was therefore, unaffected. Database storage was in SQL which was also already Y2K compliant.
Visual Basic was a bit of an issue. At the time, it was the most widely used software development tool on PCs. It had an 'intelligent' way of interpreting dates and applying a century figure if not provided. From memory, I think you could set a 'window' of 100 years, but that didn't work where you had people born in the previous century and others retiring over 100 years later (ie data covering a range larger than 100 years). The Visual Basic method wasn't reliable, so all VB code had to be upgraded to work using 4 digit years. VB's date interpretation of 4 digit years was reliable.
In the event, I actually marketed a software product called 'SourceCheck 2000'. It would open a VB project and 'inject' itself into all date function calls. You could then run the application and the product would display a popup window which listed all date processing activity and highlighted ambiguous calls and/or 2 digit year processing. I sold several hundred copies and used it very successfully with collegues - it saved a huge amount of time. When done, my software would 'uninject' itself from a VB project, restoring the original VB code.
DataEase was a DOS based database application which dated from the early 1990's. Its use had pretty much died off but my clients still had a few apps around written in it. The ones using older versions of DataEase had to be upgraded to the latest version which did support 4-digit years. The newer apps still needed to be tested because it was still possible to use 2 digit years. In many cases, clients took the opportunity to declare DataEase applications as 'life-expired'.
Believe it or not, a couple of my clients still had DBase applications in the late 1990's! If I recall, there were about 4 or 5 of them, mostly asset management and record keeping systems. They were all very simple to the extent that I was able to replace them with a VB version within a few days, running off of SQL Server - remember that this was the days before web-based software!
While the above describes the common issues I had to fix, as any developer knows, when you inherit code from others, there are always weird and wonderful bits of code doing things in strange ways which usually demonstrate that the original programmer didn't understand what they were doing and probably shouldn't have been doing it any way. Peer reviews weren't really heard of then! Needless to say, I did have some of these to unravel. In most cases, they were an attempt to implement a 'date window system' which sometimes conflicted with the one built into VB. In ALL cases, the solution in VB was to convert ALL dates to four digit year dd/mm/yyyy notation and then go and check the code which was calling it to assess the impact - in most cases, this wasn't an issue because the calling code was just assuming that the date processing was being done correctly (although I did come across some which made erroneous checks and apply other adjustments) and carrying on accordingly.
It was all of this kind of activity which developers had to deal with.
The Scare Campaign
As with every issue, there are people who are not involved 'at the coal face' and consequently, do not have an understanding of the issue at hand. Unfortunately, numerous scare campaigns arose which were based entirely on ignorance of the problem and ignorance on the part of those reporting it. There were all kind of predictions of amagedon caused by complete societal breakdown, through world war three starting through to cows not milking. Some people even bunkered down with a 'seige' mentality and tried to make themselves self-sufficient in everything, believing that the world was going to end!
Unfortunately, there were some people who attempted to 'jump on the bandwagon', trying to make money out of the situation by deliberately confusing the matter to encourage people to waste their money, not offering practical solutions and only offerring scams. These people affected the credibility of everyone else.
Several scams I saw claimed to fix your PC clock with very expensive software that did nothing and could never deliver because the hardware needed changing! In fact, Y2K occurred at a technology transition time (from 486 to Pentium processors) and effectively hastened the large scale purchase of new hardware which was Y2K complaint.
So was Y2K Genuine or Fake ?
To determine this, we have to look at the impact and risk of not making the software changes which I did:
- Any system which my clients used to calculate any form of date period for contract/financial/charging purposes would have been wrong and would have resulted in incorrect charging and major accounting/business viability problems
- Any system which my clients used which used time periods to control hardware could have failed. This could have resulted in a negative time being calculated which meant that a machine may not have conducted an action at all or if it treated a negative number (in two-s complement form) as a possitive integer, it could have caused a machine to conduct an action for far longer than it should, creating all kinds of damage and safety issues
What I did at the time most definitely prevented my clients from having major business continuity problems and avoided serious business risks for them.
Y2K was a very genuine problem and most definitely was not fake. It was the actions of many IT professionals around the world, correcting software code that prevented a disaster. The fact that when the clock ticked 01/01/2000 00:00:00 and the biggest non-event ever occurred is testament to the effort of IT professionals around the world. If there was a disaster, then it would have meant that IT professionals had not done their job properly.
Did the IT industry deliberately create the Y2K issue to create work for itself ? Most definitely not. The issues which caused the problem had their origins in hardware, software and development practices which went back many decades when these issues weren't known about.
Could the issues have been avoided ? From the 1950's to 80's possibly not because the price of hardware and software was still such that mitigating the issues was too expensive, but from the 1980's onwards when memory and storage became much cheaper, the transition to correction probably should have started then, but as we all know, people always leave things until the last minute when they have no choice - indeed, I spent many an hour trying to convince executives that the problem was coming and that we should start looking at our software but they weren't interested for some years as they couldn't justify the costs! Eventually, they had no choice.
I actually took my own steps and ensured from the early 1990's that all software I wrote used full 4 digit year calculations on dates. This meant that when Y2K approached, all of my software was already Y2K compliant and I only had minor testing and validation work to be done.