The Y2k Bug
or
*bah, humbug*
Read the headlines - Y2k "Bug" fizzles, Y2k Bug Not All It's Cracked Up to Be, etc. etc.  OK, it's time to deal with the reality of the situation and straighten out some myths and misinformation that has been spreading about "Y2k".  For the past several years we information systems professionals have been too busy dealing with real Y2k issues to take the time to put the media hype and hysteria vendors in their proper place.  Now that we've transitioned into the new year with no more than a few bumps and minor bruises, it's time to lay out the facts.

First, spell it right.  Y2 little 'k', not Y2K as is so often seen.  The "Y" stands for year. Easy enough.  The lowercase "k" is an abbreviation of "kilo", as in kilogram, kilometer, kilohertz, meaning thousand.  So, 2k means two thousand, hence Y2k or year two thousand.

Y2k "Bug"?  I don't think so.  A bug is a quazi-technical term used by software developers to describe a piece of errant code producing unexpected or undesirable results.  Rumor has it (mind you, this took place long before my entrance into the industry;  I'm simply relating the 'techno-legend' that has been passed down from programmer to programmer over the years) the term developed in the very early days of computer science. Initially, logic flow was programmed by means of patch cords in a panel.  Soon the patch panel was replaced with wiring blocks that could be removed and stored intact. Programs were 'hardwired' (another programming term) into these removable circuit boards.  As the story goes, while the boards were in storage, some sort of critter, be it insect or rodent, devoured the insulation on the wires, causing them to short out, creating errors in the program logic.  Hence, the expression 'bug in the program' literally meant an insect had caused a program error.  We still use that terminology today to express behavior that is outside the logical intent and design of the program.

 The date issues that characterize the description of the Y2k "bug" are not programming mistakes producing erroneous results.  They are design limitations built into the software, and the software is functioning exactly as designed.  Sure, we admit that the design was definitely short-sighted.  How did it happen?  Let me tell you...

Once upon a time, long long ago, like, six or eight years ago, memory was very expensive.  I remember my first Commodore64 computer.  That 64k of memory was considered almost obscene.  Who could even comprehend a program that could utilize the almost infinite resources of that machine.  And, what a bargain, at only $750.  A 16k Apple was going for $1400.  And there was the TRS-80 with its 8" floppy drive.  These were heady times on the cutting edge of technology.  Why, only a few years before 16k of memory required nine square feet of floor space and a rack eight feet tall.  Then I got my first PC - 128k at $2500.  And, for several hundred dollars you could upgrade it to 256k!  And it ran at the blinding speed of 4mhz (that's a four with NO zeros!).  I remember the first hard disk drive I saw.  It was the size of a dishwasher. Seems like it held 320 megabytes.  And it cost $40,000 in 1970's dollars.  Then a friend of mine paid nearly a thousand dollars (almost half the price of my first car!) for a 10 megabyte hard drive for his PC.  What a luxury! 

Anyway, you're starting to get the picture.  Memory and data storage was very expensive.  Two extra bytes of information in a database to store redundant 19's was inconceivable.  A transaction database of 10,000 records containing the date of service and the posting date would have to devote 40,000 bytes to storing the number 19.  The cost could be measured in real dollars.  Additionally, as this information was processed, it would take extra clock cycles to move this extra data from place to place.  Mathematically the 19 was totally superfluous.

It just sort of snuck up on us.  In the early days of programming, the half life of any given piece of code was extremely short.  Face it, the programs were short.  And programming was tedious work.  The smaller the amount of information, the more quickly and efficiently the information could be processed.  Remember, we're talking about million dollar mainframe computers running at speeds a fraction of the average laptop computer of today.  Every piece of unnecessary information was discarded.

Eventually, our ability to develop larger and more complex programs increased.  Storage, processor memory and processor speed were still extreme financial concerns.  Suddenly, before anyone realized it, we had a legacy of huge, complex programs to manage.  The programs were complex because the most efficient way to process data in a computer is not always the most obvious.  And, built into these programs was the logic to process all dates as 2 digits, just assuming that the century digits were all going to be 19's.  Even by the mid-1980's, nobody dreamed these same programs would be running at the turn of the century.  By 1996, it began to dawn on a few of us that we were in for trouble.  Then the scramble began.

Let me tell you - looking at 5 million lines of code, knowing that somewhere in there are thousands of date related logic statements performing date related calculations on hundreds of different dates stored in tens of millions of data records is daunting to say the least.  And that was just the software provided to my company by a single vendor.  All told, we had a couple of dozen different applications running on five different operating systems representing tens of millions of lines of code - none of which were Y2k compliant.

It was only through the sometimes heroic efforts of my colleagues, burning the midnight oil day after day, upgrading hardware and software systems, that we are still doing business today.  Many software systems had been allowed to lapse many versions behind current releases for a number of reasons, whether from lack of interest in new features or in favor of implementing new systems.  All these releases had to be brought current or abandoned and replaced by new products.  Several hundred thousand dollars and six or eight man years later we transitioned into the new year, only to have all our efforts declared a bust.  Now, that's gratitude for you...

I can assure you we would not be doing business today if we had simply ignored the problem, saved our effort and our time, and gone merrily on our way.  Our business is highly date dependent.  Our software was programmed to reject any attempt to use a year of 00 as invalid.  It could not accept a number higher than 99.  Schedules for patient medications, recurring tests, our ability to generate daily charges or even accept new patients would have all gone away.  People would have been hurt.  We would have been out of business and I would have been out of a job.  Multiply that by the tens of thousands of businesses that could have been in the same predicament.  Now, maybe you can begin to see why the whole thing went so smoothly!

I'm sorry if the gloom and doom naysayers prodded innocent people into unnecessary preparations for the supposed end of the world.  I know many people were frightened in to purchasing gallons of bottled water, weeks worth of powdered food and electrical generators.  Personally, I knew that every one of my colleagues had as much riding on the success of the whole project as I did.  And by that, I mean my collective colleagues in every IS shop around the country and the world.  So, how did I prepare for the 'big event?'  I bought a flashlight so I could find my way out of my office when some smashed New Year's Eve reveler drove off the road and took out a power pole, leaving me in the dark.  So, I didn't have any more faith in your ability to drink and drive responsibly than you did in my programming ability.  I guess we're even.

We IS professionals take pride in our work.  Everyone knows that a computer is no smarter than its programmer.  Seems like incentive enough for me!  I'm sorry if you were disappointed systems didn't crash and burn all over the world.  The possibilities were intriguing, I'll admit. Now, with the potential horrors of Y2k out of the way, I can set my sights for a relaxed and sane celebration of the true beginning of the new millennium, January 1, 2001.



BACK