Year 2000 Nightmare

 

Jess C. Malicdem

 

The Year 2000 is just around the corner, and its threatening arrival has put the whole Information Technology industry go gaga for what is being heralded as a disaster of the millennium. Because everybody, regardless of any business, agency, institution or person using computers will be going out of business. Unless they are fixed...they would come to halt. What happens actually is that any computer calculation that involves a date -- such as credit card transactions, an auto loan beginning anytime today or mortgage calculation -- could yield incorrect answers. Why? Because after December 31st 1999, read my lips, computers won't know what year it is. Imagine, on January 01st 2000, I am 71 years old. Damn, I am ready for retirement. This is insane but it happens to be true in the first place. Here's why? We programmed computers to store the date with this format: mm/dd/yy, 2 digits for the month, 2 digits for the day, and 2 digits for the year. Well, that is cool to me

but wait...Allow me to do this...I was born October 13, 1971. So in the mm/dd/yy format we store that as 10/13/71. If I computed how old I am today...that will be 97, which is this year, minus 71 which is the year I was born and that will come up as 26, that's right, assuming an absolute value integer, I'm 26 years old (actually I'm going to be 26 on October).

 

But look, on January 01st 2000 we store that as 01/01/00. So if I take 00 from 71 that will

put me as 71 years old which will cause "total meltdown" with every type of interest calculation worldwide and, of course, I will be older than my parents and even my grand parents, WOW. Can you see the problem? In the process of storing the year 1997 as 97 and 2000 as 00, the computer thinks that 00 is 1900 which is totally wrong. Computers are idiots. They perform miraculous tasks, but have no understanding of what they are doing.

 

Again you may ask, how come we use only 2 digits when we know we need 4 of them come year 2000? Well, the bad news is, we deliberately did it Beavis.

 

In the late '60s and '70s, computers were very expensive, it is a fortune to have those machines then. This expense was tied directly to two aspects of computing, how much could it store and process data. Even tiny increases can result a fortune. One way to store data is by using punch card known as Hollerith card, whoever that guy is. Holes represent information and each card can hold only 80 characters of information. If you write your name, full address, birthdate, and bank account number, chances are you'll written down more than 80 characters.

 

This is exactly the same problem programmers ran during those times. Hollerith cards were not big enough to store all the information they need to store. So they compromised. They wrote 101371 instead of 10/13/1971, thereby saving themselves 4 precious characters, 2 of which were the crucial '19'. So that's how we did it.

 

 

 

So what's the story morning glory ? Well, one solution is to create another data bit that handles this problem. If the flag is set to 0 then 71 refers to 1971, and if it set to 1 then 71 refers to 2071. Neat...and we have to do this before December 31, 1998. 1998? Because you'll need to test the whole application required to process the full fiscal year for your company. Whew! Here you saw, I mean you read, the scope of the Year 2000 problem is much broader than you might think.

 

Face it, unless we take action now, 2000 is not going to be a good year for data processing.

All computer programs, big or small, which uses date or time will go on strike on January 1st 2000. That's the bottom line

 

The following feedbacks are not neccessarily those of my opinions unless otherwise indicated.

 

Unlike Unix and IBM-compatible systems, Macintosh systems do not have the year 2000 problem, since Macs have always used 4 digit dates. This being the case, most Mac software also use 4 digit dates, and are exempt from the problem as well. Tim

 

The solution that you propose for the Y2K problem is a plausible solution, but it also raises many questions for me. Hypothetical problem (actually real problem for me).

 

IMS data base with 300 million segments about half contain dates that need expansion.

Historical tapes going back 5 years kept by month each containing about 180 million records.Roughly 300 supporting work files most or all having dates.

 

Questions:

 

1>How would you go about implementing the changes?

 

1>Comments: The choice of a solution for the date expansion was a very minor part of the over all problem. The real problem was HOW do you implement something this large with no impact to the rest of the company.

 

1>The scenario represented just 1 system out of 72 different systems that our company has changed. We opted to include century in all of our dates rather than use a switch

 

1>Our solution for this problem:

We identified 5 separate components in this system.

 

1>A module was written for our large IMS Data base, that when called expanded the dates into the expanded format, or un-expand if outputting to the data base. (This allowed us to expand all the programs before physically expanding the Data Base).

 

1>All files that went across components or to/from other systems had a twin "bridge" file

 

1>The vary last thing that was done, was to convert/expand the large IMS Data base and change the called module to just pass control to IMS or back to the program.

 

Conclusion: The real problems with most Y2K problems is not so much the date expansion, or the slight modifications to the programs, but HOW to implement hundreds of programs and convert hundreds of files all at the same time. Today is the first day of the rest of your life.