But there's other anecdotal evidence that at some organizations, it's worse than that. On several occasions, when we've performed IV&V on certified code, we've been given modules that, mistakenly, have not been remediated at all. These are systems that, come January, or possibly sooner, will completely breakdown. This, to put it mildly, concerns us.
There are other signs that organizations are not as far along as they would like. The New York Times recently reported on another study indicating that most organizations have underutilized their Y2K budgets, suggesting that much work remains to be done. And the Security & Exchange Commission (SEC) has consistently reported that most companies are not abiding by Y2K disclosure guidelines, which could indicate that they prefer to avoid full Y2K disclosure. In the federal government, Congressional committees are complaining that many agencies have not generated reports showing how they spent Y2K repair budgets.
For disaster recovery managers finding themselves in organizations behind on Y2K, a viable, proven, last-minute Y2K fix is a necessity. And there's the rub. Most Y2K industry analysts say it's too late to remediate applications. This is because conventional Y2K windowing tools, based on glossary search engines and some form of date expansion, are slow, labor intensive and inaccurate. And if you've worked directly with one, you also know they create a systems management nightmare.
We know of one senior IT manager who was directed to remediate his company's 2,700 legacy applications because implementation of new, Y2K-compliant ERP systems was running behind schedule. He reviewed all of the major Y2K remediation tools and came to a succinct conclusion: "They don't work." Why? They leave far too much work to be done manually. "We ran some code through one tool, and it produced a print out this high," he said, holding his hand at his chest.
Furthermore, he said, they are inaccurate. Their search engines require the user to seed the glossary with the date variable names known to be in the program. There's a flaw in the premise of this search technique, he said. The tool will only identify the date variables the Y2K project workers are already aware of. Problem is, programmers can call date variables anything they wish, without reference to any established variable naming standards. All too often, the variable name falls outside the scope of a glossary seed search. With an almost infinite variety of programmer-defined variable names available, it's no wonder that date variable names frequently elude Y2K tool search engines.
This is why most search engines identify only 80 to 95 percent of date-sensitive code. This means that if, of a million lines of code, 50,000 lines are date-sensitive, between 2,500 to 10,000 lines (or 5 to 20 percent) requiring fixes will be left unconverted.
Said the IT manager, "It's like designing a test car and modifying it after you have an accident caused by faulty design. You won't know you've missed a date variable until after your program has crashed or spit out bad data. Testing won't reveal the dates you've missed because you're only forward dating the variables you've already found ' and those variables have been corrected and will test fine. It's the variables you missed that cause problems."
Likewise, the conventional approach for automated Y2K conversion, called windowing, is also flawed. Windowing changes date variables from two-digit years by temporarily adding a "19" or "20" to years in computer code. This technique requires all date variables be changed in all modules where those variables are used. This causes a widespread system "ripple effect" that entails complex systems management to implement. The end result: windowing requires changing 6 to 10 percent of all application code.
Furthermore, with windowing, Y2K testing cannot commence until all of an application's modules have been changed, thus delaying the test phase of conversion projects. This also requires extensive testing of the complicated interaction between modules and between applications. Together, these are prohibitive burdens for organizations remediating millions of lines of code, thousands of modules and hundreds of applications, with only a few months left before the end of the year.
The above-mentioned IT manager found another way to remediate his code. In a matter of eight weeks, he led a project team that rid all 2,700 legacy applications of the Y2K bug. How many workers on the Y2K team? Two.
He did it by finding another remediation technique, called Millennium Solution from Data Integrity. Millennium Solution uses an approach that simplifies the Y2K problem, that eliminates manual intervention in the process, that requires conversion of less than half of 1 percent of his code, and that remediates more than 100,000 lines of code per day.
The key to this approach is the way it looks at legacy code for Y2K problems. Conventional tools start from the premise that the Y2K problem is a programming problem. In the 1960s, 1970s and 1980s, programmers used only two digits to designate years, thus causing the Y2K problem when two-digit years in a program are on either side of the millennial shift (see example below). So now, according to conventional wisdom, those years need to be expanded to three or four digits to account for the new century.
The Millennium Solution approach, on the other hand, looks at Y2K as a math problem. From this perspective, the premise for identifying Y2K-sensitive code is fundamentally altered. Instead of searching among an infinite variety of programmer-defined variable names, the last-minute technique looks only for computer language commands used to calculate the math that causes Y2K problems. These Y2K-critical math operations are the subtraction or comparison (a.k.a. logical compares) of dates. In essence, this search technique looks for math and logic operations (such as subtract: "-", compare greater than: ">", compare less than: "<", and sorts) ' which, unlike date variable names, cannot be altered at the whim of a programmer writing code. This is a finite, highly exacting and virtually foolproof basis for Y2K searching.
Likewise for the Y2K fix, this technique is simple and straightforward, using a math algorithm that corrects the bad math produced when computer operations involve dates on either side of the millennial shift. In fact, this is the only Y2K solution that allows legacy systems to remain in the two-digit date year world. In so doing, it avoids the major systems management complexities and system testing difficulties that cause Y2K compliance delays.
Understanding this fix technique is easy. Keep in mind that most legacy computer systems express years in two digits. Thus, the computer "sees" the year 1938 as "38," and the year 2000 as "00." Using the right date routine, which is an algorithm that correctly calculates years, the computer can be "trained" to see "00" as the year 2000.
Here's a typical mathematical problem that causes the Y2K bug: the computer is determining the age of people in its database. Assume the current year is 2000, and a person in the system was born in 1938. The computer will perform the following:
"SUBTRACT 38 FROM 00", and arrive at an answer of -38
Negative 38 is, of course, the wrong answer. Here's where the Millennium Solution fix steps in. That incorrect answer is sent to the application's fix date routine. The first thing the date routine does is add 50 to the original incorrect answer. Thus:
"ADD 50 TO -38" and computes an answer of 12
Then, the fix date routine does the same thing again. Thus:
"ADD 50 TO 12" and computes the correct age: 62
One might ask why not simply add 100 to the original incorrect answer of -38? The reason is the computer is programmed to recognize only two digits, thus the need to add 50 twice.
For date calculations in which both dates are in the same century (current year and birth-year, for example, are both in the 20th century), this fix technique is benign.
Once each Y2K-sensitive math operation has been identified, the tool automatically inserts a call to the fix date routine, which processes the affected calculation and produces the correct output. The date variable itself is never changed; the system stays in the two-digit world; systems management complexity is reduced and the Y2K problem is radically simplified.
Because date variables in source code remain unchanged, this technique leaves unchanged the program logic, data definitions, current/archived data or databases. Nor does it require bridges or wrappers. Windowing tools require all of these measures, and they bring on extensive systems management and testing problems. The result: significantly fewer lines of code are changed (less than half of 1 percent), conversion is completed in short time frames and testing - the most expensive and time consuming aspect of Y2K remediation ' is simpler, faster and less expensive.
Finally, this technique is easily integrated into ongoing Y2K projects because it coexists with code already converted by other tools. For organizations mired in a behind-schedule project using a conventional Y2K tool, the introduction of this technique will not cause disruptions ' only end the Y2K pain sooner.
Allen G. Burgess is President of Data Integrity, Waltham, Mass. Millennium Solution has been used extensively by NationsBank, Credit Suisse First Boston, the U.S. Department of the Interior and U.S. Healthcare, among other major organizations.