A TPF Testing Approach
by Frank Fannon

The ARROW Reservation System at AMTRAK functions in an SNA-TPF environment rather than an ALC environment. The system is used by Sales Agents and the myriad of Travel Agencies spread across the country. It quickly becomes evident that an inadequately tested program modification has potentially deleterious consequences upon the entire network, to the point of actually crashing the system.

We began a measurements program that evaluated the number of programs loaded, in relation to the number of abends or program fallbacks. We began to produce a "dumps" report for the test system that is a by-product of user acceptance testing. Historically, the dumps report was produced only by special request, as it required some human intervention.

At AMTRAK, the user community member(s) most directly involved with the change being implemented, actually perform system acceptance testing.

Because of our environment, many of the available testing tools were inappropriate for our use. The Quality Assurance group became aware of a software testing tool that had been previously purchased. The name of the product is Testing/Recording and Playback System or TRAPS. It was developed by TravTech, a subsidiary of Travelers Insurance. The product has since been acquired by Computer Associates.

TRAPS is a PC based software tool designed to test host-based applications. The tool will "record" the transactions of keystrokes made by the user; at the same time, it records the responses returned to the screen. A knowledgeable user will organize the entries in such a way that a series of entries can be saved as a "test case". Any number of these test cases can be created and saved. When testing is required, the user can select the test case names and enter them into a "Play Control File". This file is then "Played", via the tool. All of the entries interact with the mainframe system as though a human were keying. At the same time, TRAPS is also recording the responses made by the system. At the end of the test execution, the tester can review a "Mismatch File" created when the new responses are compared with the original responses. It then becomes an evaluative process for the tester to determine the validity of the discrepancy. If certain types of fields change frequently, such as dates, a "masking" facility is available so that the number of mis-matches is minimized.

One member of the Quality Assurance staff was assigned to learn the tool and prepare class materials. The original plan was to teach the use of the tool to the TPF programming staff and the primary user testers, so that much of the tedium of entering tautological commands could be relieved.

As the learning process continued within the Quality Assurance group, several "practice" training sessions were held. These sessions proved to be very valuable in identifying several aspects of implementing this tool that we had not originally considered.

We decided that in addition to the software manual provided by the vendor, we needed our own "TRAPS User Guide". This document was tailored to the specific needs of the TPF group. Additionally, we concluded that it would be propitious to develop and hold a number of "Pre-Class Meetings". These meetings were held approximately 2 weeks prior to the actual training. The two-fold purpose was to provide a high-level explanation of what TRAPS was all about, and to discuss the concept of Test Cases and Play Control Files, including the development of naming conventions.

In order to most effectively use any testing tool, one must both understand the nature of the data being tested and be aware of what results are desired. It is a time-consuming task to pre-plan test cases and organize them into cogent Play Control Files. If it is well thought-out, and executed properly, it will for the most part, be a one-time job. As time goes by, there will, out of necessity, be certain modifications to some of the test cases simply because of application changes, but many of those interactions or keystrokes will never have to be entered again.

Bonnie Wheet, from the Quality Assurance group, held a series of Pre-Class Meetings and conducted the actual training. The training sessions went rather well, or so we thought. At this point came the realization that the programming staff will test up to a certain point, with the bulk of the testing then being turned over to the user testers. We have since provided training to most of the user testers, and received positive feedback.

We have now completed a six-month evaluation period designed so that we could consider any adjustments to our implementation strategy. Quality Assurance regularly participated in the weekly user testing, so that questions or problems could be addressed quickly. We have not identified any significant issues to be resolved. Keeping a hands-on approach during the 6-month period undoubtedly aided in smoothing the way.

Depending upon the nature of the testing, this tool can return many person-hours to either the programming staff or the user department. In one recent instance, it took approximately 2 person-hours to establish the proper environment within the TPF test system in order to test major system modifications. The entries were tedious and required constant attention by the user. These 2 hours would have been required every time the test was conducted. TRAPS proved to be perfect for the task. Once the test run was initiated, it only took 5-6 minutes. This resulted in an additional 2 person-hours of productive time. It is possible to divide the test cases among several PC's, thereby reducing the overall span of time by a factor of the number of PC's being used.

The word is getting around about the effectiveness of TRAPS. We have had several requests from the user community at large for training in the use of TRAPS. These requests are coming not from ARROW users, but from MVS Business System users. The user group from a major project is well underway in their creation of test cases and use of TRAPS. They have even written their own permanent testing procedures.

Obviously, this product is not the be-all, end-all of testing tools. But we feel TRAPS has earned its own place in AMTRAK's testing tool-box.

Francis W. Fannon is Manager of Quality Assurance, Technical Services for AMTRAK, and can be contacted at: The National Railroad Passenger Corporation, Information Systems Department, 400 North Capitol Street, Washington, D.C. 20001.