Anda di halaman 1dari 9

Mainframe Testing

An Overview of Mainframe

Introduction to Mainframe

Mainframe is an industry term for a large computer. The name comes from the way the machine is build
up: all units (processing, communication etc.) were hung into a frame. Thus the main computer is build
into a frame, therefore: “Mainframe”.
And because of the sheer development costs, mainframes are typically manufactured by large companies
such as IBM, Amdahl, and Hitachi. Banking and insurance businesses where enormous amounts of data
are processed, typically (at least) millions of records, each day. There arise the need for a system to
process such kind of data .To create a mainframe one needed at least a few hundred thousand dollars to
build the first types. Later types of the 60' and 70's required a few million dollars and now depending on
what capacities you need mainframes range between two three hundred to several tens of millions.

But what classifies a computer as a mainframe?

♦ A mainframe has 1 to 16 CPU's (modern machines more)

♦ Memory ranges from 128 Mb over 8 Gigabyte on line RAM
♦ Its processing power ranges from 80 over 550 MIPS
♦ It has often different cabinets for
♦ Storage
♦ I/O
♦ Separate processes (program) for
♦ task management
♦ program management
♦ job management
♦ serialization
♦ catalogs
Mainframe testing

♦ inter address space

♦ communication

Operating System

MVS (Multiple Virtual Storage) was the most commonly used operating system on the System/370 and
System/390 IBM mainframe computers.
It is unrelated to IBM's other mainframe operating system called VM/CMS.

First released in 1974, MVS was later renamed by IBM, first to MVS/XA (eXtended Architecture), next to
MVS/ESA (Enterprise Systems Architecture),then to OS/390 when UNIX System Services (USS) were
added, and finally to z/OS when 64-bit support was added on the zSeries models.
Its core remains fundamentally the same operating system. By design, programs written for MVS can still
run on z/OS without modification. MVS originally supported 24-bit addressing. As the underlying hardware
progressed it supported 31-bit (XA and ESA) and now (as z/OS) 64-bit addressing.

The main interfaces to MVS are Job Control Language (JCL), the batch processing interface, and TSO
(Time Sharing Option), the interactive time-sharing interface,
ISPF is an interface which allows the user to accomplish the same tasks as TSO but in a menu and form
oriented manner.


It is a means of communication between a program that can be written in COBOL, ASSEMBER or PL/I
and the MVS operating system. JCL is a scripting language used on IBM mainframe operating systems to
instruct the Job Entry Subsystem (that is, JES2 or JES3) on how to run a batch program or start a

COBOL (Common Business Oriented Language)

COBOL is a high-level programming language first developed by the CODASYL Committee (Conference
on Data Systems Languages) in 1960.Since then, responsibility for developing new COBOL standards
has been assumed by the American National Standards Institute (ANSI).
The word COBOL is an acronym that stands for Common Business Oriented Language.
COBOL is designed for developing business, typically file-oriented, applications.
It is not designed for writing systems programs. For instance you would not develop an operating system
or a compiler using COBOL

VSAM (Virtual Storage Access Method)

Virtual Storage Access Method (VSAM) is an IBM disk file storage scheme used throughout
the Multiple Virtual Storage (MVS) architecture. Files can be read sequentially as well as randomly. The
record-oriented file system of VSAM comprises four access methods.

♦ Key Sequenced Data Set (KSDS),

♦ Relative Record Data Set (RRDS),
♦ Entry Sequenced Data Set (ESDS) and
♦ Linear Data Set (LDS).

VSAM records can be of fixed or variable length. They are organized in fixed-size blocks called Control
Intervals (CIs), and then into larger divisions called Control Areas (CAs). Control Interval sizes are
measured in bytes — for example 4 kilobytes — while Control Area sizes are measured in disk tracks or
cylinders. Page 2 of 9
Mainframe testing


It is a highly successful relational database management system. DB2 enables it’s users to create, update
and control relational database using STRUCTURED QUERY LANGUAGE (SQL).SQL (structured query
language) is used to access, manipulate or control the access of relational database.
Databases like DB2, ORACLE and SQL SERVER 2000 support the SQL (Structured query language).
Designed to meet the needs of small and large business alike, DB2 is available on a number of platforms
like DB2 on MVS.

CICS (Customer Information Control System)

It is a transaction processing system; it allows a user to input data online. A transaction can be thought of
as a unit of work, usually it is a single program that performs an update or returns the result of an
inquiry.CICS appears to the user to be a separate environment, but is actually a job that runs under the
main operating system (OS/390, Z/OS, etc).

CICS has a number of programs that handle resources, such as storage. There are various CICS tables
which define files programs and transactions to the CICS system.

When developing programs you can access files, storage and other programs and resources using the
application programming interface (API).The API lets you issue CICS commands from within the program.
This is known as Command Level Programming.

A transaction is started in CICS by entering a 4 character transaction id. When you enter the transaction
id this starts the program associated with that transaction id.

Environment setup
Databases: DB2 & VSAM
Online Systems: CICS
Languages: COBOL 390 & JCL.
O/S: MVS/ESA, Z/OS, OS/390
Tools and Utilities: FILE-AID, PLATINUM. FILEAID-DB2, DB2 – Move for Princeton,
Team structure Page 3 of 9
Mainframe testing

Onsite Testing
team Coordinator

Offshore Test
Offshore Test

Team Team Team Team Team Team

Member Member Member Member Member Member

Work assigning methods

Once the development team finished their coding they usually delivers the program to onsite development
coordinator. He did the final process of promoting and freezing the package. Once the package gets
freezed, now it is ready to deliver the program for testing. Onsite development coordinator sends a
delivery note to onsite Testing coordinator.

Delivery note should contain

♦ Program to be test
♦ Description of the program.
♦ JCL location for the particular program
♦ In which package the changed version of this programs load module is stored
♦ In which package the changed version of this program is stored.
♦ Master spec location
♦ Changed program spec location.
♦ Unit Test Plan Location

Onsite testing coordinator assigns the program to the offshore testing team leads. Based on the
complexity, offshore testing team leads assign the program to their team members. Once the offshore
testing team members got the delivery note then he should follow any one of the below process to get the

Methods to get the functional requirement

♦ Some Projects have their own website to store the entire walk through documents, sign off FRs
and some high level business documents.
♦ Some projects use a common drive with a shared folder for a particular project and they store the
entire walk through documents, sign off FR, master specs and change specs, UTPs (unit test
plans) for different programs in different sub folders. Page 4 of 9
Mainframe testing

♦ In some projects, they usually send their FR documents through email.

♦ Some projects follow the process of storing the FR in Quality Center.

After getting the FR there are 5 phases in testing they are

♦ Analysis phase
♦ Test Plan preparation.
♦ Test data setup
♦ Test case execution.
♦ Defect Tracking

Analysis phase

It should require FR, Master spec and Changed program spec of the particular program to be test. We
can easily get the specs location from the delivery note.

First the tester should understand the entire program by reading the master spec .Then he should go
through the FR. For better understanding of the FR we can also use the changed program spec. But it is
not advisable to read a changed program spec.

Test Plan preparation

♦ Create test cases based on the business requirement(s).

♦ Do not write test cases based on the COBOL code.
♦ Include sufficient number of test cases to verify error/exception handling.
♦ Try to avoid using too much of technical language when describing your test cases. As far as
possible, use the business terminology.
♦ In case of shared programs, create test cases for invoking the program thru online as well as
♦ In case of programs which feed load jobs, create test cases for running the load job using the
feed generated by the program being tested to verify that it works as expected.
♦ Create sufficient number of test cases for regression testing. This will ensure that existing logic is
unaffected by the changes.

Test plan should contain

♦ Introduction of the program

♦ Description of the program
♦ Feature to be test
♦ Feature not to be test
♦ Pre processing and Post processing
♦ Test start date
♦ Test end date
♦ Testing done by
♦ Type of testing needed (online or batch)
♦ Test cases
 Changed version test case.
 Regression test case.
♦ Test log(It should be filled after the testing completed)
 JCL location
 Job log for changed version program
 Input files used for changed version program.
 Output files of changed version program. Page 5 of 9
Mainframe testing

 Job log for production version program

 Input files used for production version program.
 Output files of production version program.
 Screen shots to be recorded in a word document for online testing.
 Capture the query result in a PS file for database testing.

Test data setup

As far as possible, use the relevant online transactions to create the test data.
Avoid making changes to data externally. This may create inconsistencies in the database.
If the program being tested updates one or more tables, create an extract of the tables using DB2-
Princeton Move. This can be used to reload the data and re-run the test cases if required. The extract
files can also be reused for future testing.

Batch test data setup

In test data setup for a batch program, we must follow the Preprocessing to generate the input file.
After generating the input file, use file aid to locate and modify the values of the impacted field.

Database test data setup

Test data setup for a Database testing is achieved by modifying the particular field values in the
respective tables.

Online data setup

There is no separate data setup for online.

Test Case Execution

Pre processing

Pre processing is a process of executing the previous program of the affected program.
In detail we can say that, in a stream of programs there is the affected program and we need to generate
the input for this program by executing the previous program and get the output as input for our affected

Post processing

Post processing is a process of executing the next program of the affected program.
In detail we can say that, the output of the affected program will fed as a input to the next program of the
particular stream so as to verify that the stream is unaffected by the changes done in the affected

Once the preprocessing is over we can start testing by executing the test cases through JCL.
There are different types of testing they are.

♦ Online testing.
♦ Batch testing
♦ Database testing

Batch testing

Batch testing is nothing but executing a JCL for a particular affected program with an input file
contains our test data and getting the expected in Output file. Page 6 of 9
Mainframe testing

Database testing

Database testing is nothing but we are executing a program with an input file and that file uploads the
data to the tables in database .In other way executing some query in the COBOL program and extracting
the expected output form the tables and writing in an output PS file.

Online testing

Online testing is nothing but entering values through online screen and checking through the
database whether the values given in the online screens are entered in the appropriate table and in the
correct fields.

Batch + online + database testing

Executing a JCL for a particular affected program with an input PS file contains our test data and
getting the expected in Output file as well as uploading the output to database. While retrieving the data
through the online screen, the data which uploads though batch process Input PS file should be retrieved.

Regression testing

Regression testing will ensure that existing logic is unaffected by the changes done. In detail, we have to
include all the test cases in the previous release of this particular program and execute the same test

Best practice for batch regression testing

Create an input file and do the test dataset up for the particular enhancement and run with the changed
version load library. Take a log of that particular output file.

Use the same input file which we used earlier and run with the baseline load library. Take a log of that
particular output file.

Now compare the two output files with the help of 3.13 option or with the help of file aid.

Comparing through file aid

Open two file aid session in a browse mode for changed version output file and baseline version output
file respectively.
Compare each field of the changed version output file with the baseline version output file.
Only the impacted or the newly introduced field must change. All the other field should remain same and
shows the same values in the result because we ran with the same input file.

Comparing through 3.13 option

Compare the changed version output file and baseline version output file with the help of 3.13 option and
it should give a result of the impacted or the mismatched field.

Defect Tracking:

During the testing phase, if any bug arise then the team member should immediately stop the testing and
raise a tester’s question to the concern developer through mail. The developer should respond with a
justification for the question raised by the tester. If the tester was not satisfied with the answer given by
the developer and he found that there is a mismatch between the FR and the Output of the particular
program .Then he should raise a defect in the QC and inform the defect id to their respective team lead. Page 7 of 9
Mainframe testing

Offshore team lead should send a mail to their onsite team coordinator. Once the defect status become
“ready to retest”, the tester should start testing the program from the initial stage onwards.

When raising a defect the tester should take care of the followings

♦ Priority
♦ Severity
♦ To whom he is assigning the defect
♦ Who are all the person need to include in the CC

Before sending Sign off note the defect raised should have the status be “closed” in QC.
Prepare sign off note and mail sign off note to all respective leads stating the path of Test Plan stored in

Documenting the Test Results

♦ Record the JCL used for testing.

♦ Record all the input and output dataset names in your test plan for each test case.
♦ Mention any Move Extract datasets created/used during the testing.
♦ Copy the job logs to a separate PDS using XDC and mention the PDS name in the test results.
♦ For DB2 programs which update tables, show the before and after views of the data using SPUFI.
♦ For CICS programs, create a document containing the screens captured during testing.

Sign-off note

Preparing a signoff note is the final process of testing. After sending the sign off note to our onsite
coordinator then the program is moved to production. While creating a signoff note we must take care of
the following things.

Sign off Note Contains

♦ Program name(Name of the program for which the testing is completed)

♦ Which release(Release Number)
♦ No of defect found during testing
♦ Are the defects are fixed and closed(Whether all the defects are fixed and the status in QC is
updated as” Closed” and also mention the defect id )
♦ If any defect is not fixed “Deferred defect” (Mention the defect ID which was not fixed for this
♦ No of defect fixed during UAT.
♦ Whether UAT sign off note is received.
♦ Did the test plan is stored in the QC
♦ Did your time is locked in the time tracker under this project.
♦ Comments

General tips for effective testing

♦ We should make a copy of the JCL from the baseline or from testing team repository to our own
PS or PDS. Then only we have to modify the JCL.
♦ While modifying the JCL we should take care of Job card parameters, DD parameters, Parm
parameters, Cycle name. Page 8 of 9
Mainframe testing

♦ Make sure all printed outputs in your JCL are redirected to SYSOUT. This will avoid large
amounts of printouts being generated at onsite.
♦ There must be a naming convention for each project and we should follow the same in assigning
the names for PS or PDS and also in assigning the cycle names.
♦ Avoid using expeditor for testing unless absolutely necessary. Expeditor is a debugging tool and
not a testing tool.
♦ Expeditor can only be used in case of testing some error/exception situations that cannot be
easily created.
♦ Separate regions need to be allocated before testing.
♦ Data present in the test region database must be a mirror of production data.
♦ Need to run with appropriate load library to reflect the particular change i.e. for a particular
enhancement we need to take the particular load library so that the new changes done in the
particular load library will reflect in the output. Page 9 of 9