Anda di halaman 1dari 29

MPS - MRG Integration

<Iteration/ Master> Test Plan Version 3.0

MPS - MRG Integration Master Test Plan

Version:

3.0

Date: 11/18/2004

Revision History
Date 08/02/2004 11/10/2004 11/18/2004 Version 1.0 2.0 3.0 Description Initial Draft of MRG system test plan for LDS Quality Assurance department Updated based upon internal review Updated Test Coverage Section Y.Karob W. Savage W. Savage Author

Confidential

Lydian Data Services, 2013

Page 2

MPS - MRG Integration Master Test Plan

Version:

3.0

Date: 11/18/2004

Table of Contents
1. Introduction 1.1 Purpose 1.2 Terminology and Acronyms 1.3 Background 1.4 Scope 1.5 Intended Audience 1.6 Referenced Material 2. Outline of Planned Tests 2.1 Outline of Test Inclusions 2.2 Outline of Other Candidates for Potential Inclusion 2.3 Outline of Test Exclusions 3. Test Approach 3.1 Testing Data and Data Strategy 3.2 Testing Techniques and Types 3.2.1 Function Testing 3.2.2 User Interface Testing 3.2.3 Load Testing 4. Entry and Exit Criteria 4.1 Entry Criteria 4.2 Exit Criteria 4.3 Suspension and Resumption Criteria 5. Deliverables 5.1 Reporting on Test Coverage 5.2 Reporting on Issues Encountered/Reported 5.3 Automated Test Scripts 5.4 Detailed Testing Results 5.5 Summary Report 6. Environmental Needs 6.1 Base System Hardware 6.2 Base Software Elements in the Test Environment 6.3 Productivity and Support Tools 7. Responsibilities, Staffing, and Training Needs 7.1 Human Resources and Roles 8. Risks, Assumptions, and Constraints 9. Management Process and Procedures 9.1 Problem Reporting, Escalation, and Issue Resolution 9.2 Test Plan Approval and Signoff APPENDIX A Testing Type Definitions 5 5 5 5 5 6 6 6 6 6 7 7 8 8 8 8 9 9 9 10 10 10 10 10 11 11 11 11 11 11 12 12 12 13 14 14 15 15

Confidential

Lydian Data Services, 2013

Page 3

MPS - MRG Integration Master Test Plan APPENDIX B Coverage Summary

Version:

3.0

Date: 11/18/2004 17

Confidential

Lydian Data Services, 2013

Page 4

MPS - MRG Integration Master Test Plan

Version:

3.0

Date: 11/18/2004

1.
1.1

Introduction
Purpose The purpose of this document is to gather all of the information necessary to coordinate the test effort. It describes the approach to test the software, and is designed to provide a high-level overview of the testing effort. This document supports the following objectives: Identifies the scope of the testing effort Outlines the testing approach that will be used Lists the deliverable elements of the test project Identifies the required resources and provides an estimate of the test efforts

1.2 Terminology and Acronyms The following list contains terms and acronyms contained within the document. Testing type definitions are contained within Appendix A. 1.3 MPS Mortgage Processing System MRG Middleburg, Riddle, Gianna LDS Lydian Data Services PMO Project Management Office AUT Application Under Test

Background MPS MRG Integration is envisioned to replace the existing process in which closing documentation is established; current process involves usage of the Miracle application. Integration shall reduce the double entry of information currently required to produce closing documentation. The introduction of the MRG components shall not affect the existing functionality of MPS. Scope This section shall address the scope for the test phase to be performed by LDS Quality Assurance Department. All testing outlined below pertains to the System Testing phase, including System Level Integration Testing, of the AUT. The following testing types will be considered in scope for the testing phase: Functional User Interface Load The following testing types placement in scope has yet to be determined. Items listed below are available testing candidates; however addition of these items will require a re-estimate of the test effort: Data and Database Integrity Business Cycle Stress Volume Performance Profiling Failover and Recovery Configuration

1.4

Confidential

Lydian Data Services, 2013

Page 5

MPS - MRG Integration Master Test Plan Installation Security and Access Control (Role based)

Version:

3.0

Date: 11/18/2004

1.5 Intended Audience This test plan is intended to provide guidance to the Development Resources, Quality Assurance Management, Project Management and Executive Sponsors of the application, as to the scope and methodology utilized during the testing phases as conducted by the LDS Quality Assurance Department. 1.6 Referenced Material The following list contains name and location of referenced material. MPS User Guide for Post Closing (Draft) - Star Team, Project Path: Development Services\CSFB Wholesale Operations\P26 MRG Integration MRG Integration Software Requirements Specification for MPS (Version 1.0) - Star Team, Project Path: Development Services\CSFB Wholesale Operations\P26 MRG Integration MPS Use Case Specification: MRG Integration - Star Team, Project Path: Development Services\CSFB Wholesale Operations\P26 MRG Integration MPS MRG Quality Assurance Master Test Plan - Star Team, Project Path: Development Services\CSFB Wholesale Operations\P26 MRG Integration

2.

Outline of Planned Tests

This section provides a high-level outline of the testing that will be performed. The outline in this section represents a high level overview of both the tests that will be performed and those that will not. It is important to note that outlined coverage execution is dependent upon the scheduled testing window being available; changes to the overall testing schedule may have a direct effect upon the execution of the test plan. A coverage summary section is located within Appendix B. 2.1 Outline of Test Inclusions The primary driver of the testing will be requirements based use cases specific to the new features provided within MPS as a direct result of the MRG integration. In addition, general regression testing will be performed upon the MPS application to ensure behavior and operation of MPS was not adversely effected as a result of the integration. LDS MPS Use-Case Specification: MRG Integration MPS Regression Testing Activities MPS-MRG limited scope load testing

If it is deemed that the above listed use/test cases do not provide sufficient coverage to address those areas considered in scope, additional use/test cases will be incorporated. 2.2 Outline of Other Candidates for Potential Inclusion There are several areas in which testing may be desired however benefit has yet to be quantified or responsible testing entity may reside outside of the Quality Assurance department, those areas are outlined in the following. Items listed below are available testing candidates; however addition of items listed below will require a re-estimate of the test effort: Project Specific Areas: MPS Messaging Inspection (Customer Portal) currently messaging inspection is addressed during the user acceptance testing (UAT) phase, currently UAT testing is performed outside of the Quality Assurance department Confidential MPS-MRG Generated Closing Documentation (sent via email) Review Lydian Data Services, 2013 Page 6

MPS - MRG Integration Master Test Plan General Testing Areas Business Cycle Stress Volume Failover and Recovery Configuration Installation

Version:

3.0

Date: 11/18/2004

2.3 Outline of Test Exclusions Items slated for exclusion from the Quality Assurance departments testing are listed in the following. Listing does not contain those items as candidates for potential inclusion. Unit Testing/Unit Integration testing envisioned to be the responsibility of development, if desired, scope will be adjusted to include the Quality Assurance department in this endeavor

3.

Test Approach

It is the expectation of the Quality Assurance department that all use cases will be completed prior to the commencement of testing. Use cases are envisioned to address all requirements and functionality of the application that requires testing. It is from these use cases that the testing scope and directive originate. The use cases will be developed into an automated test case using the Mercury Interactive Quick Test Pro (QTP) application. The automated test cases will be exercised upon Quality Assurance client platforms within the LDS Quality Assurance laboratory. Automated test cases will leverage established best practices and techniques for development. In the event the timeline of automation development exceeds the available; use cases will be addressed by manual testing of the application. In such an event, manual logs will be kept, detailing testing steps and observations. In addition to the use cases being exercised, MPS regression testing activities, consisting of both manual and automated test execution will occur to ensure that MPS functionality was not adversely affected by the introduction of the MRG components. Load testing shall be performed within a limited section of the application. Specifically targeted is the transaction time related to processing the closing documentation request as reflected within the MPS user interface. Testing shall be structured to vary the workload with relationship to time. More detailed information may be located within Appendix B Coverage Summary. Testing exposure is currently limited to system conditions as experienced via the graphical user interface (GUI) of the AUT. Actions that occur outside of the GUI are not currently in scope of the LDS Quality Assurance department for this testing endeavor. Testing activity that is desired to occur outside of the GUI is expected to be requested in the form of a use case. Testing shall be performed under controlled environmental conditions. Currently the testing is envisioned to occur within the LDS MPS staging environment. Code shall be promoted to the environment and strict environmental control procedures will be adhered to in order to provide validity to the testing effort. In the event that control practices are not adhered to, testing may be halted (reference suspension criteria). Depending upon additional items added to the scope, as listed in Section 2.2 - Outline of Other Candidates for Potential Inclusion, the testing approach may expand in response. Depending upon changes to the testing schedule and the allotted Quality Assurance testing window, testing coverage may be modified in response.

Confidential

Lydian Data Services, 2013

Page 7

MPS - MRG Integration Master Test Plan

Version:

3.0

Date: 11/18/2004

3.1 Testing Data and Data Strategy Data contained within the testing environment (LDS MPS Staging environment) shall be a copy of production data. Unless specific data is required to exercise established use cases, data will be randomly selected and adjusted throughout the course of testing. Randomization of data conditions should provide an increased level of system coverage, as more combinations shall be exercised as compared to a static data path. 3.2 Testing Techniques and Types The following shall provide a more detailed view of the testing techniques and types to be utilized based upon the current scope of testing. Areas listed as possible in scope or out of scope will not be addressed within this section. Testing techniques listed below shall implore the tools as outlined in the Productivity and Support Tools section of this document. 3.2.1 Function Testing Function testing of the AUT should focus on requirements for test that can be traced directly to use cases or business functions and business rules. The goals of these tests are to verify proper data acceptance, processing, and retrieval, and the appropriate implementation of the business rules. This type of testing is based upon black box techniques; that are verifying the application and its internal processes by interacting with the application via the User Interface (UI) and analyzing the output or results. The following high level summary identifies an outline of the testing type. Technique Objective: Exercise AUT functionality, including navigation, data entry, processing, and retrieval to observe and log target behavior. Technique: Execute each use-case scenarios individual use-case flows or functions and features, using valid and invalid data, to verify that: Expected results occur when valid data is used Appropriate error or warning messages are displayed when invalid data is used Each business rule is properly applied

Success Criteria: This technique supports the testing of the following: Key use case scenarios Key features

Special Considerations: Functional testing is dependent upon creation of use cases or published requirements documentation. 3.2.2 User Interface Testing UI testing verifies a users interaction with the software. The goal of UI testing is to ensure that the user is provided with the appropriate access and navigation through the functions of the AUT. In addition, UI testing ensures that the objects within the UI function as expected and conform to corporate or industry standards. Technique Objective: Exercise the following to observe and log standards conformance and target behavior: Navigation through the AUT reflecting business functions and requirements, including window-towindow, field-to-field, and use of access methods (tab keys, mouse movements, accelerator keys) Window objects and characteristics can be exercisedsuch as menus, size, position, state, and focus

Technique: Create or modify tests for each window to verify proper navigation and object states for each application window and object. Success Criteria: The technique supports the testing of each major screen or window that will be used extensively by the end user. Confidential Lydian Data Services, 2013 Page 8

MPS - MRG Integration Master Test Plan

Version:

3.0

Date: 11/18/2004

Special Considerations: Testing will refer to published requirements documentation to interpret object properties and behavior, if not published, interaction with Project Management and Development may be required to determine correct application appearance and/or behavior. 3.2.3 Load Testing Load testing is a performance test that subjects the AUT to varying workloads to measure and evaluate the performance behaviors and abilities of the AUT to continue to function properly under these different workloads. The goal of load testing is to determine and ensure that the system functions properly beyond the expected maximum workload. Additionally, load testing evaluates the performance characteristics, such as response times, transaction rates, and other time-sensitive issues. Technique Objective: Exercise designated transactions or business cases under varying workload conditions to observe and log target behavior and system performance data Navigation through the AUT reflecting business functions and requirements, including window-towindow, field-to-field, and use of access methods (tab keys, mouse movements, accelerator keys) Window objects and characteristics can be exercisedsuch as menus, size, position, state, and focus

Technique: Develop transaction test scripts based upon functional or business transactions, structured to measure a given operation. Success Criteria: The technique supports the testing of Workload Emulation, which is the successful emulation of the workload without any failures due to test implementation problems. Special Considerations: Testing will require dedicated and isolated environment in order to ensure validity.

4.

Entry and Exit Criteria

4.1 Entry Criteria The following shall represent the entrance criteria for the testing engagement Project established within TestDirector Project established within TrackIt Completion of requirements documentation Completion of application development Completion of development unit testing o o Documentation of unit test performed Unit testing sign off form completed

Functional walk-thru of AUT provided by Development to Project Management and Quality Assurance Documentation of recommended client side settings (Browser Configurations, Java Applets, etc.) Testing environment provided with established configuration control Published roll schedule o o o Date and time of release to Quality Assurance testing environment Duration of Quality Assurance testing window, when different that estimated/planned Scheduled production release date

Confidential

Lydian Data Services, 2013

Page 9

MPS - MRG Integration Master Test Plan Updated/current data dictionary Sign off/approvals for LDS Quality Assurance test plan

Version:

3.0

Date: 11/18/2004

4.2 Exit Criteria The following items completion and availability of corresponding documentation shall represent the exit criteria for the testing engagement. System testing activities consider in scope. Issues encountered reported within TestDirector. Issues reported resolved to the satisfaction of the PMO. Summary report created and submitted to PMO.

4.3 Suspension and Resumption Criteria The following shall detail the suspension and resumption criteria for the testing engagement. Testing may be suspended due to any one of the following: Lack of environmental control: in the event that environmental control is not enforced, testing may be halted to investigate the underlying cause. Testing may resume once management has determined that environment is acceptable for continued testing. Critical defect: in the event a critical defect is located, testing may be halted to allow for modification to code to allow for defect resolution. Testing may resume once code modifications have been submitted and incorporated into the controlled environment. Other Priorities: in the event other priorities are established by the business, testing may be halted. Testing may resume once business determines the course. Infrastructure downtime: in the event that components (network, client, servers, etc.) required for testing are unavailable, testing may be halted. Testing may resume once infrastructure is re-established. Update of application: in the event updates are required to the application, testing may be halted while such update occurs. Testing may resume once completion of update has occurred and is within a controlled environment. In the event that updates are to core components, re-testing of areas addressed prior to update may be required to ensure coverage. Data needs: in the event that all available data is utilized during the course of testing, database refresh/restore shall be sought. Testing shall resume once data issues are resolved.

5.

Deliverables

The following shall list those items to be delivered by LDS Quality Assurance department during the course of or at the completion of the testing effort. 5.1 Reporting on Test Coverage Test Coverage shall be reported upon a daily basis and the formula for calculating coverage will be based upon the total number of use cases executed as compared to the total number of available test cases. Report will be structure as textual and is envisioned to be a brief statement of current status. 5.2 Reporting on Issues Encountered/Reported Issue encountered shall be reported upon a daily basis. All issues submitted within TestDirector will be contained in Confidential Lydian Data Services, 2013 Page 10

MPS - MRG Integration Master Test Plan

Version:

3.0

Date: 11/18/2004

a summary report. Included within the report will be current/updated status of issues and if applicable the resolution to reported issue will be contained. In addition, all issues encountered during the testing that may reside outside of the application will be contained within this report, including but not limited to environmental control issues, periods of downtime, miscommunications between departments, in general any item that effects testing shall be reported upon. 5.3 Automated Test Scripts All automated testing scripts leveraged during the course of the testing shall be archived upon a specified network location and/or reside within TestDirector. 5.4 Detailed Testing Results All results generated from usage of automated scripts shall be archived upon a specified network location and/or reside within TestDirector. In the event manual testing was performed, all reports associated with the manual testing shall be contained in the same location. 5.5 Summary Report At the completion of the testing a summary report shall be generated containing a listing of all defects/issues encountered and their resolution. In addition, any lessons learned from the testing engagement shall be included within this summary report.

6.

Environmental Needs

This section presents the non-human resources required for the testing endeavor. 6.1 Base System Hardware Test environment, consisting of application infrastructure, shall be provided for the duration of Quality Assurance testing cycle, such environment is the responsibility of the infrastructure coordinator to provide and monitor. Environment shall be structured to resemble production environment as a method of ensuring valid testing. Environment shall be under strict configuration control, as stated within the entry criteria established within this documentation. As the current application is served from a local client browser, databases, application and networking layers are envisioned to be established and controlled by parties outside of the Quality Assurance department. Method of access is expected to be provided, consisting of a valid Uniform Resource Locator (URL) with an established login to be utilized for the duration of testing. Where required, in order to facilitate security access testing, additional login ids may be needed and shall be provided on an as needed basis, such testing is expected to be addressed within the context of the use cases. Client testing platforms that will be utilized shall consist of the LDS Quality Assurance laboratory platforms. Specific client settings shall be provided and administered by the Quality Assurance lead testing resource in an effort to resemble that of a production environment. Established environmental control practices apply to all base system hardware components. 6.2 Base Software Elements in the Test Environment Infrastructural software elements shall be at the control of the infrastructure coordinator for all server and network elements; such elements shall be structured to resemble a production environment as a method of insuring valid testing. Elements that reside upon the Quality Assurance testing platforms shall fall under control and administration of the Quality Assurance lead testing resource as assigned for this project. Those items will include all automated testing software, any browser configuration settings, and any required client software load. Items that have currently been identified as client software load specific to the MPS-MRG application testing include the following:

Confidential

Lydian Data Services, 2013

Page 11

MPS - MRG Integration Master Test Plan

Version:

3.0

Date: 11/18/2004

Microsoft Internet Explorer: Version 6.0.28.00.1106.xpsp2.030422-1633CO Microsoft Windows Operating System: Windows XP Professional Version 2002 Service Pack One

All version information refers to the expected client software load as present in the LDS Quality Assurance laboratory. If additional and/or different software loads are required, such will need to be coordinated with the Quality Assurance lead testing resource. 6.3 Productivity and Support Tools The following shall list those productivity and support tools to be leveraged by the Quality Assurance department for this testing endeavor. Mercury Interactive TestDirector Utilized for defect tracking and test management Mercury Interactive Quick Test Pro Utilized for test automation Borland Corporation StarTeam Utilized for project document repository

7.

Responsibilities, Staffing, and Training Needs

This section presents the required resources to address the test effort outlined; the main responsibilities, and the knowledge or skill sets required of those resources. 7.1 Human Resources and Roles This table shows the staffing related to the Quality Assurance departments efforts for the test engagement.

Human Resources Role Minimum Resources Recommended and Associated Resource Name Minimum Resources: 1 Resource Name: Joseph Spinner Specific Responsibilities or Comments

Quality Assurance Test Manager

Provides management oversight. Responsibilities include: Planning and logistics Acquire appropriate resources Present management reporting Advocate the interests of test Evaluate effectiveness of test effort

Confidential

Lydian Data Services, 2013

Page 12

MPS - MRG Integration Master Test Plan Human Resources Role Minimum Resources Recommended and Associated Resource Name Minimum Resources: 1 Resource Name: Warren Savage

Version:

3.0

Date: 11/18/2004

Specific Responsibilities or Comments

Quality Assurance Lead Testing Resource

Provides quality assurance support Responsibilities include: Identify test ideas Define test approach Define test automation architecture Verify test techniques Define testability elements Structure test implementation Implement tests and test suites Execute test suites Log results Analyse and recover from test failures Document incidents Create summary report Ensure QA testing laboratory is properly configured for test

Database Administrator, Database Manager Business Analyst

Minimum Resources: 1 Resource Name: TBD Minimum Resources: 1 Resource Name: Brian McDonald

Ensure test data (database) environment and assets are managed and maintained Creation of use cases based upon published requirements Ensure testing environments are maintained to established configuration management standards Provide general project direction and is responsible for client interaction and ensuring project success

Infrastructure Coordinator Project Manger

Minimum Resources: 1 Resource Name: Devon Walker Minimum Resources: 1 Resource Name: Jane Somerville

8.

Risks, Assumptions, and Constraints

The following table shall identify potential risk along with associated mitigation and contingency strategy.

Confidential

Lydian Data Services, 2013

Page 13

MPS - MRG Integration Master Test Plan Risk


Entry Criteria have not been satisfied.

Version:

3.0

Date: 11/18/2004 Mitigation Strategy


Align Resources to address those outstanding items

Contingency
Table the areas of testing that involve those uncompleted areas and proceed testing with the expectation that task will be completed prior to completion of testing cycle Utilize manual testing to account for those areas in which automation has yet to be developed Adjust testing scope to account for changes in initial plan Reduce planned coverage to account for reduction in allotted testing window

Automated test case development exceeds the available time. Modifications to project scope (Scope Creep) Established QA testing time reduced

Additional resources dedicated to development of automation or scope of automation reduced Monitor product changes to atomic detail as facilitated by configuration change management board and/or project management Ensure realistic timelines are established within the project schedule

The following shall list assumptions of the test effort. Assumptions, if incorrect, may affect the overall project. Assumption to be proven
Testing environment will be managed

Impact of Assumption being incorrect


Testing may yield false results in the event that environmental controls are not established and followed Testing schedule/activity is dependent upon availability of testing environments; unavailability may cause delays to the overall schedule Lack of dedicated resources may cause delays to the overall schedule Key areas of application functionality may go untested due to omission in use cases

Owners
Infrastructure Coordinator, Database Manager and Project Manager Infrastructure Coordinator, and Database Manager

Testing environment will be available (Clients, Application Servers, Database Servers, Network, etc.) Human Resources will be available and dedicated for the duration of the project life cycle Use cases are correctly designed and provide desired coverage

Quality Assurance and Project management Business Analyst

The following are viewed as constraints of the testing exercise. Constraint on


Dedicated Environment

Impact Constraint has on test effort


Current process involves the sharing of an environment between MPS and MPS-MRG; this has limited the activities performed to date

Owners
Infrastructure Coordinator and Database Manager

9.

Management Process and Procedures

The following shall give a high level overview of the management process and procedure as relating to this testing endeavor 9.1 Problem Reporting, Escalation, and Issue Resolution All issues shall be entered in TestDirector, escalation, if needed, will be done thru Quality Assurance and Project Management interaction and issue resolution shall be documented within TestDirector.

Confidential

Lydian Data Services, 2013

Page 14

MPS - MRG Integration Master Test Plan 9.2 Test Plan Approval and Signoff The following individuals approval and signoff are required. Project Role Project Manager: Business Analyst: Development Representative: Operations Representative: Database Representative: Resource Name Jane Somerville Brian McDonald TBD Steve Pico Eric Calvino Signature

Version:

3.0

Date: 11/18/2004

_________________________________ _________________________________ _________________________________ _________________________________ _________________________________ _________________________________ _________________________________

Quality Assurance Test Manager: Joseph Spinner Infrastructure Coordinator Devon Walker

APPENDIX A

Testing Type Definitions

The following shall define the testing types associated within this document. Data and Database Integrity Testing: The databases and the database processes tested as an independent subsystem. This testing should test the subsystems without the UI as the interface to the data. Functional (Function) Testing: Function testing should focus on any requirements for test that can be traced directly to use cases or business functions and business rules. The goals of these tests are to verify proper data acceptance, processing, retrieval, and the appropriate implementation of the business rules. This type of testing is based upon black box techniques; verifying the application and its internal processes by interacting with the application via the UI and analysing the output or results. Business Cycle Testing: Business Cycle Testing should emulate the activities performed on the application over time. A period should be identified, such as one year, and transactions and activities that would occur during a years period should be executed. This includes all daily, weekly, and monthly cycles, and events that are datesensitive. User Interface Testing: UI testing verifies a users interaction with the software. The goal of UI testing is to ensure that the UI provides the user with the appropriate access and navigation through the functions of the AUT. In addition, UI testing ensures that the objects within the UI function as expected and conform to corporate or industry standards. Performance Profiling: Performance profiling is a performance test in which response times, transaction rates, and other time-sensitive requirements are measured and evaluated. The goal of Performance Profiling is to verify performance requirements have been achieved. Performance profiling is implemented and executed to profile and tune application performance behaviours as a function of conditions such as workload or hardware configurations.

Confidential

Lydian Data Services, 2013

Page 15

MPS - MRG Integration Master Test Plan

Version:

3.0

Date: 11/18/2004

Load Testing: Load testing is a performance test that subjects the application to varying workloads to measure and evaluate the performance behaviours and abilities of the application to continue to function properly under these different workloads. The goal of load testing is to determine and ensure that the system functions properly beyond the expected maximum workload. Additionally, load testing evaluates the performance characteristics, such as response times, transaction rates, and other time-sensitive issues. Stress Testing: Stress testing is a type of performance test implemented and executed to understand how a system fails due to conditions at the boundary, or outside of, the expected tolerances. This typically involves low resources or competition for resources. Low resource conditions reveal how the application fails that is not apparent under normal conditions. Other defects might result from competition for shared resources, like database locks or network bandwidth, although some of these tests are usually addressed under functional and load testing. Volume Testing: Volume testing subjects the AUT to large amounts of data to determine if limits are reached that cause the software to fail. Volume testing also identifies the continuous maximum load or volume the AUT can handle for a given period. Security and Access Control Testing: Security and Access Control Testing focuses on two key areas of security: Application-level security, including access to the Data or Business Functions System-level Security, including logging into or remotely accessing to the system Based on the security you want, application-level security ensures that actors are restricted to specific functions or use cases, or they are limited in the data that is available to them. For example, everyone may be permitted to enter data and create new accounts, but only managers can delete them. If there is security at the data level, testing ensures that user type one can see all customer information, including financial data, however, user two only sees the demographic data for the same client. System-level security ensures that only those users granted access to the system are capable of accessing the applications and only through the appropriate gateways. Failover and Recovery Testing: Failover and recovery testing ensures that the AUT can successfully failover and recover from a variety of hardware, software or network malfunctions with undue loss of data or data integrity. For those systems that must be kept running failover testing ensures that, when a failover condition occurs, the alternate or backup systems properly take over for the failed system without any loss of data or transactions. Recovery testing is an antagonistic test process in which the application or system is exposed to extreme conditions, or simulated conditions, to cause a failure, such as device Input/Output (I/O) failures, or invalid database pointers and keys. Recovery processes are invoked, and the application or system is monitored and inspected to verify proper application, or system, and data recovery has been achieved. Configuration Testing: Configuration testing verifies the operation of the AUT on different software and hardware configurations. In most production environments, the particular hardware specifications for the client workstations, network connections, and database servers vary. Client workstations may have different software loaded for example, applications, drivers, and so onand, at any one time, many different combinations may be active using different resources. Installation Testing: Installation testing has two purposes. The first is to ensure that the software can be installed under different conditions; such as a new installation, an upgrade, and a complete or custom installation; under normal and abnormal conditions. Abnormal conditions include insufficient disk space, lack of privilege to create directories, and so on. The second purpose is to verify that, once installed, the software operates correctly. This usually means running a number of the tests that were developed for Function Testing.

Confidential

Lydian Data Services, 2013

Page 16

MPS - MRG Integration Master Test Plan

Version:

3.0

Date: 11/18/2004

APPENDIX B

Coverage Summary

The following shall summarize the testing coverage to be provided. MPS Regression Testing process steps shall provide the following: 1.0 2.0 Access System 1.1 2.1 2.2 3.0 3.1 3.2 3.3 Log in to MPS system Access Pre-Registration Queue Assign Loan to current user Access Registration Queue Select Loan for Edit from Registration queue Access 1003 Form 3.3.1 Enter 1003 Information values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save 1003 Information due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Enter New Appraisal - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Appraisal - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Edit Appraisal - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Appraisal - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Exercise Adobe link for Borrower and Property Information Verify data sections that auto populate Edit 1008 form Save 1008 form Verify Data sections that auto populate Save Broker Info Form Pre-Registration Queue

Registration Queue

3.3.2 3.4

Access Appraisal Form 3.4.1

3.4.2 3.4.3

3.4.4 3.5

Access 1008 Form 3.5.1 3.5.2 3.5.3 3.5.4

3.6

Access Broker Info Form 3.6.1 3.6.2

3.7

Access Loan Contact Form

Confidential

Lydian Data Services, 2013

Page 17

MPS - MRG Integration Master Test Plan 3.7.1

Version:

3.0

Date: 11/18/2004 Enter Loan Contact Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Loan Contact Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Verify Loan Contact Information is correctly displayed in summary section Enter Comment and Select Add Button Verify Comment displayed Save Comment Form Select Pull Credit Select Pull New Credit this step is dependent upon the existence of a test loan structured to pull test credit report Verify Credit Report is returned Select Raw Text button Verify Credit Report is displayed in raw text format Select Pull Credit Select Reissue Credit - this step is dependent upon the existence of a test loan structured to pull test credit report Verify Credit Report is returned Select Raw Text button Verify Credit Report is displayed in raw text format Inspect DU information, if present, to ensure absence of error Ensure field 818 Yield Spread Premium is populated and read only (MPS TD 719) Exercise Adobe link for 1008 Form Information Enter GFE Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save GFE Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Exercise MSA/MD code link Enter HMDA Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested

3.7.2 3.7.3 3.8 3.8.1 3.8.2 3.8.3 3.9 3.9.1 3.9.2 3.9.3 3.9.4 3.9.5 3.9.6 3.9.7 3.9.8 3.9.9 3.9.10 3.10 3.11 3.10.1 3.11.1 3.11.2 3.11.3

Access Comments Form

Access Credit Report Form

Access DU Response Form Access Good Faith Estimate Form

3.11.4 3.12

Access HMDA Form 3.12.1 3.12.2

Confidential

Lydian Data Services, 2013

Page 18

MPS - MRG Integration Master Test Plan 3.12.3 3.13 3.14 3.15

Version:

3.0

Date: 11/18/2004 Save HMDA Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested

Access Rate Lock Info Form Access Registration Info Form 3.14.1 3.15.1 Verify data sections that auto populate Enter Truth In Lending Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Exercise the Calculate button - due to adjusting the data values, testing will address invalid data types and formats at time of calculate, error dialog behavior will be tested Verify Calculated Information is displayed correctly Save Truth In Lending Information Exercise Adobe link for Truth In Lending Information Enter Loan Data Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Loan Data Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Adjust conditions Save Apply Conditions Form Enter Product Data Main Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Product Data Main Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Exercise ARM Index Value link Select Rate Adjustments page in the Product Data Form Enter New Rate Adjustment - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Rate Adjustment - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Edit Rate Adjustment - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Rate Adjustment - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Lydian Data Services, 2013 Page 19 Access Truth In Lending Form

3.15.2 3.15.3 3.15.4 3.15.5 3.16 3.16.1

Access Loan Data Form

3.16.2 3.17

Access Apply Conditions Form 3.17.1 3.17.2

3.18

Access Product Data Form 3.18.1

3.18.2 3.18.3 3.18.4 3.18.5

3.18.6 3.18.7

3.18.8

Confidential

MPS - MRG Integration Master Test Plan 3.19 Access Closing Agent Form 3.19.1 3.19.2 Exercise New Button

Version:

3.0

Date: 11/18/2004

Enter Closing Agent Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Closing Agent Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Exercise all available links upon page 3.20.1.1 Flood 3.20.1.2 Torque 3.20.1.3 Appintell 3.20.1.4 Geocode 3.20.1.5 Terrorist Alert

3.19.3 3.20

Access Third Party Form 3.20.1

3.21

Access Escrow Form 3.21.1 Enter Escrow Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Escrow Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Enter Insurance Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Insurance Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Enter MI Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save MI Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Select Servicer and verify information is populated Save Servicer Information Select Custodian and verify information is populated Save Custodian Information

3.21.2 3.22

Access Insurance Form 3.22.1

3.22.2 3.23

Access MI Form 3.23.1

3.23.2 3.24

Access Servicer Form 3.24.1 3.24.2

3.25

Access Custodian Form 3.25.1 3.25.2

3.26

Access Shipments Form

Confidential

Lydian Data Services, 2013

Page 20

MPS - MRG Integration Master Test Plan 3.26.1

Version:

3.0

Date: 11/18/2004 Enter New Shipment - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Shipment - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Edit Shipment - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Shipment - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Attempt to move loan from registration queue to underwriting queue Ensure message is displayed informing that checklist must be completed before queue move Select all checklist items Save Checklist Form

3.26.2 3.26.3

3.26.4 3.27

Access Checklist Form 3.27.1 3.27.2 3.27.3 3.27.4

3.28 4.0 4.1 4.2 4.3

Exercise loan movement from registration queue to underwriting queue Access Underwriting queue Select Loan for Edit from Underwriting queue Access Clear Conditions form 4.3.1 4.3.2 4.3.3 Edit Conditions Save Conditions Verify that changes occurred to the conditions form

Underwriting Queue

4.4 5.0 5.1 5.2 5.3 6.0 6.1 6.2 6.3 6.4 6.5 6.6 Confidential

Exercise loan movement from underwriting queue to processing queue Access Processing queue Select Loan for Edit from Processing queue Exercise loan movement from processing queue to pre-closing coordinator queue Access pre-closing coordinator queue Select Loan for Edit from pre-closing coordinator queue Access Doc Prep form Access Pre Wire form 6.4.1 6.5.1 Ensure form is read-only Ensure form is read-only Access Post Wire form Access Collateral form Lydian Data Services, 2013 Page 21

Processing Queue

Pre-Closing Coordinator Queue

MPS - MRG Integration Master Test Plan 6.6.1 6.6.2 6.6.3 6.7 6.7.1 6.7.2 6.7.3 6.7.4 6.8 7.0 7.1 7.2 7.3 7.4 7.5 Modify information upon form Save form Verify changes occurred to the collateral form

Version:

3.0

Date: 11/18/2004

Access Checklist Form Attempt to move loan from pre-closing coordinator queue to closing audit queue Ensure message is displayed informing that checklist must be completed before queue move Select all checklist items Save Checklist Form

Exercise loan movement from pre-closing coordinator queue to closing audit queue Access closing audit queue Select Show Only My Loans radio button Ensure only users loans are displayed Select Loan for Edit from closing audit queue Access HUD form 7.5.1 Enter HUD Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save HUD Information due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Verify Information displayed Attempt to move loan from closing audit queue to funding post closing queue Ensure message is displayed informing that checklist must be completed before queue move Select all checklist items Save Checklist Form

Closing Audit Queue

7.5.2 7.6 7.7

Access Final Details form 7.6.1 7.7.1 7.7.2 7.7.3 7.7.4 Access Checklist Form

7.8 8.0 8.1 8.2 8.3

Exercise loan movement from closing audit queue to funding post closing queue Access funding post closing queue Select Loan for Edit from funding post closing queue Access Exceptions form 8.3.1 8.3.2 8.3.3 Adjust selected exceptions, ensuring that exceptions are selected Save Exceptions form Ensure Exceptions are present upon form Lydian Data Services, 2013 Page 22

Funding Post Closing Queue

Confidential

MPS - MRG Integration Master Test Plan 8.4 Access Clear Exceptions form 8.4.1 8.4.2 8.4.3 8.5 8.6 8.5.1 8.6.1 Select Cleared box for created exception Save Clear Exceptions form Verify exception is updated and marked with user id Ensure cleared exceptions are marked

Version:

3.0

Date: 11/18/2004

Access Exceptions form Access SVC Misc form Enter SVC Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save SVC Information due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Exercise Excel Link Attempt to move loan from funding post closing queue to servicing transfer queue Ensure message is displayed informing that checklist must be completed before queue move Select all checklist items Save Checklist Form

8.6.2 8.7 8.8

Access SVC Review form 8.7.1 8.8.1 8.8.2 8.8.3 8.8.4 Access Checklist Form

8.9 9.0 9.1 9.2 9.3 10.0 10.1 10.2 11.0 11.1 11.2 11.3 11.4 11.5 11.6 11.7 11.8 Confidential

Exercise loan movement from funding post closing queue to servicing transfer queue Access servicing transfer queue Select Loan for Edit from servicing transfer queue Exercise loan movement from servicing transfer queue to shipping queue Access shipping queue Select Loan for Edit from shipping queue Access Closing Audit Queue Assign a loan Select Loan for edit from Closing Audit queue Select Withdraw button Select Cancel button on pop up message Ensure loan has not been withdrawn Select Withdraw button Select OK button on pop up message Lydian Data Services, 2013 Page 23

Servicing Transfer Queue

Shipping Queue

Closing Audit (revisited)

MPS - MRG Integration Master Test Plan 12.0 Withdraw/Decline Queue 12.1 12.2 12.3 13.0 13.1 13.2 13.3 14.0 14.1 14.2 14.3 Access Withdraw/Decline queue Select Loan for edit from withdraw/decline queue

Version:

3.0

Date: 11/18/2004

Exercise loan movement from withdraw/decline queue to sales queue Access sales queue Select Loan for edit from sales queue Exercise loan movement from sales queue to registration queue Access Registration queue Select Show Only My Loans radio button Ensure loan moved from sales queue is present

Sales Queue

Registration Queue (revisited)

End of current MPS regression testing activities. Areas specifically targeted for this release, as defined by Lydian Data Services Mortgage Process System Use-Case Specification: MRG Integration, are outlined in the following (all actions listed occur in the MPS closing audit queue). 1.0 2.0 Access System 1.1 2.1 Logon to MPS system Access 1003 Form 2.1.1 Enter 1003 Information values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save 1003 Information due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Enter Product Data Main Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Product Data Main Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Exercise ARM Index Value link Select Rate Adjustments page in the Product Data Form Enter New Rate Adjustment - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Closing Audit Queue

2.1.2 2.2

Access Product Data Form 2.2.1

2.2.2 2.2.3 2.2.4 2.2.5

Confidential

Lydian Data Services, 2013

Page 24

MPS - MRG Integration Master Test Plan 2.2.6 2.2.7

Version:

3.0

Date: 11/18/2004 Save Rate Adjustment - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Edit Rate Adjustment - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Rate Adjustment - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Enter Product Data Main Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Product Data Main Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Exercise ARM Index Value link Select Rate Adjustments page in the Product Data Form Enter New Rate Adjustment - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Rate Adjustment - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Edit Rate Adjustment - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Rate Adjustment - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Ensure field 818 Yield Spread Premium is populated and read only (MPS TD 719) Exercise Adobe link for 1008 Form Information Enter 1008 Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save 1008 Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Enter Escrow Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save Escrow Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested

2.2.8 2.3

Access Product Data Form 2.3.1

2.3.2 2.3.3 2.3.4 2.3.5

2.3.6 2.3.7

2.3.8 2.4

Access Good Faith Estimate Form 2.4.1 2.4.2 2.4.3

2.4.4 2.5

Access Escrow Form 2.5.1

2.5.2 2.6

Access MI Form

Confidential

Lydian Data Services, 2013

Page 25

MPS - MRG Integration Master Test Plan 2.6.1

Version:

3.0

Date: 11/18/2004 Enter MI Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save MI Information - due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Enter HUD Information - values shall be adjusted to ensure correct system behavior for invalid data types, missing data, when combined with the following step, error dialog behavior will be tested Save HUD Information due to adjusting the data values, testing will address invalid data types and formats at time of save, error dialog behavior will be tested Ensure form is read-only Access Title Info Tab 2.9.1.1 Exercise Fields upon Title Info 2.9.1.2 Verify Field Limits and Formats (dependent upon additional documentation) 2.9.1.3 Save 2.9.2 Access Vesting Tab 2.9.2.1 Exercise Fields upon Vesting 2.9.2.2 Verify Field Limits and Formats (dependent upon additional documentation) 2.9.2.3 Save 2.9.3 Access Land lease Tab 2.9.3.1 Exercise Fields upon Land lease Tab 2.9.3.2 Verify Field Limits and Formats (dependent upon additional documentation) 2.9.3.3 Save 2.9.4 Access Settlement Tab 2.9.4.1 Exercise Fields upon Settlement Tab 2.9.4.2 Verify Field Limits and Formats (dependent upon additional documentation) 2.9.4.3 Save 2.9.5 Access Package Tab 2.9.5.1 Exercise Fields upon Package Tab 2.9.5.2 Verify Field Limits and Formats (dependent upon additional documentation) 2.9.5.3 Save 2.9.6 Access Addl Vesting Tab 2.9.6.1 Exercise Fields upon Addl Vesting Tab 2.9.6.2 Verify Field Limits and Formats (dependent upon additional documentation)

2.6.2 2.7

Access HUD form 2.7.1

2.7.2 2.8 2.9

Access Pre Wire form 2.8.1 2.9.1 Access Doc Prep form

Confidential

Lydian Data Services, 2013

Page 26

MPS - MRG Integration Master Test Plan 2.9.6.3 Save 2.9.7 Access Seller Tab 2.9.7.1 Exercise Fields upon Seller Tab

Version:

3.0

Date: 11/18/2004

2.9.7.2 Verify Field Limits and Formats (dependent upon additional documentation) 2.9.7.3 Save 2.9.8 Access Addl Seller Tab 2.9.8.1 Exercise Fields upon Addl Seller Tab 2.9.8.2 Verify Field Limits and Formats (dependent upon additional documentation) 2.9.8.3 Save 2.9.9 Access CEM Tab 2.9.9.1 Exercise Fields upon CEM Tab 2.9.9.2 Verify Field Limits and Formats (dependent upon additional documentation) 2.9.9.3 Save 2.9.10 Access COOP Tab 2.9.10.1 Exercise Fields upon COOP Tab 2.9.10.2 Verify Field Limits and Formats (dependent upon additional documentation) 2.9.10.3 Save 2.9.11 Access UW Comments Tab 2.9.11.1 Exercise Fields upon UW Comments Tab 2.9.11.2 Verify Field Limits and Formats (dependent upon additional documentation) 2.9.11.3 Save 2.9.12 2.9.13 2.9.14 2.9.15 Enter Email To: field Select Disclose Button Verify GUI response Verify Email Received

MPS Load Testing activities, specific for this project, shall provide the following. Measurement of MPS processing time for document generation as reported in MPS in relation to user load over time. Timings will be taken measuring time between use case steps 2.9.13 until 2.9.14

Testing is envisioned to occur as follows: Confidential 1 User submits document generation request every 5 minutes for period of one hour 1 User submits document generation request every 2.5 minutes for period of one hour 1 User submits document generation request every minute for period of one hour 5 Users submit document generation request every 5 minutes for period of one hour Lydian Data Services, 2013 Page 27

MPS - MRG Integration Master Test Plan

Version:

3.0

Date: 11/18/2004 5 Users submit document generation request every 2.5 minutes for period of one hour 5 Users submit document generation request every minute for period of one hour 10 Users submit document generation request every 5 minutes for period of one hour 10 Users submit document generation request every 2.5 minutes for period of one hour 10 Users submit document generation request every minute for period of one hour

Confidential

Lydian Data Services, 2013

Page 28

MPS - MRG Integration Master Test Plan Example graphical representation of 1user load depicted below:

Version:

3.0

Date: 11/18/2004

One User Load


10 9 8 Response Time (Seconds) 7 6 5 4 3 2 1 0 10 20 30 40 50 60 Requests per Hour per User Build 1 Build 2 Build 3

Figure 1.0 Example One User Load Graph

Confidential

Lydian Data Services, 2013

Page 29