Anda di halaman 1dari 231

Klaros-Testmanagement User Manual

Version 3.0.2
Publication date July 26 2010

Copyright © 2008-2010 verit Informationssysteme GmbH


Klaros-Testmanagement User Manual
by Caroline Albuquerque, Selcuk Akgünlü, Heiko Bott, Claudia Könnecke, Klaus Mandola, Christian Petry, To-
bias Schmitt, and Torsten Stolpmann
Version 3.0.2

Publication date July 26 2010


Copyright © 2008-2010 verit Informationssysteme GmbH

Abstract

This document serves as the reference documentation for the Klaros-Testmanagement application. It gives a
detailed description of the user interface and the provided functionality.

Legal Notice.  Copyright 2009-2010 verit Informationssysteme GmbH, Europaallee 10, 67657 Kaiserslautern, Germany. All rights re-
served. This product or document is protected by copyright and distributed under licenses restricting its use, copying, distribution, and
decompilation. No part of this product or documentation may be reproduced in any form by any means without prior written authoriza-
tion of verit Informationssysteme GmbH and its licensors, if any.

Trademarks
Java™ and Solaris™ are trademarks of Sun Microsystems.

Windows® is a registered trademark of Microsoft Corporation in the United States and other countries.

JIRA® is a registered trademark of Atlassian Pty Ltd.

Third party software and licenses


This product contains software covered by the following licenses:

OTN License Agreement ( OTN) .  This application contains Oracle JDBC Driver. Please read http://www.oracle.com/
technology/index.html for more details on JDBC Driver License agreement. JDBC Driver and all associated intellectual
property rights are retained by Oracle Corporation and/or its licensors. To use JDBC Driver included this application, you
need to agree with Oracle Technology Network Development and Distribution License Terms. If you don't, you can't use
this application.

Lesser GNU Public License ( LGPL 2.1 ) . 

• JasperReports

This product uses software available under the Apache Software License ( ASL 2.0 ) . 

This product uses icons from the Tango Desktop Project ( http://tango.freedesktop.org/ ) which are released to the
Public Domain. We thank the authors for their worthwhile efforts . 

This product uses icons from the Fugue Icon Set ( http://www.pinvoke.com/ ) which are available under a Creative
Commons Attribution 3.0 license . 
Table of Contents
1. Key Features .................................................................................................................... 1
1.1. Klaros-Testmanagement Community Edition Features ................................................. 1
1.2. Klaros-Testmanagement Enterprise Edition Features ................................................... 2
2. Introduction ..................................................................................................................... 3
2.1. Overview ............................................................................................................. 3
2.2. Roles .................................................................................................................... 6
3. Installation ....................................................................................................................... 7
3.1. Prerequisites .......................................................................................................... 7
3.1.1. Client Prerequisites ...................................................................................... 7
3.1.2. Server Prerequisites ...................................................................................... 7
3.2. Installation Process ................................................................................................. 9
3.2.1. Step 1: Welcome .......................................................................................... 9
3.2.2. Step 2: Information ..................................................................................... 10
3.2.3. Step 3: Licensing Agreements ...................................................................... 10
3.2.4. Step 4: Target Path ..................................................................................... 11
3.2.5. Step 5: Select Installation Packages ............................................................... 12
3.2.6. Step 6: User Data ....................................................................................... 13
3.2.7. Step 7: Installation ...................................................................................... 14
3.2.8. Step 8: Perform External Processes ................................................................ 16
3.2.9. Step 9: Setup Shortcuts ............................................................................... 16
3.2.10. Step 10: Installation Finished ...................................................................... 17
3.3. Update Process .................................................................................................... 18
3.3.1. Database Migration .................................................................................... 18
3.4. Running Klaros-Testmanagement ........................................................................... 19
3.5. Changing the Database ......................................................................................... 19
3.6. Configuring External Issue Management Systems ...................................................... 20
3.6.1. Bugzilla Configuration ................................................................................ 20
3.6.2. JIRA Configuration ..................................................................................... 20
3.6.3. Redmine Configuration .............................................................................. 20
3.6.4. Trac Configuration ..................................................................................... 21
4. Functional Overview ....................................................................................................... 22
4.1. Login ................................................................................................................. 22
4.2. Main Functions .................................................................................................... 23
5. Define ........................................................................................................................... 24
5.1. Defining a Project ......................................................................................... 24
5.1.1. Select a Project .................................................................................. 24
5.1.2. Maintaining a Project .......................................................................... 26
5.2. Test Environments ........................................................................................ 30
5.2.1. Maintaining Test Environments ............................................................. 30
5.3. Maintaining Systems Under Test ...................................................................... 33
5.3.1. Maintain Systems Under Test ................................................................ 33
5.4. Test Cases .................................................................................................... 35
5.4.1. Maintain Test Cases ............................................................................ 36
5.5. Test Suites ................................................................................................... 43
5.5.1. Maintain Test Suites ............................................................................ 43
6. Execute ......................................................................................................................... 49
6.1. Start New Run .............................................................................................. 49
6.1.1. Run Single Test Case ........................................................................... 49
6.1.2. Run Test Suite .................................................................................... 54
6.2. Continue Run ...................................................................................................... 57

iii
Klaros-Testmanagement User Manual

6.2.1. Continue Test Suite ............................................................................. 57


6.3. Creating an issue .......................................................................................... 59
7. Evaluate ....................................................................................................................... 65
7.1. Reports ....................................................................................................... 65
7.1.1. Dashboard ......................................................................................... 65
7.1.2. Report Templates ............................................................................... 72
7.1.3. Single Test Run Report ........................................................................ 73
7.1.4. Test Run History Report ....................................................................... 76
7.2. Test Results .................................................................................................. 78
7.2.1. Test Case Results ................................................................................ 78
7.2.2. Test Suite Results ................................................................................ 83
7.3. Issues .................................................................................................................. 85
7.3.1. Issues by Test Case ..................................................................................... 86
8. Configure ...................................................................................................................... 88
8.1. System Information ....................................................................................... 88
8.2. Report Templates .......................................................................................... 88
8.2.1. Report Details ............................................................................................ 90
8.3. User Administration ...................................................................................... 90
8.3.1. Create User ........................................................................................ 90
8.3.2. Search User ....................................................................................... 92
8.4. Authentication ............................................................................................. 94
8.4.1. LDAP ................................................................................................ 94
8.5. System Parameters ........................................................................................ 96
8.5.1. E-Mail ............................................................................................... 96
8.5.2. Logging ............................................................................................ 97
8.5.3. General Settings ................................................................................. 98
8.5.4. Issue Management .............................................................................. 99
8.6. Backup/Recovery ......................................................................................... 101
8.6.1. Backup Projects ................................................................................ 101
8.6.2. Restore Projects ................................................................................ 102
9. Reports ........................................................................................................................ 103
9.1. Create A New Report Template ............................................................................ 104
9.2. Applying a Report Template ................................................................................ 107
9.3. Example Report ................................................................................................. 108
9.3.1. Creating the Groovy Script ........................................................................ 108
9.3.2. Creating the Report Template .................................................................... 108
9.3.3. Creating a Chart ....................................................................................... 109
9.3.4. Including Images ...................................................................................... 110
10. Import/Export ............................................................................................................. 111
10.1. Export Table Content to Excel ............................................................................. 111
10.2. Backup/Recovery ....................................................................................... 111
10.3. Integration With Other Frameworks ..................................................................... 111
10.3.1. Hudson Plugin for Klaros-Testmanagement ................................................ 111
10.3.2. QF-Test and JUnit Import ......................................................................... 111
Glossary .......................................................................................................................... 113
Index .............................................................................................................................. 115
A. Role Permission Overview .............................................................................................. 119
B. Model API Reference ..................................................................................................... 120
B.1. Klaros-Core API Reference .................................................................................... 120

iv
Klaros-Testmanagement User Manual

B.1.1. Package de.verit.klaros.core.model .............................................................. 120


B.2. Klaros-Scripting API Reference .............................................................................. 193
B.2.1. Package de.verit.klaros.scripting ................................................................. 193
B.2.2. Package de.verit.klaros.scripting.context ...................................................... 196
B.2.3. Package de.verit.klaros.scripting.model ........................................................ 198
C. Dump File Specification ................................................................................................. 201
C.1. <configurationList> ............................................................................................ 204
C.2. <configuration> ................................................................................................. 204
C.3. <classInitialStates> ............................................................................................. 204
C.4. <value> ............................................................................................................ 204
C.5. <entry> ............................................................................................................. 205
C.6. <classStates> ..................................................................................................... 205
C.7. <envs> .............................................................................................................. 205
C.8. <propertyDefs> .................................................................................................. 206
C.9. <suts> .............................................................................................................. 206
C.10. <testCases> ..................................................................................................... 207
C.11. <testRuns> ...................................................................................................... 208
C.12. <testSuites> ..................................................................................................... 208
C.13. <executables> .................................................................................................. 209
C.14. <testCaseStepFragments> .................................................................................. 209
C.15. <testCaseStep> ................................................................................................ 210
C.16. <testCaseResults> ............................................................................................. 210
C.17. <testSuiteResult> .............................................................................................. 211
C.18. <propertiesOwner> ........................................................................................... 211
C.19. <stringProperties> ............................................................................................ 211
C.20. <relatedClass> .................................................................................................. 212
C.21. <namedEntity> ................................................................................................. 212
C.22. <resultReference> ............................................................................................. 212
C.23. <key> ............................................................................................................. 212
C.24. <created> ........................................................................................................ 212
C.25. <enabled> ....................................................................................................... 212
C.26. <lastUpdated> ................................................................................................. 212
C.27. <description> ................................................................................................... 213
C.28. <mappedToClass> ............................................................................................ 213
C.29. <userDefined> ................................................................................................. 213
C.30. <mandatory> ................................................................................................... 213
C.31. <productversion> ............................................................................................. 213
C.32. <revisionId> ..................................................................................................... 213
C.33. <root> ............................................................................................................ 213
C.34. <trunkRoot> .................................................................................................... 213
C.35. <areatopic> ..................................................................................................... 213
C.36. <cut> .............................................................................................................. 214
C.37. <depends> ...................................................................................................... 214
C.38. <docbase> ....................................................................................................... 214
C.39. <equiclass> ...................................................................................................... 214
C.40. <evalutation> ................................................................................................... 214
C.41. <execution> ..................................................................................................... 214
C.42. <expectedResult> ............................................................................................. 214
C.43. <generalState> ................................................................................................. 214
C.44. <inputvalues> .................................................................................................. 214
C.45. <level> ............................................................................................................ 214
C.46. <message> ...................................................................................................... 215
C.47. <method> ....................................................................................................... 215
C.48. <note> ............................................................................................................ 215
C.49. <postcondition> ............................................................................................... 215
C.50. <precondition> ................................................................................................ 215
C.51. <predecessor> ................................................................................................. 215
C.52. <priority> ........................................................................................................ 215

v
Klaros-Testmanagement User Manual

C.53. <shortname> ................................................................................................... 215


C.54. <successor> ..................................................................................................... 215
C.55. <team> ........................................................................................................... 215
C.56. <traceability> ................................................................................................... 216
C.57. <variety> ......................................................................................................... 216
C.58. <testMethod> .................................................................................................. 216
C.59. <env> ............................................................................................................. 216
C.60. <runId> ........................................................................................................... 216
C.61. <sut> .............................................................................................................. 216
C.62. <timestamp> ................................................................................................... 216
C.63. <revisionComment> .......................................................................................... 216
C.64. <summary> ..................................................................................................... 216
C.65. <type> ............................................................................................................ 216
C.66. <running> ....................................................................................................... 217
C.67. <testSuitePassed> ............................................................................................. 217
C.68. <currentTestCase> ............................................................................................ 217
C.69. <executionTime> .............................................................................................. 217
C.70. <action> .......................................................................................................... 217
C.71. <propertyType> ............................................................................................... 217
C.72. <propertyOwnerReference> ............................................................................... 217
C.73. <propertyDefReference> .................................................................................... 217
C.74. <testCasePassed> ............................................................................................. 217
C.75. <propertyName> .............................................................................................. 217
C.76. <propertyValue> .............................................................................................. 218
C.77. <propertyDisplayName> .................................................................................... 218
C.78. <sortorder> ..................................................................................................... 218
C.79. <configurationReference> .................................................................................. 218
C.80. <testCaseReference> ......................................................................................... 218
C.81. <testRunReference> .......................................................................................... 218
C.82. <testResultReference> ....................................................................................... 218
C.83. <testSuiteReference> ........................................................................................ 218
C.84. <testSuiteResultReference> ................................................................................ 218
C.85. <name> .......................................................................................................... 218
C.86. <uuid> ............................................................................................................ 218
D. Reporting Resources ..................................................................................................... 219
D.1. Context Variables ............................................................................................... 219
D.2. KlarosScript Interface .......................................................................................... 219
D.3. Example report template ..................................................................................... 219

vi
List of Figures
2.1. Klaros-Testmanagement Overview Diagram ....................................................................... 4
2.2. Artifact Relation Diagram ................................................................................................ 5
2.3. Test Cases and Test Suites Relation Diagram ..................................................................... 5
2.4. Klaros-Testmanagement Follow Up Diagram ..................................................................... 6
3.1. Setting the JAVA_HOME environment variable ................................................................... 8
3.2. Welcome Screen ........................................................................................................... 9
3.3. Information Screen ...................................................................................................... 10
3.4. Licensing Agreement Screen ......................................................................................... 11
3.5. Target Path Screen ...................................................................................................... 12
3.6. Select Installation Packages Screen ................................................................................ 13
3.7. User Data Screen ......................................................................................................... 14
3.8. Installation in Progress Screen ....................................................................................... 15
3.9. Installation Finished Screen ........................................................................................... 15
3.10. Perform External Processes Screen ............................................................................... 16
3.11. Setup Shortcuts Screen ............................................................................................... 17
3.12. Installation Finished Screen ......................................................................................... 18
3.13. Database Migration .................................................................................................... 19
4.1. Login Screen ............................................................................................................... 22
4.2. Maintain a Project Screen ............................................................................................. 23
5.1. The Project Selection Screen .......................................................................................... 24
5.2. The Maintain Projects Screen ......................................................................................... 26
5.3. The Project Issue Management System Selection Page ....................................................... 28
5.4. The Project User Defined Properties Page ........................................................................ 29
5.5. Editing the values of an enumeration .............................................................................. 30
5.6. The Maintain Test Environments Screen ........................................................................... 30
5.7. The Maintain Test Environments Screen ........................................................................... 31
5.8. The Maintain Systems Under Test Screen ......................................................................... 33
5.9. The Maintain Test Environments Screen ........................................................................... 34
5.10. The Maintain Test Cases Screen .................................................................................... 36
5.11. The Maintain Test Environments Screen ......................................................................... 37
5.12. The Edit Test Cases Screen ........................................................................................... 40
5.13. The Edit Test Steps Screen ........................................................................................... 41
5.14. The Assign Attachments Screen .................................................................................... 42
5.15. The Revisions Screen ................................................................................................... 42
5.16. The Maintain Test Suites Screen .................................................................................... 43
5.17. The Edit Test Suite Screen ............................................................................................ 45
5.18. The Maintain Test Environments Screen ......................................................................... 46
5.19. The Revisions Screen ................................................................................................... 48
6.1. The Run Single Test Case Screen ..................................................................................... 50
6.2. The Detailed Information about the Test Case Screen ........................................................ 51
6.3. The Overview Screen .................................................................................................... 52
6.4. The Step By Step Instructions Screen ............................................................................... 52
6.5. The Error or Failure Detected Screen ............................................................................... 53
6.6. The Test Case Results Screen .......................................................................................... 54
6.7. The Run Test Suite Screen ............................................................................................. 56
6.8. The Detailed Information about the Test Suite Screen ........................................................ 56
6.9. The Test Suite Results Screen ......................................................................................... 57
6.10. The Continue Test Suite Screen .................................................................................... 59
6.11. Jira Issue Page ............................................................................................................ 61
6.12. Trac Issue Page ........................................................................................................... 62
6.13. Redmine Issue Page .................................................................................................... 63
6.14. Bugzilla Issue Page ...................................................................................................... 63
7.1. The Dashboard Screen .................................................................................................. 65
7.2. The Test Environment Overview Report Layout ................................................................. 66
7.3. The SUT Overview Report Layout .................................................................................... 67

vii
Klaros-Testmanagement User Manual

7.4. The Test Suite Overview Report Layout ............................................................................ 68


7.5. The Project Overview Report .......................................................................................... 69
7.6. The Project Overview Report .......................................................................................... 70
7.7. The Project Overview Report .......................................................................................... 70
7.8. The Project Health Matrix Report .................................................................................... 70
7.9. The Test Progress Report ............................................................................................... 71
7.10. The Test Progress History Report ................................................................................... 72
7.11. User Defined Reports Screens ....................................................................................... 72
7.12. Generate a parameterized Report .................................................................................. 73
7.13. The Single Test Run Report Screen ................................................................................ 73
7.14. The Single Test Run Report Layout ................................................................................ 76
7.15. The Test Run History Screen ......................................................................................... 77
7.16. The Test Run History Report Layout ............................................................................... 78
7.17. The Test Case Results Screen ........................................................................................ 79
7.18. The Test Case Results Screen ........................................................................................ 81
7.19. The Test Suite Results Screen ....................................................................................... 83
7.20. The Test Suite Results Screen ....................................................................................... 85
7.21. The Test Suite Results Screen - Test Results ..................................................................... 85
7.22. The Test Case Selection Screen ..................................................................................... 86
7.23. The Issues By Test Case Screen ..................................................................................... 86
8.1. The System Information Screen ...................................................................................... 88
8.2. User Defined Reports Screens ......................................................................................... 89
8.3. Generate a parameterized Report ................................................................................... 89
8.4. Report Details .............................................................................................................. 90
8.5. The Create User Screen ................................................................................................. 91
8.6. The Edit User Screen ..................................................................................................... 92
8.7. The Edit User Screen ..................................................................................................... 94
8.8. The LDAP Configuration Screen ...................................................................................... 95
8.9. The E-Mail Configuration Screen ..................................................................................... 97
8.10. The Logging Configuration Screen ................................................................................ 97
8.11. The General Settings Configuration Screen ..................................................................... 99
8.12. The Issue Management Configuration Screen ................................................................ 100
8.13. The Backup Projects Screen ........................................................................................ 101
8.14. The Restore Projects Screen ........................................................................................ 102
9.1. The Report Generation Process ..................................................................................... 103
9.2. The Report Templates Page .......................................................................................... 104
9.3. The New Report Templates Page .................................................................................. 105
9.4. Adding Parameters to the Script ................................................................................... 106
9.5. Specifying Parameters ................................................................................................. 106
9.6. Apply a Report Template ............................................................................................. 107
9.7. Enter Parameter .......................................................................................................... 107
9.8. A Pie Chart Example .................................................................................................... 109
10.1. Export table content to Excel ...................................................................................... 111

viii
List of Tables
3.1. Supported Browser ......................................................................................................... 7
3.2. Supported Databases ...................................................................................................... 7
A.1. Role Permission Overview Table ................................................................................... 119
A.2. Role Permission Overview Table - Enterprise Edition ........................................................ 119
C.1. Element summary ...................................................................................................... 201
C.2. <configurationList> elements ...................................................................................... 204
C.3. <configuration> attributes .......................................................................................... 204
C.4. <configuration> elements ........................................................................................... 204
C.5. <value> attributes ..................................................................................................... 205
C.6. <value> elements ...................................................................................................... 205
C.7. <entry> elements ...................................................................................................... 205
C.8. <classStates> attributes .............................................................................................. 205
C.9. <classStates> elements .............................................................................................. 205
C.10. <envs> attributes .................................................................................................... 205
C.11. <envs> elements ..................................................................................................... 206
C.12. <propertyDefs> attributes ......................................................................................... 206
C.13. <propertyDefs> elements ......................................................................................... 206
C.14. <suts> attributes ..................................................................................................... 206
C.15. <suts> elements ...................................................................................................... 206
C.16. <testCases> attributes .............................................................................................. 207
C.17. <testCases> elements ............................................................................................... 207
C.18. <testRuns> attributes ............................................................................................... 208
C.19. <testRuns> elements ................................................................................................ 208
C.20. <testSuites> attributes ............................................................................................. 208
C.21. <testSuites> elements .............................................................................................. 209
C.22. <executables> attributes .......................................................................................... 209
C.23. <executables> elements ........................................................................................... 209
C.24. <testCaseStepFragments> elements ........................................................................... 209
C.25. <testCaseStep> attributes ......................................................................................... 210
C.26. <testCaseStep> elements .......................................................................................... 210
C.27. <testCaseResults> attributes ...................................................................................... 210
C.28. <testCaseResults> elements ...................................................................................... 210
C.29. <testSuiteResult> attributes ...................................................................................... 211
C.30. <testSuiteResult> elements ....................................................................................... 211
C.31. <propertiesOwner> attributes ................................................................................... 211
C.32. <propertiesOwner> elements .................................................................................... 211
C.33. <stringProperties> attributes ..................................................................................... 212
C.34. <stringProperties> elements ...................................................................................... 212
D.1. Context Variables ....................................................................................................... 219

ix
List of Examples
10.1. QF-Test import URL sample ........................................................................................ 112

x
Chapter 1. Key Features
1.1.  Klaros-Testmanagement Community Edition Features
The key features of Klaros-Testmanagement Community Edition are:

Management of Test Related Arti- Klaros-Testmanagement allows to manage tests cases, test suites,
facts test environments, systems under test, test runs and their results.

Grouping Test Cases to Test Klaros-Testmanagement allows to group test cases to test suites.
Suites Test suites are the common unit of test execution and may be as-
signed to individual systems under tests.

Version Control of Test Cases and Klaros-Testmanagement supports the version control of test cas-
Test Suites es and test suites. Different versions can be applied to individual
systems under test and test environments with full traceability to
their results.

Support for Binary Attachments Klaros-Testmanagement allows to store binary attachments of any
kind (text documents, graphics, screenshots etc.) along with test
cases and test results.

Statistics With Klaros-Testmanagement it is easy to gather reports and


statistics about all testing results since all the test related data is
stored in the database and may be retrieved later.

Report File Export The report files can be exported into various file formats, as PDF,
HTML, CSV, RTF, TXT and XML.

Guided manual test execution Klaros-Testmanagement supports the tester in executing manual
tests. Using a web based client the tester is guided through the
test execution, giving him the possibility to take notes during the
test execution and automatically recording the test results with
Klaros-Testmanagement.

Resumption of Interrupted Work Klaros-Testmanagement allows it anytime to interrupt manual


testing and continue working on a test suite execution later.

Security The Klaros-Testmanagement web application includes user man-


agement in conjunction with a role system allowing to limit the
capabilities of individual users.

Interoperability with Continuous A Hudson plug-in enables to transfer the results of the automated
Integration Systems (Hudson) tests generated by the Hudson Continuous Integration System to
Klaros-Testmanagement.

Interoperability with Issue Track- The connection to the JIRA, Trac, Bugzilla or Redmine issue man-
ing Systems agement systems allows to directly link test artifacts with issues in
the issue tracker.

Import Interfaces for QFTest™ Test results generated by other test automation software can eas-
and JUnit Test Results ily be imported and merged with manual test results.

Backup and Restore Complete test projects can be backed up and restored. The restore
interface can be used to import foreign data via XML.

1
Key Features

1.2.  Klaros-Testmanagement Enterprise Edition Features


The additional features of the Klaros-Testmanagement Enterprise Edition are:

User Defined Custom Fields Klaros-Testmanagement Enterprise Edition allows the definition
and usage of custom fields for test cases, test suites, test environ-
ments and systems under test.

User Defined PDF Reports Klaros-Testmanagement Enterprise Edition supports the defini-
tion and generation of arbitrary PDF reports using a simple yet
powerful programming interface.

Configurable Dashboard Klaros-Testmanagement Enterprise Edition allows to individually


configure the dashboard from various reports for each user.

New Dashboard Reports Klaros-Testmanagement Enterprise Edition introduces new dash-


board reports which are exclusive to the enterprise edition.

LDAP Support for User Authenti- Klaros-Testmanagement Enterprise Edition allows to authenticate
cation users against an external LDAP Directory Server which holds the
passwords of Klaros-Testmanagement users.

Excel Export of Data Tables Klaros-Testmanagement Enterprise Edition allows the direct ex-
port of data tables to Excel Format for further analysis.

2
Chapter 2. Introduction
Klaros-Testmanagement is an easy to use web based application which helps to organize test projects.
Klaros-Testmanagement manages test cases, test suites, information about tested systems and the test
environments in which a test have been run. When a test case or plan has been executed its result in-
cluding the information about the tested system and its test environment, is stored in a database. This
enables a full traceability of all test runs. Multiple reports enable an detailed overview of the progress
of the test project at any time.

2.1. Overview
This section gives a short overview of Klaros-Testmanagement. It describes the structure and the relation
of the components and features used in the application ( Figure 2.1 ).

The main artifacts of Klaros-Testmanagement are listed below:

• Project

A project is the main unit that contains all artifacts that are needed for the test project.

• Test Environment

A test environment represents the extrinsic setting that may influence the test result. Examples for
parts of an test environment would be the operation system or an application server.

• Test System, System Under Test (SUT)

A SUT represents a particular version of the component, system, or product that is tested.

• Test Case

The test case is the basic unit of a test process. A test case can be related to many test environments
and SUTs thus it is possible to execute the same test cases and get different results.

• Test Case Results

After a test case has been executed, it has a result. This result can be Passed, Failed or Error. Klaros-
Testmanagement enriches this information with the combination of system under test and test envi-
ronment for which the test has been run and many other information that can be gathered from the
execution, like time of execution of the test.

• Test Suite

The test suite is a list of test cases. With test suites, the test cases can be grouped to test suites. A test
case can be part of more then one test suite.

• Test Suite Results

The test suite result is an aggregation of the result of the test cases that are included in the test suite.
The result of a test suite can be Passed, Failed or Error. Additionally the test suite result gets additional
information as the system under test, and the test environment in which the plan has been run, its
execution time, the person who executed the test suite and much more. The execution of a test suite
can be suspended at any time and its execution can be continued later.

There are three Roles for users defined in Klaros-Testmanagement: Administrator, Manager and Tester.
These roles are described more detailed in ( Section 2.2, “Roles” ). Figure 2.1 is an overview of Klaros-
Testmanagement and displays the user roles and the relation between the artifacts.

3
Introduction

Figure 2.1. Klaros-Testmanagement Overview Diagram

After a project has been created, test cases can be added. Test cases can be grouped to test suites. Before
running the test cases or test plans, at least one system under test, and at least one test environment
has to be defined for the project.

When all necessary artifacts are defined, a test case can be executed by performing a single test case
which will generate a test case result, or a group of tests can be performed by executing a test suite.
The outcome of the test case or test suite will be stored in a test case result or a test suite result which
contains the individual test case results of that test run.

Besides the test case results and test suite results, Klaros-Testmanagement generates reports that are
available to export to five different types of documents (PDF, HTML, CSV, RTF, XML).

Important
The report generation is not available for all roles. To learn more about the role system see
( Section 2.2, “Roles” ).

Figure 2.2 shows the structure of a project.

4
Introduction

Figure 2.2. Artifact Relation Diagram

The test case/plan run, test environment and SUT are directly related to a project, so their existence is
essential. To run a test case/plan it is mandatory to have a test environment and a SUT configured.

Figure 2.3 shows the relation between test cases and test plans.

Figure 2.3.  Test Cases and Test Suites Relation Diagram

5
Introduction

Klaros-Testmanagement supports the reuse of test cases. A test case can be part of multiple test suites.
The sharing of test cases is only possible inside the same project.

Figure 2.4 shows a workflow for Klaros-Testmanagement and the artifacts created.

Figure 2.4.  Klaros-Testmanagement Follow Up Diagram

As a first step all the objects like test environment and system under test (SUT), must be defined. Sub-
sequently test cases and test suites can be created. The statistics of the project are updated for each
object added in Klaros-Testmanagement. The reports can be generated after a test execution.

2.2. Roles
Klaros-Testmanagement defines three roles that a user can be assigned to: Administrator, Manager
and Tester. An administrator is able to create other users of the role Administrator, Manager and Tester.
A manager is able to create other users with the role Tester. A tester user cannot create any other user.

The Administrator has access to all functionalities available in Klaros-Testmanagement and is the only
user that can edit preferences such as e-mail settings, issue management server locations, status mes-
sage preferences and general parameters, as described in Chapter 8, Configure.

The Manager has access to all functionalities except editing the preferences mentioned above. He is able
to create, edit, delete and search for artifacts, show results, run test cases and test suites and generate
reports. The artifacts are the projects, test environments, systems under test, test cases and test suites.

The Tester has less permissions than the manager. He is only able to search for objects, inspect results
and execute test cases and test suites.

For a detailed view of the permissions please consult the permission matrix in Appendix A, Role Permis-
sion Overview.

6
Chapter 3. Installation
Klaros-Testmanagement is equipped with an installer for the Microsoft Windows and Linux based oper-
ating systems. TODO!!! (engl.)The installer will install all the necessary programs to run Klaros-Testman-
agement, including a file based database (Apache Derby) and the Apache Tomcat application server.

Warning
The installer installs a version of Klaros-Testmanagement that uses the Apache Derby
database for persistence. For production systems, it is strongly recommended to use a
database like MySQL, PostgreSQL or Oracle which will provide a significantly better perfor-
mance. After running the initial installation process, the database can be changed. How
this is done is described in Section 3.5, “Changing the Database”

This chapter provides all necessary information to install Klaros-Testmanagement.

3.1. Prerequisites
Since Klaros-Testmanagement is a web application the installation prerequisites for the client and the
server side are different. These are explained in detail in the next sections.

3.1.1. Client Prerequisites
To access Klaros-Testmanagement a modern web browser supporting JavaScript is required.

Name Version
Microsoft Internet Explorer Version 7 and above
Mozilla Firefox Version 3.5 and above
Google Chrome Version 4 and above
Safari Version 4 and above

Table 3.1. Supported Browser

Please note that due to its current state Konqueror is currently not a supported browser.

3.1.2. Server Prerequisites
To run the Klaros-Testmanagement server a Microsoft Windows or Linux operating system running Java
5 or later is required.

For optimal performance a separate database installation is recommended. This database is not required
to run on the same physical machine as the Klaros-Testmanagement server.

Name Description
Apache Derby Version 10.5.0.3 and above (already configured)
MySQL Version 5.1 and above
PostgreSQL Version 8.4 and above
Oracle Version 10g, 11g

Table 3.2. Supported Databases

Make sure that the JAVA_HOME/JRE_HOME environment


variable is set!
A common caveat is that although Java is installed on your machine, the startup script will
complain that neither the JAVA_HOME nor the JRE_HOME environment variable is defined.

7
Installation

Windows users will find an option to set this in their computer control panel. The location
where to reach this varies with different versions of windows, an example is shown below.
Linux users should set the variables in their startup scripts.

Figure 3.1. Setting the JAVA_HOME environment variable

3.1.2.1. Microsoft Windows
To run Klaros-Testmanagement on Microsoft Windows operating system the following requirements
have to be met:

• Minimum requirements

1GB RAM, 2GHz CPU

• Recommended requirements

2GB RAM, 2.5GHz Dual-Core CPU

• Operating system

Microsoft Windows XP, Microsoft Windows Vista or Microsoft Windows 7

• Java Runtime Environment

Sun Java 2 Standard Edition 6.0 or better

3.1.2.2. Linux
To run Klaros-Testmanagement on Linux operating system the following requirements have to be met:

8
Installation

• Minimum requirements

1GB RAM, 2GHz CPU

• Recommended requirements

2GB RAM, 2.5GHz Dual-Core CPU

• Operating system

Linux IA32 (Intel 32 bit architecture) distribution containing Version 2.2.1 or better of the GTK+ widget
toolkit and associated libraries

GTK+ is only needed for running the Klaros-Testmanagement installer.

• Java Runtime Environment

Sun Java 2 Standard Edition 6.0 or better

3.2. Installation Process
In Windows the installer is invoked by clicking Klaros-Setup.exe .

In Linux the installer can be started by entering java -jar Klaros-Setup.jar in a user shell.

The following screens show each step to install Klaros-Testmanagement.

3.2.1. Step 1: Welcome
The initial step shows a welcome screen to the user ( Figure 3.2 ).

Figure 3.2.  Welcome Screen

The installation may be aborted by clicking the Quit button. Clicking Next proceeds with the installation.

9
Installation

3.2.2. Step 2: Information
The second step shows information about the product and the revision history, listing the fixed issues
an the newly added features ( Figure 3.3 ).

Figure 3.3.  Information Screen

The installation may be aborted by clicking the Quit button. Clicking Previous goes back to the "Wel-
come" step and clicking Next proceeds with the installation.

3.2.3. Step 3: Licensing Agreements


The third step shows information about the license agreement for Klaros-Testmanagement. The license
must be accepted to continue the installation. ( Figure 3.4 ).

10
Installation

Figure 3.4.  Licensing Agreement Screen

The installation may be aborted by clicking the Quit button. Clicking Previous goes back to the "Infor-
mation" step and clicking Next proceeds with the installation.

3.2.4. Step 4: Target Path


The fourth step requests the target path where Klaros-Testmanagement will be installed. The user can
use the Browse button to search for the specific path in the local file system ( Figure 3.5 ).

Important
TODO!!! (engl.)For Windows Vista the install path should not be the Programs folder. To in-
stall Klaros-Testmanagement in the Programs folder, the User Account Control (UAC) must
be disabled.

11
Installation

Figure 3.5.  Target Path Screen

The installation may be aborted by clicking the Quit button. Clicking Previous goes back to the "Licens-
ing Agreement" dialog and clicking Next proceeds with the installation.

3.2.5. Step 5: Select Installation Packages


The fifth step allows to select the packages that are installed with Klaros-Testmanagement.

12
Installation

Figure 3.6.  Select Installation Packages Screen

The installation may be aborted by clicking the Quit button. Clicking Previous goes back to the "Target
Path" dialog, clicking Next proceeds with the installation.

3.2.6. Step 6: User Data


In the sixth step the Tomcat server settings can be changed.

13
Installation

Figure 3.7.  User Data Screen

It is possible to set the ports used by Tomcat, Server-Port, HTTP-Port, HTTPS-Port and AJP/1.3-Port, the
minimum and maximum amount of memory available to the Tomcat process as well as the user name
and password of the Tomcat administrator.

The installation may be aborted by clicking the Quit button. Clicking Previous button goes back to the
"Select Installation Packages" step and clicking Next button proceeds with the installation.

3.2.7. Step 7: Installation
The seventh step starts the installation of Klaros-Testmanagement and shows its progress. The following
screen shots show the installation in progress and the installation finished ( Figure 3.8 ).

14
Installation

Figure 3.8.  Installation in Progress Screen

Figure 3.9.  Installation Finished Screen

The installation may be aborted by clicking the Quit button. Clicking Next proceeds with the installation.

15
Installation

3.2.8. Step 8: Perform External Processes


The eighth step performs the external processes such as set the environment variables and starts Tomcat
service.

Figure 3.10.  Perform External Processes Screen

The installation may be aborted by clicking the Quit button. Clicking Next proceeds with the installation.

3.2.9. Step 9: Setup Shortcuts


The ninth step requests the options for shortcuts. The user can create shortcuts in the Start-Menu and
on the desktop ( Figure 3.11 ).

16
Installation

Figure 3.11.  Setup Shortcuts Screen

This step looks different on Linux installation but has the same functionality.

The installation may be aborted by clicking the Quit button. Clicking Previous goes back to the "Perform
External Processes" step and clicking Next proceeds with the installation.

3.2.10. Step 10: Installation Finished


The tenth step notifies the user that Klaros-Testmanagement was installed successfully and shows the
path to the uninstaller program ( Figure 3.12 ).

17
Installation

Figure 3.12.  Installation Finished Screen

The Generate an automatic installation script button saves a complete script of the installation with the
given user choices, so that the same installation could be repeated unattended or on other machines.

The installation can be completed by clicking the Done button.

3.3. Update Process
If Klaros-Testmanagement is already installed it is possible to update it to a newer version. The installer
will update all files automtically. Further information of the installation process can be found in the sec-
tion Section 3.2, “Installation Process” . The Klaros-Testmanagement home folder remains untouched so
all settings, database connections and the content repository gets not affected from the update process.

Note
Please create a backup of your database before starting the Klaros-Testmanagement up-
date process.

3.3.1. Database Migration
After updating Klaros-Testmanagement to a newer version it is possible that a database migration is
necessary. If so, Klaros-Testmanagement will show the following screen on startup. To start the migar-
tion an administrator account is needed. Figure 3.13

18
Installation

Figure 3.13.  Database Migration

3.3.1.1. Migration to 3.0.0
In order to update Klaros-Testmanagement to the version 3.0.0 it is necessary to have an installed Klaros-
Testmanagement 2.6.2. An older version needs to be updated to version 2.6.2 including the database
migration.

3.4.  Running Klaros-Testmanagement


To start Klaros-Testmanagement open the following URL in a web browser: http://HOST:HTTP--
Port/klaros-web/ where HOST is the IP address or domain name of the application server and
HTTP-Port is the port defined in Section 3.2.6, “Step 6: User Data” .

The first time Klaros-Testmanagement is executed, it will create a .klaros folder containing Klaros-Test-
management and database property files. The .klaros folder is created in the home directory of the cur-
rent user. For Microsoft Windows users it is located in C:\Documents and Settings\username and for Linux
users in user home directory .

3.5. Changing the Database


By default Klaros-Testmanagement uses the Apache Derby Database. In production systems, it is strong-
ly recommended to use a full grown database as MySQL or PostgreSQL.

To change the database, Klaros-Testmanagement must be stopped, and the hibernate properties of
Klaros-Testmanagement located in <userhome>/.klaros/hibernate.properties must be
changed.

The properties depend on the used database.

To use MySQL change the content of the file to:

      
      hibernate.dialect=org.hibernate.dialect.MySQLDialect
      hibernate.connection.driver_class=com.mysql.jdbc.Driver
      hibernate.connection.url=jdbc:mysql://localhost:3306/klaros
      hibernate.connection.username=root
      hibernate.connection.password=root
    

To use PostgreSQL change the content of the file to:

      
      hibernate.connection.driver_class = org.postgresql.Driver
      hibernate.connection.url = jdbc:postgresql://localhost/klaros
      hibernate.dialect = org.hibernate.dialect.PostgreSQLDialect
      hibernate.connection.username=root

19
Installation

      hibernate.connection.password=root
      

An exhaustive list of all parameters can be found in the Hibernate Core Manual (see Hibernate Core
Manual ).

Tip
Klaros-Testmanagement will neither automatically create the database instance ( klaros
in the above example) nor the database user (user root with password root) on the
database server. Creating a database instance and adding a user is described in the corre-
sponding database manual and will not be covered here. The database user needs permis-
sions to create, drop and alter tables to properly bootstrap the Klaros-Testmanagement
database instance.

3.6. Configuring External Issue Management Systems


Depending on the type of issue management system, some configuration efforts are needed to connect
them to Klaros-Testmanagement. This section describes the needed installation steps for each System.

3.6.1. Bugzilla Configuration
3.6.1.1. Supported versions
The Klaros-Testmanagement Bugzilla Integration currently supports Bugzilla version 3.0 and later.

3.6.1.2. Additional Bugzilla Configuration


No additional configuration is required for Bugzilla.

3.6.2. JIRA Configuration
3.6.2.1. Supported versions
The Klaros-Testmanagement JIRA Integration currently supports JIRA 3.7 and later.

3.6.2.2. Additional JIRA Configuration


To allow the connection of Klaros-Testmanagement to the JIRA server instance the JIRA RPC Plugin has
to be activated inside your JIRA installation. You find this option as Accept remote API calls in General
Configuration under Global Settings . Then you need to enable the JIRA RPC Plugin in Plugins under
System in the left-hand menu.

3.6.3. Redmine Configuration
3.6.3.1. Supported versions
The Klaros-Testmanagement Redmine Integration currently supports Redmine 0.8.7stable and 0.9.4sta-
ble.

3.6.3.2. Additional Redmine Configuration


The Klaros-Testmanagement Redmine Integration requires the installation of the Redmine WS-
API Plugin. The download archives are available at http://sourceforge.net/projects/redmin-mylyn-
con/files/ . The installation notes can be found at http://sourceforge.net/apps/mediawiki/redmin-my-
lyncon/index.php?title=Installation

The following combinations have been successfully tested:

20
Installation

• Redmine 0.8.7 / Redmine WS-API Plugin 2.6.0

• Redmine 0.9.4 / Redmine WS-API Plugin 2.6.4

Later versions are expected to work but currently not tested.

3.6.4. Trac Configuration
3.6.4.1. Supported versions
The Klaros-Testmanagement Trac Integration currently supports Trac 0.10 and later.

3.6.4.2. Additional Trac Server Configuration


The Klaros-Testmanagement Trac Integration requires the installation of the TracXMLRPC Plugin. The
download archives and installation notes are available at http://trac-hacks.org/wiki/XmlRpcPlugin/ . Af-
ter successful installation of the plugin it has to be activated using the Trac web administration interface.

The following combinations have been successfully tested:

• Trac 0.11.5 / TracXMLRPC 1.0.6

Later versions are expected to work but currently not tested.

21
Chapter 4. Functional Overview
This chapter gives a coarse overview of the usage of the application.

4.1.  Login
Figure 4.1 shows the login screen of Klaros-Testmanagement. To login just enter the user name in the
User name field, the associated password into the Password field.

Figure 4.1.  Login Screen

The Klaros-Testmanagement database predefines three default users accounts with different roles:

• Administrator

Username: admin / Password: admin

• Manager

Username: manager / Password: manager

• Tester

Username: tester / Password: tester

For a description of the users roles and the permissions associated with each role please refer to Ap-
pendix A, Role Permission Overview .

Confirm the login to Klaros-Testmanagement by clicking the Login button.

Klaros-Testmanagement is multilingual
By default Klaros-Testmanagement uses the language that is selected by the web browser
. If you want to choose a different language please select the corresponding flag at the top
right corner of the screen. The language changes immediately.

22
Functional Overview

4.2. Main Functions
After a successful login the main screen of Klaros-Testmanagement is presented to the user ( Figure 4.2 ).

Figure 4.2. Maintain a Project Screen

Klaros-Testmanagement uses a horizontal menu on the top of the screen. These categories resemble
activities in the testing project. The function categories presented in the main screen are:

In the Define section of Klaros-Testmanagement all artifacts that


Define are needed in a project can be defined and viewed. These artifacts
are the project itself, the test environments, the systems under
test, the test cases and the test plans.

The Execute section of Klaros-Testmanagement allows to execute


Execute manual test cases and test suites. Before a test is executed, the
tester must select a combination of system under test and test en-
vironment, for which he executes the test. In the following he is
guided through the manual test step by step. Each executed step
can be enriched with comments of the tester.

If the tester is interrupted in executing a long test suite, he can


continue the execution of a test suite later.

The Evaluate section provides all functionality for analyzing the


Evaluate test project. It provides coarse grained overview reports in the
dashboard, as well as fine grained information, so that each exe-
cution of a test can be traced.

The Configure menu is used for administrative tasks, eg. the user
Configure management or the import and export of testing projects is done
here.

The following chapters will describe each category in detail.

23
Chapter 5. Define

This section describes how to define the main artifacts of Klaros-Testmanagement: the projects, envi-
ronments, systems under test, test cases and test suites. These artifacts are managed in the define sec-
tion of the main menu. For each artifact a menu entry on the left hand side is provided.

5.1.  Defining a Project


The Projects section allows to select and manage the projects in Klaros-Testmanagement. The actions
used to manage the projects are create, remove, update and search projects.

When accessing Klaros-Testmanagement for the first time it is necessary to create a project to work with.
To create a new project select the menu entry Maintain Projects ( Figure 5.2 ) and follow Section 5.1.2.1,
“ Creating a Project ” .

But usually a user would just select one of the already defined projects, which is described in the fol-
lowing section.

5.1.1.  Select a Project


The main screen after a successful login is the Maintain a Project Screen ( Figure 5.1 ). It gives the user
an overview about the projects stored in the Klaros-Testmanagement database. Each project is listed
in a single row.

To continue operating the user is required to select a project. To select a project the user must click on
the check box of the desired project.

Choosing a project with the project quick select drop-down


field
The header bar of Klaros-Testmanagement contains a drop-down field labeled Project .
With this drop-down field the active project can be switched quickly on any Klaros-Test-
management page . You will be directed to the top page of your current category (Define /
Execute etc.).

Figure 5.1. The Project Selection Screen

After a project is selected, the disabled functions are made available.

5.1.1.1. Searching and Sorting Projects


It is possible to search for projects and sort the results of the search with the filtering and sorting options.
The filtering and sorting options become visible by opening the Filter / Sort panel. The Filter / Sort panel

24
Define

contains two tables. The left table contains the filtering parameters, the right table contains the sorting
parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all projects.

5.1.1.1.1. Specifying Search Criteria for Projects


Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all projects is filtered for the conjunction of all criteria listed in the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the project selection page are two fields that can be filtered: ID and Description.

• The Type column denotes the operator that is used for the criterion. In the project selection page there
the following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

5.1.1.1.2. Specifying Sorting Criteria for Projects


The table that contains the list of projects can be sorted. The criteria for the sorting are specified in the
right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If there are
more than one sorting options the uppermost sorting criterion has the highest priority, lowest row in
the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

25
Define

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that the sorting applies to. The fields that
can be sorted for in the select projects screen are ID and Description.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

5.1.2.  Maintaining a Project


The Maintain Projects page allows to create, remove, update and search projects ( Figure 5.2 ). These
functionalities are available in the same screen.

Figure 5.2. The Maintain Projects Screen

5.1.2.1.  Creating a Project


By clicking on the New button the user is able to create a new project ( Figure 5.2). An empty row is added
at the beginning of the table. A description can be specified. The Project ID is automatically assigned by
Klaros-Testmanagement. Press the Save button to confirm the creation or the Cancel button to discard.

5.1.2.2. Searching and Sorting Projects


It is possible to search for projects and sort the results of the search with the filtering and sorting options.
The filtering and sorting options become visible by opening the Filter / Sort panel. The Filter / Sort panel
contains two tables. The left table contains the filtering parameters, the right table contains the sorting
parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all projects.

26
Define

5.1.2.2.1. Specifying Search Criteria for Projects


Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all projects is filtered for the conjunction of all criteria listed in the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the maintain projects page are two fields that can be filtered: ID and Description.

• The Type column denotes the operator that is used for the criterion. In the maintain project page the
following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

5.1.2.2.2. Specifying Sorting Criteria for Projects in the Maintain Projects Screen


The table that contains the list of projects can be sorted. The criteria for the sorting are specified in the
right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If there are
more than one sorting options the uppermost sorting criterion has the highest priority, lowest row in
the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that the sorting applies to. The fields that
can be sorted for in the maintain projects screen are ID and Description.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

5.1.2.3. Deleting a Project

By clicking on the Delete icon the project will be removed from the list of selectable projects.

27
Define

Why is the Delete button sometimes disabled?


Projects that already contain some test cases can no longer be deleted by users, so their
Delete icon is disabled. Only an administrator can delete a locked project, which may
be restore later as long as it has not yet been finally purged from the database.

5.1.2.4. Restoring a Project

By clicking on the Restore icon a previously deleted project can be restored to the list of selectable
projects. Only an administrator is able to restore a deleted project.

5.1.2.5. Purging a Project

By clicking on the Purge icon a previously deleted project can be finally purged from the database.
Only an administrator is able to purge a deleted project. This operation can not be undone.

5.1.2.6. Editing a Project Description


The project description can be used to give a short overview over the project. By clicking on the descrip-
tion field, the user is able to directly change the current value ( Figure 5.2 ). Press the Save button to
submit the changes that have been made or the Cancel button to discard the changes.

5.1.2.7. Editing the Issue Management Properties of a Project

The issue management systems that are used in the project can be edited by pressing the Edit( ) icon.

The issue management properties page shows two tables: The upper table contains the configured issue
management systems that are already used in the project, the lower table shows all issue management
systems that are configured and not used by the project. To add a issue management system to the
project press the add button ( ) at the issue management system. The issue management system will
be removed from the lower table and added to the upper table. A screen shot of the page for editing
the assigned issue management system is shown in Figure 5.3, “The Project Issue Management System
Selection Page”

Figure 5.3. The Project Issue Management System Selection Page

Press Save to save the changes and Cancel to cancel the changes.

For the configuration of issue management systems see Section 8.5.4, “ Issue Management ” .

28
Define

5.1.2.8. Editing the user defined properties of a project

Klaros-Testmanagement Enterprise Edition Feature


This feature is only available in the Klaros-Testmanagement Enterprise Edition.

The user defined properties can be used to customize the Systems Under Test, Test Environments, Test
Cases and Test Suites by adding additional fields to them.

To New button ca be used to add a new user defined property. After adding a new property it is possible
to choose the entity and property type. There are four possible entities: Figure 5.4, “The Project User
Defined Properties Page”

• System Under Test

• Test Environment

• Test Case

• Test Suite

The three possible types of an user defined property are:

Text The property will be a text field limited to 1024 characters

True/False The property will be a simple check box

Enumeration The property will be a drop down box to select a single entry from a predefined list
of values

After the new property was saved the entity and type can not be changes anymore. The name of the
property and the list of enumeration values can always be changed.

Figure 5.4. The Project User Defined Properties Page

Press Save to save the changes and Cancel to cancel the changes. After saving new user defined prop-
erties they can used immediately.

29
Define

5.1.2.8.1. Enumerations
The Edit button beside the enumeration name opens the menu to edit the values of this enumeration.
Figure 5.5, “Editing the values of an enumeration”

Figure 5.5. Editing the values of an enumeration

5.2.  Test Environments


The Test Environments section allows to manage the different test environments of the project. Test
Environments represent the extrinsic setting that may influence the test result. Examples for parts of an
test environment would be the operation system or an application server.

5.2.1.  Maintaining Test Environments


The Maintain Test Environments page allows to create, remove, update and search test environments (
Figure 5.6 ). All these actions can be done directly by editing the table or on the details page.

Figure 5.6. The Maintain Test Environments Screen

30
Define

The Maintain Test Environments screen shows a table with all the test environments defined for the cur-
rent project. Each row of the table represents a test environment, each column the attributes of the test
environment. By clicking into the table fields, the attributes can be edited directly.

Press the Save button to submit the changes that have been made or the Cancel button to discard.

By clicking on the Delete icon the test environment will be removed from Klaros-Testmanagement
database.

Why is the Delete button sometimes disabled?

Test Environments that are referenced by test runs cannot be deleted, so their Delete
icon is disabled.

By clicking on the Edit icon the details page of the test environment will be displayed.

5.2.1.1. Creating a Test Environment


By clicking on the New button the user is able to create a new test environment ( Figure 5.6 ). Now an
empty row is added at the beginning of the table. The operating system and three other custom fields
can be specified here. The test environment ID is automatically assigned by Klaros-Testmanagement.
The project is the current selected project. Press the Save button to confirm the creation or the Cancel
button to discard the changes.

5.2.1.2. Editing the user defined properties of a Test Environment

Klaros-Testmanagement Enterprise Edition Feature


This feature is only available in the Klaros-Testmanagement Enterprise Edition.

If there user defined properties for test environments configured, they can be edited on this tab. Figure
5.7 )

Note
The tab is disabled if no user defined properties for test environments exist. How to main-
tain the user defined properties is decribed in section Section 5.1.2.8, “Editing the user de-
fined properties of a project” .

Figure 5.7. The Maintain Test Environments Screen

31
Define

5.2.1.3. Searching and Sorting Test Environments


It is possible to search for test environments and sort the results of the search with the filtering and
sorting options. The filtering and sorting options become visible by opening the Filter / Sort panel. The
Filter / Sort panel contains two tables. The left table contains the filtering parameters, the right table
contains the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all test environments.

5.2.1.3.1. Specifying Search Criteria for Test Environments


Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all test environments is filtered for the conjunction of all criteria listed in the filtering
table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the maintain environments page the following fields can be filtered: ID and Description .

• The Type column denotes the operator that is used for the criterion. In the maintain test environment
page the following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

5.2.1.3.2. Specifying Sorting Criteria for Test Environments


The table that contains the list of test environments can be sorted. The criteria for the sorting are spec-
ified in the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If
there are more than one sorting options the uppermost sorting criterion has the highest priority, lowest
row in the sorting criterion table has the lowest priority.

32
Define

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that the sorting applies to. The fields that can
be sorted for in the maintain test environments screen are ID, Operating System, Custom 1, Custom
2 and Custom 3.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

5.3.  Maintaining Systems Under Test


The Systems Under Test (SUT) section allows to maintain the systems that can be tested. A SUT is used
to represent a version of a software product that can be tested.

5.3.1.  Maintain Systems Under Test


Maintain Systems Under Test allows to create, remove, update and search SUTs ( Figure 5.8 ). This can
be done by editing the table shown in the Maintain Systems under Test Page or on the details page.

Figure 5.8. The Maintain Systems Under Test Screen

The table shows all defined systems under test. Each row in the table represents a system under test.
The version of the system under test can be changed by clicking into the fields and editing it directly.
To submit the changes press the Save button, to discard the changes press Cancel.

By clicking on the Delete icon the SUT will be removed from Klaros-Testmanagement database.

Delete button disabled


SUTs that are already referenced by test runs cannot be deleted, so their Delete icon
is disabled.

By clicking on the Edit icon the details page of the system under test will be displayed.

5.3.1.1. Creating a System Under Test


By clicking on the New button the user is able to create a new SUT ( Figure 5.8). An empty row is added
at the beginning of the table. The SUT version can be specified. The SUT ID is automatically assigned by

33
Define

Klaros-Testmanagement. The project is the current selected project. Press the Save button to confirm
the creation or the Cancel button to discard the changes. The user can create and save more than one
SUT at time.

5.3.1.2. Editing the user defined properties of a System Under Test

Klaros-Testmanagement Enterprise Edition Feature


This feature is only available in the Klaros-Testmanagement Enterprise Edition.

If there user defined properties for systems under test configured, they can be edited on this tab. Figure
5.9 )

Note
The tab is disabled if no user defined properties for systems under test exist. How to main-
tain the user defined properties is decribed in section Section 5.1.2.8, “Editing the user de-
fined properties of a project” .

Figure 5.9. The Maintain Test Environments Screen

5.3.1.3. Searching and Sorting Systems Under Test


It is possible to search for Systems Under Test and sort the results of the search with the filtering and
sorting options. The filtering and sorting options become visible by opening the Filter / Sort panel. The
Filter / Sort panel contains two tables. The left table contains the filtering parameters, the right table
contains the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all Systems Under Test.

5.3.1.3.1. Specifying Search Criteria for Systems Under Test


Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all SUTs is filtered for the conjunction of all criteria listed in the filtering table.

34
Define

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the maintain SUTs page are two fields that can be filtered: ID and Version.

• The Type column denotes the operator that is used for the criterion. In the maintain SUTs page the
following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

5.3.1.3.2. Specifying Sorting Criteria for Systems Under Test

The table that contains the list of Systems Under Test can be sorted. The criteria for the sorting are
specified in the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If
there are more than one sorting options the uppermost sorting criterion has the highest priority, lowest
row in the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that the sorting applies to. The fields that
can be sorted for in the maintain SUTs screen are ID and Version.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

5.4.  Test Cases


A test case is a set of input values, execution preconditions, expected results and execution postcondi-
tions, developed for a particular objective or test condition, such as determine whether an application
or software system meets its specifications or not.

35
Define

5.4.1.  Maintain Test Cases


The Maintain Test Cases page allows to create, edit, remove, clone and search test cases. This is done by
editing the table of the Maintain Test Cases page shown in Figure 5.10

Figure 5.10. The Maintain Test Cases Screen

By clicking into of the fields of the table the most important values of a test case ( Name , Traceability
, Priority , State and Execution) can be edited directly by clicking in the input field in the table. More
attributes of a test case and its steps can be edited in a detailed page by clicking on the Edit icon
( Section 5.4.1.4, “Edit Test Case” ). Press the Save button to submit the changes that have been made
or the Cancel button to discard.

Warning Sign in the ID Column


A test case with execution set to manual containing no steps is not executable. To indicate
this, a warning sign ( ) is displayed in the ID column.

5.4.1.1. Creating Test Cases

By clicking on the New button the user is able to create a new test case ( Figure 5.10 ) . An empty row is
added at the beginning of the table. The fields Name , Traceability , Priority , State and Execution of the
test case can be specified. The test case ID is automatically assigned by Klaros-Testmanagement. The
project is the current selected project. To set more properties the user may click on the Edit icon
( Section 5.4.1.4, “Edit Test Case”). To create a copy of the test case press the clone icon. Press the
Save button to confirm the creation or the Cancel button to discard the changes. The user can create
and save more than one test case at time.

5.4.1.2. Editing the user defined properties of a Test Case

Klaros-Testmanagement Enterprise Edition Feature


This feature is only available in the Klaros-Testmanagement Enterprise Edition.

If there user defined properties for test cases configured, they can be edited on this tab. Figure 5.11 )

36
Define

Note
The tab is disabled if no user defined properties for test cases exist. How to maintain the us-
er defined properties is decribed in section Section 5.1.2.8, “Editing the user defined prop-
erties of a project” .

Figure 5.11. The Maintain Test Environments Screen

5.4.1.3. Searching and Sorting Test Cases


It is possible to search for test cases and sort the results of the search with the filtering and sorting
options. The filtering and sorting options become visible by opening the Filter / Sort panel. The Filter /
Sort panel contains two tables. The left table contains the filtering parameters, the right table contains
the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all test cases.

5.4.1.3.1. Specifying Search Criteria for Test Cases


Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all test cases is filtered for the conjunction of all criteria listed in the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the maintain test cases page are eighteen fields that can be filtered: ID, Name, Description, Trace-
ability, Priority, State, Execution, Team, Type, Area, Level, Variety, Docbase, Dependency, Note, Eval-
uation, Precondition and Postcondition.

37
Define

• The Type column denotes the operator that is used for the criterion. In the maintain test cases page
the following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

5.4.1.3.2. Specifying Sorting Criteria for Testcases


The table that contains the list of test cases can be sorted. The criteria for the sorting are specified in
the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If there are
more than one sorting options the uppermost sorting criterion has the highest priority, lowest row in
the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that the sorting applies to. The fields that can
be sorted for in the maintain test cases screen are ID, Name, Description, Traceability, Priority, State,
Execution, Team, Type, Area, Level, Variety, Docbase, Dependency, Note, Evaluation, Precondition
and Postcondition.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

5.4.1.4. Edit Test Case

By clicking the Edit icon the view will change ( Figure 5.12 ) and offers the possibility to change the
following attributes:

• ID

Assigned automatically

• Revision

Assigned automatically and incremented every time a test case is changed

38
Define

• Team

The team that executes the test case

• Name

The name of the test case allocated

• Description

A brief description of the test case

• Precondition

The preconditions of the test case means everything that is necessary to execute a test case, e.g. "It
is necessary to have a database with valid data.".

• Postcondition

The postcondition defines the criteria that must be fulfilled after the test case has been executed, e.g.
the postcondition of a successful login test case is that the user is authenticated and the home page
of the application appears.

• Area

The requirements to cover by the test case

• Design Technique

The method of execution: Black-Box or White-Box

• Variety

The expected behave of the test run: Positive or Negative

• Level

The test stage(s) of the test case, for example unit tests or system test

• Priority

The priority of the test case: Low, Medium or High

• Docbase

Reference to the document base of the test case, which contains the requirements of the test case

• Note

An additional note for the test case, contains outside information of the test case that maybe useful
for users to know

• Dependency

The dependencies of the test case (another test case)

• Evaluation

The mode of the result evaluation: Manual or Automatic

• Execution

The mode of execution of the test case: Manual or Automatic

39
Define

• Traceability

A reference to the corresponding use case or work package

• State

Only test cases with the state Draft are editable. To edit a test case you have to change and save the
state to Draft first. A Locked test case is not executable and a Skip test case is not shown in any reports.
If the test case is approved then set the state to Approved .

Figure 5.12. The Edit Test Cases Screen

Press the New button to get an empty form for a new test case. All the data changed in the previous
form is discarded.

Press the Save button to submit the changes that have been made or the Cancel button to discard the
changes.

What is to do to create a new revision of the test case?


To create a new revision of the test case the checkbox Create new revsion? must be selected.

By clicking on the Delete icon the test case will be removed from Klaros-Testmanagement database.

Warning
Only test cases that are not referenced by results or test suites can be deleted.

5.4.1.4.1. Edit Test Steps


By clicking on the Steps tab a page will be shown that offers the possibility to change the following
attributes:

40
Define

• Precondition

Condition that must be fulfilled before the execution of the test case, e.g. "The user is not yet logged
in."

• Action

The test action, e.g. "Enter name and password, and click button Login."

• Postcondition

Condition that must be fulfilled after the execution of the test case, e.g. "The user is authenticated
and has access to the system."

Why is the Steps tab disabled?


The purpose of test steps is the guidance of a tester through a manual test. So steps are only
sensible for manual tests. Because of this, the "Steps" tab is only enabled if the "Execution"
field is set to "Manual".

Figure 5.13. The Edit Test Steps Screen

To change the order of the test steps use the or icons.

Press the New button to create a new test step, the Back button to return to the test cases view or the
Cancel button to discard the changes.

By clicking on the Delete icon the test step will be removed from Klaros-Testmanagement database
when the user save the test case changes clicking on the Save button in Figure 5.12 .

5.4.1.4.2. Assign Attachments

By clicking on the Attachments tab a page will be shown that offers the possibility to add attachments
to the test case. the Browse button opens the "open file dialog". The Upload button uploads the file to
Klaros-Testmanagement.

41
Define

Figure 5.14. The Assign Attachments Screen

After the attachment is saved to the test case, it is possible to download the file by clicking on the
icon.

By clicking on the icon the attachment will be removed from Klaros-Testmanagement database
when the user saves the test case changes clicking on the Save button.

5.4.1.4.3. Revisions
By clicking on the Revisions tab a page will be shown that shows the revisions history of the test case.

Warning
The tab is only enabled if the test case has a revisions history to show.

Figure 5.15. The Revisions Screen

42
Define

By clicking on the name link of the test cases, it is possible to browse and compare the revisions.

5.5.  Test Suites


A test suite is a set of test cases which can be executed in conjunction. The results of the test suite
execution are used to verify and ensure that a system meets its design specifications and requirements.

5.5.1.  Maintain Test Suites


The Maintain Test Suites page gives an overview of the test suites and allows to create, edit, remove and
search test suites ( Figure 5.16 ). This can be accomplished in the Maintain Test Suite page by editing the
table of test suites shown.

Figure 5.16. The Maintain Test Suites Screen

By clicking on the field and editing the value directly user is able to change the current value of the test
suite description. ( Figure 5.16 ). The column Test Cases shows the amount of test cases contained in
the test suite.

Press the Save button to submit the changes that have been made or the Cancel button to discard.

5.5.1.1. Create Test Suites


By clicking on the New button the user is able to create a new test suite ( Figure 5.16). An empty row
is added at the beginning of the table. The test suite description can be specified. The test suite ID is
automatically assigned by Klaros-Testmanagement. The project is the current selected project. To add
or remove test cases, altering description or assigned systems under test, the user may click on the
Edit icon ( Section 5.5.1.3, “Editing Test Suites”). To clone a test suite click on the . Press the Save
button to confirm the creation or the Cancel button to discard the changes. The user can create and
save more than one test suite at a time.

What is to do to create a new revision of the test suite?


To create a new revision of the test suite the checkbox Create new revsion? must be selected.

5.5.1.2. Searching and Sorting Test Suites


It is possible to search for test suites and sort the results of the search with the filtering and sorting
options. The filtering and sorting options become visible by opening the Filter / Sort panel. The Filter /

43
Define

Sort panel contains two tables. The left table contains the filtering parameters, the right table contains
the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all test suites.

5.5.1.2.1. Specifying Search Criteria for Test Suites


Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all test suites is filtered for the conjunction of all criteria listed in the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the maintain test suites page are two fields that can be filtered: ID and Description.

• The Type column denotes the operator that is used for the criterion. In the maintain test suites page
the following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

5.5.1.2.2. Specifying Sorting Criteria for Test Suites


The table that contains the list of test suites can be sorted. The criteria for the sorting are specified in
the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If there are
more than one sorting options the uppermost sorting criterion has the highest priority, lowest row in
the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

44
Define

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that the sorting applies to. The fields that
can be sorted for in the maintain test suites screen are ID and Description.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

5.5.1.3. Editing Test Suites

By clicking the Edit icon the Edit Test Suite page will be opened. The page allows to edit the test test
cases that are involved in the test suite, changing their order, changing the test suites description and
changing its system under test. The page is shown in Figure 5.17 . The input box in the upper left corner
can be used to change the description of the test suite, the drop-down right aside of it allows to alter
the assigned system under test.

The assigned system under test of a test suite


The assignment of a test suite with a system of a test does not affect the functionality of
Klaros-Testmanagement. The test suites can be still run with other systems under test. This
attribute has only informal character for the user.

By clicking the Add icon the test case is added to the test suite. By clicking the Remove icon the
test case is removed from the test suite. To change the order of the test cases press the or icons.

Figure 5.17. The Edit Test Suite Screen

Warning Sign in the ID Column


A manual test case that contains no steps is not executable. To indicate this, a warning sign
( ) is displayed in the ID column. If the test suite contains at least one test case that is
not executable, the execution of the whole test plan is inhibited.

45
Define

Press the Save button to submit the changes that have been made or the Cancel button to discard the
changes.

5.5.1.3.1. Editing the user defined properties of a Test Suite

Klaros-Testmanagement Enterprise Edition Feature


This feature is only available in the Klaros-Testmanagement Enterprise Edition.

If there user defined properties for test suites configured, they can be edited on this tab. Figure 5.18 )

Note
The tab is disabled if no user defined properties for test suites exist. How to maintain the us-
er defined properties is decribed in section Section 5.1.2.8, “Editing the user defined prop-
erties of a project” .

Figure 5.18. The Maintain Test Environments Screen

5.5.1.3.2. Searching and Sorting available Test Cases


For a better survey of the available test cases is possible to search for test cases and sort the results
of the search with the filtering and sorting options. The filtering and sorting options become visible
by opening the Filter / Sort panel. The Filter / Sort panel contains two tables. The left table contains the
filtering parameters, the right table contains the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all test cases.

Specifying Search Criteria for available Test Cases

Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all test cases is filtered for the conjunction of all criteria listed in the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

46
Define

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the maintain test cases page are eighteen fields that can be filtered: ID, Name, Description, Trace-
ability, Priority, State, Execution, Team, Type, Area, Level, Variety, Docbase, Dependency, Note, Eval-
uation, Precondition and Postcondition.

• The Type column denotes the operator that is used for the criterion. In the maintain test cases page
the following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

Specifying Sorting Criteria for available Test Cases

The table that contains the list of test cases can be sorted. The criteria for the sorting are specified in
the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If there are
more than one sorting options the uppermost sorting criterion has the highest priority, lowest row in
the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that the sorting applies to. The fields that can
be sorted for in the maintain test cases screen are ID, Name, Description, Traceability, Priority, State,
Execution, Team, Type, Area, Level, Variety, Docbase, Dependency, Note, Evaluation, Precondition
and Postcondition.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

5.5.1.3.3. Revisions
By clicking on the Revisions a page will be shown that shows the revisions history of the test suite.

Warning
The button is only enabled if the test suite has a revision history to show.

47
Define

Figure 5.19. The Revisions Screen

By clicking on the name link of the test suites, it is possible to browse and compare the revisions.

48
Chapter 6. Execute

This chapter describes how to execute test cases and test suites, how to continue unfinished test suite
executions and finally how to create issues for tests that failed.

6.1.  Start New Run

6.1.1.  Run Single Test Case


Executing a test case consists of running a test on a specific version of the SUT in a selected test envi-
ronment.

The Run a Single Test Case page shows all test cases in a list. To get a better overview about the available
test cases the sorting and searching functionality of the Filter / Sort panel can be used.

6.1.1.1. Searching and Sorting Test Cases


It is possible to search for test cases and sort the results of the search with the filtering and sorting
options. The filtering and sorting options become visible by opening the Filter / Sort panel. The Filter /
Sort panel contains two tables. The left table contains the filtering parameters, the right table contains
the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all test cases.

6.1.1.1.1. Specifying Search Criteria for Test Cases


Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all test cases is filtered for the conjunction of all criteria listed in the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the Run A Single Test Case page are eighteen fields that can be filtered: ID, Name, Description,
Traceability, Priority, State, Execution, Team, Type, Area, Level, Variety, Docbase, Dependency, Note,
Evaluation, Precondition and Postcondition.

• The Type column denotes the operator that is used for the criterion. In the Run A Single Test Cases
page the following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

49
Execute

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

6.1.1.1.2. Specifying Sorting Criteria for Test Cases


The table that contains the list of test cases can be sorted. The criteria for the sorting are specified in
the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If there are
more than one sorting options the uppermost sorting criterion has the highest priority, lowest row in
the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that is sorted for. The fields that can be
sorted for in the Run Single Test Case screen are ID, Name, Description, Traceability, Priority, State,
Execution, Team, Type, Area, Level, Variety, Docbase, Dependency, Note, Evaluation, Precondition
and Postcondition.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

6.1.1.2. Executing a Single Test Case

To execute a test case click on the Action icon of the test case ( Figure 6.1 ).

Figure 6.1. The Run Single Test Case Screen

50
Execute

Warning
Only test cases with the states Draft , Approved and Skip can be executed.

A detailed information screen about the test case will be shown ( Figure 6.2 ). Press the icon on the
right side of the screen, for more details about the test case.

Figure 6.2. The Detailed Information about the Test Case Screen

Before a test run can be started, the test environment the test is run in and the SUT itself has to be
defined.

Creating Test Environments and Systems under Test


For information on how to create a test environment or a system under test, please refer
to Chapter 5, Define .

Executing prior revisions of a test


To execute an older revision, choose the revision from the Choose revision drop down box
and press the Switch button.

Press the Execute button to run the test case. Klaros-Testmanagement will show the following screen
( Figure 6.3 ).

51
Execute

Figure 6.3. The Overview Screen

The screen shows the test case overview and the attachments of the test case. The attachments can be
downloaded by clicking on the icon.

To begin the manual test run the Start button must be pressed.

Figure 6.4. The Step By Step Instructions Screen

The screen shows the Precondition , Action and Expected values of the test case. Depending on the
values the user can choose and press the following buttons:

52
Execute

• Passed

The test has finished successfully

• Error

An error occurred during test execution

• Failure

A failure of the system under test was detected during test execution

• Skip

The current step can be skipped without changing the result of the test

• Back

Go back to the last step to repeat it again

What is the difference between a failure and an error?


A failure is a discrepancy between a computed, observed, or measured value or condition
and the true, specified, or theoretically correct value or condition.

An error is the inability of the system to perform the test correctly.

The same process has to be applied for each step of the test case.

For each error or failure, or at end of the test execution, Klaros-Testmanagement will show the following
screen ( Figure 6.5 ).

Figure 6.5. The Error or Failure Detected Screen

53
Execute

Tip
The description and summary are common for all the steps.

Press the Next Step button to continue the test even if an error or failure occurred, or press the End Test
button to finish the test execution .

By clicking on the Finish button, Klaros-Testmanagement will show the test run result of the test case
( Figure 6.6 ).

Figure 6.6. The Test Case Results Screen

Note
If there are any issue management systems configured for this project, the Create Issue
will be enabled. For details about creating an issue with Klaros-Testmanagement see Sec-
tion 6.3, “ Creating an issue ” .

6.1.2.  Run Test Suite


Executing a test suite means running a defined set of tests on a specific version of the SUT in a selected
test environment.

The Run a Test Suite page shows all test suites of a project in a list. To get a better overview about the
available test suites the sorting and searching functionality of the Filter / Sort panel can be used.

6.1.2.1. Searching and Sorting Test Suites


It is possible to search for test suites and sort the results of the search with the filtering and sorting
options. The filtering and sorting options become visible by opening the Filter / Sort panel. The Filter /
Sort panel contains two tables. The left table contains the filtering parameters, the right table contains
the sorting parameters.

54
Execute

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all test suites.

6.1.2.1.1. Specifying Search Criteria for Test Suites


Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all test suites is filtered for the conjunction of all criteria listed in the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the run test suite page are two fields that can be filtered: ID and Description.

• The Type column denotes the operator that is used for the criterion. In the run test plan page the
following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

6.1.2.1.2. Specifying Sorting Criteria for Test Suites


The table that contains the list of test suites can be sorted. The criteria for the sorting are specified in
the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If there are
more than one sorting options the uppermost sorting criterion has the highest priority, lowest row in
the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

55
Execute

• The Order By column denotes the field in the results table that is sorted for. The fields that can be
sorted for in the run test suite screen are ID and Description.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

6.1.2.2. Executing a Test Suite

To execute a test suite click on the Action icon of the test suite in Figure 6.7 .

Figure 6.7. The Run Test Suite Screen

Warning
Test cases with the state Locked or without any steps will be skipped when executing the
test suite..

A detailed information screen about each test suite will be shown in ( Figure 6.8 ). Press the icon, on
the right side of the screen, for more details about the test suite.

Figure 6.8. The Detailed Information about the Test Suite Screen

56
Execute

Before a test run is started, a test environment and a SUT must be chosen.

Before a test run is started, for each test case of the test suite the same actions described in Section 6.1.1,
“ Run Single Test Case ” can be applied to execute a test suite.

Tip
To execute an older revision, choose the revision from the Choose revision drop down box
and press the Switch button.

By clicking on the Finish button, Klaros-Testmanagement will show the test result of each test case that
has been executed in the test suite ( Figure 6.9 ).

Figure 6.9. The Test Suite Results Screen

Note
If there are any issue management systems configured for this project, the Create Issue
will be enabled. For details about creating an issue with Klaros-Testmanagement see Sec-
tion 6.3, “ Creating an issue ” .

6.2. Continue Run

6.2.1.  Continue Test Suite


A test suite that was interrupted in its execution can be continued later. The test plan execution starts
with the test case that was not fully executed.

The Continue a Test Run page lists all test suites runs of a project that have not been finished. To get a
better overview about the available interrupted test suite runs the sorting and searching functionality
of the Filter / Sort panel can be used.

57
Execute

6.2.1.1. Searching and Sorting discontinued Test Suites in the Continue Test Run page
It is possible to search for discontinued runs of test suites and sort the results of the search with the
filtering and sorting options. The filtering and sorting options become visible by opening the Filter / Sort
panel. The Filter / Sort panel contains two tables. The left table contains the filtering parameters, the
right table contains the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all discontinued runs of test suites.

6.2.1.1.1. Specifying Search Criteria for Suspended Test Suite Executions


Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all discontinued runs of test suites is filtered for the conjunction of all criteria listed
in the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the Continue Test Run page are six fields that can be filtered: ID, Description, Last Executor, Operat-
ing System, Version and Timestamp.

• The Type column denotes the operator that is used for the criterion. In the Continue Test Run page the
following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

6.2.1.1.2. Specifying Sorting Criteria for Suspended Test Suite Executions


The table that contains the list of discontinued test runs can be sorted. The criteria for the sorting are
specified in the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If

58
Execute

there are more than one sorting options the uppermost sorting criterion has the highest priority, lowest
row in the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that is sorted for. The fields that can be
sorted for in the Continue Test Run screen are ID, Description, Last Executor, Operating System, Ver-
sion and Timestamp.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

6.2.1.2. Continuing a suspended test suite execution


Sometimes a test suite execution has to be suspended due to working hours or outside conditions which
prevent further test executions. A tester may stop executing a test suite at any time without loosing the
previously logged test results. Only the result of currently executed steps are lost in this case.

To continue the execution of a stopped/suspended test suite click on the Action icon of the test
suite in Figure 6.10 . To finish a suspended test suite click on the icon of the test suite.

Figure 6.10. The Continue Test Suite Screen

For each test case of the test suite the same actions as described in Section 6.1.1, “ Run Single Test
Case ” can be applied to execute the test suite. The new test results are merged automatically with the
suspended test suite execution results.

6.3.  Creating an issue


After the execution of a test case or test suite has been finished, an issue in an issue management system
can be created. For this at least one issue management system has to be configured and defined for the
selected project. How an issue management system is configured is described in Section 8.5.4, “ Issue
Management ” , how an issue management system is defined for a project is explained in Section 5.1.2.7,
“Editing the Issue Management Properties of a Project” .

The execution of a test case is described in Section 6.1.1, “ Run Single Test Case ” and the execution
of a test suite is described in Section 6.1.2, “ Run Test Suite ” .

59
Execute

To create a new issue, click on the Create Issue button on the results page, that is displayed after the
execution of a test case or test plan.

By default the system uses the credentials entered at the login screen to authenticate the user against
the issue management system. If the credentials entered at login does not suffice and the issue man-
agement system has not yet been used during the session, Klaros-Testmanagement will request new
authentication information in a dialog window. Otherwise, the user is directed directly to the issue cre-
ation page.

The content of the issue management page is mostly dependent on the used issue management system
and its capabilities, but all pages for the different issue management systems have some elements in
common:

Issue Management System When there are more than one issue management systems are con-
figured for the project, the issue management system where the
new issue is created can be selected in the issue management sys-
tem drop-down field.

Save The Save button creates or updates an issue in the selected issue
management system.

Cancel The Cancel discards unsaved changes and goes back to the Execute
a Single Test Case , or the Execute a Test Suite page.

New After an issue has been saved an additional New button is provided
to allow to create more than one issue.

For the JIRA issue tracker additionally the following fields are available:

Summary A short summary of the issue.

Type The type of the issue, e.g. a bug or an improvement.

Priority The priority of the issue.

Components The components of the system that are affected, multiple selections are al-
lowed here.

Affected Versions The version of the system in which the issue appeared, multiple selections
are allowed here.

Fixed Versions The version of the system in which the issue has been fixed. Multiple selec-
tions are allowed here.

Due Date The date when the issue should be fixed.

Description A longer description of the issue.

Test Environment A description of the test environment in which the issue occurred.

60
Execute

Figure 6.11. Jira Issue Page

For the Trac issue tracker the following fields are available:

Summary A short summary of the issue.

Type The type of the issue, e.g. a defect or enhancement

Priority The priority of the issue.

Components The component of the system that is affected.

Version The version of the system that is affected.

Milestone The milestone of the system that is affected.

Description A detailed description of the issue

Keywords Keywords for the issue.

61
Execute

Figure 6.12. Trac Issue Page

For the Redmine issue tracker the following fields are available:

Summary A short summary of the issue.

Issue Type The type of the issue, e.g. a defect or enhancement

Priority The priority of the issue.

Assignee The assignee of the issue.

Version The version of the system that is affected.

Category The category of the system that is affected.

Milestone The milestone of the system that is affected.

Description A detailed description of the issue

Estimated time [hours] The estimated time in hours for the issue.

62
Execute

Figure 6.13. Redmine Issue Page

For the Bugzilla issue tracker the following fields are available:

Summary A short summary of the issue.

Platform The paltform of the issue.

Importance The priority and type of the issue.

Components The components of the issue.

Version The version of the system that is affected.

Milestone The milestone of the system that is affected.

Description A detailed description of the issue

Figure 6.14. Bugzilla Issue Page

63
Execute

Tip
To change the state of the related test case just activate the Update Test Case State? and
choose the new state from the drop down box.

For a more detailed information about each field please consult the documentation of the respective
issue management system.

The created issues can be inspected in the Issues section in the Evaluate section see Section 7.3, “Issues”
for more details.

64
Chapter 7. Evaluate

This chapter describes how Klaros-Testmanagement can be used to gather information about the test-
ing project, and evaluate the data about it.

7.1.  Reports
The reports section provides a dashboard, a page for inspecting single test runs and a page for gener-
ating a test run history report.

7.1.1.  Dashboard
The dashboard shows basic statistics about the active project and provides the links to the overview
reports. An example of a dashboard is shown in Figure 7.1 .

Figure 7.1. The Dashboard Screen

7.1.1.1. The Overview Reports


The upper part of the dashboard shows a table with the available overview reports. There following
three types of reports available:

65
Evaluate

• Test Environment Overview

The Test Environment Overview report shows the test runs that have been executed in each test envi-
ronment containing the execution date and quantity of success, failure or error result. An example of
the report is shown in Figure 7.2 .

Figure 7.2. The Test Environment Overview Report Layout

• SUT Overview

The SUT Overview report shows the test runs that have been executed for a SUT containing the exe-
cution date and quantity of success, failure or error result. An example of the report is is shown in
Figure 7.3 .

66
Evaluate

Figure 7.3. The SUT Overview Report Layout

• Test Suite Overview

The Test Suite Overview report shows the test plans containing its test cases, description and and
how many times it was executed. An example report is shown in Figure 7.4 .

67
Evaluate

Figure 7.4. The Test Suite Overview Report Layout

The reports can be generated in five different file formats:

• PDF

• HTML

• CSV

• RTF

• XML

By clicking on an icon of a file type, the report will be generated in the selected format.

7.1.1.2. The Dashboard Reports


The dashboard is a container for multiple reports. Each report has the same structure: It consists of a title
bar and an area in which the report is displayed. The title bar of the report contains the reports' name,
and an edit button ( ). When the edit icon is clicked, the parameters of the report can be configured
in the following dialog. The changes in the configuration can be applied to the report by pressing the
Apply button or discarded by pressing the Cancel button. Additionally the name of the report can be
configured by clicking on the name in the title bar of the report.

The configured dashboard can be persisted by clicking the Save button in the upper right corner of
the dashboard, if the Cancel button is pressed, the changes in the configuration of the dashboard are
discarded.

68
Evaluate

All reports are showing the data of the correctly selected project.

The Klaros-Testmanagement Community Edition comes with three predefined reports:

• The Project Overview Report

• The Latest Success Rate Report

• The Test Activity Report

The reports are described in the following sections.

In addition the Klaros-Testmanagement Enterprise Edition contains three additional reports:

• The Project Health Report

• The Test Progress History Report

• The Test Progress Report

The reports are described in the following sections.

7.1.1.2.1. The Project Overview Report


The Project Overview Report shows the main properties of a project listed in a table. The Report shows

• The number of defined systems under test

• The number of defined test environments

• The number of test cases

• The number of test suites

• The number of test case results

• The number of test suite results

• The average of the count of the test case steps

• The average of the test case execution time

The only configurable property of the report is its display name.

Figure 7.5. The Project Overview Report

7.1.1.2.2. The Latest Success Rate Report


The Latest Success Rate shows the ratio of the newest test results that have been run for a selected com-
bination of system under test and test environment.

The configurable properties of the report are:

• The display name of the report.

• A boolean value indicating that this report always uses the active project (Klaros-Testmanagement
Enterprise Edition only).

• Alternativly, the fixed project that this report will display.

• The system under test.

• The test environment.

69
Evaluate

Figure 7.6. The Project Overview Report

7.1.1.2.3. The Test Activity Report


The Test Activity shows the count and the results of the test runs for a selected combination of system
under test and test environment in a selected period of time as a histogram.

The configurable properties of the report are:

• The name of the report.

• A boolean value indicating that this report always uses the active project (Klaros-Testmanagement
Enterprise Edition only).

• Alternativly, the fixed project that this report will display.

• The system under test.

• The test environment.

• The time period in days.

Figure 7.7. The Project Overview Report

7.1.1.2.4. The Project Health Matrix Report


The Project Health Matrix Report shows the project health as weather icons for each defined test envi-
ronment and system under test in a matrix. The icons are calculated from the execution and success
rate in each category.

Klaros-Testmanagement Enterprise Edition Feature


This feature is only available in the Klaros-Testmanagement Enterprise Edition.

The configurable properties of the report are:

• The name of the report.

• A boolean value indicating that this report always uses the active project.

• Alternativly, the fixed project that this report will display.

• The execution rates assigned to the different health categories.

• The success rates assigned to the different health categories.

Figure 7.8. The Project Health Matrix Report

70
Evaluate

7.1.1.2.5. The Test Progress Report


The Test Progress Report shows the rate of executed vs. defined tests of a project for a given test envi-
ronment and a system under test.

Klaros-Testmanagement Enterprise Edition Feature


This feature is only available in the Klaros-Testmanagement Enterprise Edition.

The configurable properties of the report are:

• The name of the report.

• A boolean value indicating that this report always uses the active project

• Alternativly, the fixed project that this report will display.

• The system under test.

• The test environment.

Figure 7.9. The Test Progress Report

7.1.1.2.6. The Test Progress History Report


The Test Progress History Report shows the rate of executed vs defined tests of a project for a given test
environment, system under test and time period.

Klaros-Testmanagement Enterprise Edition Feature


This feature is only available in the Klaros-Testmanagement Enterprise Edition.

The configurable properties of the report are:

• The name of the report.

• A boolean value indicating that this report always uses the active project.

71
Evaluate

• Alternativly, the fixed project that this report will display.

• The system under test.

• The test environment.

• The time period of the report

Figure 7.10. The Test Progress History Report

7.1.2.  Report Templates

Klaros-Testmanagement Enterprise Edition Feature


This feature is only available in the Klaros-Testmanagement Enterprise Edition.

The Report Templates screen allows to generate the reports to PDF. The icon starts the generation
and opens (if necessary) a new window to enter the needed parameters. How to create user defined
reports is explained in the section Section 8.2, “ Report Templates ” .

Figure 7.11. User Defined Reports Screens

If the user defined report is parameterized, all needed parameters must be entered to start the gener-
ation of the report.

72
Evaluate

Figure 7.12. Generate a parameterized Report

Note
The Generate button gets visible after entering all needed parameters.

7.1.3.  Single Test Run Report


The Single Test Run Report screen allows to generate a report containing the result of the test run se-
lected by clicking on the Action icon. ( Figure 7.13 ).

Figure 7.13. The Single Test Run Report Screen

The Single Test Run Report page lists all test runs that have been made in the active project. To get a
better overview about the test runs the sorting and searching functionality of the Filter / Sort panel can
be used.

73
Evaluate

7.1.3.1. Searching and Sorting Reports of Single Test Runs


It is possible to search for reports of single test runs and sort the results of the search with the filtering
and sorting options. The filtering and sorting options become visible by opening the Filter / Sort panel.
The Filter / Sort panel contains two tables. The left table contains the filtering parameters, the right table
contains the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all reports of single test runs.

7.1.3.1.1. Specifying Search Criteria for Reports of Single Test Runs in the Single Test
Run Reports page
Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all Reports of Single Test Runs is filtered for the conjunction of all criteria listed in
the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the Single Test Run Reports page are four fields that can be filtered: ID, Test Environment, System
Under Test and Timestamp.

• The Type column denotes the operator that is used for the criterion. In the Single Test Run Reports
page the following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

7.1.3.1.2. Specifying Sorting Criteria for Reports of Single Test Runs


The table that contains the list of test results can be sorted. The criteria for the sorting are specified in
the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If there are

74
Evaluate

more than one sorting options the uppermost sorting criterion has the highest priority, lowest row in
the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that is sorted for. The fields that can be
sorted for in the Single Test Run Reports screen are ID, Test Environment and System under Test.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

7.1.3.2. The Test Run Report

The report is generated in PDF format and its layout is shown in Figure 7.14 : The report details: the date
of creation of test run and its result, specifying the number of test cases executed with success, failure
and error. Additionaly a pie chart is shown summarizing the result.

75
Evaluate

Figure 7.14. The Single Test Run Report Layout

7.1.4.  Test Run History Report


The Test Run History screen allows to generate a report containing a summary with two result history
graphics and test run details of each project, with the run ID, date of test execution , total of test cases
and number of success, failure and error results ( Figure 7.15 ).

76
Evaluate

Figure 7.15. The Test Run History Screen

Klaros-Testmanagement can export these test run report in five different types:

• PDF

• HTML

• CSV

• RTF

• XML

The type can be selected in field Choose File Type . The start date and end date can be selected using
the calendar component in Figure 7.15 .

The test run history can be generated by clicking the Generate button.

The report is generated and its layout is shown in Figure 7.16 . The report details the date of creation
of each test run and its results, specifying the number of test cases executed with success, failure and
error. Two line charts visualize these results.

77
Evaluate

Figure 7.16. The Test Run History Report Layout

7.2.  Test Results


Here it is possible to visualize the results of test runs. Klaros-Testmanagement defines two types of re-
sults: Test Case Results and Test Suite Results.

7.2.1.  Test Case Results


In the Test Case Result screen it is possible to visualize the results of the execution of a particular test
case. Figure 7.17 .

78
Evaluate

Figure 7.17. The Test Case Results Screen

The screen displays an overview of the test cases for the project. To narrow the search, the filtering and
sorting functionality of the page can be used.

7.2.1.1. Searching and Sorting Test Cases


It is possible to search for test cases in the test case result page and sort the results of the search with
the filtering and sorting options. The filtering and sorting options become visible by opening the Filter /
Sort panel. The Filter / Sort panel contains two tables. The left table contains the filtering parameters, the
right table contains the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all test cases in the project.

7.2.1.1.1. Specifying Search Criteria for test cases in the test case report page
Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all test cases is filtered for the conjunction of all criteria listed in the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the test case results page are eighteen fields that can be filtered: ID, Name, Description, Traceability,
Priority, State, Execution, Team, Type, Area, Level, Variety, Docbase, Dependency, Note, Evaluation,
Precondition and Postcondition.

• The Type column denotes the operator that is used for the criterion. In the Test Case Results page the
following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

79
Evaluate

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

7.2.1.1.2. Specifying Sorting Criteria for Test Cases

The table that contains the list of test results can be sorted. The criteria for the sorting are specified in
the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If there are
more than one sorting options the uppermost sorting criterion has the highest priority, lowest row in
the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that is sorted for. The fields that can be
sorted for in the Test Case Results screen are ID, Name, Description, Traceability, Priority, State, Exe-
cution, Team, Type, Area, Level, Variety, Docbase, Dependency, Note, Evaluation, Precondition and
Postcondition.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

7.2.1.2. Inspecting the Details of Test Case Results

By clicking on Action icon Klaros-Testmanagement offers the user the possibility to visualize more
details of the test case and its results ( Figure 7.18 ).

80
Evaluate

Figure 7.18. The Test Case Results Screen

The test case details panel in the page shows all details of the selectet test case.

The table shows all results of each run of the test case in particular. The result of the test cases run is
illustrated by the following icons:

• the test finished successfully

• a failure of the system under test has been detected

• an error occured during the test execution

• the test result could not be determined

To provide a better survey over the table with the test runs, Klaros-Testmanagement allows to filter and
sort the content of the test case result table.

7.2.1.2.1. Searching and Sorting the Test Case Results Table


It is possible to search in the table with the test suite results and sort the results of the search with the
filtering and sorting options. The filtering and sorting options become visible by opening the Filter / Sort
panel. The Filter / Sort panel contains two tables. The left table contains the filtering parameters, the
right table contains the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all results of a test case.

Specifying Search Criteria for Results of a Test Case

Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all Reports of Single Test Runs is filtered for the conjunction of all criteria listed in
the filtering table.

81
Evaluate

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the test case results page are four fields that can be filtered: ID, Operating System, Version and
Timestamp.

• The Type column denotes the operator that is used for the criterion. In the test case result details page
the following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

Specifying Sorting Criteria for Results of a Test Case

The table that contains the list of test cases with results can be sorted. The criteria for the sorting are
specified in the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If
there are more than one sorting options the uppermost sorting criterion has the highest priority, lowest
row in the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that is sorted for. The fields that can be
sorted for in the Test Case Results Details screen are ID, Test Environment, System under Test and
Timestamp.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

7.2.1.2.2. Navigating back to the Overview Page

By clicking on the Back button the screen changes to Figure 7.17 and the user is able to choose another
test case.

82
Evaluate

7.2.2.  Test Suite Results


In the Test Suite Results screen it is possible to visualize the results of the execution of a test plan and
its test cases ( Figure 7.19 ).

Figure 7.19. The Test Suite Results Screen

The screen gives an overview of the test suites of the project. To gain more overview, the Filtering and
Sorting functionality of the page can be used.

7.2.2.1. Searching and Sorting test suites in the

It is possible to search in the test suites and sort the results of the search with the filtering and sorting
options. The filtering and sorting options become visible by opening the Filter / Sort panel. The Filter /
Sort panel contains two tables. The left table contains the filtering parameters, the right table contains
the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all test suites that have results.

7.2.2.1.1. Specifying Search Criteria of a test suites in the test suite results page

Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all Reports of Single Test Runs is filtered for the conjunction of all criteria listed in
the filtering table.

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the main test suite result page are two fields that can be filtered: ID and Description.

• The Type column denotes the operator that is used for the criterion. In the test case result details page
the following operators are available:

83
Evaluate

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

7.2.2.1.2. Specifying Sorting Criteria for Test Suites

The table that contains the list of test suites with results runs can be sorted. The criteria for the sorting are
specified in the right table in the Filter / Sort panel. It is possible to specify more than one sorting option. If
there are more than one sorting options the uppermost sorting criterion has the highest priority, lowest
row in the sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that is sorted for. The fields that can be
sorted for in the Test Suite Results screen are ID and Description.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

7.2.2.2. Inspecting Test Suite Result Details

By clicking on Action icon Klaros-Testmanagement shows the test suite result, the timestamp, test
environment and SUT that the test suite was executed ( Figure 7.20 ).

84
Evaluate

Figure 7.20. The Test Suite Results Screen

The detailed testplan results can be searched and sorted for ID, Operating System and Version.

By clicking on Action icon again, Klaros-Testmanagement offers the user the possibility to visualize
the result of each test case of the selected test suite. ( Figure 7.21 ).

Figure 7.21. The Test Suite Results Screen - Test Results

The result of the test suites and test cases is illustrated by the following icons:

• the test finished successfully

• a failure of the system under test has been detected

• an error occured during the test execution

• the test result could not be determined

The detailed test case results can be searched and sorted for ID, Operating System and Version.

7.3. Issues
The issues section allows to inspect the issues that are attached to results of test cases.

85
Evaluate

7.3.1. Issues by Test Case


The Issues by Test Case page lists all test cases that have results with an attached issue in a table. The
table has five columns that show ID, name, description and the traceability properties of a test case. By
clicking the button the attached issues of the selected test case can be browsed. The overview table
is shown in Figure 7.22, “The Test Case Selection Screen” .

Figure 7.22. The Test Case Selection Screen

When the button is pressed, the issues that are attached to a test case are shown listed in a table. The
table has seven columns, that show the id of the issue in the issue management system, the name of the
issue management system, the summary of the issue, the date when the issue was created, the creator,
the priority and the status of an issue. The table with the issues for a test case is shown in Figure 7.23,
“The Issues By Test Case Screen” .

Figure 7.23. The Issues By Test Case Screen

By clicking the link on the issue id, the user is directly redirected to the page of the issue in the issue
management system.

86
Evaluate

Clicking Back navigates back to the Issues by Testcase overview page.

87
Chapter 8. Configure

The Configure section enables to change the application settings, such as create or edit users, e-mail
settings, JIRA settings, logging settings and backup and recovery of projects.

8.1.  System Information

The first page shown is the System Information page. The System Information page contains informa-
tion about the operating system, Java environment, memory usage and some other information about
the system ( Figure 8.1 ).

Figure 8.1. The System Information Screen

8.2.  Report Templates


Klaros-Testmanagement Enterprise Edition Feature
This feature is only available in the Klaros-Testmanagement Enterprise Edition.

On the Report Templates pages it is possible to create, edit, delete and generate user defined reports.

A click on the icon starts the generation of the related report to the PDf format.

88
Configure

Figure 8.2. User Defined Reports Screens

If the user defined report is parameterized, all needed paremeters must be entered to start the gener-
ation of the report.

Figure 8.3. Generate a parameterized Report

Note
The Generate button gets visible after entering all needed parameters.

89
Configure

To create a new report press the New button. A new report gets created and the Report Details page is
shown. To edit an existing report press the icon. By pressing the icon the related report gets
removed from the content reporsitory.

8.2.1. Report Details
On the Report Details it is possible to set the name and description of the report. There must also be a
valid script and report template in order to save the user defined report. It is possible to edit the script
and template in the text area or upload them directly from an external file. By pressing the Save button
Klaros-Testmanagement tries to save the report to the content repository.

Note
The report is only saved if the script and template are valid and free of errors.

Figure 8.4. Report Details

Further information about creating user defined reports is available in the section Section 9.1, “ Create
A New Report Template ” .

8.3.  User Administration


In the User Administration section it is possible to browse, create, edit and search users.

8.3.1.  Create User


By choosing this option the user is able to create a new user ( Figure 8.5 ).

90
Configure

Figure 8.5. The Create User Screen

For creating a new user the following attributes need to be specified:

• Full Name

The full name of the user

• User Name

The login name of the user

• E-Mail Address

The e-mail address of the user

• System account

If this flag is set the user is not able to login at the login page and interactively control the application.
Still his credentials can may used for automated tasks like importing test results.

• Password

The password to log in Klaros-Testmanagement

• Retype Password

Confirmation of the password

• User Role

The role of the user, which can be Administrator , Manager or Tester .

Press the Save button to confirm the creation of the user or the Cancel button to discard.

91
Configure

Important
Only users with administrator or manager privileges are able to create new users.

8.3.2.  Search User


The Search User page gives the user an overview about the users registered in Klaros-Testmanagement.
( Figure 8.6 ). Each user is listed in a row.

Figure 8.6. The Edit User Screen

The Filter / Sort panel allows to search for users and sort them for arbitrary criteria.

8.3.2.1. Searching and Sorting Users


For a better survey of the users is possible to search for users and sort the results of the search with
the filtering and sorting options. The filtering and sorting options become visible by opening the Filter /
Sort panel. The Filter / Sort panel contains two tables. The left table contains the filtering parameters, the
right table contains the sorting parameters.

Below the two tables you find two buttons:

Apply The Apply button applies the filter and sorting criteria specified in
the two tables above and presents the filtered and sorted result in
the table below.

Reset The Reset button clears the tables with the filter and sorting crite-
ria and displays all users.

8.3.2.1.1. Specifying Search Criteria for available Users


Each line in the filter table represents a criterion that restricts the result set that is displayed in the table
below, i.e. the set of all users is filtered for the conjunction of all criteria listed in the filtering table.

92
Configure

By clicking the button, a new and empty row for a filter criterion is appended to the table. The table
with the filtering criteria has four columns:

• The Field column denotes the field in the table that the criterion is filtering for. In the Filter / Sort panel
of the maintain users page are five fields that can be filtered: Full Name , User Name , E-Mail Address
, Role and the SystemAccount predicate.

• The Type column denotes the operator that is used for the criterion. In the search users page the
following operators are available:

Equals Only the items that exactly match the value of the criterion are
shown.

Not equals The items that do not equal the value of the criterion are shown in
the result table.

Greater than or equals Items that are greater than or equal the value of the criterion are
shown.

Less than or equals Items that are less than or equal the value of the criterion are
shown.

Greater than Items that are greater than the value of the criterion are shown.

Less than Items that are less than the value of the criterion are shown.

Like Items that have a part that matches value of the criterion are
shown.

• The Value column defines the value for the filtering criterion.

• The Action column contains a button ( ) for deleting the criterion in the current line.

8.3.2.1.2. Specifying Sorting Criteria for Users

The table that contains the list of users can be sorted. The criteria for the sorting are specified in the right
table in the Filter / Sort panel. It is possible to specify more than one sorting option. If there are more
than one sorting options the uppermost sorting criterion has the highest priority, the lowest row in the
sorting criterion table has the lowest priority.

When the button is clicked, a new empty row for a sorting criterion is appended at the bottom of
the sorting options table. Each row in the sorting options table has in the Action column a Delete button
( ). By pressing this button the row with the sorting criterion can be removed.

The sorting options table contains three columns:

• The Order By column denotes the field in the results table that the sorting applies to. The fields that
can be sorted after are Full Name , User Name , E-Mail Address , Role and the SystemAccount predicate.

• The Type column defines the direction of the sorting sequence, i.e. descending or ascending.

• The Action column contains a button for deleting the sorting criterion.

8.3.2.2. Editing User Properties

By clicking the Edit icon the view will change and offers the possibility to change the user name, e-
mail address, user role and password of the user ( Figure 8.7 ).

93
Configure

Figure 8.7. The Edit User Screen

8.3.2.3. Deleting a User

By clicking on the Delete icon on the Search User Screen ( Figure 8.6 ) the user will be removed from
Klaros-Testmanagement database. Be sure to use that function only when you are really sure to discard
that user.

8.4.  Authentication
Klaros-Testmanagement Enterprise Edition Feature
This feature is only available in the Klaros-Testmanagement Enterprise Edition.

The authentication section allows to configure external authentication systems to be used with Klaros-
Testmanagement.

By using an external authentication system the administrator is no longer required to manually create
user accounts and provide the users with default passwords. Instead an external system like an existing
LDAP dictionary can be used to authenticate users.

If external authentication is activated, a user account inside Klaros-Testmanagement is automatically


created when the user successfully authenticates against the external authentication system for the first
time. The default role for newly created users is the tester role.

Only a user with the administrator role is able to edit the system parameters below.

8.4.1.  LDAP
For accessing an LDAP server a rather large set of configuration parameters are required. Your local
system administrator should be able to provide you with the correct values.

94
Configure

Figure 8.8. The LDAP Configuration Screen

Parameters needed to contact the LDAP server:

• Server Address

The internet address under which the LDAP server resides (e.g. ldap.acme.com ).

• Server Port

The port on which the LDAP server is listening (typically 389 ).

• Bind DN

The distinguished name used for binding to this LDAP server.

• Bind Credentials

The credentials (password) required to be able to bind to this LDAP server.

Parameters needed to locate user accounts:

• User Context DN

The distinguished name under which user accounts are stored.

• User DN Prefix

The distinguished name prefix used to locate user accounts (e.g. uid= ).

• User DN Suffix

The distinguished name suffix used to locate user accounts (e.g. ,ou=Users,dc=acme,dc=com
). When locating user accounts the prefix, the account id and the suffix are concatenated to form the
distinguished name of the user account.

95
Configure

• User Object Classes

A comma separated list of the LDAP object classes of the user account entries (e.g. person,posix-
Account )

Parameters describing the attributes of a user account:

• User Name Attribute

The LDAP user name attribute which corresponds to the Klaros-Testmanagement account name (e.g.
uid ).

• User Password Attribute

The LDAP password attribute which corresponds to the Klaros-Testmanagement account password
(e.g. userPassword ).

• First Name Attribute

The LDAP user first name attribute which corresponds to the Klaros-Testmanagement account name
(e.g. givenName ).

• Last Name Attribute

The LDAP user last name attribute which will form up the Klaros-Testmanagement account name (e.g.
sn ).

• Enabled Attribute

The LDAP user attribute used for enabling/disabling this user for Klaros-Testmanagement.

When the Use as default checkmark is activated, the login screen will default to LDAP authentication
for all users. Yet it is still possible for existing users to authenticate against the Klaros user database if
selected in the login screen.

Press the link Test LDAP access to test if the fields are filled in properly. If successful the number of users
found in the LDAP directory is shown in the message area.

Press the Save button to submit your changes or the Cancel button to discard them.

8.5.  System Parameters


Only a user with the administrator role is able to edit the system parameters below.

8.5.1.  E-Mail
In the E-Mail configuration page it is possible to change the some e-mail settings of the application. The
following image shows all the attributes that can be edited ( Figure 8.9 ).

96
Configure

Figure 8.9. The E-Mail Configuration Screen

Press the link Send Test Email to test if the fields are filled in properly.

Press the Save button to submit your changes or the Cancel button to discard them.

8.5.2.  Logging
In the Logging configuration page it is possible to change the console and database log level of the
application. The log level is the message level of severity that will be displayed or saved in the database
( Figure 8.10 ):

Figure 8.10. The Logging Configuration Screen

97
Configure

The following attributes are shown:

• Application console log level

The log level that the user wants to monitor.

• Audit database log level.

The log level that the database saves.

Both Console Log Level and DB Log Level have the same options:

• DEBUG

The DEBUG level designates informational events that are most useful to debug an application.

• INFO

The INFO level designates informational messages that highlight the progress of the application.

• WARNING

The WARNING level designates potentially harmful situations.

• ERROR

The ERROR level designates error events that might still allow the application to continue running.

• FATAL

The FATAL level designates very severe error events that will presumably lead the application to abort.

Press the Save button to submit the changes that have been made or the Cancel button to discard the
changes.

Tip
The log is shown at the bottom of the page. The user can expand the messages to view
all or hide all.

8.5.3.  General Settings


Here it is possible to change the general settings of the application. Figure 8.11 shows the attribute
Application URL that is necessary to be set if Klaros-Testmanagement runs behind a proxy. The proxy is
responsible to redirect the user request to a physical URL. More information about rewriting URLs can
be found here .

98
Configure

Figure 8.11. The General Settings Configuration Screen

Press the link Check Application URL to test if the URL was entered correctly.

Tip
Here it is also possible to overwrite the default session timeout of the underlying tomcat.
A negative value will disable the session timeout.

The Message of the day text allows to configure a message of the day. This message is displayed on the
login screen of Klaros-Testmanagement. Alternatively to a static message, a random message can be
displayed by checking the Use random quote of the day check box.

The Rows per table page combo box selects the default nuber of rows a data table will have. This value can
also be overridden on each table for each user individually and will be stored permanantly for each user.

When activated the Authentication required for import button will require authentication credentials
when test results are imported (e.g. by the Hudson plugin). See also Section 10.3.1, “Hudson Plugin for
Klaros-Testmanagement ” .

Press the Save button to submit the changes that have been made or the Cancel button to discard the
changes.

8.5.4.  Issue Management


An issue management system is a software package that manages and maintains lists of issues. Issues
are reports about defects in a software system.

Klaros-Testmanagement is capable of creating issue entries in an issue management system and assign-
ing them to failed test runs. It is possible to configure multiple issue management systems. At the mo-
ment Klaros-Testmanagement is supporting the issue management systems JIRA (a commercial issue
management system produced by Atlassian Pty Ltd. see http://www.atlassian.com/ for more
information) and Trac (an open source issue management system, see http://trac.edgewall.
org/ for more information).

Note
To use the Trac issue management system with Klaros-Testmanagement Trac must have
installed the XML-RPC Plug-in and the users must have the access rights for the plug-in. For

99
Configure

more information about the XML-RPC Plug-in of Trac see http://trac-hacks.org/


wiki/XmlRpcPlugin .

The issue management systems are configured in the Issue Management section in the define menu.
The issue management page is shown in Figure 8.12, “The Issue Management Configuration Screen”

Figure 8.12. The Issue Management Configuration Screen

The issue management systems sections shows all the configured issue management systems in one
table. It is possible to change the properties of each entry by editing the fields of the table directly and
pressing the save button afterwards.

The properties of the issue management system are

ID The internal id of the issue management system. This property is assigned automat-
ically and cannot be altered by the user. If the deleting of an issue management sys-
tem is prohibited, e.g. there are already test case results assigned to issues in the issue
tracker, the ID has additionally a lock icon attached.

System The system column indicates which issue management system is used. Currently
Klaros-Testmanagement supports JIRA, Trac, Redmine, and Bugzilla.

Project If the issue management system organizes issues in projects, it is possible to specify
the project where the new issues should be created in the issue management system.
Only JIRA Redmine and Bugzilla manages its issues in different projects, for Trac the
URL is used to select different projects.

Description In the description field a description of the issue management system can be entered.

URL In the URL field the link to the issue management system is specified. To check if the
URL is valid, the Validate the URL button on the right of the URL field can be pressed.
If the URL to the issue management system is configured correctly, a confirmation
message will be displayed in the message area.

Action The action column contains a button that can be used to delete an issue management
system from the configuration. If the issue management system is unused, i.e. it is
not used in any project, the configuration of the issue management system can be
deleted.

100
Configure

8.5.4.1. Adding a new Issue Management System


To add a new issue management system press the New button. An empty row will be added to the
list of issue management systems. After the new row is filled with the correct values, pressing the Save
save button will persist the settings for the issue management system. Pressing Cancel will discard the
changes.

8.5.4.2. Editing an existing Issue Management System


The configuration of an issue management system can be changed by editing the fields in the table.
Pressing Save persists the changes, pressing Cancel discards them.

8.5.4.3. Deleting an Issue Management System


If the issue management system is not used in any project, it can be deleted by pressing the delete
button ( ) of the entry in the table for configuration of the issue management systems.

8.6.  Backup/Recovery
To move data between different database installations or to selectivly import data Klaros-Testmanage-
ment provides the functionality to import and export database content via XML files. The format of these
files is explained in Appendix C, Dump File Specification

8.6.1.  Backup Projects


Here it is possible to backup selective project data. The user can select one or more projects, or click at
the Select all option to select all the projects that are available ( Figure 8.13 ).

Figure 8.13. The Backup Projects Screen

Press the Export button to export the selected projects to an XML formatted output file or the Cancel
button to discard the selection.

101
Configure

8.6.2.  Restore Projects


Here it is possible to restore the project data that has previously been exported. The user can browse
a backup file and upload it. The screen will show the projects contained in the uploaded file and the
user is able to select one or more projects, or click at the Select all option to select all the projects to
restore ( Figure 8.14 ).

Figure 8.14. The Restore Projects Screen

Press the Import button to import the data of the projects selected or the Cancel button to discard the
restore.

Note
The Import action will not overwrite existing projects or other existing artifacts.

102
Chapter 9. Reports

Klaros-Testmanagement Enterprise Edition Feature


This feature is only available in the Klaros-Testmanagement Enterprise Edition.

With the Enterprise Edition of Klaros-Testmanagement it is possible to define custom Reports. Though
Klaros-Testmanagement already provides several Reports, it might be helpful to design new Reports
that suite your and your customer's needs. Figure  9.1, “The Report Generation Process” gives an
overview of the Report generation process. As the Report definition process is based on a Groovy script
and SeamPDF, basic knowledge in Java programming and XML are helpful for a Report Designer.

Figure 9.1. The Report Generation Process

The reporting process involves two roles, the Report Designer and the Report User. While the Re-
port Designer provides the outlook of the Report and prepares the Report data, the Report User ap-

103
Reports

plies the Reports to the data collected in Klaros-Testmanagement. The Report Designer has to pro-
vide two scripts to build a Report. One script prepares the data and is implemented in Groovy, while
the second script describes the layout of the Report and is called the report template. The Groovy
script taking care of data retrieval and preparation is provided by a Java class that implements the
Section D.2, “KlarosScript Interface”. This interface defines one method called execute and takes a
de.verit.klaros.scripting.KlarosContext object as input parameter.

To retrieve the data from Klaros-Testmanagement the designer can access the
de.verit.klaros.core.model via HQL. The main task of the class is to provide and prepare
the data for the Report. The data for the report template must then be stored in the
de.verit.klaros.scripting.KlarosContext object passed to the execute method. To make Reports more flex-
ible for Report Users, it is possible to pass parameters to the Groovy script. The parameters are stored in
the context and can be accessed from the Groovy script and the report template.

Note
The de.verit.klaros.scripting.KlarosContext object already contains predefined objects. For
a list please refer to Section D.1, “Context Variables”.

The report template must be implemented using SeamPDF. Section Section  9.3, “ Example Report ”
provides an example on how to define a custom Report.

9.1.  Create A New Report Template


Before Reports can be applied to the test data they have to be defined. To get to the Report Templates
page click on the Configure icon and select Report Templates from the menu on the left side
of the screen.

Figure 9.2. The Report Templates Page

To create a new Report click on the New button on the Report Templates page. Then provide the basic
data for the Report by entering a name in the Name field and a short description in the Description field.

104
Reports

Figure 9.3. The New Report Templates Page

To enter the Groovy script, which retrieves the data use the Groovy Script text area. To unfold the Tem-
plate text area click on the Edit the Template link next to the Template label. The Report's Template code
can then be entered into the unfolding text area. To unfold the Groovy script text area again click on
the Edit the script link.

Note
It might be helpful to use a Java IDE e.g. eclipse, to develop the Groovy script and a XML
editor to provide the report template. The created files can then be uploaded into Klaros-
Testmanagement. To avoid errors in the Groovy script just add the Klaros model libraries
to the build path of your eclipse project.

Instead of manually entering the code for each text area, the code can be imported from a file. Specify
the file to use by clicking on the Browse Button and select the file from the file system. Click OK in the
file dialog and afterwards click Upload to import the selected file into the text area.

This page provides three actions to be executed. To test the code click on the Preview button to gen-
erate a Report. The Save button stores the Report into Klaros repository in the file system. The changes
can be discarded by pressing the Back button.

Note
The generated Report is opened automatically in a new browser window. If this does not
work for you, please check if you have a pop-up blocker active.

To provide a certain degree of freedom to the Report User and to make the Reports more flexible, it is
possible to pass parameters to the Groovy script. This mechanism can for example be used to pass a
timespan to the Groovy script, so that only data for this timespan is retrieved from Klaros-Testmanage-
ment.

To pass arguments to the Report select the The script has parameters checkbox.

105
Reports

Figure 9.4. Adding Parameters to the Script

A new section as shown in Figure 9.4, “Adding Parameters to the Script” is displayed, which enables the
addition of parameters. By clicking on the button a parameter can be added to the parameter list.

Figure 9.5. Specifying Parameters

By clicking the icon the parameter is removed from the list. The type of the parameters can be
specified by the combo box. Supported types are Text, Number, and Date.

The passed parameters can be accessed by the Groovy script by either calling the getParame-
terValue(String name) or the getParameter(String name) method. The methods will
return null, if no parameter with the specified name can be found.

The parameters can be accessed from the report template by their name, e.g.

  <p:text value="#{reportname}"/>
 

106
Reports

9.2.  Applying a Report Template


To apply a Report to your test data click on the Evaluate icon and select Report Templates from
the menu on the left side. A list of available Report Templates will be provided for selection.

Figure 9.6. Apply a Report Template

By clicking on the icon of a Report Template, a PDF Report is rendered. If the Groovy script was
defined with parameters, a pop-up window will prompt the user to enter the defined parameters before
the Report is generated.

Figure 9.7. Enter Parameter

107
Reports

9.3.  Example Report


This section provides an example Report, which shows how to retrieve test case results and how to
display them depending on their status.

9.3.1.  Creating the Groovy Script


The following code snippet shows the frame for a Groovy script with the required imports. The code
to retrieve the data must be implemented in the execute method. A more detailed description of the
Klaros-Testmanagement API can be found in de.verit.klaros.core.model

  import de.verit.klaros.scripting.*;
  import de.verit.klaros.core.model.*;
  import java.util.*;

  public class TestScript implements KlarosScript {

    public void execute(KlarosContext context) {
    ­...
    ­}
  ­}
    

The next step in the data retrieval process is to actually get the required data. The following code snippet
shows how to build a query string and how to get the data.

  StringBuffer query = new StringBuffer();
  query.append("select tcr from KlarosTestCaseResult tcr");
  List<?> tcr = context.executeQuery(query.toString());
    

The data is returned in a List object that must be stored in the context so that it can later be accessed
from the report template. The code snippet below shows how to store the list in the context. For more
information on building queries please consult the HQL documentation.

  context.add("results", tcr);
    

The List object is stored in the context with the name "results" and can be accessed from the report
template by this name. If more data is required, execute more queries to retrieve the data or process the
already retrieved data and store the processed data in the context with a different name.

Note
It is possible to store more than one object in the context. Just use a different name for
each object.

9.3.2. Creating the Report Template


The code snippets presented in this section show how to build a report template for Klaros-Testman-
agement. For detailed information please refer to SeamPDF manual. More information on the Klaros-
Testmanagement object model can be found in de.verit.klaros.core.model.

The following snippet shows how to build the frame for the report template. Inside this frame all other
Report details can be described and grouped in chapters and sections.

  <p:document xmlns:ui="http://java.sun.com/jsf/facelets" xmlns:f="http://java.sun.com/jsf/core"
   xmlns:p="http://jboss.com/products/seam/pdf" title="Klaros-Testmanagement Test Plan Report"
   marginMirroring="true" author="#{user.name}" creator="#{user.name}" pageSize="A4">
  ­...
  </p:document>
          
          

Note the usage of the de.verit.klaros.core.model.KlarosUser parameter from the context, #{user.
name}.

108
Reports

The next code snippet shows how to define headers and footers for all pages in the Report. This snippet
makes use of the date and the de.verit.klaros.core.model.KlarosUser object in the context.

  <f:facet name="header">
    <p:font size="8">
      <p:header borderWidthBottom="0.1" borderColorBottom="black" borderWidthTop="0" alignment="center">
        <p:text value="Example report ­- generated #{date} by #{user.name}"/>
      </p:header>
      <p:footer borderWidthTop="0.1" borderColorTop="black" borderWidthBottom="0" alignment="center">
        <p:text value="Page ­" ­/>
        <p:pageNumber ­/>
      </p:footer>
    </p:font>
  </f:facet>

Next the front page for the Report should be defined to provide a short summary of the Report. To
keep this example short only a fragment is presented. For the complete script please see Section D.3,
“Example report template”.

  <p:font style="bold" size="16">
    <p:paragraph alignment="center" spacingAfter="5">
      <p:text value="#{user.name} (#{user.email})"/>
    </p:paragraph>
  </p:font>

The snippet shows how to insert the user's email address which opens the email client when clicked.

9.3.3. Creating a Chart
To provide a graphical overview it is sometimes necessary to add a chart to the Report. The required
data can be prepared by the Groovy script and stored in the context. Then the report template can pass
the data to the charting component of SeamPDF. This section explains how to create a chart as shown
in Figure 9.8, “A Pie Chart Example”

Figure 9.8. A Pie Chart Example

9.3.3.1. Pie Chart Groovy Script


As mentioned before, the Groovy script is not only used to retrieve the data, but it can al-
so be used to prepare the data for the Report. The following listing shows a possible way to
prepare the data for a pie chart. For every possible result a List object is created, then the

109
Reports

de.verit.klaros.core.model.KlarosTestCaseResult retrieved before are stored in one of the lists depend-


ing on the test result. Next the lists are added to the context.

  List<KlarosTestCaseResult> error = new ArrayList<KlarosTestCaseResult>();
  List<KlarosTestCaseResult> failure = new ArrayList<KlarosTestCaseResult>();
  List<KlarosTestCaseResult> success = new ArrayList<KlarosTestCaseResult>();
  ­// Iterate over the results and retrieve the status
  Iterator<KlarosTestCaseResult> iter = (Iterator<KlarosTestCaseResult>) tcr.iterator();
  while (iter.hasNext()) {
    KlarosTestCaseResult result = iter.next();
      if (result.isError()) error.add(result);
      else if (result.isFailure()) failure.add(result);
      else if (result.isPassed()) success.add(result);
  ­}
  context.add("error", error);
  context.add("failure", failure);
  context.add("success", success);

The de.verit.klaros.core.model.KlarosTestCaseResult are split into three lists depending on the test case
result. These lists can then be accessed from the context by their corresponding key.

9.3.3.2. Pie Chart Template Script


The following snippet shows how to display the data that was prepared by the Groovy script before.

  <p:paragraph horizontalAlignment="center">
    <p:piechart title="Testresults" direction="anticlockwise" circular="true"
      startAngle="30" labelGap="0.1" labelLinkPaint="black" plotBackgroundPaint="white"
      labelBackgroundPaint="white" is3D="true">
      <p:series key="results">
        <p:data key="Error [#{error.size}]" value="#{error.size}" sectionPaint="#FF0A0A" ­/>
        <p:data key="Success [#{success.size}]" value="#{success.size}" sectionPaint="#33CC00"/>
        <p:data key="Failure [#{failure.size}]" value="#{failure.size}" sectionPaint="#FFCC00"/>
      </p:series>
    </p:piechart>
  </p:paragraph>

The piechart element builds the frame for the chart by defining the main outlook of the chart. For
detailed information on different charts please check the SeamPDF documentation. For a pie chart a
series of data is required. The data is retrieved from the lists stored in the context by the Groovy script.

  <p:data key="Error [#{error.size}]" value="#{error.size}" sectionPaint="#FF0A0A" ­/>

This code retrieves the list containing the error results from the context and calls its size() method
to determine the amount of erroneous test cases. The pie chart is then rendered from the three data
sections, as seen in Figure 9.8, “A Pie Chart Example”.

9.3.4. Including Images
This section shows how to include an image into a Report. Since the Report is rendered by Seam, the
image to be included must be accessible from Seam. This can either be achieved by storing the image
in a .jar file and storing this file in the .klaros/resources folder of the user running Klaros-Test-
management, or by providing the image on a web server, from where it can be included by using HTML
code in the report template.

When storing the image in a .jar file it can be easily accessed by the following code snippet.

  <p:image value="images/image.png"/>

The value attribute defines the image filename and the folder of the image location.

When providing an image via a web server it can be accessed by the following code snippet.

  <p:html>
    <img src="http://www.verit.de/images/logo-klaros-160.png" ­/>
  </p:html>

In this case the src attribute defines the URL of the image location.

110
Chapter 10. Import/Export
Klaros-Testmanagement has several interfaces to import results from other tools and export its data to
several formats.

10.1. Export Table Content to Excel


Klaros-Testmanagement Enterprise Edition Feature
This feature is only available in the Klaros-Testmanagement Enterprise Edition.

Klaros-Testmanagement has the possibility to export the content of all tables to an Excel file. The current
filter and sort settings will be considered. You have the choice if you want to export all elements from
the list to Excel, or just the elements on the current page of the table.

Figure 10.1. Export table content to Excel

10.2.  Backup/Recovery
To move data between different database installations or to selectively import data Klaros-Testmanage-
ment provides the functionality to import and export database content via XML files. The format of these
files is explained in Appendix C, Dump File Specification , Section 8.6, “ Backup/Recovery ” explains
the import and export functionality.

10.3. Integration With Other Frameworks


Klaros-Testmanagement has several interfaces to import results from other tools.

10.3.1. Hudson Plugin for Klaros-Testmanagement


Integrates the continuous integration server Hudson with Klaros-Testmanagement by publishing the
test results of a Hudson build to the Klaros-Testmanagement application. The test results will be stored
in the Klaros-Testmanagement database for further evaluation and reporting purposes. In the Hudson
Wiki you can find the installation and configuration guide for the Plugin.

10.3.2. QF-Test and JUnit Import


Klaros-Testmanagement is able to import test results via a REST interface into the Klaros-Testmanage-
ment database. By default the url of the importer is http://localhost:18080/klaros-web/importer .The con-

111
Import/Export

tent will be transferred via a HTTP PUT request using the above URL and various URL Query parameter.
The following parameters are supported:

config The ID of the project to import the results into (e.g. P0001)

env The ID of the test environment in which the tests have been run
(e.g. ENV00001). Please make sure that this test environment al-
ready exists in the project before starting the import.

sut The ID of the system under test in which the tests have been run
(e.g. SUT00001). Please make sure that this system under test al-
ready exists in the project before starting the import.

type The type of the import format. Currently this is either qftest for
QF-Test or junit for JUnit XML result files.

time The time of the import. Please make sure the format of the time is
"dd.MM.yyyy_HH:mm".

user name (since plugin version The user name for the import. If Klaros-Testmanagement is config-
1.1) ured to use authentication for the import a valid user name must
be passed to the importer.

password (since plugin version The password for the import. If Klaros-Testmanagement is config-
1.1) ured to use authentication for the import a valid password must
be passed to the importer.

A complete example for a QF-Test import URL would look like this:
http://localhost:18080/klaros-web/importer?
config=P00001&env=ENV00001&sut=SUT00001&type=qftest&time=01.01.1970_12:00&username=validUser&password=validPassword

Example 10.1. QF-Test import URL sample

with the result file contained in the HTTP request body.

112
Glossary
A
Administrator User role that has access to all functionalities in Klaros-Testmanagement.

D
Database A database is a collection of information organized into interrelated tables
of data and specifications of data objects.

E
E-Mail Electronic mail, often abbreviated as e-mail, is any method of creating,
transmitting, or storing primarily text-based communications with digital
communications systems.

I
Issue The term issue is a unit of work to accomplish an improvement in a system.
An issue could be a bug, a requested feature, task, missing documentation,
and so forth.

J
JIRA JIRA is a bug tracking, issue tracking, and project management system by
Atlassian Software .

JUnit JUnit is a unit testing framework for the Java programming language.

M
Manager User role that has access to create, edit, delete and search for objects, run
test cases and test suites, show results and generate reports.

O
Operating System An operating system (commonly abbreviated to either OS or O/S) is an in-
terface between hardware and applications. It is responsible for the man-
agement and coordination of activities and the sharing of the limited re-
sources of the computer. Common contemporary operating systems in-
clude Microsoft Windows, Mac OS, Linux, BSD and Solaris.

P
Postcondition Environmental and state conditions that must be fulfilled before the com-
ponent or system can be executed with a particular test or test procedure.

Precondition Environmental and state conditions that must be fulfilled after the execu-
tion of a test or test procedure.

113
Glossary

Q
QFTest QF-Test is a professional tool for automated testing of Java and Web appli-
cations with a graphical user interface from Quality First Software .

R
Redmine Redmine is an open source bug tracking, issue tracking, and project man-
agement system.

T
Test Case A test case is a set of input values, execution preconditions, expected re-
sults and execution post-conditions, developed for a particular objective or
test condition, such as determine whether an application or software sys-
tem meets specifications.

Tester User role that has access to search for objects, run test cases and test suites
and show results.

Test Execution The process of running a test by the component or system under test, pro-
ducing actual result(s).

Test Suite A test suite is a set of test cases and the test cases can be executed in groups.
The test suite will be used to verify and ensure that a product or system
meets its design specifications and other requirements.

Trac Trac is an open source bug tracking and issue tracking system.

U
URL Uniform Resource Locator (URL) specifies where an identified resource is
available and the mechanism for retrieving it.

W
Web browser Web browser is a software application which enables a user to display and
interact with text, images and other information typically located on a Web
page at a Web site on the World Wide Web or a local area network.

114
Index getActiveProject, 195, 198
getAllTestRequirements, 126, 163
getArea, 135, 172
A getAssignee, 149, 188
add, 194, 197 getAttachments, 143, 180
addParameter, 194, 197 getBranches, 128, 167, 173, 184, 188
getBranchRoot, 129, 167, 173, 184, 189
B getChildren, 127, 163
Back, 53 getConfiguration, 131, 135, 146, 147, 149,
167, 173, 184, 185, 189
getCoverage, 127, 163
C getCoveringHeadTestCases, 146, 184
Classes
getCoveringTestCases, 146, 184
KlarosAttachment, 153
getCovers, 135, 173
KlarosConfiguration, 153
getCreated, 135, 149, 173, 189
KlarosContext, 193
getCreatedString, 136, 173
KlarosExternalImplementation, 156
getCreator, 131, 136, 144, 147, 150, 167,
KlarosIssue, 157
173, 182, 186, 189
KlarosNamedEntity, 158
getDate, 122, 156
KlarosObjectFactory, 161
getDependency, 136, 173
KlarosProject, 161
getDepth, 127, 163
KlarosQueryFactory, 195
getDescendants, 127, 164
KlarosRequirementGroup, 162
getDescription, 121, 124, 136, 140, 154,
KlarosRevision, 164
157, 173, 177
KlarosStatusMessage, 168
getDesignTechnique, 136, 173
KlarosSUTImplementation, 166
getDetectedIssues, 136, 174
KlarosTag, 169
getDocbase, 136, 174
KlarosTestCase, 170
getDynaClass, 159
KlarosTestCaseResult, 176
getEmail, 152, 192
KlarosTestCaseState, 178
getEnv, 147, 186
KlarosTestCaseStep, 180
getEnvs, 121, 154
KlarosTestEnvironment, 181
getEvaluation, 137, 174
KlarosTestExecutable, 182
getExecution, 137, 174
KlarosTestRequirement, 183
getExecutionTime, 140, 177
KlarosTestRun, 184
getFamily, 127, 164
KlarosTestSuite, 187
getGroup, 146, 184
KlarosTestSuiteResult, 190
getImplementation, 137, 174
KlarosUser, 191
getImplementationOf, 123, 156
ParameterContext, 198
getInstance, 161
ParameterType, 199
getLastEditor, 150, 189
compareTo, 154, 167, 172, 177, 181, 185,
getLastUpdated, 150, 189
191, 192
getLevel, 137, 174
contains, 159
getLocale, 195, 198
getMappedClass, 161
E getMessage, 132, 169
equals, 154, 167, 172, 177, 181, 185, getName, 121, 124, 150, 155, 159, 189
191, 192 getNote, 137, 174
Error, 53 getNumberCoveredRequirements, 127, 164
execute, 196, 196, 196 getNumberErrors, 147, 186
executeParameterizedQuery, 194, 197 getNumberFailures, 147, 186
executeQuery, 194, 197 getNumberPassed, 148, 186
getNumberRequirements, 127, 164
F getNumberStates, 131, 167
Failure, 53 getParameter, 195, 198
getParameters, 199
G getParameterValue, 195, 198
get, 159, 159, 159 getParent, 128, 164
getAction, 143, 180 getPostcondition, 137, 143, 174, 180

115
Index

getPrecondition, 138, 143, 175, 181 IKlarosExternalImplementation, 122


getPredecessor, 129, 167, 175, 184, 189 IKlarosExternalLink, 123
getPriority, 138, 175 IKlarosIssue, 123
getProductversion, 131, 167 IKlarosNamedEntity, 124
getPropertyAsList, 125, 160 IKlarosProject, 125
getPropertyAsString, 125, 160 IKlarosRequirementGroup, 126
getReference, 123, 157 IKlarosRevision, 128
getResults, 138, 148, 150, 175, 186, 189 IKlarosStatusMessage, 132
getRevisionComment, 129, 165 IKlarosSUTImplementation, 130
getRevisionId, 129, 165 IKlarosTag, 133
getRevisions, 133, 170 IKlarosTestCase, 134
getRole, 152, 192 IKlarosTestCaseResult, 139
getRoot, 129, 168, 175, 184, 189 IKlarosTestCaseState, 141
getRunId, 148, 186 IKlarosTestCaseStep, 142
getShortname, 138, 144, 150, 175, 182, IKlarosTestEnvironment, 143
189 IKlarosTestExecutable, 145
getState, 138, 175 IKlarosTestRequirement, 145
getStateDescription, 142, 179 IKlarosTestRun, 146
getStateName, 138, 142, 175, 179 IKlarosTestSuite, 148
getStates, 124, 139, 157, 175 IKlarosTestSuiteResult, 151
getStatus, 132, 169 IKlarosUser, 152
getSuccessor, 129, 168, 175, 184, 190 Interfaces
getSummary, 140, 177 IKlarosAttachment, 120
getSut, 142, 148, 150, 179, 187, 190 IKlarosConfiguration, 120
getSuts, 121, 155 IKlarosContext, 196
getTagId, 133, 170 IKlarosExternalImplementation, 122
getTags, 129, 166 IKlarosExternalLink, 123
getTeam, 139, 176 IKlarosIssue, 123
getTestCase, 140, 142, 177, 179 IKlarosNamedEntity, 124
getTestCases, 121, 124, 151, 155, 158, IKlarosProject, 125
190 IKlarosRequirementGroup, 126
getTestCaseStates, 131, 131, 168, 168 IKlarosRevision, 128
getTestCaseSteps, 139, 176 IKlarosStatusMessage, 132
getTestRequirements, 121, 128, 155, 164 IKlarosSUTImplementation, 130
getTestRun, 141, 151, 178, 191 IKlarosTag, 133
getTestRuns, 122, 131, 144, 155, 168, 182 IKlarosTestCase, 134
getTestSuite, 152, 191 IKlarosTestCaseResult, 139
getTestSuiteResultCount, 151, 190 IKlarosTestCaseState, 141
getTestSuites, 122, 155 IKlarosTestCaseStep, 142
getTimestamp, 132, 133, 148, 169, 170, IKlarosTestEnvironment, 143
187 IKlarosTestExecutable, 145
getTimestampString, 148, 187 IKlarosTestRequirement, 145
getTraceability, 139, 176 IKlarosTestRun, 146
getTrunkRoot, 130, 168, 176, 184, 190 IKlarosTestSuite, 149
getUser, 133, 169 IKlarosTestSuiteResult, 151
getUsername, 152, 192 IKlarosUser, 152
getUuid, 120, 153 KlarosScript, 196
getVariety, 139, 176 isDefinedProperty, 125, 125, 144, 160
isEnabled, 122, 132, 144, 151, 156, 168,
H 182, 190
isError, 141, 178
hashCode, 155, 168, 176, 178, 182, 187,
isFailure, 141, 178
191, 192
isPassed, 141, 178

I J
IKlarosAttachment, 120
JUnit, 112
IKlarosConfiguration, 120
IKlarosContext, 196

116
Index

K getCoveringTestCases, 146, 184


getCovers, 135, 173
KlarosAttachment, 153
getCreated, 135, 149, 173, 189
KlarosConfiguration, 153
getCreatedString, 136, 173
KlarosContext, 193, 194, 194
getCreator, 131, 136, 144, 147, 150,
KlarosExternalImplementation, 156
167, 173, 182, 186, 189
KlarosIssue, 157
getDate, 122, 156
KlarosNamedEntity, 158
getDependency, 136, 173
KlarosObjectFactory, 161
getDepth, 127, 163
KlarosProject, 161
getDescendants, 127, 164
KlarosQueryFactory, 195, 196
getDescription, 121, 124, 136, 140, 154,
KlarosRequirementGroup, 162
157, 173, 177
KlarosRevision, 164
getDesignTechnique, 136, 173
KlarosScript, 196
getDetectedIssues, 136, 174
KlarosStatusMessage, 168
getDocbase, 136, 174
KlarosSUTImplementation, 166
getDynaClass, 159
KlarosTag, 169
getEmail, 152, 192
KlarosTestCase, 170
getEnv, 147, 186
KlarosTestCaseResult, 176
getEnvs, 121, 154
KlarosTestCaseState, 178
getEvaluation, 137, 174
KlarosTestCaseStep, 180
getExecution, 137, 174
KlarosTestEnvironment, 181
getExecutionTime, 140, 177
KlarosTestExecutable, 182
getFamily, 127, 164
KlarosTestRequirement, 183
getGroup, 146, 184
KlarosTestRun, 184
getImplementation, 137, 174
KlarosTestSuite, 187
getImplementationOf, 123, 156
KlarosTestSuiteResult, 190
getInstance, 161
KlarosUser, 191
getLastEditor, 150, 189
getLastUpdated, 150, 189
L getLevel, 137, 174
Login, 22 getLocale, 195, 198
getMappedClass, 161
M getMessage, 132, 169
Methods getName, 121, 124, 150, 155, 159, 189
add, 194, 197 getNote, 137, 174
addParameter, 194, 197 getNumberCoveredRequirements, 127, 164
compareTo, 154, 167, 172, 177, 181, getNumberErrors, 147, 186
185, 191, 192 getNumberFailures, 147, 186
contains, 159 getNumberPassed, 148, 186
equals, 154, 167, 172, 177, 181, 185, getNumberRequirements, 127, 164
191, 192 getNumberStates, 131, 167
execute, 196, 196, 196 getParameter, 195, 198
executeParameterizedQuery, 194, 197 getParameters, 199
executeQuery, 194, 197 getParameterValue, 195, 198
get, 159, 159, 159 getParent, 128, 164
getAction, 143, 180 getPostcondition, 137, 143, 174, 180
getActiveProject, 195, 198 getPrecondition, 138, 143, 175, 181
getAllTestRequirements, 126, 163 getPredecessor, 129, 167, 175, 184, 189
getArea, 135, 172 getPriority, 138, 175
getAssignee, 149, 188 getProductversion, 131, 167
getAttachments, 143, 180 getPropertyAsList, 125, 160
getBranches, 128, 167, 173, 184, 188 getPropertyAsString, 125, 160
getBranchRoot, 129, 167, 173, 184, 189 getReference, 123, 157
getChildren, 127, 163 getResults, 138, 148, 150, 175, 186,
getConfiguration, 131, 135, 146, 147, 189
149, 167, 173, 184, 185, 189 getRevisionComment, 129, 165
getCoverage, 127, 163 getRevisionId, 129, 165
getCoveringHeadTestCases, 146, 184 getRevisions, 133, 170

117
Index

getRole, 152, 192 Precondition, 39, 41, 52


getRoot, 129, 168, 175, 184, 189 Projects
getRunId, 148, 186 Create Project, 26
getShortname, 138, 144, 150, 175, 182,
189 Q
getState, 138, 175 QF-Test, 112
getStateDescription, 142, 179
getStateName, 138, 142, 175, 179
getStates, 124, 139, 157, 175
R
remove, 160
getStatus, 132, 169
Role
getSuccessor, 129, 168, 175, 184, 190
Administrator, 6
getSummary, 140, 177
Manager, 6
getSut, 142, 148, 150, 179, 187, 190
Tester, 6
getSuts, 121, 155
getTagId, 133, 170
getTags, 129, 166 S
getTeam, 139, 176 set, 160, 160, 160
getTestCase, 140, 142, 177, 179 setLocale, 195, 198
getTestCases, 121, 124, 151, 155, 158, Skip, 53
190 Systems Under Test, 33, 33
getTestCaseStates, 131, 131, 168, 168
getTestCaseSteps, 139, 176 T
getTestRequirements, 121, 128, 155, 164 Test
getTestRun, 141, 151, 178, 191 Test Case, 50
getTestRuns, 122, 131, 144, 155, 168, Test Suite, 54
182 Test Environments, 30
getTestSuite, 152, 191 Test Execution, 54, 76
getTestSuiteResultCount, 151, 190
getTestSuites, 122, 155 W
getTimestamp, 132, 133, 148, 169, 170, Web Browser, 22
187
getTimestampString, 148, 187
getTraceability, 139, 176
getTrunkRoot, 130, 168, 176, 184, 190
getUser, 133, 169
getUsername, 152, 192
getUuid, 120, 153
getVariety, 139, 176
hashCode, 155, 168, 176, 178, 182,
187, 191, 192
isDefinedProperty, 125, 125, 144, 160
isEnabled, 122, 132, 144, 151, 156,
168, 182, 190
isError, 141, 178
isFailure, 141, 178
isPassed, 141, 178
KlarosContext, 194, 194
KlarosQueryFactory, 196
ParameterContext, 199, 199, 199
remove, 160
set, 160, 160, 160
setLocale, 195, 198

P
ParameterContext, 198, 199, 199, 199
ParameterType, 199
Passed, 53
Postcondition, 39, 41

118
Appendix A. Role Permission Overview
Actions/Roles Administrator Manager Tester
Create, Edit and Delete users with role 'Admin'
Create, Edit and Delete users with role 'Manager'
Create, Edit and Delete users with role 'Tester'
Search users
Edit own user
Edit preferences: E-Mail, Logging, General and Is-
sue Management
Backup and Recovery
Create, Edit and Delete objects
Search objects
Execute objects
Show results
Generate reports

Table A.1. Role Permission Overview Table

Role Permission Overview Table - Enterprise Edition

Actions/Roles Administrator Manager Tester


Create customized reports
Generate customized reports
Export Excel tables
Configure Dashboard
Define custom fields for Test Case/Test Suites/-
SUTs/Test Environments
Configure LDAP Authentication

Table A.2. Role Permission Overview Table - Enterprise Edition

119
Appendix B. Model API Reference
B.1. Klaros-Core API Reference
B.1. Klaros-Core API Reference
B.1.1. Package de.verit.klaros.core.model
B.1.1.1. Interface IKlarosAttachment
This interface provides access to a binary attachment.

B.1.1.1.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosAttachment {
// Public Methods

public java.lang.String getUuid();

B.1.1.1.2. getUuid()
public java.lang.String getUuid();

Get the uuid.

Parameters
return The uuid of this attachment.

B.1.1.2. Interface IKlarosConfiguration
This interface provides access to data of a project's configuration.

B.1.1.2.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosConfiguration {
// Public Methods

public java.lang.String getDescription();

public java.util.Set<de.verit.klaros.core.model.KlarosTestEnvironment> getEnvs();

public java.lang.String getName();

public java.util.Set<de.verit.klaros.core.model.KlarosSUTImplementation> getSuts();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getTestRequirements();

public java.util.Set<de.verit.klaros.core.model.KlarosTestRun> getTestRuns();

public java.util.Set<de.verit.klaros.core.model.KlarosTestSuite> getTestSuites();

public java.lang.Boolean isEnabled();

120
Model API Reference

B.1.1.2.2. getDescription()
public java.lang.String getDescription();

Returns the project description.

Parameters
return The description of the project.

B.1.1.2.3. getEnvs()
public java.util.Set<de.verit.klaros.core.model.KlarosTestEnvironment> getEnvs();

Returns the project test environments.

Parameters
return Set containing the test environments of the project.

B.1.1.2.4. getName()
public java.lang.String getName();

Returns the project name.

Parameters
return The name of the project.

B.1.1.2.5. getSuts()
public java.util.Set<de.verit.klaros.core.model.KlarosSUTImplementation> getSuts();

Returns the project SUTs (systems under test).

Parameters
return KlarosSet containing the SUT objects of the project.

B.1.1.2.6. getTestCases()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

Returns the project test cases.

Parameters
return Set containing the test case objects of the project.

B.1.1.2.7. getTestRequirements()
public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getTestRequirements();

Returns the project test requirements.

Parameters
return Set containing the test requirement objects of the project.

121
Model API Reference

B.1.1.2.8. getTestRuns()
public java.util.Set<de.verit.klaros.core.model.KlarosTestRun> getTestRuns();

Returns the project test runs.

Parameters
return Set containing the test run objects of the project.

B.1.1.2.9. getTestSuites()
public java.util.Set<de.verit.klaros.core.model.KlarosTestSuite> getTestSuites();

Returns the project test suites.

Parameters
return Set containing the test suite objects of the project.

B.1.1.2.10. isEnabled()
public java.lang.Boolean isEnabled();

Returns whether this project is enabled or not.

Parameters
return true if this project is enabled and active, false if not.

B.1.1.3. Interface IKlarosExternalImplementation
This interface provides access to the information about the implementation of a test case.

B.1.1.3.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosExternalImplementation
    extends, de.verit.klaros.core.model.IKlarosExternalLink {
// Public Methods

public java.lang.String getDate();

public de.verit.klaros.core.model.KlarosTestCase getImplementationOf();

B.1.1.3.2. getDate()
public java.lang.String getDate();

get date.

Parameters

122
Model API Reference

return String containing the date of the creation of this implementation.

B.1.1.3.3. getImplementationOf()
public de.verit.klaros.core.model.KlarosTestCase getImplementationOf();

Get related test case.

Parameters
return The test case for which this is an implementation.

B.1.1.4. Interface IKlarosExternalLink
This interface provides access to externally stored information about an object.

B.1.1.4.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosExternalLink {
// Public Methods

public java.lang.String getReference();

B.1.1.4.2. getReference()
public java.lang.String getReference();

Get the reference to the externally stored information.

Parameters
return The reference to the information.

B.1.1.5. Interface IKlarosIssue
This interface provides access to a software issue.

B.1.1.5.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosIssue {
// Public Methods

public java.lang.String getDescription();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseState> getStates();

public java.util.List<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

123
Model API Reference

B.1.1.5.2. getDescription()
public java.lang.String getDescription();

Get the description.

Parameters
return The description of this issue.

B.1.1.5.3. getStates()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseState> getStates();

Get the related test case states.

Parameters
return Set of test cases states which exists because this issue is 'active' (created
but not solved).

B.1.1.5.4. getTestCases()
public java.util.List<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

Get the related test cases.

Parameters
return List of test cases which have detected this issue.

B.1.1.6. Interface IKlarosNamedEntity
This interface provides access to data of a properties owner.

B.1.1.6.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosNamedEntity {
// Public Methods

public java.lang.String getName();

public java.util.List<java.lang.String> getPropertyAsList(java.lang.String name);

public java.lang.String getPropertyAsString(java.lang.String name);

public boolean isDefinedProperty(java.lang.String propertyName);

B.1.1.6.2. getName()
public java.lang.String getName();

Get name.

Parameters
return The name of the properties owner.

124
Model API Reference

B.1.1.6.3. getPropertyAsList(String)
public java.util.List<java.lang.String> getPropertyAsList(java.lang.String name);

Get the generic property as a List.

Parameters
name The property's name.
return The property's value in a list.

B.1.1.6.4. getPropertyAsString(String)
public java.lang.String getPropertyAsString(java.lang.String name);

Get the generic property as a String.

Parameters
name The property's name.
return The property's value in a single string.

B.1.1.6.5. isDefinedProperty(String)
public boolean isDefinedProperty(java.lang.String propertyName);

Check if a property identified by given name is a defined property.

Parameters
propertyName The name of the property to check.
return true if the identified property is a defined property, false else.

B.1.1.7. Interface IKlarosProject
This interface provides access to data of a test project.

B.1.1.7.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosProject
    extends, de.verit.klaros.core.model.IKlarosNamedEntity {
// Public Methods

public boolean isDefinedProperty(java.lang.String propertyName);

B.1.1.7.2. isDefinedProperty(String)
public boolean isDefinedProperty(java.lang.String propertyName);

125
Model API Reference

Specified by: Method isDefinedProperty in interface IKlarosNamedEntity

Check if a property identified by given name is a defined property.

Parameters
propertyName The name of the property to check.
return true if the identified property is a defined property, false else.

B.1.1.8. Interface IKlarosRequirementGroup
The interface to retrieve the information about a requirement group.

B.1.1.8.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosRequirementGroup
    extends, de.verit.klaros.core.model.IKlarosNamedEntity {
// Public Methods

public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getAllTestRequirements();

public java.util.Set<de.verit.klaros.core.model.KlarosRequirementGroup> getChildren();

public double getCoverage();

public java.lang.Integer getDepth();

public java.util.List<de.verit.klaros.core.model.KlarosRequirementGroup> getDescendants();

public java.util.List<de.verit.klaros.core.model.KlarosRequirementGroup> getFamily();

public int getNumberCoveredRequirements();

public int getNumberRequirements();

public de.verit.klaros.core.model.KlarosRequirementGroup getParent();

public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getTestRequirements();

B.1.1.8.2. getAllTestRequirements()
public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getAllTestRequirements();

Get all test requirements of this group hierarchy.

Parameters
return List containing own test requirements and all test requirements of
groups which are descendants of this group.

126
Model API Reference

B.1.1.8.3. getChildren()
public java.util.Set<de.verit.klaros.core.model.KlarosRequirementGroup> getChildren();

Get the children of this requirement group.

Parameters
return Set containing the requirement groups that are children of this require-
ment group.

B.1.1.8.4. getCoverage()
public double getCoverage();

Get test requirement coverage for the group.

Parameters
return the coverage in percent. If no requirements are found, 1.0 is returned.

B.1.1.8.5. getDepth()
public java.lang.Integer getDepth();

Get depth in tree of this node.

Parameters
return The depth in the tree

B.1.1.8.6. getDescendants()
public java.util.List<de.verit.klaros.core.model.KlarosRequirementGroup> getDescendants();

Get all descending subgroups of this group.

Parameters
return List containing all groups which are descendants of this group.

B.1.1.8.7. getFamily()
public java.util.List<de.verit.klaros.core.model.KlarosRequirementGroup> getFamily();

Get whole family with this group as root.

Parameters
return List containing this group and all of its descendants.

B.1.1.8.8. getNumberCoveredRequirements()
public int getNumberCoveredRequirements();

Get number of test requirements covered by at least one test case.

Parameters
return The number of covered test requirements.

B.1.1.8.9. getNumberRequirements()
public int getNumberRequirements();

Get number of test requirements, including all descendants' test requirements.

Parameters
return The number of overall test requirements.

127
Model API Reference

B.1.1.8.10. getParent()
public de.verit.klaros.core.model.KlarosRequirementGroup getParent();

Get the parent of this requirement group.

Parameters
return The parent requirement group, or null.

B.1.1.8.11. getTestRequirements()
public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getTestRequirements();

Get the related test requirements of this requirement group.

Parameters
return Set containing the test requirements which belong to this requirement
group.

B.1.1.9. Interface IKlarosRevision
This interface provides access to a revisionable klaros object.

B.1.1.9.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosRevision
    extends, de.verit.klaros.core.model.IKlarosNamedEntity {
// Public Methods

public de.verit.klaros.core.model.KlarosRevision<?, ?> getBranchRoot();

public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getBranches();

public de.verit.klaros.core.model.KlarosRevision<?, ?> getPredecessor();

public java.lang.String getRevisionComment();

public java.lang.String getRevisionId();

public de.verit.klaros.core.model.KlarosRevision<?, ?> getRoot();

public de.verit.klaros.core.model.KlarosRevision<?, ?> getSuccessor();

public java.util.Set<de.verit.klaros.core.model.KlarosTag> getTags();

public de.verit.klaros.core.model.KlarosRevision<?, ?> getTrunkRoot();

B.1.1.9.2. getBranches()
public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getBranches();

128
Model API Reference

Get branches of this revision.

Parameters
return Set containing all revision objects which have been created as first revision
of a branch (trunk root) and habe been created using this revision.

B.1.1.9.3. getBranchRoot()
public de.verit.klaros.core.model.KlarosRevision<?, ?> getBranchRoot();

Get the original revision of a trunk root revision object.

Parameters
return The revision object that has been used to create a new branch.

B.1.1.9.4. getPredecessor()
public de.verit.klaros.core.model.KlarosRevision<?, ?> getPredecessor();

Get the predecessor of the revision.

Parameters
return The revision object that is the predecessor of this revision.

B.1.1.9.5. getRevisionComment()
public java.lang.String getRevisionComment();

Get comment.

Parameters
return The comment of the revision.

B.1.1.9.6. getRevisionId()
public java.lang.String getRevisionId();

Get the revision id.

Parameters
return The revision id.

B.1.1.9.7. getRoot()
public de.verit.klaros.core.model.KlarosRevision<?, ?> getRoot();

Get the root of the revision hierarchy.

Parameters
return The root revision object.

B.1.1.9.8. getSuccessor()
public de.verit.klaros.core.model.KlarosRevision<?, ?> getSuccessor();

Get the successor of the revision.

Parameters
return The revision object that is the successor of this revision.

B.1.1.9.9. getTags()
public java.util.Set<de.verit.klaros.core.model.KlarosTag> getTags();

129
Model API Reference

Get tags of this revision.

Parameters
return Set containing all tags this revision belongs to.

B.1.1.9.10. getTrunkRoot()
public de.verit.klaros.core.model.KlarosRevision<?, ?> getTrunkRoot();

Get the 'root' of a trunk.

Parameters
return the revision which is the first revision of the branch to which this revision
belongs.

B.1.1.10. Interface IKlarosSUTImplementation
This interface provides access to data of a system under test version.

B.1.1.10.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosSUTImplementation
    extends, de.verit.klaros.core.model.IKlarosRevision {
// Public Methods

public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

public java.lang.String getCreator();

public int getNumberStates(java.lang.String name);

public java.lang.String getProductversion();

public java.util.Set<?> getTestCaseStates();

public java.util.List<de.verit.klaros.core.model.KlarosTestCaseState> getTestCaseStates(java.lang.String name);

public java.util.Set<?> getTestRuns();

public java.lang.Boolean isEnabled();

130
Model API Reference

B.1.1.10.2. getConfiguration()
public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

Get configuration.

Parameters
return The related configuration.

B.1.1.10.3. getCreator()
public java.lang.String getCreator();

Parameters
return The name of the creator.

B.1.1.10.4. getNumberStates(String)
public int getNumberStates(java.lang.String name);

Get number of test case states which are related to this revision and have the given name.

Parameters
name The name of the states to look for
return number of states with given name and relation to this SUT object

B.1.1.10.5. getProductversion()
public java.lang.String getProductversion();

Get product version.

Parameters
return The version id of the system under test.

B.1.1.10.6. getTestCaseStates()
public java.util.Set<?> getTestCaseStates();

Get test case states.

Parameters
return Set of test case states related to the SUT version.

B.1.1.10.7. getTestCaseStates(String)
public java.util.List<de.verit.klaros.core.model.KlarosTestCaseState> getTestCaseStates(java.lang.String name);

Get test case states which are related to this revision and have the given name.

Parameters
name The name of the states to look for
return Set of the states with given name

B.1.1.10.8. getTestRuns()
public java.util.Set<?> getTestRuns();

131
Model API Reference

Get test runs.

Parameters
return Set of test runs performed for this SUT version.

B.1.1.10.9. isEnabled()
public java.lang.Boolean isEnabled();

Check if enabled.

Parameters
return true if enabled

B.1.1.11. Interface IKlarosStatusMessage
This class provides access to data of a test case.

B.1.1.11.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosStatusMessage {
// Public Methods

public java.lang.String getMessage();

public java.lang.String getStatus();

public java.util.Date getTimestamp();

public de.verit.klaros.core.persistent.User getUser();

B.1.1.11.2. getMessage()
public java.lang.String getMessage();

Get the message of the status message.

Parameters
return String of the message of the status message.

B.1.1.11.3. getStatus()
public java.lang.String getStatus();

Get the status of the status message.

Parameters
return String of the status of the status message.

B.1.1.11.4. getTimestamp()
public java.util.Date getTimestamp();

132
Model API Reference

Get the timestamp of the status message.

Parameters
return timestamp of the status message.

B.1.1.11.5. getUser()
public de.verit.klaros.core.persistent.User getUser();

Get the user of the status message.

Parameters
return User of the status message.

B.1.1.12. Interface IKlarosTag
This interface provides access to the data of a tag.

B.1.1.12.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTag {
// Public Methods

public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getRevisions();

public java.lang.String getTagId();

public java.util.Date getTimestamp();

B.1.1.12.2. getRevisions()
public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getRevisions();

Get related revisions.

Parameters
return Collection of revision objects which are related to this tag.

B.1.1.12.3. getTagId()
public java.lang.String getTagId();

Get tag id.

Parameters
return The id of the tag.

B.1.1.12.4. getTimestamp()
public java.util.Date getTimestamp();

133
Model API Reference

Get timestamp.

Parameters
return The time of the creation of the tag as a Date object.

B.1.1.13. Interface IKlarosTestCase
This class provides access to data of a test case.

B.1.1.13.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTestCase
    extends, de.verit.klaros.core.model.IKlarosRevision {
// Public Methods

public de.verit.klaros.core.types.TestAreatopic getArea();

public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getCovers();

public java.util.Date getCreated();

public java.lang.String getCreatedString(java.util.Locale locale);

public de.verit.klaros.core.model.KlarosUser getCreator();

public java.lang.String getDependency();

public java.lang.String getDescription();

public de.verit.klaros.core.types.TestDesignTechnique getDesignTechnique();

public java.util.List<de.verit.klaros.core.model.KlarosIssue> getDetectedIssues();

public java.lang.String getDocbase();

public java.lang.String getEvaluation();

public de.verit.klaros.core.types.TestExecutionMethod getExecution();

public de.verit.klaros.core.model.KlarosExternalImplementation getImplementation();

public de.verit.klaros.core.types.TestLevel getLevel();

public java.lang.String getNote();

public java.lang.String getPostcondition();

public java.lang.String getPrecondition();

public de.verit.klaros.core.types.TestPriority getPriority();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseResult> getResults();

public java.lang.String getShortname();

public java.lang.String getState();

public java.lang.String getStateName(de.verit.klaros.core.model.KlarosSUTImplementation sut);

public java.util.Map<de.verit.klaros.core.model.KlarosSUTImplementation, de.verit.klaros.core.model.KlarosTestCaseState> getStates();

public java.lang.String getTeam();

public java.util.List<de.verit.klaros.core.model.KlarosTestCaseStep> getTestCaseSteps();

public java.lang.String getTraceability();

public de.verit.klaros.core.types.TestVariety getVariety();

134
Model API Reference

B.1.1.13.2. getArea()
public de.verit.klaros.core.types.TestAreatopic getArea();

The area of this test case.

Parameters
return The area.

B.1.1.13.3. getConfiguration()
public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

Get configuration.

Parameters
return The related configuration.

B.1.1.13.4. getCovers()
public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getCovers();

Get covered test requirements.

Parameters
return Set of test requirements which are covered by this test case revision.

B.1.1.13.5. getCreated()
public java.util.Date getCreated();

The creation date of this test case.

Parameters

135
Model API Reference

return The creation date.

B.1.1.13.6. getCreatedString(Locale)
public java.lang.String getCreatedString(java.util.Locale locale);

Formats a date with the given locale and returns it as string.

Parameters
locale The locale.
return The formated date string.

B.1.1.13.7. getCreator()
public de.verit.klaros.core.model.KlarosUser getCreator();

Parameters
return The creator of this test case.

B.1.1.13.8. getDependency()
public java.lang.String getDependency();

The dependency of this test case.

Parameters
return The dependency.

B.1.1.13.9. getDescription()
public java.lang.String getDescription();

The description of this test case.

Parameters
return The description.

B.1.1.13.10. getDesignTechnique()
public de.verit.klaros.core.types.TestDesignTechnique getDesignTechnique();

The design technique of this test case.

Parameters
return The type.

B.1.1.13.11. getDetectedIssues()
public java.util.List<de.verit.klaros.core.model.KlarosIssue> getDetectedIssues();

Get detected issues.

Parameters
return List of issues which have been detected by this test case revision.

B.1.1.13.12. getDocbase()
public java.lang.String getDocbase();

136
Model API Reference

The docbase of this test case.

Parameters
return The docbase.

B.1.1.13.13. getEvaluation()
public java.lang.String getEvaluation();

The evaluation of this test case.

Parameters
return The evaluation.

B.1.1.13.14. getExecution()
public de.verit.klaros.core.types.TestExecutionMethod getExecution();

The execution method of this test case.

Parameters
return The execution method.

B.1.1.13.15. getImplementation()
public de.verit.klaros.core.model.KlarosExternalImplementation getImplementation();

Get implementation.

Parameters
return The object describing the location of the implementation of this test case
revision.

B.1.1.13.16. getLevel()
public de.verit.klaros.core.types.TestLevel getLevel();

The level of this test case.

Parameters
return The level.

B.1.1.13.17. getNote()
public java.lang.String getNote();

The note of this test case.

Parameters
return The note.

B.1.1.13.18. getPostcondition()
public java.lang.String getPostcondition();

The postcondition of this test case.

Parameters

137
Model API Reference

return The postcondition.

B.1.1.13.19. getPrecondition()
public java.lang.String getPrecondition();

The precondition of this test case.

Parameters
return The precondition.

B.1.1.13.20. getPriority()
public de.verit.klaros.core.types.TestPriority getPriority();

The priority of this test case.

Parameters
return The priority.

B.1.1.13.21. getResults()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseResult> getResults();

Get test case results.

Parameters
return Set of results of executions of this test case revision.

B.1.1.13.22. getShortname()
public java.lang.String getShortname();

The short name (title) of this test case.

Parameters
return The short name.

B.1.1.13.23. getState()
public java.lang.String getState();

The type of this test case.

Parameters
return The type.

B.1.1.13.24. getStateName(KlarosSUTImplementation)
public java.lang.String getStateName(de.verit.klaros.core.model.KlarosSUTImplementation sut);

Get name of the state related to given SUT.

Parameters
sut The related SUT implementation
return The name of the state

138
Model API Reference

B.1.1.13.25. getStates()
public java.util.Map<de.verit.klaros.core.model.KlarosSUTImplementation, de.verit.klaros.core.model.KlarosTestCaseState> getStates();

Get test case states.

Parameters
return Collection of the states of this test case revision in relation to the dif-
ferent SUT versions..

B.1.1.13.26. getTeam()
public java.lang.String getTeam();

The team responsible for this test case.

Parameters
return The team.

B.1.1.13.27. getTestCaseSteps()
public java.util.List<de.verit.klaros.core.model.KlarosTestCaseStep> getTestCaseSteps();

Get test case steps.

Parameters
return list of steps of this test case revision.

B.1.1.13.28. getTraceability()
public java.lang.String getTraceability();

The traceability of this test case.

Parameters
return The traceability.

B.1.1.13.29. getVariety()
public de.verit.klaros.core.types.TestVariety getVariety();

The variety of this test case.

Parameters
return The variety.

B.1.1.14. Interface IKlarosTestCaseResult
This interface provides access to data of a test case result.

B.1.1.14.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTestCaseResult
    extends, de.verit.klaros.core.model.IKlarosNamedEntity {
// Public Methods

public java.lang.String getDescription();

public long getExecutionTime();

public java.lang.String getSummary();

139
Model API Reference

public de.verit.klaros.core.model.KlarosTestCase getTestCase();

public de.verit.klaros.core.model.KlarosTestRun getTestRun();

public boolean isError();

public boolean isFailure();

public boolean isPassed();

B.1.1.14.2. getDescription()
public java.lang.String getDescription();

Get the test result description. This is usually set for failed/error status results.

Parameters
return The test result description.

B.1.1.14.3. getExecutionTime()
public long getExecutionTime();

Get the test execution time in ms.

Parameters
return The test execution time.

B.1.1.14.4. getSummary()
public java.lang.String getSummary();

Get the test result summary. This is usually set for failed/error status results.

Parameters
return The test result summary.

B.1.1.14.5. getTestCase()
public de.verit.klaros.core.model.KlarosTestCase getTestCase();

Get test case.

Parameters

140
Model API Reference

return The test case that has been executed to get this result.

B.1.1.14.6. getTestRun()
public de.verit.klaros.core.model.KlarosTestRun getTestRun();

Get test run.

Parameters
return The test run that created this result.

B.1.1.14.7. isError()
public boolean isError();

Check if this is an error result.

It is assumed, that error results have a property 'type' with the value 'E' or 'error'.

Parameters
return true if this results represents an error.

B.1.1.14.8. isFailure()
public boolean isFailure();

Check if this is a failure result.

It is assumed, that failure results have a property 'type' with the value 'F' or 'failure'.

Parameters
return true if this results represents a failure.

B.1.1.14.9. isPassed()
public boolean isPassed();

Check if this is a result of a passed test case.

It is assumed, that passed results have a property 'testCasePassed' with value 'true'.

Parameters
return true if this results represents an error.

B.1.1.15. Interface IKlarosTestCaseState
This interface provides access to data of a test case state.

B.1.1.15.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTestCaseState {
// Public Methods

public java.lang.String getStateDescription();

public java.lang.String getStateName();

public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

public de.verit.klaros.core.model.KlarosTestCase getTestCase();

141
Model API Reference

B.1.1.15.2. getStateDescription()
public java.lang.String getStateDescription();

Get description of state.

Parameters
return The description of this state.

B.1.1.15.3. getStateName()
public java.lang.String getStateName();

Get name of state.

Parameters
return The name of this state.

B.1.1.15.4. getSut()
public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

Get related system under test version.

Parameters
return The related SUT version.

B.1.1.15.5. getTestCase()
public de.verit.klaros.core.model.KlarosTestCase getTestCase();

Get test case.

Parameters
return The test case whose state is defined by this state object.

B.1.1.16. Interface IKlarosTestCaseStep
This interface provides access to a software issue.

B.1.1.16.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTestCaseStep {
// Public Methods

public java.lang.String getAction();

public java.util.List<de.verit.klaros.core.model.KlarosAttachment> getAttachments();

public java.lang.String getPostcondition();

public java.lang.String getPrecondition();

142
Model API Reference

B.1.1.16.2. getAction()
public java.lang.String getAction();

Get the action.

Parameters
return The action of this test case step.

B.1.1.16.3. getAttachments()
public java.util.List<de.verit.klaros.core.model.KlarosAttachment> getAttachments();

Get the attachments.

Parameters
return The attachments of this test case step.

B.1.1.16.4. getPostcondition()
public java.lang.String getPostcondition();

Get the post condition.

Parameters
return The post condition of this test case step.

B.1.1.16.5. getPrecondition()
public java.lang.String getPrecondition();

Get the pre condition.

Parameters
return The pre condition of this test case step.

B.1.1.17. Interface IKlarosTestEnvironment
This interface provides access to data of a test environment.

B.1.1.17.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTestEnvironment
    extends, de.verit.klaros.core.model.IKlarosNamedEntity {
// Public Methods

public java.lang.String getCreator();

public java.lang.String getShortname();

public java.util.Set<de.verit.klaros.core.persistent.TestRun> getTestRuns();

public boolean isDefinedProperty(java.lang.String propertyName);

143
Model API Reference

public java.lang.Boolean isEnabled();

B.1.1.17.2. getCreator()
public java.lang.String getCreator();

Parameters
return The name of the creator.

B.1.1.17.3. getShortname()
public java.lang.String getShortname();

Parameters
return The description.

B.1.1.17.4. getTestRuns()
public java.util.Set<de.verit.klaros.core.persistent.TestRun> getTestRuns();

Get test runs.

Parameters
return Collection of test runs executed in the environment.

B.1.1.17.5. isDefinedProperty(String)
public boolean isDefinedProperty(java.lang.String propertyName);

Specified by: Method isDefinedProperty in interface IKlarosNamedEntity

Check if a property identified by given name is a defined property.

Parameters
propertyName The name of the property to check.
return true if the identified property is a defined property, false else.

B.1.1.17.6. isEnabled()
public java.lang.Boolean isEnabled();

Check if enabled.

144
Model API Reference

Parameters
return true, if enabled

B.1.1.18. Interface IKlarosTestExecutable
This class provides access to data of a test case.

B.1.1.18.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTestExecutable {
}

B.1.1.19. Interface IKlarosTestRequirement
This interface provides access to data of a test requirement.

B.1.1.19.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTestRequirement
    extends, de.verit.klaros.core.model.IKlarosRevision {
// Public Methods

public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getCoveringHeadTestCases();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getCoveringTestCases();

public de.verit.klaros.core.model.KlarosRequirementGroup getGroup();

145
Model API Reference

B.1.1.19.2. getConfiguration()
public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

Get configuration.

Parameters
return The related configuration.

B.1.1.19.3. getCoveringHeadTestCases()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getCoveringHeadTestCases();

Get head revisions of covering test cases.

Parameters
return Set of test cases

B.1.1.19.4. getCoveringTestCases()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getCoveringTestCases();

Get test cases covering the test requirement.

Parameters
return Set of test cases which cover this test requirement.

B.1.1.19.5. getGroup()
public de.verit.klaros.core.model.KlarosRequirementGroup getGroup();

Get requirement group.

Parameters
return The requirement group the test requirement belongs to.

B.1.1.20. Interface IKlarosTestRun
This interface provides access to data of a test run.

B.1.1.20.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTestRun {
// Public Methods

public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

public java.lang.String getCreator();

public de.verit.klaros.core.model.KlarosTestEnvironment getEnv();

public int getNumberErrors();

public int getNumberFailures();

public int getNumberPassed();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseResult> getResults();

public java.lang.String getRunId();

public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

146
Model API Reference

public java.util.Date getTimestamp();

public java.lang.String getTimestampString(java.util.Locale locale);

B.1.1.20.2. getConfiguration()
public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

Get configuration.

Parameters
return The related configuration.

B.1.1.20.3. getCreator()
public java.lang.String getCreator();

Parameters
return The name of the creator.

B.1.1.20.4. getEnv()
public de.verit.klaros.core.model.KlarosTestEnvironment getEnv();

Get test environment.

Parameters
return The test environment in which the test cases have benn executed.

B.1.1.20.5. getNumberErrors()
public int getNumberErrors();

Get number of test cases with errors of this test run. It is assumed, that error results have a property
'type' with the value 'E'.

Parameters
return The number of passed test cases.

B.1.1.20.6. getNumberFailures()
public int getNumberFailures();

Get number of failed test cases of this test run. It is assumed, that failed results have a property 'type'
with the value 'F'.

Parameters
return The number of passed test cases.

147
Model API Reference

B.1.1.20.7. getNumberPassed()
public int getNumberPassed();

Get number of passed test cases of this test run. It is assumed, that passed results have a property 'test-
CasePassed' with value 'true'.

Parameters
return The number of passed test cases.

B.1.1.20.8. getResults()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseResult> getResults();

Get results.

Parameters
return Set of results of test case executions.

B.1.1.20.9. getRunId()
public java.lang.String getRunId();

Get id of test run.

Parameters
return The id of the test run.

B.1.1.20.10. getSut()
public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

Get the tested system version.

Parameters
return The SUT version which has been tested.

B.1.1.20.11. getTimestamp()
public java.util.Date getTimestamp();

Get timestamp.

Parameters
return The time the test run has been executed as Date object.

B.1.1.20.12. getTimestampString(Locale)
public java.lang.String getTimestampString(java.util.Locale locale);

Formats the timestamp and returns it as string.

Parameters
locale The locale to use for the format.
return The time the test run has been executed as formated String.

B.1.1.21. Interface IKlarosTestSuite

148
Model API Reference

This class provides access to data of a test case.

B.1.1.21.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTestSuite {
// Public Methods

public de.verit.klaros.core.model.KlarosUser getAssignee();

public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

public java.util.Date getCreated();

public de.verit.klaros.core.model.KlarosUser getCreator();

public de.verit.klaros.core.model.KlarosUser getLastEditor();

public java.util.Date getLastUpdated();

public java.lang.String getName();

public java.util.List<de.verit.klaros.core.model.KlarosTestSuiteResult> getResults();

public java.lang.String getShortname();

public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

public java.util.List<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

public java.lang.Integer getTestSuiteResultCount();

public java.lang.Boolean isEnabled();

B.1.1.21.2. getAssignee()
public de.verit.klaros.core.model.KlarosUser getAssignee();

Parameters
return The user to whom this test suite is assigned.

B.1.1.21.3. getConfiguration()
public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

Get configuration.

Parameters
return The related configuration.

B.1.1.21.4. getCreated()
public java.util.Date getCreated();

Parameters

149
Model API Reference

return The date when the suite was created.

B.1.1.21.5. getCreator()
public de.verit.klaros.core.model.KlarosUser getCreator();

Parameters
return The user who created the test suite.

B.1.1.21.6. getLastEditor()
public de.verit.klaros.core.model.KlarosUser getLastEditor();

Parameters
return The user who last edited the test suite.

B.1.1.21.7. getLastUpdated()
public java.util.Date getLastUpdated();

Parameters
return The date when the last update was done.

B.1.1.21.8. getName()
public java.lang.String getName();

Get the name of the test suite.

Parameters
return String of the name of the test suite.

B.1.1.21.9. getResults()
public java.util.List<de.verit.klaros.core.model.KlarosTestSuiteResult> getResults();

Get test suite results.

Parameters
return Set of results of executions of this test suite.

B.1.1.21.10. getShortname()
public java.lang.String getShortname();

Get the short name of the test suite.

Parameters
return String of the description of the test suite.

B.1.1.21.11. getSut()
public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

Get the description of the test suite.

Parameters

150
Model API Reference

return String of the description of the test suite.

B.1.1.21.12. getTestCases()
public java.util.List<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

Get the executables of this test suite.

Parameters
return Set of test cases of this test suite.

B.1.1.21.13. getTestSuiteResultCount()
public java.lang.Integer getTestSuiteResultCount();

Return the number of test suite results in this test suite.

Parameters
return The number of test suite results in this test suite.

B.1.1.21.14. isEnabled()
public java.lang.Boolean isEnabled();

Returns the value of the enabled flag of this test suite.

Parameters
return true if the test suite is enabled, else false

B.1.1.22. Interface IKlarosTestSuiteResult
This interface provides access to data of a test case result.

B.1.1.22.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosTestSuiteResult
    extends, de.verit.klaros.core.model.IKlarosNamedEntity {
// Public Methods

public de.verit.klaros.core.model.KlarosTestRun getTestRun();

public de.verit.klaros.core.model.KlarosTestSuite getTestSuite();

B.1.1.22.2. getTestRun()
public de.verit.klaros.core.model.KlarosTestRun getTestRun();

151
Model API Reference

Get the test run for this result.

Parameters
return the test run

B.1.1.22.3. getTestSuite()
public de.verit.klaros.core.model.KlarosTestSuite getTestSuite();

Get test case.

Parameters
return The test case that has been executed to get this result.

B.1.1.23. Interface IKlarosUser
The user object.

B.1.1.23.1. Synopsis
public interface de.verit.klaros.core.model.IKlarosUser {
// Public Methods

public java.lang.String getEmail();

public java.lang.String getRole();

public java.lang.String getUsername();

B.1.1.23.2. getEmail()
public java.lang.String getEmail();

The email address of this user.

Parameters
return the email address

B.1.1.23.3. getRole()
public java.lang.String getRole();

The role name of this user.

Parameters
return the role name

B.1.1.23.4. getUsername()
public java.lang.String getUsername();

152
Model API Reference

The user name of this user as used when logging in.

Parameters
return the user name

B.1.1.24. Class KlarosAttachment
This class provides access to attachments.

B.1.1.24.1. Synopsis
public final class de.verit.klaros.core.model.KlarosAttachment
    extends, de.verit.klaros.core.model.KlarosWrapper<de.verit.klaros.core.model.KlarosAttachment, de.verit.klaros.core.persistent.Attachm
    
    implements, de.verit.klaros.core.model.IKlarosAttachment {
// Public Constructors

public KlarosAttachment();

// Public Methods

public java.lang.String getUuid();

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.24.2. getUuid()
public java.lang.String getUuid();

Specified by: Method getUuid in interface IKlarosAttachment

Get the uuid.

B.1.1.25. Class KlarosConfiguration
This class provides access to the information stored for project's configuration.

B.1.1.25.1. Synopsis
public class de.verit.klaros.core.model.KlarosConfiguration
    extends, de.verit.klaros.core.model.KlarosWrapper<de.verit.klaros.core.model.KlarosConfiguration, de.verit.klaros.core.persistent.Conf
    
    implements, de.verit.klaros.core.model.IKlarosConfiguration, java.lang.Comparable<de.verit.klaros.core.model.KlarosConfiguration> {
// Public Methods

153
Model API Reference

public int compareTo(de.verit.klaros.core.model.KlarosConfiguration o);

public boolean equals(java.lang.Object o);

public java.lang.String getDescription();

public java.util.Set<de.verit.klaros.core.model.KlarosTestEnvironment> getEnvs();

public java.lang.String getName();

public java.util.Set<de.verit.klaros.core.model.KlarosSUTImplementation> getSuts();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getTestRequirements();

public java.util.Set<de.verit.klaros.core.model.KlarosTestRun> getTestRuns();

public java.util.Set<de.verit.klaros.core.model.KlarosTestSuite> getTestSuites();

public int hashCode();

public java.lang.Boolean isEnabled();

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.25.2. compareTo(KlarosConfiguration)
public int compareTo(de.verit.klaros.core.model.KlarosConfiguration o);

B.1.1.25.3. equals(Object)
public boolean equals(java.lang.Object o);

B.1.1.25.4. getDescription()
public java.lang.String getDescription();

Specified by: Method getDescription in interface IKlarosConfiguration

Returns the project description.

See Also getDescription()

B.1.1.25.5. getEnvs()
public java.util.Set<de.verit.klaros.core.model.KlarosTestEnvironment> getEnvs();

Specified by: Method getEnvs in interface IKlarosConfiguration

Returns the project test environments.

154
Model API Reference

See Also getEnvs()

B.1.1.25.6. getName()
public java.lang.String getName();

Specified by: Method getName in interface IKlarosConfiguration

Returns the project name.

See Also getName()

B.1.1.25.7. getSuts()
public java.util.Set<de.verit.klaros.core.model.KlarosSUTImplementation> getSuts();

Specified by: Method getSuts in interface IKlarosConfiguration

Returns the project SUTs (systems under test).

See Also getSuts()

B.1.1.25.8. getTestCases()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

Specified by: Method getTestCases in interface IKlarosConfiguration

Returns the project test cases.

See Also getTestCases()

B.1.1.25.9. getTestRequirements()
public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getTestRequirements();

Specified by: Method getTestRequirements in interface IKlarosConfiguration

Returns the project test requirements.

See Also getTestRequirements()

B.1.1.25.10. getTestRuns()
public java.util.Set<de.verit.klaros.core.model.KlarosTestRun> getTestRuns();

Specified by: Method getTestRuns in interface IKlarosConfiguration

Returns the project test runs.

See Also getTestRuns()

B.1.1.25.11. getTestSuites()
public java.util.Set<de.verit.klaros.core.model.KlarosTestSuite> getTestSuites();

Specified by: Method getTestSuites in interface IKlarosConfiguration

Returns the project test suites.

See Also de.verit.klaros.core.model.IKlarosConfigu-


ration

B.1.1.25.12. hashCode()
public int hashCode();

155
Model API Reference

B.1.1.25.13. isEnabled()
public java.lang.Boolean isEnabled();

Specified by: Method isEnabled in interface IKlarosConfiguration

Returns whether this project is enabled or not.

See Also isEnabled()

B.1.1.26. Class KlarosExternalImplementation
This class encapsulates external implementation data of a test case.

B.1.1.26.1. Synopsis
public final class de.verit.klaros.core.model.KlarosExternalImplementation
    extends, de.verit.klaros.core.model.KlarosWrapper<de.verit.klaros.core.model.KlarosExternalImplementation, de.verit.klaros.core.persis
    
    implements, de.verit.klaros.core.model.IKlarosExternalImplementation {
// Public Methods

public java.lang.String getDate();

public de.verit.klaros.core.model.KlarosTestCase getImplementationOf();

public java.lang.String getReference();

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.26.2. getDate()
public java.lang.String getDate();

Specified by: Method getDate in interface IKlarosExternalImplementation

get date.

See Also getDate()

B.1.1.26.3. getImplementationOf()
public de.verit.klaros.core.model.KlarosTestCase getImplementationOf();

Specified by: Method getImplementationOf in interface IKlarosExternalImplementation

Get related test case.

See Also getImplementationOf()

156
Model API Reference

B.1.1.26.4. getReference()
public java.lang.String getReference();

See Also getReference()

B.1.1.27. Class KlarosIssue
This class provides access to the information stored for detected issues.

B.1.1.27.1. Synopsis
public final class de.verit.klaros.core.model.KlarosIssue
    extends, de.verit.klaros.core.model.KlarosWrapper<de.verit.klaros.core.model.KlarosIssue, de.verit.klaros.core.persistent.Issue>
    
    implements, de.verit.klaros.core.model.IKlarosIssue {
// Public Methods

public java.lang.String getDescription();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseState> getStates();

public java.util.List<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.27.2. getDescription()
public java.lang.String getDescription();

Specified by: Method getDescription in interface IKlarosIssue

Get the description.

See Also getDescription()

B.1.1.27.3. getStates()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseState> getStates();

Specified by: Method getStates in interface IKlarosIssue

157
Model API Reference

Get the related test case states.

See Also getStates()

B.1.1.27.4. getTestCases()
public java.util.List<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

Specified by: Method getTestCases in interface IKlarosIssue

Get the related test cases.

See Also getTestCases()

B.1.1.28. Class KlarosNamedEntity
This class encapsulates the (dynamic) properties of a klaros object.

B.1.1.28.1. Synopsis
public abstract class de.verit.klaros.core.model.KlarosNamedEntity<T,S extends de.verit.klaros.core.persistent.NamedEntity>
    extends, de.verit.klaros.core.model.KlarosWrapper<T, S>
    
    implements, de.verit.klaros.core.model.IKlarosNamedEntity, org.apache.commons.beanutils.DynaBean {
// Public Methods

public boolean contains(java.lang.String name,
                          java.lang.String key);

public java.lang.Object get(java.lang.String name);

public java.lang.Object get(java.lang.String name,
                                  int index);

public java.lang.Object get(java.lang.String name,
                                  java.lang.String key);

public org.apache.commons.beanutils.DynaClass getDynaClass();

public java.lang.String getName();

public java.util.List<java.lang.String> getPropertyAsList(java.lang.String name);

public java.lang.String getPropertyAsString(java.lang.String name);

public boolean isDefinedProperty(java.lang.String propertyName);

public void remove(java.lang.String name,
                     java.lang.String key);

public void set(java.lang.String name,
                  int index,
                  java.lang.Object value);

public void set(java.lang.String name,
                  java.lang.Object value);

public void set(java.lang.String name,
                  java.lang.String key,
                  java.lang.Object value);

Direct known subclasses: de.verit.klaros.core.model.KlarosProject , de.verit.-


klaros.core.model.KlarosRequirementGroup , de.verit.klaros.core.model.-
KlarosRevision , de.verit.klaros.core.model.KlarosTestCaseResult , de.ver-
it.klaros.core.model.KlarosTestEnvironment , de.verit.klaros.core.model.
KlarosTestRun , de.verit.klaros.core.model.KlarosTestSuiteResult , de.ver-
it.klaros.core.model.KlarosUser

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

158
Model API Reference

B.1.1.28.2. contains(String, String)
public boolean contains(java.lang.String name,
                          java.lang.String key);

Specified by: Method contains in interface DynaBean

See Also org.apache.commons.beanutils.DynaBean.contains

B.1.1.28.3. get(String)
public java.lang.Object get(java.lang.String name);

Specified by: Method get in interface DynaBean

See Also org.apache.commons.beanutils.DynaBean.get

B.1.1.28.4. get(String, int)
public java.lang.Object get(java.lang.String name,
                                  int index);

Specified by: Method get in interface DynaBean

See Also org.apache.commons.beanutils.DynaBean.get

B.1.1.28.5. get(String, String)
public java.lang.Object get(java.lang.String name,
                                  java.lang.String key);

Specified by: Method get in interface DynaBean

See Also org.apache.commons.beanutils.DynaBean.get

B.1.1.28.6. getDynaClass()
public org.apache.commons.beanutils.DynaClass getDynaClass();

Specified by: Method getDynaClass in interface DynaBean

See Also org.apache.commons.beanutils.DynaBean.getDynaClass

B.1.1.28.7. getName()
public java.lang.String getName();

159
Model API Reference

Specified by: Method getName in interface IKlarosNamedEntity

Get name.

See Also getName()

B.1.1.28.8. getPropertyAsList(String)
public java.util.List<java.lang.String> getPropertyAsList(java.lang.String name);

Specified by: Method getPropertyAsList in interface IKlarosNamedEntity

Get the generic property as a List.

See Also getPropertyAsList(java.lang.String)

B.1.1.28.9. getPropertyAsString(String)
public java.lang.String getPropertyAsString(java.lang.String name);

Specified by: Method getPropertyAsString in interface IKlarosNamedEntity

Get the generic property as a String.

See Also getPropertyAsString(java.lang.String)

B.1.1.28.10. isDefinedProperty(String)
public boolean isDefinedProperty(java.lang.String propertyName);

Specified by: Method isDefinedProperty in interface IKlarosNamedEntity

Check if a property identified by given name is a defined property.

See Also isDefinedProperty(java.lang.String)

B.1.1.28.11. remove(String, String)
public void remove(java.lang.String name,
                     java.lang.String key);

Specified by: Method remove in interface DynaBean

See Also org.apache.commons.beanutils.DynaBean.remove

B.1.1.28.12. set(String, int, Object)


public void set(java.lang.String name,
                  int index,
                  java.lang.Object value);

Specified by: Method set in interface DynaBean

See Also org.apache.commons.beanutils.DynaBean.set

B.1.1.28.13. set(String, Object)
public void set(java.lang.String name,
                  java.lang.Object value);

Specified by: Method set in interface DynaBean

See Also org.apache.commons.beanutils.DynaBean.set

B.1.1.28.14. set(String, String, Object)


public void set(java.lang.String name,

160
Model API Reference

                  java.lang.String key,
                  java.lang.Object value);

Specified by: Method set in interface DynaBean

See Also org.apache.commons.beanutils.DynaBean.set

B.1.1.29. Class KlarosObjectFactory
Factory for klaros objects.

B.1.1.29.1. Synopsis
public final class de.verit.klaros.core.model.KlarosObjectFactory {
// Public Static Methods

public static java.lang.Object getInstance(java.lang.Object wrapped);

public static java.lang.Class<?> getMappedClass(java.lang.Class<?> clazz);

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.29.2. getInstance(Object)
public static java.lang.Object getInstance(java.lang.Object wrapped);

Get an instance of a klaros data class.

Parameters
wrapped The data object wrapped by this class.
return An instance wrapping the given object or the original data object, if no
wrapper has to be created.

B.1.1.29.3. getMappedClass(Class<?>)
public static java.lang.Class<?> getMappedClass(java.lang.Class<?> clazz);

Get the mapped class for given class.

Parameters
clazz The class to get the mapped class for.
return The mapped class.

B.1.1.30. Class KlarosProject
This class provides access to the information stored for test environments.

161
Model API Reference

B.1.1.30.1. Synopsis
public final class de.verit.klaros.core.model.KlarosProject
    extends, de.verit.klaros.core.model.KlarosNamedEntity<de.verit.klaros.core.model.KlarosProject, de.verit.klaros.core.persistent.Projec
    
    implements, de.verit.klaros.core.model.IKlarosProject {
}

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.31. Class KlarosRequirementGroup
This class realizes the grouping of test requirements.

B.1.1.31.1. Synopsis
public final class de.verit.klaros.core.model.KlarosRequirementGroup
    extends, de.verit.klaros.core.model.KlarosNamedEntity<de.verit.klaros.core.model.KlarosRequirementGroup, de.verit.klaros.core.persiste
    
    implements, de.verit.klaros.core.model.IKlarosRequirementGroup {
// Public Methods

public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getAllTestRequirements();

public java.util.Set<de.verit.klaros.core.model.KlarosRequirementGroup> getChildren();

public double getCoverage();

public java.lang.Integer getDepth();

public java.util.List<de.verit.klaros.core.model.KlarosRequirementGroup> getDescendants();

public java.util.List<de.verit.klaros.core.model.KlarosRequirementGroup> getFamily();

public int getNumberCoveredRequirements();

public int getNumberRequirements();

public de.verit.klaros.core.model.KlarosRequirementGroup getParent();

public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getTestRequirements();

162
Model API Reference

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.31.2. getAllTestRequirements()
public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getAllTestRequirements();

Specified by: Method getAllTestRequirements in interface IKlarosRequirementGroup

Get all test requirements of this group hierarchy.

See Also getAllTestRequirements()

B.1.1.31.3. getChildren()
public java.util.Set<de.verit.klaros.core.model.KlarosRequirementGroup> getChildren();

Specified by: Method getChildren in interface IKlarosRequirementGroup

Get the children of this requirement group.

See Also getChildren()

B.1.1.31.4. getCoverage()
public double getCoverage();

Specified by: Method getCoverage in interface IKlarosRequirementGroup

Get test requirement coverage for the group.

See Also getCoverage()

B.1.1.31.5. getDepth()
public java.lang.Integer getDepth();

Specified by: Method getDepth in interface IKlarosRequirementGroup

Get depth in tree of this node.

See Also getDepth()

163
Model API Reference

B.1.1.31.6. getDescendants()
public java.util.List<de.verit.klaros.core.model.KlarosRequirementGroup> getDescendants();

Specified by: Method getDescendants in interface IKlarosRequirementGroup

Get all descending subgroups of this group.

See Also getDescendants()

B.1.1.31.7. getFamily()
public java.util.List<de.verit.klaros.core.model.KlarosRequirementGroup> getFamily();

Specified by: Method getFamily in interface IKlarosRequirementGroup

Get whole family with this group as root.

See Also getFamily()

B.1.1.31.8. getNumberCoveredRequirements()
public int getNumberCoveredRequirements();

Specified by: Method getNumberCoveredRequirements in interface IKlarosRequirementGroup

Get number of test requirements covered by at least one test case.

See Also getNumberCoveredRequirements()

B.1.1.31.9. getNumberRequirements()
public int getNumberRequirements();

Specified by: Method getNumberRequirements in interface IKlarosRequirementGroup

Get number of test requirements, including all descendants' test requirements.

See Also getNumberRequirements()

B.1.1.31.10. getParent()
public de.verit.klaros.core.model.KlarosRequirementGroup getParent();

Specified by: Method getParent in interface IKlarosRequirementGroup

Get the parent of this requirement group.

See Also getParent()

B.1.1.31.11. getTestRequirements()
public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getTestRequirements();

Specified by: Method getTestRequirements in interface IKlarosRequirementGroup

Get the related test requirements of this requirement group.

See Also getTestRequirements()

B.1.1.32. Class KlarosRevision
This class encapsulates the revision related information of a revisionable klaros object.

B.1.1.32.1. Synopsis
public abstract class de.verit.klaros.core.model.KlarosRevision<T,S extends de.verit.klaros.core.persistent.Revision>

164
Model API Reference

    extends, de.verit.klaros.core.model.KlarosNamedEntity<T, S>


    
    implements, de.verit.klaros.core.model.IKlarosRevision {
// Public Methods

public final java.lang.String getRevisionComment();

public final java.lang.String getRevisionId();

public final java.util.Set<de.verit.klaros.core.model.KlarosTag> getTags();

Direct known subclasses: de.verit.klaros.core.model.KlarosSUTImplementation ,


de.verit.klaros.core.model.KlarosTestCase , de.verit.klaros.core.model.-
KlarosTestRequirement , de.verit.klaros.core.model.KlarosTestSuite

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.32.2. getRevisionComment()
public final java.lang.String getRevisionComment();

Specified by: Method getRevisionComment in interface IKlarosRevision

Get comment.

See Also getRevisionComment()

B.1.1.32.3. getRevisionId()
public final java.lang.String getRevisionId();

Specified by: Method getRevisionId in interface IKlarosRevision

Get the revision id.

165
Model API Reference

See Also getRevisionId()

B.1.1.32.4. getTags()
public final java.util.Set<de.verit.klaros.core.model.KlarosTag> getTags();

Specified by: Method getTags in interface IKlarosRevision

Get tags of this revision.

See Also getTags()

B.1.1.33. Class KlarosSUTImplementation
This class provides access to the information stored for systems under tests (SUT).

B.1.1.33.1. Synopsis
public final class de.verit.klaros.core.model.KlarosSUTImplementation
    extends, de.verit.klaros.core.model.KlarosRevision<de.verit.klaros.core.model.KlarosSUTImplementation, de.verit.klaros.core.persistent
    
    implements, de.verit.klaros.core.model.IKlarosSUTImplementation, java.lang.Comparable<de.verit.klaros.core.model.KlarosSUTImplementati
// Public Methods

public int compareTo(de.verit.klaros.core.model.KlarosSUTImplementation o);

public boolean equals(java.lang.Object o);

public de.verit.klaros.core.model.KlarosSUTImplementation getBranchRoot();

public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getBranches();

public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

public java.lang.String getCreator();

public int getNumberStates(java.lang.String name);

public de.verit.klaros.core.model.KlarosSUTImplementation getPredecessor();

public java.lang.String getProductversion();

public de.verit.klaros.core.model.KlarosSUTImplementation getRoot();

public de.verit.klaros.core.model.KlarosSUTImplementation getSuccessor();

public java.util.Set<de.verit.klaros.core.persistent.TestCaseState> getTestCaseStates();

public java.util.List<de.verit.klaros.core.model.KlarosTestCaseState> getTestCaseStates(java.lang.String name);

public java.util.Set<de.verit.klaros.core.persistent.TestRun> getTestRuns();

public de.verit.klaros.core.model.KlarosSUTImplementation getTrunkRoot();

public int hashCode();

public java.lang.Boolean isEnabled();

Methods inherited from de.verit.klaros.core.model.KlarosRevision: getRevisionComment ,


getRevisionId , getTags

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

166
Model API Reference

B.1.1.33.2. compareTo(KlarosSUTImplementation)
public int compareTo(de.verit.klaros.core.model.KlarosSUTImplementation o);

B.1.1.33.3. equals(Object)
public boolean equals(java.lang.Object o);

B.1.1.33.4. getBranches()
public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getBranches();

B.1.1.33.5. getBranchRoot()
public de.verit.klaros.core.model.KlarosSUTImplementation getBranchRoot();

B.1.1.33.6. getConfiguration()
public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

Specified by: Method getConfiguration in interface IKlarosSUTImplementation

Get configuration.

B.1.1.33.7. getCreator()
public java.lang.String getCreator();

Specified by: Method getCreator in interface IKlarosSUTImplementation

B.1.1.33.8. getNumberStates(String)
public int getNumberStates(java.lang.String name);

Specified by: Method getNumberStates in interface IKlarosSUTImplementation

Get number of test case states which are related to this revision and have the given name.

B.1.1.33.9. getPredecessor()
public de.verit.klaros.core.model.KlarosSUTImplementation getPredecessor();

B.1.1.33.10. getProductversion()
public java.lang.String getProductversion();

Specified by: Method getProductversion in interface IKlarosSUTImplementation

167
Model API Reference

Get product version.

See Also getProductversion()

B.1.1.33.11. getRoot()
public de.verit.klaros.core.model.KlarosSUTImplementation getRoot();

B.1.1.33.12. getSuccessor()
public de.verit.klaros.core.model.KlarosSUTImplementation getSuccessor();

B.1.1.33.13. getTestCaseStates()
public java.util.Set<de.verit.klaros.core.persistent.TestCaseState> getTestCaseStates();

Specified by: Method getTestCaseStates in interface IKlarosSUTImplementation

Get test case states.

B.1.1.33.14. getTestCaseStates(String)
public java.util.List<de.verit.klaros.core.model.KlarosTestCaseState> getTestCaseStates(java.lang.String name);

Specified by: Method getTestCaseStates in interface IKlarosSUTImplementation

Get test case states which are related to this revision and have the given name.

B.1.1.33.15. getTestRuns()
public java.util.Set<de.verit.klaros.core.persistent.TestRun> getTestRuns();

Specified by: Method getTestRuns in interface IKlarosSUTImplementation

Get test runs.

B.1.1.33.16. getTrunkRoot()
public de.verit.klaros.core.model.KlarosSUTImplementation getTrunkRoot();

B.1.1.33.17. hashCode()
public int hashCode();

B.1.1.33.18. isEnabled()
public java.lang.Boolean isEnabled();

Specified by: Method isEnabled in interface IKlarosSUTImplementation

Check if enabled.

B.1.1.34. Class KlarosStatusMessage
This class provides access to the information stored in a logged status message.

B.1.1.34.1. Synopsis
public final class de.verit.klaros.core.model.KlarosStatusMessage
    extends, de.verit.klaros.core.model.KlarosWrapper<de.verit.klaros.core.model.KlarosStatusMessage, de.verit.klaros.core.persistent.Stat
    
    implements, de.verit.klaros.core.model.IKlarosStatusMessage {
// Public Methods

public java.lang.String getMessage();

public java.lang.String getStatus();

public java.util.Date getTimestamp();

168
Model API Reference

public de.verit.klaros.core.persistent.User getUser();

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.34.2. getMessage()
public java.lang.String getMessage();

Specified by: Method getMessage in interface IKlarosStatusMessage

Get the message of the status message.

B.1.1.34.3. getStatus()
public java.lang.String getStatus();

Specified by: Method getStatus in interface IKlarosStatusMessage

Get the status of the status message.

B.1.1.34.4. getTimestamp()
public java.util.Date getTimestamp();

Specified by: Method getTimestamp in interface IKlarosStatusMessage

Get the timestamp of the status message.

B.1.1.34.5. getUser()
public de.verit.klaros.core.persistent.User getUser();

Specified by: Method getUser in interface IKlarosStatusMessage

Get the user of the status message.

B.1.1.35. Class KlarosTag
This class encapsulates the tag information of a taggable klaros object.

B.1.1.35.1. Synopsis
public class de.verit.klaros.core.model.KlarosTag
    extends, de.verit.klaros.core.model.KlarosWrapper<de.verit.klaros.core.model.KlarosTag, de.verit.klaros.core.persistent.Tag>
    

169
Model API Reference

    implements, de.verit.klaros.core.model.IKlarosTag {
// Public Constructors

public KlarosTag();

// Public Methods

public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getRevisions();

public java.lang.String getTagId();

public java.util.Date getTimestamp();

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.35.2. getRevisions()
public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getRevisions();

Specified by: Method getRevisions in interface IKlarosTag

Get related revisions.

B.1.1.35.3. getTagId()
public java.lang.String getTagId();

Specified by: Method getTagId in interface IKlarosTag

Get tag id.

B.1.1.35.4. getTimestamp()
public java.util.Date getTimestamp();

Specified by: Method getTimestamp in interface IKlarosTag

Get timestamp.

B.1.1.36. Class KlarosTestCase
This class provides access to the information stored for test cases.

170
Model API Reference

B.1.1.36.1. Synopsis
public final class de.verit.klaros.core.model.KlarosTestCase
    extends, de.verit.klaros.core.model.KlarosRevision<de.verit.klaros.core.model.KlarosTestCase, de.verit.klaros.core.persistent.TestCase
    
    implements, de.verit.klaros.core.model.IKlarosTestCase, java.lang.Comparable<de.verit.klaros.core.model.KlarosTestCase> {
// Public Methods

public int compareTo(de.verit.klaros.core.model.KlarosTestCase o);

public boolean equals(java.lang.Object o);

public de.verit.klaros.core.types.TestAreatopic getArea();

public de.verit.klaros.core.model.KlarosTestCase getBranchRoot();

public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getBranches();

public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getCovers();

public java.util.Date getCreated();

public java.lang.String getCreatedString(java.util.Locale locale);

public de.verit.klaros.core.model.KlarosUser getCreator();

public java.lang.String getDependency();

public java.lang.String getDescription();

public de.verit.klaros.core.types.TestDesignTechnique getDesignTechnique();

public java.util.List<de.verit.klaros.core.model.KlarosIssue> getDetectedIssues();

public java.lang.String getDocbase();

public java.lang.String getEvaluation();

public de.verit.klaros.core.types.TestExecutionMethod getExecution();

public de.verit.klaros.core.model.KlarosExternalImplementation getImplementation();

public de.verit.klaros.core.types.TestLevel getLevel();

public java.lang.String getNote();

public java.lang.String getPostcondition();

public java.lang.String getPrecondition();

public de.verit.klaros.core.model.KlarosTestCase getPredecessor();

public de.verit.klaros.core.types.TestPriority getPriority();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseResult> getResults();

public de.verit.klaros.core.model.KlarosTestCase getRoot();

public java.lang.String getShortname();

public java.lang.String getState();

public java.lang.String getStateName(de.verit.klaros.core.model.KlarosSUTImplementation sut);

public java.util.Map<de.verit.klaros.core.model.KlarosSUTImplementation, de.verit.klaros.core.model.KlarosTestCaseState> getStates();

public de.verit.klaros.core.model.KlarosTestCase getSuccessor();

public java.lang.String getTeam();

public java.util.List<de.verit.klaros.core.model.KlarosTestCaseStep> getTestCaseSteps();

public java.lang.String getTraceability();

171
Model API Reference

public de.verit.klaros.core.model.KlarosTestCase getTrunkRoot();

public de.verit.klaros.core.types.TestVariety getVariety();

public int hashCode();

Methods inherited from de.verit.klaros.core.model.KlarosRevision: getRevisionComment ,


getRevisionId , getTags

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.36.2. compareTo(KlarosTestCase)
public int compareTo(de.verit.klaros.core.model.KlarosTestCase o);

Compares the 2 objects by the compare function of its child class.

Parameters
o the other object
return the comparison result

B.1.1.36.3. equals(Object)
public boolean equals(java.lang.Object o);

B.1.1.36.4. getArea()
public de.verit.klaros.core.types.TestAreatopic getArea();

Specified by: Method getArea in interface IKlarosTestCase

The area of this test case.

172
Model API Reference

B.1.1.36.5. getBranches()
public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getBranches();

B.1.1.36.6. getBranchRoot()
public de.verit.klaros.core.model.KlarosTestCase getBranchRoot();

B.1.1.36.7. getConfiguration()
public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

Specified by: Method getConfiguration in interface IKlarosTestCase

Get configuration.

B.1.1.36.8. getCovers()
public java.util.Set<de.verit.klaros.core.model.KlarosTestRequirement> getCovers();

Specified by: Method getCovers in interface IKlarosTestCase

Get covered test requirements.

B.1.1.36.9. getCreated()
public java.util.Date getCreated();

Specified by: Method getCreated in interface IKlarosTestCase

The creation date of this test case.

B.1.1.36.10. getCreatedString(Locale)
public java.lang.String getCreatedString(java.util.Locale locale);

Specified by: Method getCreatedString in interface IKlarosTestCase

Formats a date with the given locale and returns it as string.

B.1.1.36.11. getCreator()
public de.verit.klaros.core.model.KlarosUser getCreator();

Specified by: Method getCreator in interface IKlarosTestCase

B.1.1.36.12. getDependency()
public java.lang.String getDependency();

Specified by: Method getDependency in interface IKlarosTestCase

The dependency of this test case.

B.1.1.36.13. getDescription()
public java.lang.String getDescription();

Specified by: Method getDescription in interface IKlarosTestCase

The description of this test case.

B.1.1.36.14. getDesignTechnique()
public de.verit.klaros.core.types.TestDesignTechnique getDesignTechnique();

Specified by: Method getDesignTechnique in interface IKlarosTestCase

173
Model API Reference

The design technique of this test case.

B.1.1.36.15. getDetectedIssues()
public java.util.List<de.verit.klaros.core.model.KlarosIssue> getDetectedIssues();

Specified by: Method getDetectedIssues in interface IKlarosTestCase

Get detected issues.

See Also getDetectedIssues()

B.1.1.36.16. getDocbase()
public java.lang.String getDocbase();

Specified by: Method getDocbase in interface IKlarosTestCase

The docbase of this test case.

B.1.1.36.17. getEvaluation()
public java.lang.String getEvaluation();

Specified by: Method getEvaluation in interface IKlarosTestCase

The evaluation of this test case.

B.1.1.36.18. getExecution()
public de.verit.klaros.core.types.TestExecutionMethod getExecution();

Specified by: Method getExecution in interface IKlarosTestCase

The execution method of this test case.

B.1.1.36.19. getImplementation()
public de.verit.klaros.core.model.KlarosExternalImplementation getImplementation();

Specified by: Method getImplementation in interface IKlarosTestCase

Get implementation.

B.1.1.36.20. getLevel()
public de.verit.klaros.core.types.TestLevel getLevel();

Specified by: Method getLevel in interface IKlarosTestCase

The level of this test case.

B.1.1.36.21. getNote()
public java.lang.String getNote();

Specified by: Method getNote in interface IKlarosTestCase

The note of this test case.

B.1.1.36.22. getPostcondition()
public java.lang.String getPostcondition();

Specified by: Method getPostcondition in interface IKlarosTestCase

The postcondition of this test case.

174
Model API Reference

B.1.1.36.23. getPrecondition()
public java.lang.String getPrecondition();

Specified by: Method getPrecondition in interface IKlarosTestCase

The precondition of this test case.

B.1.1.36.24. getPredecessor()
public de.verit.klaros.core.model.KlarosTestCase getPredecessor();

B.1.1.36.25. getPriority()
public de.verit.klaros.core.types.TestPriority getPriority();

Specified by: Method getPriority in interface IKlarosTestCase

The priority of this test case.

B.1.1.36.26. getResults()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseResult> getResults();

Specified by: Method getResults in interface IKlarosTestCase

Get test case results.

B.1.1.36.27. getRoot()
public de.verit.klaros.core.model.KlarosTestCase getRoot();

B.1.1.36.28. getShortname()
public java.lang.String getShortname();

Specified by: Method getShortname in interface IKlarosTestCase

The short name (title) of this test case.

B.1.1.36.29. getState()
public java.lang.String getState();

Specified by: Method getState in interface IKlarosTestCase

The type of this test case.

B.1.1.36.30. getStateName(KlarosSUTImplementation)
public java.lang.String getStateName(de.verit.klaros.core.model.KlarosSUTImplementation sut);

Specified by: Method getStateName in interface IKlarosTestCase

Get name of the state related to given SUT.

B.1.1.36.31. getStates()
public java.util.Map<de.verit.klaros.core.model.KlarosSUTImplementation, de.verit.klaros.core.model.KlarosTestCaseState> getStates();

Specified by: Method getStates in interface IKlarosTestCase

Get test case states.

See Also getStates()

B.1.1.36.32. getSuccessor()
public de.verit.klaros.core.model.KlarosTestCase getSuccessor();

175
Model API Reference

B.1.1.36.33. getTeam()
public java.lang.String getTeam();

Specified by: Method getTeam in interface IKlarosTestCase

The team responsible for this test case.

B.1.1.36.34. getTestCaseSteps()
public java.util.List<de.verit.klaros.core.model.KlarosTestCaseStep> getTestCaseSteps();

Specified by: Method getTestCaseSteps in interface IKlarosTestCase

Get test case steps.

B.1.1.36.35. getTraceability()
public java.lang.String getTraceability();

Specified by: Method getTraceability in interface IKlarosTestCase

The traceability of this test case.

B.1.1.36.36. getTrunkRoot()
public de.verit.klaros.core.model.KlarosTestCase getTrunkRoot();

B.1.1.36.37. getVariety()
public de.verit.klaros.core.types.TestVariety getVariety();

Specified by: Method getVariety in interface IKlarosTestCase

The variety of this test case.

B.1.1.36.38. hashCode()
public int hashCode();

B.1.1.37. Class KlarosTestCaseResult
This class provides access to the information stored for test case results.

B.1.1.37.1. Synopsis
public final class de.verit.klaros.core.model.KlarosTestCaseResult
    extends, de.verit.klaros.core.model.KlarosNamedEntity<de.verit.klaros.core.model.KlarosTestCaseResult, de.verit.klaros.core.persistent
    
    implements, de.verit.klaros.core.model.IKlarosTestCaseResult, java.lang.Comparable<de.verit.klaros.core.model.KlarosTestCaseResult> {
// Public Methods

public int compareTo(de.verit.klaros.core.model.KlarosTestCaseResult o);

public boolean equals(java.lang.Object o);

public java.lang.String getDescription();

public long getExecutionTime();

public java.lang.String getSummary();

public de.verit.klaros.core.model.KlarosTestCase getTestCase();

public de.verit.klaros.core.model.KlarosTestRun getTestRun();

public int hashCode();

public boolean isError();

public boolean isFailure();

176
Model API Reference

public boolean isPassed();

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.37.2. compareTo(KlarosTestCaseResult)
public int compareTo(de.verit.klaros.core.model.KlarosTestCaseResult o);

B.1.1.37.3. equals(Object)
public boolean equals(java.lang.Object o);

B.1.1.37.4. getDescription()
public java.lang.String getDescription();

Specified by: Method getDescription in interface IKlarosTestCaseResult

Get the test result description. This is usually set for failed/error status results.

B.1.1.37.5. getExecutionTime()
public long getExecutionTime();

Specified by: Method getExecutionTime in interface IKlarosTestCaseResult

Get the test execution time in ms.

B.1.1.37.6. getSummary()
public java.lang.String getSummary();

Specified by: Method getSummary in interface IKlarosTestCaseResult

Get the test result summary. This is usually set for failed/error status results.

B.1.1.37.7. getTestCase()
public de.verit.klaros.core.model.KlarosTestCase getTestCase();

Specified by: Method getTestCase in interface IKlarosTestCaseResult

Get test case.

177
Model API Reference

See Also getTestCase()

B.1.1.37.8. getTestRun()
public de.verit.klaros.core.model.KlarosTestRun getTestRun();

Specified by: Method getTestRun in interface IKlarosTestCaseResult

Get test run.

See Also getTestRun()

B.1.1.37.9. hashCode()
public int hashCode();

B.1.1.37.10. isError()
public boolean isError();

Specified by: Method isError in interface IKlarosTestCaseResult

Check if this is an error result.

It is assumed, that error results have a property 'type' with the value 'E' or 'error'.

See Also isError()

B.1.1.37.11. isFailure()
public boolean isFailure();

Specified by: Method isFailure in interface IKlarosTestCaseResult

Check if this is a failure result.

It is assumed, that failure results have a property 'type' with the value 'F' or 'failure'.

See Also isFailure()

B.1.1.37.12. isPassed()
public boolean isPassed();

Specified by: Method isPassed in interface IKlarosTestCaseResult

Check if this is a result of a passed test case.

It is assumed, that passed results have a property 'testCasePassed' with value 'true'.

See Also isPassed()

B.1.1.38. Class KlarosTestCaseState
This class provides access to the information stored for test case states.

B.1.1.38.1. Synopsis
public final class de.verit.klaros.core.model.KlarosTestCaseState
    extends, de.verit.klaros.core.model.KlarosWrapper<de.verit.klaros.core.model.KlarosTestCaseState, de.verit.klaros.core.persistent.Test
    
    implements, de.verit.klaros.core.model.IKlarosTestCaseState {
// Public Methods

public java.lang.String getStateDescription();

public java.lang.String getStateName();

public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

178
Model API Reference

public de.verit.klaros.core.model.KlarosTestCase getTestCase();

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.38.2. getStateDescription()
public java.lang.String getStateDescription();

Specified by: Method getStateDescription in interface IKlarosTestCaseState

Get description of state.

See Also getStateDescription()

B.1.1.38.3. getStateName()
public java.lang.String getStateName();

Specified by: Method getStateName in interface IKlarosTestCaseState

Get name of state.

See Also getStateName()

B.1.1.38.4. getSut()
public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

Specified by: Method getSut in interface IKlarosTestCaseState

Get related system under test version.

See Also getSut()

B.1.1.38.5. getTestCase()
public de.verit.klaros.core.model.KlarosTestCase getTestCase();

Specified by: Method getTestCase in interface IKlarosTestCaseState

Get test case.

See Also getTestCase()

179
Model API Reference

B.1.1.39. Class KlarosTestCaseStep
This class provides access to the information stored for detected issues.

B.1.1.39.1. Synopsis
public final class de.verit.klaros.core.model.KlarosTestCaseStep
    extends, de.verit.klaros.core.model.KlarosWrapper<de.verit.klaros.core.model.KlarosTestCaseStep, de.verit.klaros.core.persistent.TestC
    
    implements, de.verit.klaros.core.model.IKlarosTestCaseStep {
// Public Methods

public java.lang.String getAction();

public java.util.List<de.verit.klaros.core.model.KlarosAttachment> getAttachments();

public java.lang.String getPostcondition();

public java.lang.String getPrecondition();

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.39.2. getAction()
public java.lang.String getAction();

Specified by: Method getAction in interface IKlarosTestCaseStep

Get the action.

B.1.1.39.3. getAttachments()
public java.util.List<de.verit.klaros.core.model.KlarosAttachment> getAttachments();

Specified by: Method getAttachments in interface IKlarosTestCaseStep

Get the attachments.

B.1.1.39.4. getPostcondition()
public java.lang.String getPostcondition();

Specified by: Method getPostcondition in interface IKlarosTestCaseStep

Get the post condition.

180
Model API Reference

B.1.1.39.5. getPrecondition()
public java.lang.String getPrecondition();

Specified by: Method getPrecondition in interface IKlarosTestCaseStep

Get the pre condition.

B.1.1.40. Class KlarosTestEnvironment
This class provides access to the information stored for test environments.

B.1.1.40.1. Synopsis
public final class de.verit.klaros.core.model.KlarosTestEnvironment
    extends, de.verit.klaros.core.model.KlarosNamedEntity<de.verit.klaros.core.model.KlarosTestEnvironment, de.verit.klaros.core.persisten
    
    implements, de.verit.klaros.core.model.IKlarosTestEnvironment, java.lang.Comparable<de.verit.klaros.core.model.KlarosTestEnvironment> 
// Public Methods

public int compareTo(de.verit.klaros.core.model.KlarosTestEnvironment o);

public boolean equals(java.lang.Object o);

public java.lang.String getCreator();

public java.lang.String getShortname();

public java.util.Set<de.verit.klaros.core.persistent.TestRun> getTestRuns();

public int hashCode();

public java.lang.Boolean isEnabled();

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.40.2. compareTo(KlarosTestEnvironment)
public int compareTo(de.verit.klaros.core.model.KlarosTestEnvironment o);

Compares the 2 objects by the compare function of its child class.

B.1.1.40.3. equals(Object)
public boolean equals(java.lang.Object o);

181
Model API Reference

B.1.1.40.4. getCreator()
public java.lang.String getCreator();

Specified by: Method getCreator in interface IKlarosTestEnvironment

B.1.1.40.5. getShortname()
public java.lang.String getShortname();

Specified by: Method getShortname in interface IKlarosTestEnvironment

B.1.1.40.6. getTestRuns()
public java.util.Set<de.verit.klaros.core.persistent.TestRun> getTestRuns();

Specified by: Method getTestRuns in interface IKlarosTestEnvironment

Get test runs.

See Also getTestRuns()

B.1.1.40.7. hashCode()
public int hashCode();

B.1.1.40.8. isEnabled()
public java.lang.Boolean isEnabled();

Specified by: Method isEnabled in interface IKlarosTestEnvironment

Check if enabled.

B.1.1.41. Class KlarosTestExecutable
This class provides access to the information stored for test cases.

B.1.1.41.1. Synopsis
public class de.verit.klaros.core.model.KlarosTestExecutable
    extends, de.verit.klaros.core.model.KlarosWrapper<de.verit.klaros.core.model.KlarosTestExecutable, de.verit.klaros.core.persistent.Tes
    
    implements, de.verit.klaros.core.model.IKlarosTestExecutable {
}

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

182
Model API Reference

B.1.1.42. Class KlarosTestRequirement
This class provides access to the information stored for test requirements.

B.1.1.42.1. Synopsis
public final class de.verit.klaros.core.model.KlarosTestRequirement
    extends, de.verit.klaros.core.model.KlarosRevision<de.verit.klaros.core.model.KlarosTestRequirement, de.verit.klaros.core.persistent.T
    
    implements, de.verit.klaros.core.model.IKlarosTestRequirement {
// Public Methods

public de.verit.klaros.core.model.KlarosTestRequirement getBranchRoot();

public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getBranches();

public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getCoveringHeadTestCases();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getCoveringTestCases();

public de.verit.klaros.core.model.KlarosRequirementGroup getGroup();

public de.verit.klaros.core.model.KlarosTestRequirement getPredecessor();

public de.verit.klaros.core.model.KlarosTestRequirement getRoot();

public de.verit.klaros.core.model.KlarosTestRequirement getSuccessor();

public de.verit.klaros.core.model.KlarosTestRequirement getTrunkRoot();

Methods inherited from de.verit.klaros.core.model.KlarosRevision: getRevisionComment ,


getRevisionId , getTags

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

183
Model API Reference

B.1.1.42.2. getBranches()
public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getBranches();

B.1.1.42.3. getBranchRoot()
public de.verit.klaros.core.model.KlarosTestRequirement getBranchRoot();

B.1.1.42.4. getConfiguration()
public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

Specified by: Method getConfiguration in interface IKlarosTestRequirement

Get configuration.

B.1.1.42.5. getCoveringHeadTestCases()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getCoveringHeadTestCases();

Specified by: Method getCoveringHeadTestCases in interface IKlarosTestRequirement

Get head revisions of covering test cases.

B.1.1.42.6. getCoveringTestCases()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCase> getCoveringTestCases();

Specified by: Method getCoveringTestCases in interface IKlarosTestRequirement

Get test cases covering the test requirement.

B.1.1.42.7. getGroup()
public de.verit.klaros.core.model.KlarosRequirementGroup getGroup();

Specified by: Method getGroup in interface IKlarosTestRequirement

Get requirement group.

B.1.1.42.8. getPredecessor()
public de.verit.klaros.core.model.KlarosTestRequirement getPredecessor();

B.1.1.42.9. getRoot()
public de.verit.klaros.core.model.KlarosTestRequirement getRoot();

B.1.1.42.10. getSuccessor()
public de.verit.klaros.core.model.KlarosTestRequirement getSuccessor();

B.1.1.42.11. getTrunkRoot()
public de.verit.klaros.core.model.KlarosTestRequirement getTrunkRoot();

B.1.1.43. Class KlarosTestRun
This class provides access to the information stored for test runs.

B.1.1.43.1. Synopsis
public final class de.verit.klaros.core.model.KlarosTestRun
    extends, de.verit.klaros.core.model.KlarosNamedEntity<de.verit.klaros.core.model.KlarosTestRun, de.verit.klaros.core.persistent.TestRu
    
    implements, de.verit.klaros.core.model.IKlarosTestRun, java.lang.Comparable<de.verit.klaros.core.model.KlarosTestRun> {
// Public Methods

public int compareTo(de.verit.klaros.core.model.KlarosTestRun o);

public boolean equals(java.lang.Object o);

184
Model API Reference

public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

public java.lang.String getCreator();

public de.verit.klaros.core.model.KlarosTestEnvironment getEnv();

public int getNumberErrors();

public int getNumberFailures();

public int getNumberPassed();

public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseResult> getResults();

public java.lang.String getRunId();

public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

public java.util.Date getTimestamp();

public java.lang.String getTimestampString(java.util.Locale locale);

public int hashCode();

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.43.2. compareTo(KlarosTestRun)
public int compareTo(de.verit.klaros.core.model.KlarosTestRun o);

B.1.1.43.3. equals(Object)
public boolean equals(java.lang.Object o);

B.1.1.43.4. getConfiguration()
public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

Specified by: Method getConfiguration in interface IKlarosTestRun

Get configuration.

185
Model API Reference

See Also getConfiguration()

B.1.1.43.5. getCreator()
public java.lang.String getCreator();

Specified by: Method getCreator in interface IKlarosTestRun

B.1.1.43.6. getEnv()
public de.verit.klaros.core.model.KlarosTestEnvironment getEnv();

Specified by: Method getEnv in interface IKlarosTestRun

Get test environment.

See Also getEnv()

B.1.1.43.7. getNumberErrors()
public int getNumberErrors();

Specified by: Method getNumberErrors in interface IKlarosTestRun

Get number of test cases with errors of this test run. It is assumed, that error results have a property
'type' with the value 'E'.

See Also getNumberErrors()

B.1.1.43.8. getNumberFailures()
public int getNumberFailures();

Specified by: Method getNumberFailures in interface IKlarosTestRun

Get number of failed test cases of this test run. It is assumed, that failed results have a property 'type'
with the value 'F'.

See Also getNumberFailures()

B.1.1.43.9. getNumberPassed()
public int getNumberPassed();

Specified by: Method getNumberPassed in interface IKlarosTestRun

Get number of passed test cases of this test run. It is assumed, that passed results have a property 'test-
CasePassed' with value 'true'.

See Also getNumberPassed()

B.1.1.43.10. getResults()
public java.util.Set<de.verit.klaros.core.model.KlarosTestCaseResult> getResults();

Specified by: Method getResults in interface IKlarosTestRun

Get results.

See Also getResults()

B.1.1.43.11. getRunId()
public java.lang.String getRunId();

Specified by: Method getRunId in interface IKlarosTestRun

186
Model API Reference

Get id of test run.

See Also getRunId()

B.1.1.43.12. getSut()
public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

Specified by: Method getSut in interface IKlarosTestRun

Get the tested system version.

See Also getSut()

B.1.1.43.13. getTimestamp()
public java.util.Date getTimestamp();

Specified by: Method getTimestamp in interface IKlarosTestRun

Get timestamp.

See Also getTimestamp()

B.1.1.43.14. getTimestampString(Locale)
public java.lang.String getTimestampString(java.util.Locale locale);

Specified by: Method getTimestampString in interface IKlarosTestRun

Formats the timestamp and returns it as string.

See Also getTimestampString(java.util.Locale)

B.1.1.43.15. hashCode()
public int hashCode();

B.1.1.44. Class KlarosTestSuite
This class provides access to the information stored for test suites.

B.1.1.44.1. Synopsis
public final class de.verit.klaros.core.model.KlarosTestSuite
    extends, de.verit.klaros.core.model.KlarosRevision<de.verit.klaros.core.model.KlarosTestSuite, de.verit.klaros.core.persistent.TestSui
    
    implements, de.verit.klaros.core.model.IKlarosTestSuite {
// Public Methods

public de.verit.klaros.core.model.KlarosUser getAssignee();

public de.verit.klaros.core.model.KlarosTestSuite getBranchRoot();

public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getBranches();

public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

public java.util.Date getCreated();

public de.verit.klaros.core.model.KlarosUser getCreator();

public de.verit.klaros.core.model.KlarosUser getLastEditor();

public java.util.Date getLastUpdated();

public java.lang.String getName();

public de.verit.klaros.core.model.KlarosTestSuite getPredecessor();

187
Model API Reference

public java.util.List<de.verit.klaros.core.model.KlarosTestSuiteResult> getResults();

public de.verit.klaros.core.model.KlarosTestSuite getRoot();

public java.lang.String getShortname();

public de.verit.klaros.core.model.KlarosTestSuite getSuccessor();

public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

public java.util.List<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

public java.lang.Integer getTestSuiteResultCount();

public de.verit.klaros.core.model.KlarosTestSuite getTrunkRoot();

public java.lang.Boolean isEnabled();

Methods inherited from de.verit.klaros.core.model.KlarosRevision: getRevisionComment ,


getRevisionId , getTags

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.44.2. getAssignee()
public de.verit.klaros.core.model.KlarosUser getAssignee();

Specified by: Method getAssignee in interface IKlarosTestSuite

B.1.1.44.3. getBranches()
public java.util.Set<de.verit.klaros.core.model.KlarosRevision<?, ?>> getBranches();

188
Model API Reference

B.1.1.44.4. getBranchRoot()
public de.verit.klaros.core.model.KlarosTestSuite getBranchRoot();

B.1.1.44.5. getConfiguration()
public de.verit.klaros.core.model.KlarosConfiguration getConfiguration();

Specified by: Method getConfiguration in interface IKlarosTestSuite

Get configuration.

See Also getConfiguration()

B.1.1.44.6. getCreated()
public java.util.Date getCreated();

Specified by: Method getCreated in interface IKlarosTestSuite

B.1.1.44.7. getCreator()
public de.verit.klaros.core.model.KlarosUser getCreator();

Specified by: Method getCreator in interface IKlarosTestSuite

B.1.1.44.8. getLastEditor()
public de.verit.klaros.core.model.KlarosUser getLastEditor();

Specified by: Method getLastEditor in interface IKlarosTestSuite

B.1.1.44.9. getLastUpdated()
public java.util.Date getLastUpdated();

Specified by: Method getLastUpdated in interface IKlarosTestSuite

B.1.1.44.10. getName()
public java.lang.String getName();

Specified by: Method getName in interface IKlarosTestSuite

Get the name of the test suite.

B.1.1.44.11. getPredecessor()
public de.verit.klaros.core.model.KlarosTestSuite getPredecessor();

B.1.1.44.12. getResults()
public java.util.List<de.verit.klaros.core.model.KlarosTestSuiteResult> getResults();

Specified by: Method getResults in interface IKlarosTestSuite

Get test suite results.

B.1.1.44.13. getRoot()
public de.verit.klaros.core.model.KlarosTestSuite getRoot();

B.1.1.44.14. getShortname()
public java.lang.String getShortname();

Specified by: Method getShortname in interface IKlarosTestSuite

Get the short name of the test suite.

189
Model API Reference

B.1.1.44.15. getSuccessor()
public de.verit.klaros.core.model.KlarosTestSuite getSuccessor();

B.1.1.44.16. getSut()
public de.verit.klaros.core.model.KlarosSUTImplementation getSut();

Specified by: Method getSut in interface IKlarosTestSuite

Get the description of the test suite.

B.1.1.44.17. getTestCases()
public java.util.List<de.verit.klaros.core.model.KlarosTestCase> getTestCases();

Specified by: Method getTestCases in interface IKlarosTestSuite

Get the executables of this test suite.

B.1.1.44.18. getTestSuiteResultCount()
public java.lang.Integer getTestSuiteResultCount();

Specified by: Method getTestSuiteResultCount in interface IKlarosTestSuite

Return the number of test suite results in this test suite.

B.1.1.44.19. getTrunkRoot()
public de.verit.klaros.core.model.KlarosTestSuite getTrunkRoot();

B.1.1.44.20. isEnabled()
public java.lang.Boolean isEnabled();

Specified by: Method isEnabled in interface IKlarosTestSuite

Returns the value of the enabled flag of this test suite.

B.1.1.45. Class KlarosTestSuiteResult
This class provides access to the information stored for test series results.

B.1.1.45.1. Synopsis
public final class de.verit.klaros.core.model.KlarosTestSuiteResult
    extends, de.verit.klaros.core.model.KlarosNamedEntity<de.verit.klaros.core.model.KlarosTestSuiteResult, de.verit.klaros.core.persisten
    
    implements, de.verit.klaros.core.model.IKlarosTestSuiteResult, java.lang.Comparable<de.verit.klaros.core.model.KlarosTestSuiteResult> 
// Public Methods

public int compareTo(de.verit.klaros.core.model.KlarosTestSuiteResult o);

public boolean equals(java.lang.Object o);

public de.verit.klaros.core.model.KlarosTestRun getTestRun();

public de.verit.klaros.core.model.KlarosTestSuite getTestSuite();

public int hashCode();

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

190
Model API Reference

B.1.1.45.2. compareTo(KlarosTestSuiteResult)
public int compareTo(de.verit.klaros.core.model.KlarosTestSuiteResult o);

B.1.1.45.3. equals(Object)
public boolean equals(java.lang.Object o);

B.1.1.45.4. getTestRun()
public de.verit.klaros.core.model.KlarosTestRun getTestRun();

Specified by: Method getTestRun in interface IKlarosTestSuiteResult

Get the test run for this result.

B.1.1.45.5. getTestSuite()
public de.verit.klaros.core.model.KlarosTestSuite getTestSuite();

Specified by: Method getTestSuite in interface IKlarosTestSuiteResult

Get test case.

B.1.1.45.6. hashCode()
public int hashCode();

B.1.1.46. Class KlarosUser
The user object.

B.1.1.46.1. Synopsis
public class de.verit.klaros.core.model.KlarosUser
    extends, de.verit.klaros.core.model.KlarosNamedEntity<de.verit.klaros.core.model.KlarosUser, de.verit.klaros.core.persistent.User>
    
    implements, de.verit.klaros.core.model.IKlarosUser, java.lang.Comparable<de.verit.klaros.core.model.KlarosUser> {
// Public Methods

public int compareTo(de.verit.klaros.core.model.KlarosUser o);

public boolean equals(java.lang.Object o);

public java.lang.String getEmail();

public java.lang.String getRole();

public java.lang.String getUsername();

public int hashCode();

191
Model API Reference

Methods inherited from de.verit.klaros.core.model.KlarosNamedEntity: contains , get ,


getDynaClass , getName , getPropertyAsList , getPropertyAsString , isDe-
finedProperty , remove , set

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.1.1.46.2. compareTo(KlarosUser)
public int compareTo(de.verit.klaros.core.model.KlarosUser o);

Compares the 2 objects by the compare function of its child class.

B.1.1.46.3. equals(Object)
public boolean equals(java.lang.Object o);

B.1.1.46.4. getEmail()
public java.lang.String getEmail();

Specified by: Method getEmail in interface IKlarosUser

The email address of this user.

B.1.1.46.5. getRole()
public java.lang.String getRole();

Specified by: Method getRole in interface IKlarosUser

The role name of this user.

B.1.1.46.6. getUsername()
public java.lang.String getUsername();

Specified by: Method getUsername in interface IKlarosUser

The user name of this user as used when logging in.

B.1.1.46.7. hashCode()
public int hashCode();

192
Model API Reference

B.2. Klaros-Scripting API Reference

B.2. Klaros-Scripting API Reference


B.2.1. Package de.verit.klaros.scripting
B.2.1.1. Class KlarosContext
Context to provide all methods to the user to add own objects to the event context.

B.2.1.1.1. Synopsis
public class de.verit.klaros.scripting.KlarosContext
    implements, de.verit.klaros.scripting.context.IKlarosContext {
// Public Constructors

public KlarosContext(org.jboss.seam.contexts.Context context,
                       de.verit.klaros.core.persistent.Configuration activeProject,
                       de.verit.klaros.scripting.KlarosQueryFactory factory,
                       de.verit.klaros.scripting.model.ParameterContext parameters);

public KlarosContext(org.jboss.seam.contexts.Context context,
                       de.verit.klaros.core.persistent.Configuration activeProject,
                       de.verit.klaros.scripting.KlarosQueryFactory factory,
                       de.verit.klaros.scripting.model.ParameterContext parameters,
                       java.util.Locale locale);

// Public Methods

public void add(java.lang.String name,
                  java.lang.Object value);

public void addParameter(java.lang.String name,
                           java.lang.Object value,
                           de.verit.klaros.scripting.model.ParameterType type);

public java.util.List<?> executeParameterizedQuery(java.lang.String query)
    throws OdaException, HibernateException;

public java.util.List<?> executeQuery(java.lang.String query)
    throws OdaException, HibernateException;

public de.verit.klaros.core.model.KlarosConfiguration getActiveProject();

public java.util.Locale getLocale();

public de.verit.klaros.contentrepository.parameter.Parameter getParameter(java.lang.String name);

public java.lang.Object getParameterValue(java.lang.String name);

public void setLocale(java.util.Locale locale);

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

193
Model API Reference

B.2.1.1.2. KlarosContext(Context, Configuration, KlarosQueryFactory, ParameterCon-


text)
public KlarosContext(org.jboss.seam.contexts.Context context,
                       de.verit.klaros.core.persistent.Configuration activeProject,
                       de.verit.klaros.scripting.KlarosQueryFactory factory,
                       de.verit.klaros.scripting.model.ParameterContext parameters);

Create a KlarosContext.

Parameters
context Gets passed by the relating servlet.
activeProject The active project if available.

B.2.1.1.3. KlarosContext(Context, Configuration, KlarosQueryFactory, ParameterCon-


text, Locale)
public KlarosContext(org.jboss.seam.contexts.Context context,
                       de.verit.klaros.core.persistent.Configuration activeProject,
                       de.verit.klaros.scripting.KlarosQueryFactory factory,
                       de.verit.klaros.scripting.model.ParameterContext parameters,
                       java.util.Locale locale);

Create a KlarosContext.

Parameters
context Gets passed by the relating servlet.
activeProject The active project if available.

B.2.1.1.4. add(String, Object)
public void add(java.lang.String name,
                  java.lang.Object value);

Specified by: Method add in interface IKlarosContext

Add a new object with the give key to the event context.

B.2.1.1.5. addParameter(String, Object, ParameterType)


public void addParameter(java.lang.String name,
                           java.lang.Object value,
                           de.verit.klaros.scripting.model.ParameterType type);

Specified by: Method addParameter in interface IKlarosContext

Add a new parameter to the ParameterContext.

B.2.1.1.6. executeParameterizedQuery(String)
public java.util.List<?> executeParameterizedQuery(java.lang.String query)
    throws OdaException, HibernateException;

Specified by: Method executeParameterizedQuery in interface IKlarosContext

Execute the given query with the Parameters from the ParameterContext.

B.2.1.1.7. executeQuery(String)
public java.util.List<?> executeQuery(java.lang.String query)
    throws OdaException, HibernateException;

Specified by: Method executeQuery in interface IKlarosContext

Execute the given query.

194
Model API Reference

B.2.1.1.8. getActiveProject()
public de.verit.klaros.core.model.KlarosConfiguration getActiveProject();

Specified by: Method getActiveProject in interface IKlarosContext

B.2.1.1.9. getLocale()
public java.util.Locale getLocale();

Specified by: Method getLocale in interface IKlarosContext

B.2.1.1.10. getParameter(String)
public de.verit.klaros.contentrepository.parameter.Parameter getParameter(java.lang.String name);

Specified by: Method getParameter in interface IKlarosContext

B.2.1.1.11. getParameterValue(String)
public java.lang.Object getParameterValue(java.lang.String name);

Specified by: Method getParameterValue in interface IKlarosContext

B.2.1.1.12. setLocale(Locale)
public void setLocale(java.util.Locale locale);

Specified by: Method setLocale in interface IKlarosContext

B.2.1.2. Class KlarosQueryFactory
This factory lets the user execute queries.

B.2.1.2.1. Synopsis
public class de.verit.klaros.scripting.KlarosQueryFactory {
// Public Constructors

public KlarosQueryFactory(org.hibernate.Session session);

// Public Methods

public java.util.List<?> execute(java.lang.String query)
    throws OdaException, HibernateException;

public java.util.List<?> execute(java.lang.String query,
                                       de.verit.klaros.scripting.model.ParameterContext params)
    throws OdaException, HibernateException;

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

195
Model API Reference

B.2.1.2.2. KlarosQueryFactory(Session)
public KlarosQueryFactory(org.hibernate.Session session);

Create a KlarosQueryFactory.

Parameters
session Passed by the relating servlet.

B.2.1.2.3. execute(String)
public java.util.List<?> execute(java.lang.String query)
    throws OdaException, HibernateException;

Prepares the given query string with our KlarosOdaQuery and returns a KlarosList with the results of the
query.

B.2.1.2.4. execute(String, ParameterContext)
public java.util.List<?> execute(java.lang.String query,
                                       de.verit.klaros.scripting.model.ParameterContext params)
    throws OdaException, HibernateException;

Prepares the given query string with our KlarosOdaQuery and returns a KlarosList with the results of the
query.

B.2.1.3. Interface KlarosScript
Public interface that all seam-pdf template scripts must implement be work properly.

B.2.1.3.1. Synopsis
public interface de.verit.klaros.scripting.KlarosScript {
// Public Methods

public void execute(de.verit.klaros.scripting.KlarosContext context);

B.2.1.3.2. execute(KlarosContext)
public void execute(de.verit.klaros.scripting.KlarosContext context);

This functions gets called by the seam-pdf servlet to execute the script.

Parameters
context The event context to provide all needed functions, properties and objects.

B.2.2. Package de.verit.klaros.scripting.context
B.2.2.1. Interface IKlarosContext
public interface de.verit.klaros.scripting.context.IKlarosContext {
// Public Methods

public void add(java.lang.String name,
                  java.lang.Object value);

196
Model API Reference

public void addParameter(java.lang.String name,
                           java.lang.Object value,
                           de.verit.klaros.scripting.model.ParameterType type);

public java.util.List<?> executeParameterizedQuery(java.lang.String query)
    throws OdaException, HibernateException;

public java.util.List<?> executeQuery(java.lang.String query)
    throws OdaException, HibernateException;

public de.verit.klaros.core.model.KlarosConfiguration getActiveProject();

public java.util.Locale getLocale();

public de.verit.klaros.contentrepository.parameter.Parameter getParameter(java.lang.String name);

public java.lang.Object getParameterValue(java.lang.String name);

public void setLocale(java.util.Locale locale);

B.2.2.1.1. add(String, Object)
public void add(java.lang.String name,
                  java.lang.Object value);

Add a new object with the give key to the event context.

B.2.2.1.2. addParameter(String, Object, ParameterType)


public void addParameter(java.lang.String name,
                           java.lang.Object value,
                           de.verit.klaros.scripting.model.ParameterType type);

Add a new parameter to the ParameterContext.

Parameters
name The name of the parameter.
value The value of the parameter.
type The type of the parameter. E.g.: String, Integer or Date

B.2.2.1.3. executeParameterizedQuery(String)
public java.util.List<?> executeParameterizedQuery(java.lang.String query)
    throws OdaException, HibernateException;

Execute the given query with the Parameters from the ParameterContext.

Parameters
query The HQL query to execute.
return A KlarosList of the selected objects.

B.2.2.1.4. executeQuery(String)
public java.util.List<?> executeQuery(java.lang.String query)
    throws OdaException, HibernateException;

197
Model API Reference

Execute the given query.

Parameters
query The HQL query to execute.
return A KlarosList of the selected objects.

B.2.2.1.5. getActiveProject()
public de.verit.klaros.core.model.KlarosConfiguration getActiveProject();

Parameters
return Returns the activeProject.

B.2.2.1.6. getLocale()
public java.util.Locale getLocale();

Parameters
return Returns the locale.

B.2.2.1.7. getParameter(String)
public de.verit.klaros.contentrepository.parameter.Parameter getParameter(java.lang.String name);

Parameters
return The parameter with the given name or null.

B.2.2.1.8. getParameterValue(String)
public java.lang.Object getParameterValue(java.lang.String name);

Parameters
return The value of the parameter with the given name or null.

B.2.2.1.9. setLocale(Locale)
public void setLocale(java.util.Locale locale);

Parameters
locale The locale to set.

B.2.3. Package de.verit.klaros.scripting.model
B.2.3.1. Class ParameterContext
The context for script parameters.

B.2.3.1.1. Synopsis
public class de.verit.klaros.scripting.model.ParameterContext {
// Public Constructors

public ParameterContext();

public ParameterContext(java.util.List<de.verit.klaros.contentrepository.parameter.Parameter> parameters);

public ParameterContext(java.util.Map<java.lang.String, de.verit.klaros.contentrepository.parameter.Parameter> parameters);

// Public Methods

198
Model API Reference

public java.util.Map<java.lang.String, de.verit.klaros.contentrepository.parameter.Parameter> getParameters();

Methods inherited from java.lang.Object: equals , getClass , hashCode , notify , notifyAll


, toString , wait

B.2.3.1.2. ParameterContext()
public ParameterContext();

Create a ParameterContext.

B.2.3.1.3. ParameterContext(List<Parameter>)
public ParameterContext(java.util.List<de.verit.klaros.contentrepository.parameter.Parameter> parameters);

Create a ParameterContext.

B.2.3.1.4. ParameterContext(Map<String, Parameter>)
public ParameterContext(java.util.Map<java.lang.String, de.verit.klaros.contentrepository.parameter.Parameter> parameters);

Create a ParameterContext.

B.2.3.1.5. getParameters()
public java.util.Map<java.lang.String, de.verit.klaros.contentrepository.parameter.Parameter> getParameters();

Parameters
return Returns the map parameters.

B.2.3.2. Class ParameterType
The supported types of parameters.

B.2.3.2.1. Synopsis
public final class de.verit.klaros.scripting.model.ParameterType
    extends, java.lang.Enum<de.verit.klaros.scripting.model.ParameterType> {
// Public Static Fields// Public Static Methods

public static de.verit.klaros.scripting.model.ParameterType valueOf(java.lang.String name);

public static de.verit.klaros.scripting.model.ParameterType[] values();

Methods inherited from java.lang.Enum: compareTo , equals , getDeclaringClass , hash-


Code , name , ordinal , toString , valueOf

Methods inherited from java.lang.Object: getClass , notify , notifyAll , wait

199
Model API Reference

200
Appendix C. Dump File Specification
The following table gives a short overview of the available elements:

Name Description
<action> The action to be taken in a test case step.
<areatopic> The area of this test case.
<classInitialStates> This element contains the initial class state definitions.
<classStates> The condition of the classStates.
<configuration> A project configuration.
<configurationList> The container element for project configurations.
<configurationRefer- A reference to a configuration.
ence>
<created> The creation date of this object.
<currentTestCase> The current test case being executed.
<cut> The cut of this test case.
<depends> The dependency of this test case.
<description> The description.
<docbase> The docbase of this test case.
<enabled> Enabled or disabled state of this object.
<entry> An initial class state entry.
<env> The test environment reference.
<envs> The environment section contains a test environment definition.
<equiclass> The equivalence class of this test case.
<evalutation> The evalutation of this test case.
<executables> A wrapper for a list of test executables (either test cases or test series).
<execution> The execution type of this test case.
<executionTime> The execution time in ms.
<expectedResult> The expected result.
<generalState> The general state.
<inputvalues> The input values of this test case.
<key> The initial class state entry key contains the name of the affected class.
<lastUpdated> The date of last update.
<level> The level of this test case.
<mandatory> Indicates that the property is mandatory.
<mappedToClass> Associated classes.
<message> The message of this test case.
<method> The method of this test case.
<name> The name of this element.
<namedEntity> The related named entity.
<note> The note of this test case.
<postcondition> The postcondition / expected result.
<precondition> The precondition.

201
Dump File Specification

Name Description
<predecessor> The predecessor of this version.
<priority> The priority of this test case.
<productversion> The version string of the SUT.
<propertiesOwner> Container for generic properties.
<propertyDefRefer- A reference to a property definition.
ence>
<propertyDefs> The property Definitions.
<propertyDisplay- The displayable name of a property.
Name>
<propertyName> The name of a property.
<propertyOwnerRef- A reference to a property owner.
erence>
<propertyType> The type of a property.
<propertyValue> The value of a property.
<relatedClass> The related class of this class state.
<resultReference> The related result identifiers.
<revisionComment> The comment added while creating this revision.
<revisionId> The revision id.
<root> The root revision of this object.
<runId> The test run id.
<running> Indicates that execution has not yet finished.
<shortname> The short name / code of this object.
<sortorder> The sort order of this property when displayed.
<stringProperties> Container for string properties.
<successor> The successor of this version.
<summary> The overview.
<sut> A reference to a system under test.
<suts> A SUT (System Under Test) represents a particular version of the product that
is tested.
<team> The team this test case is assigned to.
<testCasePassed> True, if this test case has passed, false otherwise.
<testCaseReference> A reference to a test case.
<testCaseResults> After a test case has been executed, it has a result.
<testCases> The test case is the basic unit of a test process.
<testCaseStep> A test case step.
<testCaseStepFrag- A reference to a test case step fragment.
ments>
<testMethod> The test method of this test case.
<testResultRefer- A reference to a test series result.
ence>
<testRunReference> A reference to a test run.
<testRuns> The test runs.

202
Dump File Specification

Name Description
<testSuitePassed> Indicates the successful completion of the series.
<testSuiteReference> A reference to a test series.
<testSuiteResult> The result of the test series.
<testSuiteResultRefer- A reference to a test series result.
ence>
<testSuites> The respective testSuite.
<timestamp> The time stamp where this completed.
<traceability> The traceability of this test case.
<trunkRoot> The trunk root revision of this object.
<type> The result type (P=passed, E=error, F=failed, S=skipped).
<userDefined> User defined.
<uuid> The uuid identifies a persistent object.
<value> The initial class state entry value describes the inital state value.
<variety> The variety of this test case.

Table C.1.  Element summary

203
Dump File Specification

C.1. <configurationList>
The container element for project configurations.

Name Multiplicity Description


<configuration> 0..n The contained project configurations.

Table C.2. <configurationList> elements

C.2. <configuration>
A project configuration.

This element may occur inside the following elements: <configurationList> , <value> , <classStates> ,
<envs> , <suts> , <testCases> , <testRuns> , <testSuites> , <testCaseResults> , <testSuiteResult> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.3. <configuration> attributes

Name Multiplicity Description


<classInitialStates> 0..n Classes can be associated with state models. The
initial state is defined here.
<classStates> 0..n The possible states of the entities.
<description> 0..n A detailed description of the configuration.
<enabled> 0..n Enabled or disabled.
<envs> 0..n The environment section contains a test environ-
ment definition.
<name> 0..n The name.
<propertyDefs> 0..n The property definitions.
<suts> 0..n A SUT (System Under Test) represents a particular
version of the product that is tested.
<testCases> 0..n The test case is the basic unit of a test process. A
test case can be related to many environments
and SUTs thus it is possible to execute the same
test cases and get different results.
<testRuns> 0..n The test runs.
<testSuites> 0..n The respective test series.

Table C.4. <configuration> elements

C.3. <classInitialStates>
This element contains the initial class state definitions.

This element may occur inside the following elements: <configuration> .

C.4. <value>
The initial class state entry value describes the inital state value.

204
Dump File Specification

This element may occur inside the following elements: <entry> , <stringProperties> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.5. <value> attributes

Name Multiplicity Description


<configurationReference> 1..1 A reference to the configuration the initial state
value belongs to.
<name> 1..1 The name of the state.
<relatedClass> 1..1 The related class of this initial state.

Table C.6. <value> elements

C.5. <entry>
An initial class state entry.

Name Multiplicity Description


<key> 1..1 The entry key.
<value> 1..1 The entry value.

Table C.7. <entry> elements

C.6. <classStates>
The condition of the classStates.

This element may occur inside the following elements: <configuration> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.8. <classStates> attributes

Name Multiplicity Description


<configurationReference> 1..1 The configuration of the classStates.
<name> 1..1 The name of the classStates.
<relatedClass> 1..1 The related classes of the classStates.

Table C.9. <classStates> elements

C.7. <envs>
The environment section contains a test environment definition.

This element may occur inside the following elements: <configuration> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.10. <envs> attributes

205
Dump File Specification

Name Multiplicity Description


<configurationReference> 1..1 The configuration this environment belongs to.
<created> 1..1 The creation date.
<enabled> 1..1 Enabled or disabled.
<lastUpdated> 1..1 The date of the last update.
<name> 1..1 The name of the environment.
<propertiesOwner> 1..1 The container of the related generic properties.
<shortname> 1..1 The description for the environment.

Table C.11. <envs> elements

C.8. <propertyDefs>
The property Definitions.

This element may occur inside the following elements: <configuration> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.12. <propertyDefs> attributes

Name Multiplicity Description


<mandatory> 1..1 A mandatory flag.
<mappedToClass> 1..1 The different classes of assignment.
<propertyDisplayName> 0..1 The externally visible name of the property.
<propertyName> 1..1 The internal name of the property.
<propertyType> 1..1 The type of the property.
<sortorder> 1..1 The sort order of the property when displayed.
<userDefined> 1..1 Indicates that this property is user defined.

Table C.13. <propertyDefs> elements

C.9. <suts>
A SUT (System Under Test) represents a particular version of the product that is tested.

This element may occur inside the following elements: <configuration> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.14. <suts> attributes

Name Multiplicity Description


<configurationReference> 1..1 The configuration this SUT belongs to.
<created> 1..1 The creation date.
<enabled> 1..1 Enabled or disabled.
<lastUpdated> 1..1 The date of the last update.

206
Dump File Specification

Name Multiplicity Description


<name> 1..1 The name of the SUTs (System Under Test).
<propertiesOwner> 1..1 The container of the related generic properties.
<productversion> 1..1 The version of the SUT.

Table C.15. <suts> elements

C.10. <testCases>
The test case is the basic unit of a test process. A test case can be related to many environments and
SUTs (System Under Test) thus it is possible to execute the same test cases and get different results.

This element may occur inside the following elements: <configuration> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.16. <testCases> attributes

Name Multiplicity Description


<configurationReference> 1..1 The configuration of the test cases.
<created> 1..1 The creation date.
<enabled> 1..1 Enabled or disabled.
<lastUpdated> 1..1 The date of the last update.
<name> 1..1 The name of the test cases.
<propertiesOwner> 1..1 The container of the related generic properties.
<revisionComment> 1..1 Improvement of the comment.
<revisionId> 1..1 Improvement of the ID.
<areatopic> 0..1 The area of this test case.
<depends> 0..1 Dependence.
<description> 1..1 The description of the testCase.
<docbase> 1..1 The docbase.
<evalutation> 1..1 The evaluation of testCase.
<execution> 0..1 The execution.
<expectedResult> 0..1 The expected result of this test case.
<generalState> 1..1 The general state of the testCase.
<level> 0..1 The level of the testCase.
<note> 1..1 Notes about the test case.
<postcondition> 1..1 The postcondition after running the test.
<precondition> 1..1 The precondition before running the test.
<predecessor> 0..1 The predecessor of this test case version.
<priority> 0..1 The priority of the test case.
<testCaseResults> 0..n The test case results.
<root> 0..1 The root revision of the test case.
<shortname> 1..1 The short name of this test.
<successor> 0..1 The successor revision of this test case.

207
Dump File Specification

Name Multiplicity Description


<team> 1..1 The team entry of the test case.
<testCaseStepFragments> 0..n References to the test case step fragments of the
test case.
<testMethod> 0..1 The test method entry of the test case.
<testSuiteReference> 0..n A reference to the test series this test case be-
longs to.
<traceability> 1..1 The traceability entry of the test case.
<trunkRoot> 1..1 Root of the current state.
<variety> 0..1 The variety entry of the test case.

Table C.17. <testCases> elements

C.11. <testRuns>
The test runs.

This element may occur inside the following elements: <configuration> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.18. <testRuns> attributes

Name Multiplicity Description


<configurationReference> 1..1 The configuration of testRun.
<created> 1..1 The creation date.
<enabled> 1..1 Enabled or Disabled.
<lastUpdated> 1..1 The date of the last update.
<name> 1..1 The name of the test cases.
<propertiesOwner> 1..1 The container of the related generic properties.
<env> 1..1 The environment.
<testResultReference> 0..n Includes the results.
<runId> 1..1 The run ID.
<sut> 1..1 A SUT (System Under Test) represents a particular
version of the product that is tested.
<timestamp> 1..1 Timestamp of the testRun.

Table C.19. <testRuns> elements

C.12. <testSuites>
The respective testSuite.

This element may occur inside the following elements: <configuration> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.20. <testSuites> attributes

208
Dump File Specification

Name Multiplicity Description


<configurationReference> 1..1 The configuration of the testSuite.
<created> 1..1 The creation date.
<enabled> 1..1 Enabled or disabled.
<lastUpdated> 1..1 The date of the last update.
<name> 1..1 The name of the testSuite.
<propertiesOwner> 1..1 The container of the related generic properties.
<revisionComment> 1..1 Improvement of the comment.
<revisionId> 1..1 Improvement of the ID.
<executables> 0..n The list of executable references for this test se-
ries.
<predecessor> 0..1 The predecessor of this test suite version.
<testSuiteResult> 0..n The test series results.
<root> 1..1 The root revision of this test series.
<shortname> 1..1 The description of the testSuite.
<successor> 0..1 The successor of this test suite version.
<sut> 0..1 A SUT (System Under Test) represents a particular
version of the product that is being tested.
<trunkRoot> 1..1 The trunk root revision of this test series.

Table C.21. <testSuites> elements

C.13. <executables>
A wrapper for a list of test executables (either test cases or test series).

This element may occur inside the following elements: <testSuites> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.22. <executables> attributes

Name Multiplicity Description


<testCaseReference> 0..n The test cases contained in this list of executable
items.

Table C.23. <executables> elements

C.14. <testCaseStepFragments>
A reference to a test case step fragment.

This element may occur inside the following elements: <testCases> .

Name Multiplicity Description


<testCaseStep> 0..n FIXME This is nonsense.

Table C.24. <testCaseStepFragments> elements

209
Dump File Specification

C.15. <testCaseStep>
A test case step.

This element may occur inside the following elements: <testCaseStepFragments> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.25. <testCaseStep> attributes

Name Multiplicity Description


<action> 1..1 The action to be taken in this step.
<postcondition> 1..1 The postcondition for this test case step.
<precondition> 1..1 The precondition for this test case step.

Table C.26. <testCaseStep> elements

C.16. <testCaseResults>
After a test case has been executed, it has a result. This result can be Passed, Failed, Error or Skipped.
Klaros- Testmanagement enriches this information with the combination of system under test and en-
vironment for which the test has been run and many other information that can be gathered from the
execution, like time of execution of the test.

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.27. <testCaseResults> attributes

Name Multiplicity Description


<configurationReference> 1..1 The configuration this result belongs to.
<created> 1..1 The creation date.
<enabled> 1..1 Enabled or disabled.
<lastUpdated> 1..1 The date of the last update.
<name> 1..1 The name of the result.
<propertiesOwner> 1..1 The container of the related generic properties.
<description> 1..1 The description of the result.
<executionTime> 1..1 The execution time of the test.
<summary> 1..1 The summary of the result.
<testRunReference> 1..1 A reference to the test run this result belongs to.
<type> 1..1 The type of result.
<testCaseReference> 1..1 A reference to the test case of this test result.
<testCasePassed> 1..1 Indicates if this test case has passed or not.
<testSuiteResultReference> 0..1 A reference to the test series result this result be-
longs to.

Table C.28. <testCaseResults> elements

210
Dump File Specification

C.17. <testSuiteResult>
The result of the test series.

This element may occur inside the following elements: <testCaseResults> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.29. <testSuiteResult> attributes

Name Multiplicity Description


<configurationReference> 1..1 The configuration this result belongs to.
<created> 1..1 The creation date.
<enabled> 1..1 Enabled or disabled.
<lastUpdated> 1..1 The date of the last update.
<name> 1..1 The name of the result.
<propertiesOwner> 1..1 The container of the related generic properties.
<executionTime> 1..1 The execution time of the test.
<testRunReference> 1..1 A reference to the test run this result belongs to.
<type> 1..1 The type of result.
<currentTestCase> 0..1 The currently executing test case.
<testResultReference> 0..n The related test results.
<running> 1..1 A test series result describes the result of the exe-
cution of a test series.
<testSuiteReference> 1..1 The related test series.
<testSuitePassed> 1..1 Boolean indicating if this test series has passed.

Table C.30. <testSuiteResult> elements

C.18. <propertiesOwner>
Container for generic properties.

This element may occur inside the following elements: <envs> , <suts> , <testCases> , <testRuns> ,
<testSuites> , <testCaseResults> , <testSuiteResult> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.31. <propertiesOwner> attributes

Name Multiplicity Description


<stringProperties> 0..n  

Table C.32. <propertiesOwner> elements

C.19. <stringProperties>
Container for string properties.

211
Dump File Specification

This element may occur inside the following elements: <propertiesOwner> .

Name Restriction Description


kl__id required The unique identifier of this element.

Table C.33. <stringProperties> attributes

Name Multiplicity Description


<propertyName> 1..1 The internal name of the property.
<propertyOwnerReference> 1..1 The owner of the property definition.
<propertyDefReference> 1..1 The definition of the property.
<sortorder> 1..1 The sort order of the property when displayed.
<propertyValue> 1..1 The value of this property.

Table C.34. <stringProperties> elements

C.20. <relatedClass>
The related class of this class state.

This element may occur inside the following elements: <value> , <classStates> .

C.21. <namedEntity>
The related named entity.

C.22. <resultReference>
The related result identifiers.

C.23. <key>
The initial class state entry key contains the name of the affected class.

This element may occur inside the following elements: <entry> .

C.24. <created>
The creation date of this object.

This element may occur inside the following elements: <envs> , <suts> , <testCases> , <testRuns> ,
<testSuites> , <testCaseResults> , <testSuiteResult> .

C.25. <enabled>
Enabled or disabled state of this object.

This element may occur inside the following elements: <configuration> , <envs> , <suts> , <testCases>
, <testRuns> , <testSuites> , <testCaseResults> , <testSuiteResult> .

C.26. <lastUpdated>
The date of last update.

212
Dump File Specification

This element may occur inside the following elements: <envs> , <suts> , <testCases> , <testRuns> ,
<testSuites> , <testCaseResults> , <testSuiteResult> .

C.27. <description>
The description.

This element may occur inside the following elements: <configuration> , <testCases> , <testCaseRe-
sults> .

C.28. <mappedToClass>
Associated classes.

This element may occur inside the following elements: <propertyDefs> .

C.29. <userDefined>
User defined.

This element may occur inside the following elements: <propertyDefs> .

C.30. <mandatory>
Indicates that the property is mandatory.

This element may occur inside the following elements: <propertyDefs> .

C.31. <productversion>
The version string of the SUT.

This element may occur inside the following elements: <suts> .

C.32. <revisionId>
The revision id.

This element may occur inside the following elements: <testCases> , <testSuites> .

C.33. <root>
The root revision of this object.

This element may occur inside the following elements: <testCases> , <testSuites> .

C.34. <trunkRoot>
The trunk root revision of this object.

This element may occur inside the following elements: <testCases> , <testSuites> .

C.35. <areatopic>
The area of this test case.

213
Dump File Specification

This element may occur inside the following elements: <testCases> .

C.36. <cut>
The cut of this test case.

C.37. <depends>
The dependency of this test case.

This element may occur inside the following elements: <testCases> .

C.38. <docbase>
The docbase of this test case.

This element may occur inside the following elements: <testCases> .

C.39. <equiclass>
The equivalence class of this test case.

C.40. <evalutation>
The evalutation of this test case.

This element may occur inside the following elements: <testCases> .

C.41. <execution>
The execution type of this test case.

This element may occur inside the following elements: <testCases> .

C.42. <expectedResult>
The expected result.

This element may occur inside the following elements: <testCases> .

C.43. <generalState>
The general state.

This element may occur inside the following elements: <testCases> .

C.44. <inputvalues>
The input values of this test case.

C.45. <level>
The level of this test case.

This element may occur inside the following elements: <testCases> .

214
Dump File Specification

C.46. <message>
The message of this test case..

C.47. <method>
The method of this test case..

C.48. <note>
The note of this test case..

This element may occur inside the following elements: <testCases> .

C.49. <postcondition>
The postcondition / expected result.

This element may occur inside the following elements: <testCases> , <testCaseStep> .

C.50. <precondition>
The precondition.

This element may occur inside the following elements: <testCases> , <testCaseStep> .

C.51. <predecessor>
The predecessor of this version.

This element may occur inside the following elements: <testCases> , <testSuites> .

C.52. <priority>
The priority of this test case.

This element may occur inside the following elements: <testCases> .

C.53. <shortname>
The short name / code of this object.

This element may occur inside the following elements: <envs> , <testCases> , <testSuites> .

C.54. <successor>
The successor of this version.

This element may occur inside the following elements: <testCases> , <testSuites> .

C.55. <team>
The team this test case is assigned to.

This element may occur inside the following elements: <testCases> .

215
Dump File Specification

C.56. <traceability>
The traceability of this test case.

This element may occur inside the following elements: <testCases> .

C.57. <variety>
The variety of this test case.

This element may occur inside the following elements: <testCases> .

C.58. <testMethod>
The test method of this test case.

This element may occur inside the following elements: <testCases> .

C.59. <env>
The test environment reference.

This element may occur inside the following elements: <testRuns> .

C.60. <runId>
The test run id.

This element may occur inside the following elements: <testRuns> .

C.61. <sut>
A reference to a system under test.

This element may occur inside the following elements: <testRuns> , <testSuites> .

C.62. <timestamp>
The time stamp where this completed.

This element may occur inside the following elements: <testRuns> .

C.63. <revisionComment>
The comment added while creating this revision.

This element may occur inside the following elements: <testCases> , <testSuites> .

C.64. <summary>
The overview.

This element may occur inside the following elements: <testCaseResults> .

C.65. <type>
The result type (P=passed, E=error, F=failed, S=skipped).

216
Dump File Specification

This element may occur inside the following elements: <testCaseResults> , <testSuiteResult> .

C.66. <running>
Indicates that execution has not yet finished.

This element may occur inside the following elements: <testSuiteResult> .

C.67. <testSuitePassed>
Indicates the successful completion of the series.

This element may occur inside the following elements: <testSuiteResult> .

C.68. <currentTestCase>
The current test case being executed.

This element may occur inside the following elements: <testSuiteResult> .

C.69. <executionTime>
The execution time in ms.

This element may occur inside the following elements: <testCaseResults> , <testSuiteResult> .

C.70. <action>
The action to be taken in a test case step.

This element may occur inside the following elements: <testCaseStep> .

C.71. <propertyType>
The type of a property.

This element may occur inside the following elements: <propertyDefs> .

C.72. <propertyOwnerReference>
A reference to a property owner.

C.73. <propertyDefReference>
A reference to a property definition.

C.74. <testCasePassed>
True, if this test case has passed, false otherwise.

This element may occur inside the following elements: <testCaseResults> .

C.75. <propertyName>
The name of a property.

217
Dump File Specification

This element may occur inside the following elements: <propertyDefs> .

C.76. <propertyValue>
The value of a property.

C.77. <propertyDisplayName>
The displayable name of a property.

This element may occur inside the following elements: <propertyDefs> .

C.78. <sortorder>
The sort order of this property when displayed.

This element may occur inside the following elements: <propertyDefs> , <stringProperties> .

C.79. <configurationReference>
A reference to a configuration.

C.80. <testCaseReference>
A reference to a test case.

C.81. <testRunReference>
A reference to a test run.

C.82. <testResultReference>
A reference to a test series result.

C.83. <testSuiteReference>
A reference to a test series.

C.84. <testSuiteResultReference>
A reference to a test series result.

C.85. <name>
The name of this element.

This element may occur inside the following elements: <configuration> , <value> , <classStates> , <en-
vs> , <suts> , <testCases> , <testRuns> , <testSuites> , <testCaseResults> , <testSuiteResult> , <string-
Properties> .

C.86. <uuid>
The uuid identifies a persistent object.

218
Appendix D. Reporting Resources
D.1. Context Variables
Name Description
date The current date.
locale The current locale set in the web frontend.
activeProject The currently selected KlarosProject. If no project is selected this vari-
able contains the null object.
user A KlarosUser object representing the active user.

Table D.1. Context Variables

The context variables can be accessed via SeamPDF by e.g.

  <p:text value="#{date}" ­/>
  

Note
#{user.name} and #{user.username} contain different values. The first provides the user's
real name, while the latter provides the name the user is logged in with.

D.2. KlarosScript Interface
 
package de.verit.klaros.scripting;
/**
 * Public interface that all seam-pdf template scripts must implement be work
 * properly.
 */
public interface KlarosScript {
    ­/**
     * This functions gets called by the seam-pdf servlet to execute the script.
     *
     * @param context
     *            The event context to provide all needed functions, properties
     *            and objects.
     */
    void execute(KlarosContext context);
}
    

D.3. Example report template


<p:document xmlns:ui="http://java.sun.com/jsf/facelets"
  xmlns:f="http://java.sun.com/jsf/core" xmlns:p="http://jboss.com/products/seam/pdf"
  title="Klaros-Testmanagement Test Plan Report" marginMirroring="true"
  author="#{user.name}" creator="#{user.name}" pageSize="A4">
  <f:facet name="header">
    <p:font size="8">
      <p:header borderWidthBottom="0.1" borderColorBottom="black" borderWidthTop="0" alignment="center">
        <p:text value="Example report ­- generated #{date} by #{user.name}"/>
      </p:header>
      <p:footer borderWidthTop="0.1" borderColorTop="black" borderWidthBottom="0" alignment="center">
        <p:text value="Page ­" ­/>
        <p:pageNumber ­/>
      </p:footer>
    </p:font>
  </f:facet>
  <!-- print the frontpage ­-->
  <p:paragraph alignment="center" spacingAfter="100">
    <p:text value="" ­/>
  </p:paragraph>
  <p:font style="bold" size="32">
    <p:paragraph alignment="center" spacingAfter="75">
      <p:text value="Test Case Report" ­/>
    </p:paragraph>
  </p:font>
  <p:font style="normal" size="12">
    <p:paragraph alignment="center" spacingAfter="5">
      <p:text value="Created by" ­/>
    </p:paragraph>
  </p:font>
  <p:font style="bold" size="16">

219
Reporting Resources

    <p:paragraph alignment="center" spacingAfter="5">
      <p:text value="#{user.name} (#{user.email})"/>
    </p:paragraph>
  </p:font>
  <p:font style="normal" size="12">
    <p:paragraph alignment="center" spacingAfter="5">
      <p:text value="at" ­/>
    </p:paragraph>
  </p:font>
  <p:font style="bold" size="16">
    <p:paragraph alignment="center" spacingAfter="75">
      <p:text value="#{date}" ­/>
    </p:paragraph>
  </p:font>
  <p:newPage/>
  <ui:fragment rendered="#{results ­!= null}">
    <p:font style="normal" size="12">
      <p:paragraph alignment="left" spacingAfter="10">
        <p:text value="The testresults for ­" ­/>
      </p:paragraph>
    </p:font>
  </ui:fragment>
  <!-- Testresult table ­-->
  <p:table columns="4" widths="1 1 3 3">
    <!-- create the headline with bold characters ­-->
    <p:font size="10" style="bold">
      <p:cell horizontalAlignment="center" verticalAlignment="top">
        <p:paragraph>
          <p:text value="Name" ­/>
        </p:paragraph>
      </p:cell>
      <p:cell horizontalAlignment="center" verticalAlignment="top">
        <p:paragraph>
          <p:text value="Result" ­/>
        </p:paragraph>
      </p:cell>
      <p:cell horizontalAlignment="center" verticalAlignment="top">
        <p:paragraph>
          <p:text value="Testrun description" ­/>
        </p:paragraph>
      </p:cell>
      <p:cell horizontalAlignment="center" verticalAlignment="top">
        <p:paragraph>
          <p:text value="Summary" ­/>
        </p:paragraph>
      </p:cell>
    </p:font>
    <!-- display the attributes of the test results ­-->
    <p:font size="8">
      <ui:repeat value="#{results}" var="tcr">
        <p:cell verticalAlignment="top" horizontalAlignment="left">
          <p:paragraph>
            <p:text value="#{tcr.testCase.name}" ­/>
          </p:paragraph>
        </p:cell>
        <!-- decide which color has to be displayed, based on the testresult ­-->
        <ui:fragment rendered="#{tcr.error}">
          <p:cell backgroundColor="rgb(255,0,0)" verticalAlignment="top" horizontalAlignment="center">
            <p:paragraph>
              <p:text value="error" ­/>
            </p:paragraph>
          </p:cell>
        </ui:fragment>
        <ui:fragment rendered="#{tcr.failure}">
          <p:cell backgroundColor="rgb(255,215,0)" verticalAlignment="top" horizontalAlignment="center">
            <p:paragraph>
              <p:text value="failure" ­/>
            </p:paragraph>
          </p:cell>
        </ui:fragment>
        <ui:fragment rendered="#{tcr.passed}">
          <p:cell backgroundColor="rgb(0,255,0)" verticalAlignment="top" horizontalAlignment="center">
            <p:paragraph>
              <p:text value="passed" ­/>
            </p:paragraph>
          </p:cell>
        </ui:fragment>
        <p:cell verticalAlignment="top" horizontalAlignment="left">
          <p:paragraph>
            <p:text value="#{tcr.description}" ­/>
          </p:paragraph>
        </p:cell>
        <p:cell verticalAlignment="top" horizontalAlignment="left">
          <p:paragraph>
            <p:text value="#{tcr.summary}" ­/>
          </p:paragraph>
        </p:cell>
        <!-- print the testcase description below the result row. to differ from the next row use a bigger border 
for the bottom-->
        <p:cell colspan="4" verticalAlignment="top" horizontalAlignment="left" borderWidthBottom="1" 
paddingBottom="3">
          <p:paragraph>
            <p:font size="6" style="bold">
              <p:text value="Testcase description:" ­/>
            </p:font>
            <p:font size="6">

220
Reporting Resources

              <p:text value="#{tcr.testCase.description}" ­/>
            </p:font>
          </p:paragraph>
        </p:cell>
      </ui:repeat>
    </p:font>
  </p:table>
</p:document>
  

221

Anda mungkin juga menyukai