Anda di halaman 1dari 182

TPC Hints and Tips

Update for V4.2.2


IBM Corporation
Tivoli Storage SWAT team
December 6, 2011

Tivoli Storage Productivity Center


Hints and Tips
Updates included in this version (V4.2.2, December 2011):
1. Tivoli Common Reporting Overview (2.2)
2. LSI SMI-S Provider (4.6)
3. Enabling Statistics Logging for Clariion (4.10.6)
4. Register SRA with a new TPC server (5.3.6)
5. DB2 tuning (5.5.2)
6. DB2 install notes (5.5.6)
7. Updated LDAP hints/tips (5.8.2)
8. Changing the TPC server hostname or IP address (5.22)
9. LSI SMI-S provider 10.x commands (10.7)
10. Miscellaneous corrections and clarifications
Updates included in the previous version (V4.2.1, March 2011):
1. Addition of information for V4.2
2. TPC Installer Overview (2.3)
3. Installing the TPC database in a custom location (2.4.4)
4. IBM N-Series (4.8)
5. EMC 4.1 Clariion SMI Provider Configuration (4.10.5)
6. DB2 Database Restore, Tuning (5.5.1, 5.5.2)
7. TIP DE cleanup steps (5.6.2.1, 5.6.2.2)
8. Backing up TPC (5.7)
9. LDAP Basics (5.8.2)
10. TPC GUI and Java (5.16)
11. Cleaning Up TPC Directories (5.17)
12. Switches on Private Networks (5.18)
13. DS8k User Accounts (5.19)
14. Patching TPC (5.21)
15. SSPC Upgrades and TIP (6.1.1)
16. CISCO switch commands (10.10)
17. Show subsections in table of contents
18. Miscellaneous correction and clarifications
Eighth Edition (December 2011)
This edition applies to Version 3, Releases 1, 2, 3, and Version 4 Releases 1, 2 of IBM Tivoli Storage
Productivity Center (product numbers 5608-VC0, 5608-VC1, 5608-VC3, 5608-VC4, 5608-VC6).
This document was updated on December 6, 2011.
Copyright International Business Machines Corporation 2011. All rights reserved.
Note to U.S. Government Users Restricted Rights -- Use, duplication or disclosure restricted by GSA
ADP Schedule Contract with IBM Corp.

December 2011

Tivoli Storage Productivity Center


Hints and Tips

Table of Contents
1 NOTICES.................................................................................................................................................................................9
1.1 LEGAL NOTICE ......................................................................................................................................................................9
1.2 TRADEMARKS......................................................................................................................................................................10
1.3 ACKNOWLEDGEMENT............................................................................................................................................................11
1.3.1 Comments welcome..................................................................................................................................................11
1.4 UPDATES TO TPC HINTS & TIPS..........................................................................................................................................11
1.5 OTHER TPC PUBLICATIONS.................................................................................................................................................. 11
2 INSTALLATION OF TPC...................................................................................................................................................13
2.1 TPC COMPONENT AND PACKAGE OVERVIEW...........................................................................................................................13
2.2 TIVOLI COMMON REPORTING ................................................................................................................................................14
2.3 TPC INSTALLATION PACKAGES AND FIXPACKS........................................................................................................................ 14
2.3.1 Downloading TPC Software.....................................................................................................................................15
2.4 TPC INSTALLER OVERVIEW..................................................................................................................................................17
2.4.1 Install Log Locations................................................................................................................................................17
2.4.2 TPC Installer............................................................................................................................................................17
2.4.2.1 Supported Options.............................................................................................................................................................. 18
2.4.2.2 Log Locations..................................................................................................................................................................... 18
2.4.2.3 Log file descriptions........................................................................................................................................................... 18
2.4.2.4 Debugging Techniques....................................................................................................................................................... 19
2.4.2.5 Known Issues..................................................................................................................................................................... 19
2.4.2.5.1 Disk Spanning (multiple disks).................................................................................................................................. 19
2.4.2.5.2 Log file cleanup on uninstalls..................................................................................................................................... 20
2.4.2.5.3 Local System user created.......................................................................................................................................... 20
2.4.2.5.4 During upgrade versions of failed components are updated in InstallShield ............................................................... 21
2.4.2.5.5 On Linux the installer will not start............................................................................................................................ 22

2.4.3 DbSchema component..............................................................................................................................................22


2.4.3.1 Log Locations..................................................................................................................................................................... 22
2.4.3.2 Log file descriptions........................................................................................................................................................... 22
2.4.3.3 Known Issues..................................................................................................................................................................... 23
2.4.3.3.1 Not sourcing db2profile.............................................................................................................................................. 23
2.4.3.3.2 Wrong DB2 Edition................................................................................................................................................... 23

2.4.4 Data Server Component...........................................................................................................................................23


2.4.4.1 Log locations...................................................................................................................................................................... 23
2.4.4.2 Log file descriptions........................................................................................................................................................... 24
2.4.4.3 Known Issues..................................................................................................................................................................... 24
2.4.4.3.1 TSRMsrv.zip being accessed...................................................................................................................................... 24
2.4.4.3.2 Copy of JRE fails on AIX 6.1..................................................................................................................................... 24

2.4.5 Device Server Component........................................................................................................................................24


2.4.5.1 Log locations...................................................................................................................................................................... 24
2.4.5.2 Log file descriptions........................................................................................................................................................... 25
2.4.5.3 Known Issues..................................................................................................................................................................... 25
2.4.5.3.1 WAS Profile Creation Fails (Corrupt Download Image) ............................................................................................ 25
2.4.5.3.2 Max Heap Space Size................................................................................................................................................. 25
2.4.5.3.3 Device Server will not stop during upgrade................................................................................................................ 26

2.4.6 TIP Installer..............................................................................................................................................................27


2.4.6.1 Log locations...................................................................................................................................................................... 27
2.4.6.2 Log file descriptions........................................................................................................................................................... 27

2.4.7 TPC Replication....................................................................................................................................................... 27


2.4.7.1 Log locations...................................................................................................................................................................... 27
2.4.7.2 Log file descriptions........................................................................................................................................................... 28

2.4.8 Data Agent................................................................................................................................................................28


2.4.8.1 Log locations...................................................................................................................................................................... 28
2.4.8.2 Log file descriptions........................................................................................................................................................... 28
2.4.8.3 Known Issues..................................................................................................................................................................... 28
2.4.8.3.1 Data Agent fails during Common Agent validation.................................................................................................... 28

2.4.9 Fabric Agent.............................................................................................................................................................29


2.4.9.1 Log locations...................................................................................................................................................................... 29
2.4.9.2 Log file descriptions........................................................................................................................................................... 29
2.4.9.3 Known Issues..................................................................................................................................................................... 29

December 2011

Tivoli Storage Productivity Center


Hints and Tips
2.4.9.3.1 Fabric agent not shutting down during upgrade.......................................................................................................... 29

2.4.10 Storage Resource Agent......................................................................................................................................... 29


2.4.10.1 Log Locations................................................................................................................................................................... 29
2.4.10.2 Log file descriptions......................................................................................................................................................... 29
2.4.10.3 Agent Command Line Usage............................................................................................................................................ 30
2.4.10.4 SRA Install Error Codes................................................................................................................................................... 31
2.4.10.5 SRA Uninstall Error Codes.............................................................................................................................................. 33

2.5 INSTALLING TPC.................................................................................................................................................................34


2.5.1 Server Configuration................................................................................................................................................34
2.5.2 Userids and Passwords............................................................................................................................................ 34
2.5.3 Installation Tasks......................................................................................................................................................34
2.5.4 Installing the TPC database in a custom location....................................................................................................37
2.6 UPGRADING TPC ............................................................................................................................................................... 38
2.6.1 Backup Your Existing Environment..........................................................................................................................38
2.6.2 Temporarily Change DB2 Logging To Circular......................................................................................................38
2.6.3 SCHEDULED_UPGRADES File.............................................................................................................................38
2.6.4 Stop TPC Workload..................................................................................................................................................38
3 TPC DATA SOURCE OVERVIEW................................................................................................................................... 40
3.1 DATA SOURCES USED IN TPC...............................................................................................................................................40
3.1.1 SMIS Providers (CIM Agents)..................................................................................................................................40
3.1.2 Data Agents..............................................................................................................................................................41
3.1.3 Storage Resource Agents..........................................................................................................................................42
3.1.4 Inband Fabric agents............................................................................................................................................... 42
3.1.5 Out of Band Fabric (OOBF) Agents........................................................................................................................ 43
3.1.6 Cisco SAN environments.......................................................................................................................................... 43
3.1.6.1 Configuring Cisco MDS9000 switches for Out-of-band communication ........................................................................... 43

3.1.7 TPC Server data sources..........................................................................................................................................45


3.1.8 VMware VI data sources..........................................................................................................................................45
3.2 WHAT SOURCES DO I NEED FOR WHAT FUNCTIONS?...................................................................................................................45
3.2.1 Data Manager - Server Management.......................................................................................................................45
3.2.2 Data Manager Storage Subsystem Asset Management.........................................................................................45
3.2.3 Data Manager for Databases Database Management..........................................................................................45
3.2.4 Disk Manager Reports and Provisioning of LUNs................................................................................................45
3.2.5 Disk Manager - Storage Subsystem Performance management.............................................................................. 46
3.2.6 Fabric Manager Reports and Fabric Zoning........................................................................................................47
3.2.7 Fabric Manager Switch Performance Monitoring................................................................................................48
3.2.8 Tape Manager ........................................................................................................................................................ 49
3.2.9 Replication Manager................................................................................................................................................49
4 SMI-S PROVIDERS INSTALLATION AND CONFIGURATION................................................................................50
4.1 SOME SMI-S TERMINOLOGY................................................................................................................................................50
4.2 GENERAL CONFIGURATION GUIDELINES...................................................................................................................................51
4.2.1 Namespaces for CIM agents ....................................................................................................................................52
4.3 PERFORMANCE DATA COLLECTION USING SMI-S CIM AGENTS................................................................................................53
4.4 IBM DS OPEN API CIM AGENT.........................................................................................................................................54
4.4.1 DS Open API CIM Agent V5.2/V5.3/V5.4.0.............................................................................................................54
4.4.1.1 Enabling the DS Open API V5.2/V5.3/V5.4.0 CIM Agent on the DS8000 HMC.............................................................. 54
4.4.1.2 Installing the DS Open API V5.2/V5.3/V5.4.0 CIM Agent as a proxy agent ..................................................................... 55
4.4.1.3 Setting up the dscimcli utility............................................................................................................................................. 56
4.4.1.4 Configuring the DS Open API V5.2/V5.3/V5.4.0 CIM agent ............................................................................................. 57

4.4.2 DS Open API CIM Agent V5.4.1..............................................................................................................................58


4.4.3 Configure TPC for the DS Open API CIM Agent.....................................................................................................58
4.4.4 Scalability / Best Practices Guidelines for the DS Open API..................................................................................59
4.5 IBM XIV..........................................................................................................................................................................60
4.5.1 Understanding how authentication works in XIV CIM 10.1....................................................................................60
4.5.2 Steps to configure XIV in TPC..................................................................................................................................60
4.6 LSI SMI-S PROVIDER FOR DS4000 DEVICES........................................................................................................................62
4.6.1 Installing the LSI SMI-S Provider............................................................................................................................62
4.6.2 Modifying the LSI SMI-S Provider CIM Agent configuration..................................................................................63

December 2011

Tivoli Storage Productivity Center


Hints and Tips
4.6.2.1 Adding or Removing a device for the CIM agent ............................................................................................................... 63
4.6.2.2 Changing the CIM agent HTTP and HTTPS port............................................................................................................... 64
4.6.2.3 Configuring the CIM Agent on machines with multiple IP addresses................................................................................. 64
4.6.2.4 Enabling authorization for the CIM agent.......................................................................................................................... 64

4.6.3 Configure TPC for the LSI SMI-S provider..............................................................................................................65


4.7 SAN VOLUME CONTROLLER SMI-S CIM AGENT CONFIGURATION ..........................................................................................66
4.7.1 Create a userid for TPC...........................................................................................................................................66
4.7.2 Verify that the SVC CIM agent is running................................................................................................................66
4.7.3 Configure TPC for the SVC CIM Agent...................................................................................................................66
4.7.4 Memory considerations for the SVC 3.1 and 4.1 CIM Agents ................................................................................ 67
4.7.5 SVC embedded CIMOM...........................................................................................................................................67
4.7.6 Disable TPC services when not using TPC on SSPC...............................................................................................68
4.8 IBM N-SERIES...................................................................................................................................................................68
4.9 IBM TAPE LIBRARIES..........................................................................................................................................................73
4.9.1 Tape CIM agents...................................................................................................................................................... 73
4.9.2 Are there any resource issues to be aware of when contemplating a large environment? ......................................73
4.9.3 TPC Supported Tape Libraries................................................................................................................................ 73
4.10 EMC SMI-S PROVIDER CONFIGURATION............................................................................................................................74
4.10.1 Install the EMC SMI-S Provider............................................................................................................................ 74
4.10.2 Configure the EMC SMI-S Provider for Symmetrix devices..................................................................................75
4.10.3 Configure the EMC SMI-S Provider for Clariion devices..................................................................................... 77
4.10.4 Configure TPC for the EMC SMI-S Provider........................................................................................................ 77
4.10.5 EMC 4.1 Clariion SMI Provider Configuration.....................................................................................................78
4.10.6 Enabling Statistics Logging in Clariion.................................................................................................................79
4.11 HDS HICOMMAND SMI-S PROVIDER.................................................................................................................................80
4.11.1 Installing HiCommand Device Manager................................................................................................................80
4.11.2 Configuring HiCommand Device Manager........................................................................................................... 80
4.11.2.1 Add a Device Manager license key................................................................................................................................... 80
4.11.2.2 Change the dispatcher configuration ................................................................................................................................. 81
4.11.2.3 Add Subsystems to Device Manager ................................................................................................................................ 81
4.11.2.4 Enable the SMI-S CIM Agent.......................................................................................................................................... 81

4.11.3 Configure TPC for the HiCommand Device Manager CIM Agent........................................................................81
4.11.4 Adding a HiCommand User Admin for TPC..........................................................................................................82
4.11.5 Problems with HiCommand in TPC.......................................................................................................................88
4.11.6 HDS TagmaStore Universal Storage Platform (USP) Virtualization in TPC........................................................89
4.12 BROCADE SMI-S AGENT...................................................................................................................................................90
4.12.1 Installing Brocade SMI-S Agent.............................................................................................................................90
4.12.2 Configure the Brocade SMI-S Agent......................................................................................................................91
4.12.3 Configure TPC for the Brocade SMI-S Agent........................................................................................................91
4.12.4 Memory and Scalability Considerations for Brocade SMI-S Agent.......................................................................92
4.13 MCDATA OPENCONNECTORS SMI-S INTERFACE..........................................................................................................94
4.13.1 Installing McDATA SMI-S Interface CIM Agent...................................................................................................94
4.13.2 Configuration for Direct Connection method....................................................................................................... 95
4.13.3 Configure TPC for the McDATA SMI-S Provider..................................................................................................96
4.14 CISCO SAN-OS CIM SERVER............................................................................................................................................97
4.14.1 Enable and Configure the Cisco SAN-OS CIM Server.......................................................................................... 97
4.14.2 Configure TPC for the Cisco SAN-OS CIM server................................................................................................97
5 TPC HINTS AND TIPS........................................................................................................................................................98
5.1 TPC GOODIES....................................................................................................................................................................98
5.1.1 Topology Viewer Tip................................................................................................................................................ 98
5.1.1.1 Alt key/mouse button 1 navigation..................................................................................................................................... 98
5.1.1.2 Mouse Wheel navigation.................................................................................................................................................... 98
5.1.1.3 Mouse click on a device entry............................................................................................................................................ 98
5.1.1.4 San Planner: Planning a path for a DS6000........................................................................................................................ 98

5.2 CREATING A MASTER IMAGE TO CLONE TPC AGENT MACHINES..................................................................................................99


5.3 COMMON AGENT TIPS........................................................................................................................................................101
5.3.1 Check /etc/hosts file for valid localhost entry........................................................................................................ 101
5.3.2 Stopping and restarting a Common Agent............................................................................................................. 101
5.3.3 Force the Common Agent to uninstall....................................................................................................................102

December 2011

Tivoli Storage Productivity Center


Hints and Tips
5.3.4 Cleaning up Common Agent residue......................................................................................................................102
5.3.5 Associate a Common Agent with a new TPC Server..............................................................................................102
5.3.6 Associate a Storage Resource Agent with a new TPC Server................................................................................103
5.4 TPC SUBAGENT TIPS.......................................................................................................................................................... 104
5.4.1 Restarting a stopped or failed Inband Fabric sub agent........................................................................................104
5.4.2 Restarting a stopped or failed Data Agent subagent..............................................................................................105
5.4.3 Stopping and restarting multiple TPC data agents ...............................................................................................105
5.4.4 Check the status of a Fabric subagent................................................................................................................... 109
5.4.5 Forcing a TPC Agent to use a particular IP address ............................................................................................110
5.4.6 How to exclude devices from Fabric agent scans .................................................................................................111
5.5 DB2 MAINTENANCE, TUNING AND CONFIGURATION FOR TPC...................................................................................................112
5.5.1 DB2 Maintenance Steps......................................................................................................................................... 112
5.5.1.1 Tip Safely Backing Up DB2 On Windows .................................................................................................................... 114

5.5.2 DB2 Performance Tuning.......................................................................................................................................115


5.5.3 Increasing the licensed processor limit for DB2...................................................................................................119
5.5.4 Checking the DB2 Listener port.............................................................................................................................119
5.5.5 Removing DB2 from AIX........................................................................................................................................121
5.5.6 Notes on Installing DB2......................................................................................................................................... 122
5.6 UNINSTALLING TPC.......................................................................................................................................................... 123
5.6.1 Silent TPC Agent Uninstall.................................................................................................................................... 123
5.6.2 Cleaning Up A Bad TPC Install.............................................................................................................................123
5.6.2.1 Cleaning Up TPC Installs On Windows ........................................................................................................................... 123
5.6.2.2 Cleaning Up TPC Installs On Unix.................................................................................................................................. 125

5.7 BACKING UP TPC ............................................................................................................................................................127


5.8 TPC AND WINDOWS DOMAIN ACCOUNTS.............................................................................................................................129
5.8.1 How TPC Login Authentication Works In TPC 4.x................................................................................................129
5.8.2 LDAP Basics...........................................................................................................................................................129
5.8.2.1 A Note about Windows Active Directory......................................................................................................................... 129
5.8.2.2 Understanding LDAP directory entries ............................................................................................................................ 130
5.8.2.3 Basic LDAP Terminology for TPC.................................................................................................................................. 130
5.8.2.4 Configuring TPC for LDAP Before you start ............................................................................................................... 130
5.8.2.5 LDAP Issues, Tips and Solutions .................................................................................................................................... 131
5.8.2.6 Stopping and starting services after a configuration change ............................................................................................. 132
5.8.2.7 LDAP Configuration Files................................................................................................................................................ 133

5.9 COMMON AGENT SERVICE AND THE WINDOWS LOCALSYSTEM ACCOUNT..................................................................................134


5.9.1 Limitations.............................................................................................................................................................134
5.10 SCHEDULING TPC WORKLOAD.........................................................................................................................................135
5.11 DATA PATH EXPLORER TIPS..............................................................................................................................................136
5.12 TIP TIPS (TIVOLI INTEGRATED PORTAL)............................................................................................................................136
5.13 NETAPP/NAS/NETWARE..................................................................................................................................................137
5.13.1 Netapp/NAS - Standard Configuration Steps.......................................................................................................137
5.13.2 Discovery Automatic or Manual....................................................................................................................... 137
5.13.3 Common Problems and Solutions 1..................................................................................................................... 137
5.13.3.1 The customer will not expose a root level userid for NAS. ........................................................................................ 137
5.13.3.2 Huge Netapp/NAS devices are discovered. ................................................................................................................... 138
5.13.3.3 The NAS device does not uniquely identify itself. ......................................................................................................... 138

5.13.4 Netware Standard Configuration Steps.............................................................................................................138


5.13.5 Common Problems and Solutions 2..................................................................................................................... 139
5.13.5.1 Customer has multiple NDS Trees.................................................................................................................................. 139
5.13.5.2 Customer is unwilling to share NDS Administrator ID ................................................................................................... 139
5.13.5.3 Customer uses only one Agent to Scan too many Netware Servers ................................................................................ 139

5.13.6 Scan/Probe Agent Administration........................................................................................................................139


5.13.7 Remote Scanning..................................................................................................................................................139
5.13.7.1 Remote Scanning Windows......................................................................................................................................... 139
5.13.7.2 Remote Scanning Unix................................................................................................................................................ 140
5.13.7.3 Remote Scanning snmputil.......................................................................................................................................... 140

5.14 NOTE ON CONFIGURING TPC FOR BATCH REPORTS ON UNIX OR LINUX................................................................................141


5.15 UNIX OPEN FILE LIMITS IN LARGE TPC ENVIRONMENTS.....................................................................................................141
5.16 THE TPC GUI AND JAVA................................................................................................................................................141
5.16.1 Where to find the Java bundled with TPC............................................................................................................142
5.16.2 Changing the program association for Java Web Start (JNLP).......................................................................... 142

December 2011

Tivoli Storage Productivity Center


Hints and Tips
5.17 CLEANING UP TPC DIRECTORIES......................................................................................................................................142
5.18 PLANNING FOR PRIVATE SWITCH NETWORKS.......................................................................................................................143
5.19 DS8K USER ACCOUNTS..................................................................................................................................................144
5.20 AGENT REGISTRATION PROBLEMS (SRV0042E/53E, BTC4045E, TIVOLI GUID)..............................................................144
5.20.1 More on database error occurred during agent registration errors.................................................................145
5.21 APPLYING TPC PATCHES.................................................................................................................................................146
5.22 CHANGING THE TPC SERVER HOSTNAME OR IP ADDRESS.......................................................................................................146
6 SYSTEM STORAGE PRODUCTIVITY CENTER (SSPC)...........................................................................................149
6.1 COMMON SSPC ISSUES......................................................................................................................................................149
6.1.1 Prerequisite step required when upgrading SSPC based on TPC 4.x................................................................... 149
6.1.2 Apply TPC Standard Edition License.....................................................................................................................149
6.1.3 Agent Manager Registration.................................................................................................................................. 150
6.1.4 Remote Agent Installation...................................................................................................................................... 150
6.1.5 CIMOM Installation On SSPC/TPC Server...........................................................................................................150
6.1.6 Running dscimcli.bat..............................................................................................................................................150
6.1.7 Configuring the DS8000 to the HMC CIMOM...................................................................................................... 151
6.1.7.1 Step 1 - Create new user account on HMC using dscli..................................................................................................... 151
6.1.7.2 Step 1 - Create new user account on HMC using DS8000 Storage Manager .................................................................... 151
6.1.7.3 Step 2 - Configure the device to the CIMOM ................................................................................................................... 152

6.1.8 Isolating DS8000 Performance Monitor Problems................................................................................................152


6.1.9 Disable TPC services when not using TPC on SSPC.............................................................................................152
6.2 SSPC REFERENCES...........................................................................................................................................................152
7 TPC REPLICATION MANAGER....................................................................................................................................153
7.1
7.2
7.3
7.4

TPC FOR REPLICATION AND DEVICE IP PORTS.......................................................................................................................153


SHUTTING DOWN TPC FOR REPLICATION............................................................................................................................. 153
TPC FOR REPLICATION TIPS/RECOMMENDATIONS:...................................................................................................................154
TPC FOR REPLICATION REFERENCES....................................................................................................................................155

8 APPENDIX A.......................................................................................................................................................................156
8.1 AGENTRESTART.BAT LISTING...............................................................................................................................................156
9 APPENDIX B - TPC AGENT INSTALLATION TIPS...................................................................................................158
10 APPENDIX C - TPC COMMAND REFERENCE........................................................................................................160
10.1 STARTING TPC .............................................................................................................................................................160
10.2 START/STOP TPC SERVICES............................................................................................................................................160
10.3 AGENT MANAGER AND COMMON AGENT COMMANDS..........................................................................................................161
10.4 DEVICE SERVER - TPCTOOL COMMANDS.......................................................................................................................162
10.5 DEVICE SERVER SRMCP COMMANDS...........................................................................................................................162
10.6 DB2 COMMANDS...........................................................................................................................................................162
10.7 CIMOM COMMANDS.....................................................................................................................................................163
10.8 REPLICATION MANAGER COMMANDS.................................................................................................................................164
10.9 SUPPORT/SERVICE DATA COLLECTION COMMANDS..............................................................................................................165
10.10 MISCELLANEOUS COMMANDS..........................................................................................................................................165
11 APPENDIX D DOWN LEVEL ITEMS STILL SUPPORTED IN TPC...................................................................167
11.1 IBM TOTALSTORAGE DS OPEN API V5.1......................................................................................................................167
11.1.1 IBM TotalStorage Enterprise Storage Server (ESS) command-line interface (CLI) .................................167
11.1.2 Installing the IBM DS Open API CIM Agent V5.1...............................................................................................168
11.1.3 Configuring the IBM DS Open API CIM Agent V5.1...........................................................................................168
11.1.3.1 Add a unique userid and password................................................................................................................................. 169
11.1.3.2 Add storage devices to be managed................................................................................................................................ 169
11.1.3.3 Restart the CIM Agent V5.1........................................................................................................................................... 170
11.1.3.4 Verify that the CIM Agent can communicate with the storage devices ........................................................................... 170
11.1.3.5 Additional information for IBM ESS devices and the V5.1 CIM agent .......................................................................... 171

12 APPENDIX E LINKS....................................................................................................................................................173

December 2011

Tivoli Storage Productivity Center


Hints and Tips

December 2011

Tivoli Storage Productivity Center


Hints and Tips

1 Notices
1.1 Legal Notice
This information was developed for products and services offered in the U.S.A. IBM may not offer the
products, services, or features discussed in this document in other countries. Consult your local IBM
representative for information on the products and services currently available in your area. Any
reference to an IBM product, program, or service is not intended to state or imply that only that IBM
product, program, or service may be used. Any functionally equivalent product, program, or service that
does not infringe any IBM intellectual property right may be used instead. However, it is the user's
responsibility to evaluate and verify the operation of any non-IBM product, program, or service.
IBM may have patents or pending patent applications covering subject matter described in this
document. The furnishing of this document does not give you any license to these patents. You can send
license inquiries, in writing, to:
IBM Director of Licensing, IBM Corporation, North Castle Drive Armonk, NY 10504-1785 U.S.A.
The following paragraph does not apply to the United Kingdom or any other country where such
provisions are inconsistent with local law: INTERNATIONAL BUSINESS MACHINES
CORPORATION PROVIDES THIS PUBLICATION "AS IS" WITHOUT WARRANTY OF ANY
KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A
PARTICULAR PURPOSE. Some states do not allow disclaimer of express or implied warranties in
certain transactions, therefore, this statement may not apply to you.
This information could include technical inaccuracies or typographical errors. Changes are periodically
made to the information herein; these changes will be incorporated in new editions of the publication.
IBM may make improvements and/or changes in the product(s) and/or the program(s) described in this
publication at any time without notice.
Any references in this information to non-IBM Web sites are provided for convenience only and do not
in any manner serve as an endorsement of those Web sites. The materials at those Web sites are not part
of the materials for this IBM product and use of those Web sites is at your own risk.
IBM may use or distribute any of the information you supply in any way it believes appropriate without
incurring any obligation to you.
Information concerning non-IBM products was obtained from the suppliers of those products, their
published announcements or other publicly available sources. IBM has not tested those products and
cannot confirm the accuracy of performance, compatibility or any other claims related to non-IBM
products. Questions on the capabilities of non-IBM products should be addressed to the suppliers of
those products.
This information contains examples of data and reports used in daily business operations. To illustrate
them as completely as possible, the examples include the names of individuals, companies, brands, and
products. All of these names are fictitious and any similarity to the names and addresses used by an
actual business enterprise is entirely coincidental.

December 2011

Tivoli Storage Productivity Center


Hints and Tips
COPYRIGHT LICENSE:
This information contains sample application programs in source language, which illustrates
programming techniques on various operating platforms. You may copy, modify, and distribute these
sample programs in any form without payment to IBM, for the purposes of developing, using, marketing
or distributing application programs conforming to the application programming interface for the
operating platform for which the sample programs are written. These examples have not been
thoroughly tested under all conditions. IBM, therefore, cannot guarantee or imply reliability,
serviceability, or function of these programs. You may copy, modify, and distribute these sample
programs in any form without payment to IBM for the purposes of developing, using, marketing, or
distributing application programs conforming to IBM's application programming interfaces.

1.2 Trademarks
The following terms are trademarks or registered trademarks of the International Business Machines
Corporation in the United States or other countries or both:
AIX
iSeries
Tivoli
DB2
Passport Advantage
TotalStorage
DS4000, DS6000, DS8000
pSeries
WebSphere
Enterprise Storage Server
zSeries
Redbooks (logo)
server
Redbooks
The following terms are trademarks or registered trademarks of other companies:
Microsoft, Windows, Windows XP and the Windows logo are trademarks of Microsoft Corporation in
the United States, other countries, or both.
UNIX is a registered trademark of the Open Group in the United States and other countries.
Java, Solaris, and all Java-based trademarks and logos are trademarks of Sun Microsystems, Inc. in the
United States, other countries, or both.
Intel is a registered trademark of the Intel Corporation or its subsidiaries in the United States and other
countries.
Linux is a trademark of Linus Torvalds in the United States, other countries, or both.
CLARiiON and Symmetrix are registered trademarks of the EMC Corporation.
HiCommand is a registered trademark of Hitachi Data Systems Corporation.
Brocade and the Brocade logo are trademarks or registered trademarks of Brocade Communications
Systems, Inc., in the United States and/or in other countries.
McDATA and Intrepid are registered trademarks of McDATA Corporation.
Cisco is a registered trademark of Cisco Systems, Inc. and/or its affiliates in the U.S. and certain other
countries.
Engenio and the Engenio logo are trademarks or registered trademarks of LSI Logic Corporation.

December 2011

10

Tivoli Storage Productivity Center


Hints and Tips
Other company, product, or service names may be trademarks or service marks of others.

1.3 Acknowledgement
The materials in this document have been collected from explicit work in the TPC development lab,
other labs within IBM, from problems discovered in the field at customer locations, and contributions
offered by people that have discovered a problem and have documented the solution.
Many people have helped with the materials that are included in this document, too many to properly
acknowledge here. It is a source of information for advanced configuration and problem resolution for
some of the more common problems encountered with TPC V3 (3.1, 3.2, 3.3.0, 3.3.1, 3.3.2) and V4
(4.1, 4.1.1, 4.2, 4.2.1, 4.2.2).
The original TPC Hints & Tips document was created and published by IBM Tivoli SWAT Team
member Doug Dunham.

1.3.1 Comments welcome


Your c
1.1 Legal Notice 9............................................................................................3
1.2 Trademarks 10............................................................................................3
1.3 Acknowledgement 11.................................................................................3
1.4 Updates to TPC Hints & Tips 11................................................................3
1.5 Other TPC Publications 11.........................................................................3
2.1 TPC Component and Package Overview 13..............................................3
2.2 Tivoli Common Reporting 14...................................................................3
2.3 TPC Installation Packages and Fixpacks 14...............................................3
2.4 TPC Installer Overview 17.........................................................................3
2.5 Installing TPC 34........................................................................................4
2.6 Upgrading TPC 38.....................................................................................4
3.1 Data Sources used in TPC 40.....................................................................4
3.2 What sources do I need for what functions? 45.........................................4
4.1 Some SMI-S Terminology 50....................................................................4
4.2 General Configuration Guidelines 51.........................................................4
4.3 Performance Data Collection using SMI-S CIM Agents 53......................4
4.4 IBM DS Open API CIM agent 54..............................................................4
4.5 IBM XIV 60...............................................................................................4
4.6 LSI SMI-S Provider for DS4000 devices 62..............................................4
4.7 SAN Volume Controller SMI-S CIM Agent Configuration 66.................5
4.8 IBM N-Series 68.........................................................................................5
4.9 IBM Tape Libraries 73...............................................................................5
4.10 EMC SMI-S Provider Configuration 74..................................................5
4.11 HDS HiCommand SMI-S Provider 80.....................................................5
4.12 Brocade SMI-S Agent 90.........................................................................5
4.13 McDATA OPENconnectors SMI-S Interface 94.................................5
4.14 Cisco SAN-OS CIM server 97.................................................................5
5.1 TPC Goodies 98.........................................................................................5
5.2 Creating a master image to clone TPC Agent machines 99.......................5
5.3 Common Agent Tips 101...........................................................................5
December 2011

11

Tivoli Storage Productivity Center


Hints and Tips
5.4 TPC subagent tips 104................................................................................6
5.5 DB2 maintenance, tuning and configuration for TPC 112.........................6
5.6 Uninstalling TPC 123.................................................................................6
5.7 Backing Up TPC 127.................................................................................6
5.8 TPC and Windows Domain Accounts 129.................................................6
5.9 Common Agent Service and the Windows LocalSystem Account 134.....6
5.10 Scheduling TPC Workload 135...............................................................6
5.11 Data Path Explorer Tips 136....................................................................6
5.12 TIP Tips (Tivoli Integrated Portal) 136...................................................6
5.13 Netapp/NAS/Netware 137........................................................................6
5.14 Note on Configuring TPC for batch reports on UNIX or Linux 141......6
5.15 Unix Open File Limits in Large TPC Environments 141.........................6
5.16 The TPC GUI and Java 141......................................................................6
5.17 Cleaning Up TPC Directories 142............................................................7
5.18 Planning for Private Switch Networks 143..............................................7
5.19 DS8k User Accounts 144........................................................................7
5.20 Agent Registration Problems (SRV0042E/53E, BTC4045E, Tivoli
GUID) 144.........................................................................................................7
5.21 Applying TPC Patches 146......................................................................7
5.22 Changing the TPC server hostname or IP address 146............................7
6.1 Common SSPC Issues 149.........................................................................7
6.2 SSPC References 152.................................................................................7
7.1 TPC for Replication and device IP ports 153.............................................7
7.2 Shutting Down TPC for Replication 153...................................................7
7.3 TPC for Replication tips/recommendations: 154.......................................7
7.4 TPC for Replication References 155..........................................................7
8.1 AgentRestart.bat listing 156.......................................................................7
10.1 Starting TPC 160....................................................................................7
10.2 Start/Stop TPC Services 160...................................................................7
10.3 Agent Manager and Common Agent Commands 161............................7
10.4 Device Server - TPCTOOL Commands 162...........................................7
10.5 Device Server SRMCP Commands 162...............................................7
10.6 DB2 Commands 162...............................................................................7
10.7 CIMOM Commands 163.........................................................................7
10.8 Replication Manager Commands 164.....................................................7
10.9 Support/Service Data Collection Commands 165...................................7
10.10 Miscellaneous Commands 165..............................................................7
11.1 IBM TotalStorage DS Open API V5.1 167.............................................7
1NOTICES...................................................................................................................................................................................9
1.1LEGAL NOTICE........................................................................................................................................................................9
1.2TRADEMARKS........................................................................................................................................................................10
1.3ACKNOWLEDGEMENT..............................................................................................................................................................11
1.3.1Comments welcome...................................................................................................................................................11
1.4UPDATES TO TPC HINTS & TIPS............................................................................................................................................17
1.5OTHER TPC PUBLICATIONS....................................................................................................................................................18
2INSTALLATION OF TPC.....................................................................................................................................................19
2.1TPC COMPONENT AND PACKAGE OVERVIEW.............................................................................................................................19
2.2TIVOLI COMMON REPORTING ..................................................................................................................................................20
2.3TPC INSTALLATION PACKAGES AND FIXPACKS..........................................................................................................................20

December 2011

12

Tivoli Storage Productivity Center


Hints and Tips
2.3.1Downloading TPC Software......................................................................................................................................21
2.4TPC INSTALLER OVERVIEW....................................................................................................................................................23
2.4.1Install Log Locations.................................................................................................................................................23
2.4.2TPC Installer.............................................................................................................................................................23
2.4.2.1Supported Options............................................................................................................................................................... 24
2.4.2.2Log Locations...................................................................................................................................................................... 24
2.4.2.3Log file descriptions............................................................................................................................................................24
2.4.2.4Debugging Techniques........................................................................................................................................................25
2.4.2.5Known Issues...................................................................................................................................................................... 25
2.4.2.5.1Disk Spanning (multiple disks)...................................................................................................................................25
2.4.2.5.2Log file cleanup on uninstalls......................................................................................................................................26
2.4.2.5.3Local System user created...........................................................................................................................................26
2.4.2.5.4During upgrade versions of failed components are updated in InstallShield................................................................27
2.4.2.5.5On Linux the installer will not start.............................................................................................................................28

2.4.3DbSchema component...............................................................................................................................................28
2.4.3.1Log Locations...................................................................................................................................................................... 28
2.4.3.2Log file descriptions............................................................................................................................................................28
2.4.3.3Known Issues...................................................................................................................................................................... 29
2.4.3.3.1Not sourcing db2profile...............................................................................................................................................29
2.4.3.3.2Wrong DB2 Edition.................................................................................................................................................... 29

2.4.4Data Server Component............................................................................................................................................29


2.4.4.1Log locations....................................................................................................................................................................... 29
2.4.4.2Log file descriptions............................................................................................................................................................30
2.4.4.3Known Issues...................................................................................................................................................................... 30
2.4.4.3.1TSRMsrv.zip being accessed.......................................................................................................................................30
2.4.4.3.2Copy of JRE fails on AIX 6.1......................................................................................................................................30

2.4.5Device Server Component.........................................................................................................................................30


2.4.5.1Log locations....................................................................................................................................................................... 30
2.4.5.2Log file descriptions............................................................................................................................................................31
2.4.5.3Known Issues...................................................................................................................................................................... 31
2.4.5.3.1WAS Profile Creation Fails (Corrupt Download Image).............................................................................................31
2.4.5.3.2Max Heap Space Size..................................................................................................................................................31
2.4.5.3.3Device Server will not stop during upgrade.................................................................................................................32

2.4.6TIP Installer...............................................................................................................................................................33
2.4.6.1Log locations....................................................................................................................................................................... 33
2.4.6.2Log file descriptions............................................................................................................................................................33

2.4.7TPC Replication........................................................................................................................................................33
2.4.7.1Log locations....................................................................................................................................................................... 33
2.4.7.2Log file descriptions............................................................................................................................................................34

2.4.8Data Agent.................................................................................................................................................................34
2.4.8.1Log locations....................................................................................................................................................................... 34
2.4.8.2Log file descriptions............................................................................................................................................................34
2.4.8.3Known Issues...................................................................................................................................................................... 34
2.4.8.3.1Data Agent fails during Common Agent validation.....................................................................................................34

2.4.9Fabric Agent..............................................................................................................................................................35
2.4.9.1Log locations....................................................................................................................................................................... 35
2.4.9.2Log file descriptions............................................................................................................................................................35
2.4.9.3Known Issues...................................................................................................................................................................... 35
2.4.9.3.1Fabric agent not shutting down during upgrade...........................................................................................................35

2.4.10Storage Resource Agent..........................................................................................................................................35


2.4.10.1Log Locations.................................................................................................................................................................... 35
2.4.10.2Log file descriptions..........................................................................................................................................................35
2.4.10.3Agent Command Line Usage.............................................................................................................................................36
2.4.10.4SRA Install Error Codes....................................................................................................................................................37
2.4.10.5SRA Uninstall Error Codes...............................................................................................................................................39

2.5INSTALLING TPC...................................................................................................................................................................40
2.5.1Server Configuration.................................................................................................................................................40
2.5.2Userids and Passwords.............................................................................................................................................40
2.5.3Installation Tasks.......................................................................................................................................................40
2.5.4Installing the TPC database in a custom location.....................................................................................................43
2.6UPGRADING TPC .................................................................................................................................................................44
2.6.1Backup Your Existing Environment...........................................................................................................................44
2.6.2Temporarily Change DB2 Logging To Circular.......................................................................................................44

December 2011

13

Tivoli Storage Productivity Center


Hints and Tips
2.6.3SCHEDULED_UPGRADES File..............................................................................................................................44
2.6.4Stop TPC Workload...................................................................................................................................................44
3TPC DATA SOURCE OVERVIEW.....................................................................................................................................46
3.1DATA SOURCES USED IN TPC.................................................................................................................................................46
3.1.1SMIS Providers (CIM Agents)...................................................................................................................................46
3.1.2Data Agents...............................................................................................................................................................47
3.1.3Storage Resource Agents...........................................................................................................................................48
3.1.4Inband Fabric agents................................................................................................................................................48
3.1.5Out of Band Fabric (OOBF) Agents.........................................................................................................................49
3.1.6Cisco SAN environments...........................................................................................................................................49
3.1.6.1Configuring Cisco MDS9000 switches for Out-of-band communication............................................................................49

3.1.7TPC Server data sources...........................................................................................................................................51


3.1.8VMware VI data sources...........................................................................................................................................51
3.2WHAT SOURCES DO I NEED FOR WHAT FUNCTIONS?.....................................................................................................................51
3.2.1Data Manager - Server Management........................................................................................................................51
3.2.2Data Manager Storage Subsystem Asset Management..........................................................................................51
3.2.3Data Manager for Databases Database Management...........................................................................................51
3.2.4Disk Manager Reports and Provisioning of LUNs.................................................................................................51
3.2.5Disk Manager - Storage Subsystem Performance management...............................................................................52
3.2.6Fabric Manager Reports and Fabric Zoning.........................................................................................................53
3.2.7Fabric Manager Switch Performance Monitoring.................................................................................................54
3.2.8Tape Manager .........................................................................................................................................................55
3.2.9Replication Manager.................................................................................................................................................55
4SMI-S PROVIDERS INSTALLATION AND CONFIGURATION..................................................................................56
4.1SOME SMI-S TERMINOLOGY..................................................................................................................................................56
4.2GENERAL CONFIGURATION GUIDELINES.....................................................................................................................................57
4.2.1Namespaces for CIM agents .....................................................................................................................................58
4.3PERFORMANCE DATA COLLECTION USING SMI-S CIM AGENTS..................................................................................................59
4.4IBM DS OPEN API CIM AGENT...........................................................................................................................................60
4.4.1DS Open API CIM Agent V5.2/V5.3/V5.4.0..............................................................................................................60
4.4.1.1Enabling the DS Open API V5.2/V5.3/V5.4.0 CIM Agent on the DS8000 HMC...............................................................60
4.4.1.2Installing the DS Open API V5.2/V5.3/V5.4.0 CIM Agent as a proxy agent......................................................................61
4.4.1.3Setting up the dscimcli utility..............................................................................................................................................62
4.4.1.4Configuring the DS Open API V5.2/V5.3/V5.4.0 CIM agent..............................................................................................63

4.4.2DS Open API CIM Agent V5.4.1...............................................................................................................................64


4.4.3Configure TPC for the DS Open API CIM Agent......................................................................................................64
4.4.4Scalability / Best Practices Guidelines for the DS Open API...................................................................................65
4.5IBM XIV............................................................................................................................................................................66
4.5.1Understanding how authentication works in XIV CIM 10.1.....................................................................................66
4.5.2Steps to configure XIV in TPC...................................................................................................................................66
4.6LSI SMI-S PROVIDER FOR DS4000 DEVICES..........................................................................................................................68
4.6.1Installing the LSI SMI-S Provider.............................................................................................................................68
4.6.2Modifying the LSI SMI-S Provider CIM Agent configuration...................................................................................69
4.6.2.1Adding or Removing a device for the CIM agent................................................................................................................69
4.6.2.2Changing the CIM agent HTTP and HTTPS port................................................................................................................70
4.6.2.3Configuring the CIM Agent on machines with multiple IP addresses.................................................................................70
4.6.2.4Enabling authorization for the CIM agent...........................................................................................................................70

4.6.3Configure TPC for the LSI SMI-S provider...............................................................................................................71


4.7SAN VOLUME CONTROLLER SMI-S CIM AGENT CONFIGURATION............................................................................................72
4.7.1Create a userid for TPC............................................................................................................................................72
4.7.2Verify that the SVC CIM agent is running.................................................................................................................72
4.7.3Configure TPC for the SVC CIM Agent....................................................................................................................72
4.7.4Memory considerations for the SVC 3.1 and 4.1 CIM Agents .................................................................................73
4.7.5SVC embedded CIMOM............................................................................................................................................73
4.7.6Disable TPC services when not using TPC on SSPC................................................................................................74
4.8IBM N-SERIES.....................................................................................................................................................................74
4.9IBM TAPE LIBRARIES............................................................................................................................................................79

December 2011

14

Tivoli Storage Productivity Center


Hints and Tips
4.9.1Tape CIM agents.......................................................................................................................................................79
4.9.2Are there any resource issues to be aware of when contemplating a large environment?.......................................79
4.9.3TPC Supported Tape Libraries.................................................................................................................................79
4.10EMC SMI-S PROVIDER CONFIGURATION..............................................................................................................................80
4.10.1Install the EMC SMI-S Provider.............................................................................................................................80
4.10.2Configure the EMC SMI-S Provider for Symmetrix devices...................................................................................81
4.10.3Configure the EMC SMI-S Provider for Clariion devices......................................................................................83
4.10.4Configure TPC for the EMC SMI-S Provider.........................................................................................................83
4.10.5EMC 4.1 Clariion SMI Provider Configuration......................................................................................................84
4.10.6Enabling Statistics Logging in Clariion..................................................................................................................85
4.11HDS HICOMMAND SMI-S PROVIDER...................................................................................................................................86
4.11.1Installing HiCommand Device Manager.................................................................................................................86
4.11.2Configuring HiCommand Device Manager............................................................................................................86
4.11.2.1Add a Device Manager license key....................................................................................................................................86
4.11.2.2Change the dispatcher configuration..................................................................................................................................87
4.11.2.3Add Subsystems to Device Manager.................................................................................................................................87
4.11.2.4Enable the SMI-S CIM Agent...........................................................................................................................................87

4.11.3Configure TPC for the HiCommand Device Manager CIM Agent.........................................................................87


4.11.4Adding a HiCommand User Admin for TPC...........................................................................................................88
4.11.5Problems with HiCommand in TPC........................................................................................................................94
4.11.6HDS TagmaStore Universal Storage Platform (USP) Virtualization in TPC.........................................................95
4.12BROCADE SMI-S AGENT.....................................................................................................................................................96
4.12.1Installing Brocade SMI-S Agent..............................................................................................................................96
4.12.2Configure the Brocade SMI-S Agent.......................................................................................................................97
4.12.3Configure TPC for the Brocade SMI-S Agent.........................................................................................................97
4.12.4Memory and Scalability Considerations for Brocade SMI-S Agent........................................................................98
4.13 MCDATA OPENCONNECTORS SMI-S INTERFACE..........................................................................................................100
4.13.1 Installing McDATA SMI-S Interface CIM Agent..................................................................................................100
4.13.2 Configuration for Direct Connection method......................................................................................................101
4.13.3Configure TPC for the McDATA SMI-S Provider.................................................................................................102
4.14CISCO SAN-OS CIM SERVER............................................................................................................................................103
4.14.1Enable and Configure the Cisco SAN-OS CIM Server.........................................................................................103
4.14.2Configure TPC for the Cisco SAN-OS CIM server...............................................................................................103
5TPC HINTS AND TIPS........................................................................................................................................................104
5.1TPC GOODIES....................................................................................................................................................................104
5.1.1Topology Viewer Tip...............................................................................................................................................104
5.1.1.1Alt key/mouse button 1 navigation....................................................................................................................................104
5.1.1.2Mouse Wheel navigation...................................................................................................................................................104
5.1.1.3Mouse click on a device entry...........................................................................................................................................104
5.1.1.4San Planner: Planning a path for a DS6000.......................................................................................................................104

5.2CREATING A MASTER IMAGE TO CLONE TPC AGENT MACHINES..................................................................................................105


5.3COMMON AGENT TIPS..........................................................................................................................................................107
5.3.1Check /etc/hosts file for valid localhost entry.........................................................................................................107
5.3.2Stopping and restarting a Common Agent..............................................................................................................107
5.3.3Force the Common Agent to uninstall.....................................................................................................................108
5.3.4Cleaning up Common Agent residue.......................................................................................................................108
5.3.5Associate a Common Agent with a new TPC Server...............................................................................................108
5.3.6Associate a Storage Resource Agent with a new TPC Server.................................................................................109
5.4TPC SUBAGENT TIPS............................................................................................................................................................110
5.4.1Restarting a stopped or failed Inband Fabric sub agent.........................................................................................110
5.4.2Restarting a stopped or failed Data Agent subagent...............................................................................................111
5.4.3Stopping and restarting multiple TPC data agents ................................................................................................111
5.4.4Check the status of a Fabric subagent....................................................................................................................115
5.4.5Forcing a TPC Agent to use a particular IP address .............................................................................................116
5.4.6How to exclude devices from Fabric agent scans ..................................................................................................117
5.5DB2 MAINTENANCE, TUNING AND CONFIGURATION FOR TPC.....................................................................................................118
5.5.1DB2 Maintenance Steps..........................................................................................................................................118
5.5.1.1Tip Safely Backing Up DB2 On Windows.....................................................................................................................120

5.5.2DB2 Performance Tuning........................................................................................................................................121

December 2011

15

Tivoli Storage Productivity Center


Hints and Tips
5.5.3 Increasing the licensed processor limit for DB2....................................................................................................125
5.5.4Checking the DB2 Listener port..............................................................................................................................125
5.5.5Removing DB2 from AIX.........................................................................................................................................127
5.5.6Notes on Installing DB2..........................................................................................................................................128
5.6UNINSTALLING TPC............................................................................................................................................................129
5.6.1Silent TPC Agent Uninstall.....................................................................................................................................129
5.6.2Cleaning Up A Bad TPC Install..............................................................................................................................129
5.6.2.1Cleaning Up TPC Installs On Windows............................................................................................................................129
5.6.2.2Cleaning Up TPC Installs On Unix...................................................................................................................................131

5.7BACKING UP TPC ..............................................................................................................................................................133


5.8TPC AND WINDOWS DOMAIN ACCOUNTS...............................................................................................................................135
5.8.1How TPC Login Authentication Works In TPC 4.x.................................................................................................135
5.8.2LDAP Basics............................................................................................................................................................135
5.8.2.1A Note about Windows Active Directory..........................................................................................................................135
5.8.2.2Understanding LDAP directory entries .............................................................................................................................136
5.8.2.3Basic LDAP Terminology for TPC...................................................................................................................................136
5.8.2.4Configuring TPC for LDAP Before you start ................................................................................................................136
5.8.2.5LDAP Issues, Tips and Solutions .....................................................................................................................................137
5.8.2.6Stopping and starting services after a configuration change..............................................................................................138
5.8.2.7LDAP Configuration Files.................................................................................................................................................139

5.9COMMON AGENT SERVICE AND THE WINDOWS LOCALSYSTEM ACCOUNT....................................................................................140


5.9.1 Limitations..............................................................................................................................................................140
5.10 SCHEDULING TPC WORKLOAD...........................................................................................................................................141
5.11DATA PATH EXPLORER TIPS................................................................................................................................................142
5.12 TIP TIPS (TIVOLI INTEGRATED PORTAL)..............................................................................................................................142
5.13NETAPP/NAS/NETWARE....................................................................................................................................................143
5.13.1Netapp/NAS - Standard Configuration Steps........................................................................................................143
5.13.2Discovery Automatic or Manual........................................................................................................................143
5.13.3Common Problems and Solutions 1......................................................................................................................143
5.13.3.1The customer will not expose a root level userid for NAS. .........................................................................................143
5.13.3.2Huge Netapp/NAS devices are discovered. ....................................................................................................................144
5.13.3.3The NAS device does not uniquely identify itself. ..........................................................................................................144

5.13.4Netware Standard Configuration Steps..............................................................................................................144


5.13.5Common Problems and Solutions 2......................................................................................................................145
5.13.5.1Customer has multiple NDS Trees...................................................................................................................................145
5.13.5.2Customer is unwilling to share NDS Administrator ID....................................................................................................145
5.13.5.3Customer uses only one Agent to Scan too many Netware Servers.................................................................................145

5.13.6Scan/Probe Agent Administration.........................................................................................................................145


5.13.7Remote Scanning...................................................................................................................................................145
5.13.7.1Remote Scanning Windows..........................................................................................................................................145
5.13.7.2Remote Scanning Unix.................................................................................................................................................146
5.13.7.3Remote Scanning snmputil...........................................................................................................................................146

5.14 NOTE ON CONFIGURING TPC FOR BATCH REPORTS ON UNIX OR LINUX..................................................................................147


5.15UNIX OPEN FILE LIMITS IN LARGE TPC ENVIRONMENTS.......................................................................................................147
5.16THE TPC GUI AND JAVA..................................................................................................................................................147
5.16.1Where to find the Java bundled with TPC.............................................................................................................148
5.16.2Changing the program association for Java Web Start (JNLP)...........................................................................148
5.17CLEANING UP TPC DIRECTORIES........................................................................................................................................148
5.18PLANNING FOR PRIVATE SWITCH NETWORKS.........................................................................................................................149
5.19 DS8K USER ACCOUNTS....................................................................................................................................................150
5.20AGENT REGISTRATION PROBLEMS (SRV0042E/53E, BTC4045E, TIVOLI GUID)................................................................150
5.20.1More on database error occurred during agent registration errors..................................................................151
5.21APPLYING TPC PATCHES...................................................................................................................................................152
5.22CHANGING THE TPC SERVER HOSTNAME OR IP ADDRESS.........................................................................................................152
6SYSTEM STORAGE PRODUCTIVITY CENTER (SSPC).............................................................................................155
6.1COMMON SSPC ISSUES........................................................................................................................................................155
6.1.1Prerequisite step required when upgrading SSPC based on TPC 4.x....................................................................155
6.1.2Apply TPC Standard Edition License......................................................................................................................155
6.1.3Agent Manager Registration...................................................................................................................................156

December 2011

16

Tivoli Storage Productivity Center


Hints and Tips
6.1.4Remote Agent Installation.......................................................................................................................................156
6.1.5CIMOM Installation On SSPC/TPC Server............................................................................................................156
6.1.6Running dscimcli.bat...............................................................................................................................................156
6.1.7Configuring the DS8000 to the HMC CIMOM.......................................................................................................157
6.1.7.1Step 1 - Create new user account on HMC using dscli......................................................................................................157
6.1.7.2Step 1 - Create new user account on HMC using DS8000 Storage Manager.....................................................................157
6.1.7.3Step 2 - Configure the device to the CIMOM....................................................................................................................158

6.1.8Isolating DS8000 Performance Monitor Problems.................................................................................................158


6.1.9Disable TPC services when not using TPC on SSPC..............................................................................................158
6.2SSPC REFERENCES.............................................................................................................................................................158
7TPC REPLICATION MANAGER......................................................................................................................................159
7.1TPC FOR REPLICATION AND DEVICE IP PORTS.........................................................................................................................159
7.2SHUTTING DOWN TPC FOR REPLICATION...............................................................................................................................159
7.3TPC FOR REPLICATION TIPS/RECOMMENDATIONS:.....................................................................................................................160
7.4TPC FOR REPLICATION REFERENCES......................................................................................................................................161
8APPENDIX A.........................................................................................................................................................................162
8.1AGENTRESTART.BAT LISTING.................................................................................................................................................162
9APPENDIX B - TPC AGENT INSTALLATION TIPS.....................................................................................................164
10APPENDIX C - TPC COMMAND REFERENCE..........................................................................................................166
10.1 STARTING TPC ...............................................................................................................................................................166
10.2 START/STOP TPC SERVICES..............................................................................................................................................166
10.3 AGENT MANAGER AND COMMON AGENT COMMANDS............................................................................................................167
10.4 DEVICE SERVER - TPCTOOL COMMANDS.........................................................................................................................168
10.5 DEVICE SERVER SRMCP COMMANDS.............................................................................................................................168
10.6 DB2 COMMANDS.............................................................................................................................................................168
10.7 CIMOM COMMANDS.......................................................................................................................................................169
10.8 REPLICATION MANAGER COMMANDS...................................................................................................................................170
10.9 SUPPORT/SERVICE DATA COLLECTION COMMANDS................................................................................................................171
10.10 MISCELLANEOUS COMMANDS............................................................................................................................................171
11APPENDIX D DOWN LEVEL ITEMS STILL SUPPORTED IN TPC.....................................................................173
11.1 IBM TOTALSTORAGE DS OPEN API V5.1........................................................................................................................173
11.1.1IBM TotalStorage Enterprise Storage Server (ESS) command-line interface (CLI) ..................................173
11.1.2Installing the IBM DS Open API CIM Agent V5.1................................................................................................174
11.1.3Configuring the IBM DS Open API CIM Agent V5.1............................................................................................174
11.1.3.1Add a unique userid and password..................................................................................................................................175
11.1.3.2Add storage devices to be managed.................................................................................................................................175
11.1.3.3Restart the CIM Agent V5.1............................................................................................................................................176
11.1.3.4Verify that the CIM Agent can communicate with the storage devices............................................................................176
11.1.3.5Additional information for IBM ESS devices and the V5.1 CIM agent...........................................................................177

12APPENDIX E LINKS......................................................................................................................................................179

omments are important to us. We want our technical publications to be as helpful as possible. To
provide feedback about this document, you can email your comments to your IBM Technical Support
representative.

1.4 Updates to TPC Hints & Tips


A frequently asked question is why isnt TPC Hints & Tips updated closer to when a new release
comes out? The answer is that the majority of the information in this document comes from the field
and support/development experience. This knowledge develops over time as more TPC users install or
upgrade to the latest release.
The current revision cycle is approximately one year after a new release, subject to change.
December 2011

17

Tivoli Storage Productivity Center


Hints and Tips

The official product documentation available at release contains information about new features and
supported devices.

1.5 Other TPC Publications


IBM Tivoli Storage Productivity Center Hints and Tips (this document) is a supplement to the Tivoli
Storage Productivity Center (TPC) publications that are available for V3 and V4. It is offered to provide
necessary information for configuration and workarounds for known problems in TPC. It is intended to
be a supplement to the TPC publications, providing additional information to help implementers of TPC
with configuration questions and to provide guidance in the planning and implementation of TPC. It is
expected that an experienced TPC installer will use this document as a supplement for installation and
configuration, and use the official TPC publications for overall knowledge of the installation process,
configuration, and usage of the TPC components.
This document is not intended to replace the official TPC publications, nor is it a self-standing
guide to installation and configuration. You can find the entire set of TPC publications in the >>>TPC
Infocenter. They are available in both PDF and HTML formats. These documents are essential to a
successful implementation of TPC, and should be used to make sure that you do all the required steps to
install and configure TPC. You should have the official publications available in either softcopy or
printed form, read them and be familiar with their content.
There is also a new >>>TPC 4.2 Release Guide Redbook available.
References to information on the internet begin with >>>, and the corresponding web link appears in
Appendix E Links. If you are viewing this document online, you can click on these references to go
to the link in the appendix, and you can click on the appendix link to open the link in a web browser.
Items identified with the pointing hand contain information that is especially significant. Even if you
just browse through this document, please take note of this special information.

December 2011

18

Tivoli Storage Productivity Center


Hints and Tips

2 Installation of TPC
IBM Tivoli Storage Productivity Center (TPC) V4.2.2 is the latest version of IBMs storage resource
management software. The new features introduced with this release are described in the 'New for IBM
TPC 4.2.2 section in the >>>TPC Infocenter. In addition to support for new devices, highlights include:

Tivoli Common Reporting TPC 4.2.2 introduces Storage Tiering reports in Tivoli Common
Reporting (TCR) which uses the Cognos reporting engine.
Fully functional Storage Resource Agent (SRA) the SRA is now capable of all functionality of
legacy Data and Fabric agents (see >>>Deployment considerations for Storage Resource agents)
A path is provided to migrate legacy Data/Fabric agents to SRA (see >>>Migrating Data agents
and Fabric agents to Storage Resource agents)
No Agent Manager is needed for a new TPC install with all SRAs, or for an upgraded TPC
installation after all agents have been migrated to SRA
If you have downlevel TPC Data or Fabric agents to support monitoring of agent servers running
old operating systems platforms that are, you will need to keep an Agent Manager to support
those agents
If you are upgrading TPC, and your upgrade includes upgrading DB2 to v9.7, and you are
running the Agent Manager, you must upgrade the Agent Manager to v1.4.

The TPC V3 versions that are still supported are V3.1.3, V3.2.1, V3.3.0, V3.3.1, and V3.3.2.

2.1 TPC Component and Package Overview


Each installation package contains the complete TPC product, and all functions are installed. However
the different licenses that are shipped with each package limit the functionality available at the graphical
user interface level.
There are prerequisite components that need to be installed that are used by the TPC Server. These
components can be installed on a different machine from the TPC Server, but experience has shown that
performance is much better when they are all installed on the same machine.

DB2 Universal Database Enterprise Edition V9.1 Fixpack 2, or V9.5 Fixpack 3A, or V9.7*
IBM Tivoli Agent Manager V1.3.2.15, V1.3.2.26, V1.3.2.30, V1.4**

The installable components available in the TPC installer are:


Database Schema*
Data Server*
Device Server*
Graphical User Interface (GUI)
Command Line Interface (CLI)
Data Agent (V4.1.1 and older)
Storage Resource Agent (V4.1 and newer)
Fabric Agent (V4.1.1 and older)
Tivoli Integrated Portal (V4.1 and newer *)
Tivoli Storage Productivity Center for Replication (TPC-R, V4.1 and newer *)

December 2011

19

Tivoli Storage Productivity Center


Hints and Tips
These are the functional components that make TPC work. You will need to install certain components
in order to get the TPC Server to work, as noted by an asterisk (*) in the lists above. Data, Fabric, and
SRA agents will most likely be installed in multiple locations, and not necessarily on the TPC server.
The CLI and GUI are installed wherever a user might wish to control TPC, and can be in a remote
location.
The Tivoli Agent Manager (**) is only required to support legacy Data and Fabric agents, and if you
plan to upgrade DB2 to V9.7 you must upgrade Agent Manager to V1.4. If you are installing a new TPC
4.2 environment and will only have SRA, you should not install Tivoli Agent Manager. If you are
upgrading a TPC environment that included the Agent Manager and Data/Fabric agents, you will can
uninstall the Agent Manager if you wish after you have migrated all of your Data/Fabric agents to SRA.
Please refer to the TPC publications for more information on the different TPC components.

2.2 Tivoli Common Reporting


TPC 4.2.2 includes support for Storage Tiering reports through Tivoli Common Reporting (TCR). TCR
is installed separately from TPC.

TCR has its own installation package. It is available for Windows, AIX, and Linux.
TCR installs another TIP instance (you cannot use the TPC TIP instance for TCR).
TCR is not part of TPC single sign on, however you can choose (and we recommend) the same
userids and passwords for your TCR install as you are using with TIP/TPC for consistency.
TCR can be installed on its own server separate from the TPC server.
The disk and memory requirements for TCR must be considered an addition to the disk and
memory requirements for other applications on the server where it is to be installed. For
example, if TPC requires a minimum of 8gb and TCR requires a minimum of 4gb, you should
have at least 12gb of memory on a server where both TPC and TCR are to be installed.
For large environments, and/or environments with only the minimum physical memory on the
TPC server, we recommend that TCR be installed on its own server.
The remote implementation requires installation of a DB2 client to facilitate connection to the
TPCDB database. The DB2 client can be installed from the DB2 install package that is provided
with TPC.

For more details on TCR installation and TPC 4.2.2 Storage Tiering Reports, please see the >>>TPC 4.2
Release Guide Redbook.

2.3 TPC Installation Packages and Fixpacks


The TPC software components are listed below with the Operating System platforms supported by each,
and are the same for all package distributions, with the exception of the TPC Server image, which
contains a different license for each distribution:
DB2 V9.1 Fixpack 5, or DB2 V9.5 Fixpack 3A, DB2 V9.7 (AIX, Linux Intel, Windows)
This is available on CD and also a downloadable ZIP/TAR package.
**Note: 64bit DB2 V9 on AIX and Linux no longer includes the DB2 Control Center application
(db2cc), but Control Center can be installed on a Windows server for remote management.

The license for DB2 V9.7 is now provided in a separate package that you must download
and install. It is no longer bundled with the DB2 V9.7 software.
IBM Tivoli Agent Manager V1.3.2.30, V1.4 if using DB2 V9.7 (AIX, Linux Intel, Windows)
This is available on CD and also a downloadable ZIP/TAR package.
December 2011

20

Tivoli Storage Productivity Center


Hints and Tips

TPC Server Installation (AIX, Linux Intel, Windows)


This contains all the components necessary to install the TPC server, and push agents to remote
machines. The software is delivered as multiple electronic images.
TPC V4.2 - If you download the images, you will find ZIP/TAR/EXE files for the Server
install labeled as disk1-part1 (base TPC), disk1-part2 (TIP and TPC for Replication), disk1part3 (required TIP PTF), disk1-part4 (Tivoli Monitoring agent for TPC, required only if you
are using Tivoli Monitoring); disk1-part1/2/3 must be downloaded in the same directory. If
you order physical media, you will receive DVDs that contains the TPC software.
TPC V4.1 - ZIP/TAR/EXE files for the Server install labeled as disk1-part1, disk1-part2
(Windows, AIX, Linux) and disk 2 (Windows and UNIX). If you order physical media, you
will receive DVDs that contain the TPC software.
TPC V3.3 - If you download the images, you will find ZIP/TAR/EXE files for the Server
install labeled as disk1 (Windows, AIX, Linux) and disk 2 (Windows and UNIX). If you
order physical media, you will receive a DVD that contains both images.
TPC fixpacks are maintenance releases to the base release. TPC V4.1.1 is a fixpack release to
TPC V4.1.0 (i.e., 4.1.1 fixpacks are considered a continuation of the 4.1.0 product stream).
TPC Agent Installation
You can order physical media or download images used to install agents locally on servers to be
managed by TPC. There is one for each platform supported, and these are designated as disk2.
AIX Data and Fabric agents, Storage Resource Agent
Linux Intel Data and Fabric agents, Storage Resource Agent
Linux iSeries Data agents only, Storage Resource Agent
Linux zSeries Data agents only
HP/UX Data and Fabric agents
Solaris Data and Fabric agents
Windows Data and Fabric agents, Storage Resource Agent
There is no longer a separate disk for TPC Agent Files for Cross Platform Installation. These
files have been incorporated into the two-part disk1 image for TPC V4.1, and disk1 and disk2
images for TPC V3.3.
TPC National Language Support (AIX, Linux Intel, Windows)
TPC DB Migration Utility (AIX, Linux, Windows)

2.3.1 Downloading TPC Software


If you are a customer or business partner and have purchased TPC, you can download your TPC code
from >>>Passport Advantage or order the physical media on a combination of DVD/CDs.
TPC fixpack releases do not include the product license and are used to upgrade a licensed TPC
environment. They will not run without a valid license. These can be downloaded at:
ftp://ftp.software.ibm.com/software/tivoli/products/TPC/fixes/
If you are an IBM employee and have connectivity to the internal IBM network, and are
demonstrating the product (for example as a proof of concept exercise) on your own machine, you
can download these installation packages as eAssemblies that correspond to the package distributions
listed above. Log into >>>IBM Xtreme Leverage Downloads and search for Tivoli Storage
Productivity Center V4.2.2. The search will return a list of the TPC eAssemblies. Choose the package
and server platform you desire, and download the install images.
Software downloaded from this
site should not be provided to IBM customers.
December 2011

21

Tivoli Storage Productivity Center


Hints and Tips

A typical platform download for the package distribution will consume about 4GB of disk space.
Uncompressing the images will consume an additional 4-5GB of space. You can store these
uncompressed installation packages on a file server for easy access, and use a CIFS or NFS mount to
access the code from different servers.
Please be aware of the following known issues when downloading TPC software images:
Download from a browser (http) doesn't work right once the images hit the 2 GB limit. Use
command line ftp or another application.
AIX native tar doesn't work when the images hit the 2 GB limit. Install and use GNU tar.
Windows self-extracting exe doesn't work when the images hit the 2 GB limit.
TIP installer reports an error when AIX native tar used to extract images. File names in long
paths are truncated. Install and use GNU tar.
Compare the file size of the downloaded image with the file size on the ftp server to make sure
your download completed successfully.
MD5 checksum data is now provided with TPC fixpack downloads. We recommend that you use
this data to verify your TPC downloads. For Windows, Microsoft provides the File Checksum
Integrity Verifier tool (FCIV) that you can use. On Linux, you can use the md5sum command.
On AIX, you can use csum.

December 2011

22

Tivoli Storage Productivity Center


Hints and Tips

2.4 TPC Installer Overview


This section covers the location of logs, debugging techniques, and known issues.
There are two types of Installations: Typical Install and a Custom Install. The Typical Install is for
customers that want to use a common User ID and Password for all users. e.g.. DB2 Administrator,
WAS Administrator, etc.. In the field custom install is the preferred install path because it gives the user
more flexibility with IDs , Passwords, and other install configurations, i.e. Database, CAS Agents, and
SRA Agents.
You must have administrator rights to run the installer, i.e. Administrator or a user account belonging to
the Administrator's group on Windows. On non-Windows platforms you must use the root user.

2.4.1 Install Log Locations


When installing TPC components most log files are created in the log subdirectory of the install
location, i.e. C:\Program Files\IBM\TPC\log or /opt/IBM/TPC/log. Each component will have its own
subdirectory under the log directory. The exact log location and log files will be discussed in more detail
under each component section below.
These are the basic locations of the log files. An important thing to note is that not all components are
actually installed using InstallShield. For some of the components InstallShield executes other installers
that also create log files. That is why it is important to know the differences in the log files and all
possible locations.

2.4.2 TPC Installer


This is the main suite installer that installs all components. At certain times as many as three installers
can be running at the same time although it appears only one is running. This is why it is important to
understand what is happening and the key log files to successfully debug an issue.
A good practice for debugging or figuring out what is happening before the actual install happens (i.e.,
pre-install panels) is to use the InstallShield parm -is:javaconsole

December 2011

23

Tivoli Storage Productivity Center


Hints and Tips

setup.exe -is:javaconsole or ./setup.sh -is:javaconsole


This will display all the InstallShield messages and debug messages. In Windows it will open a cmd
window and display all messages in that window. On non-Windows platforms the messages will go to
stdout. For silent installs I recommend using the InstallShield -is:log <fullpath>/log.txt option:

setup.exe -is:log <tmpdir>\log or ./setup.sh -is:log <tmpdir>/log


2.4.2.1 Supported Options
-silent
The silent option allows the customer to install agents on machines that do not have graphics
capabilities. This option must be used with the options file.
-options
The options parameter is used to pass in variables to the installer when used with the -silent option. The
options files can be found in the root install image directory. They are called setup.iss and
setup_agents.iss.
-is:javaconsole
This option is good for debugging pre-Install data entry and validation. On Windows this brings up a
command window and displays most output to the window. On Unix the output is displayed in stdout.
-is:log
This option is used to save installer output messages and debug messages to a file. The best practice is to
use a full path e.g. /tmp/log.txt or c:\log.txt. Note! This file is similar to TPC.log that gets created when
the actual install starts i.e. after the install summary page.
-is:javahome
This option allows the user to point the installer to a different jvm directory. This option is usually not
used unless the jvm is corrupted or deleted.

2.4.2.2 Log Locations


<Install Location>/log/install
<Install Location> refers to the typical default locations of a TPC install on Windows and Unix
platforms (this may vary if you chose to install TPC in a different location):
C:\Program Files\IBM\TPC or /opt/IBM/TPC

2.4.2.3 Log file descriptions


TPC.log Contains all messages from the install of the TPC components.
For components that have
their own installer i.e. Data components, TIP, TPC-R the execution of that components installer appears
in TPC.log.
1) createDataSrvAccStdout.log - Stdout of the command used to create the user that is used with the
Data Server service. Windows only.
2) DataSrvAccExistStdout.log Stdout of the command used to check if the Data Server user exists.
Windows only.
December 2011

24

Tivoli Storage Productivity Center


Hints and Tips
3) guidInstallOutput.log Output of the guid.
4) makeDataSrvAccStdout.log
5) StopDeviceServerStdout.log
6) installedTPC.status Exit code of the Embedded TPC Install.
7) TPC.log Main installer log file for all components.

2.4.2.4 Debugging Techniques


A good practice for debugging or figuring out what is happening before the actual install happens i.e.
Pre-Install panels is to use the InstallShield option -is:javaconsole

setup.exe -is:javaconsole or ./setup.sh -is:javaconsole


This option will show most the InstallShield messages and debug messages. On Windows it will open a
command window and display details in that window. On non-Windows platforms all output goes to
stdout. For silent installs use the -is:log option shown below.

setup.exe -is:log <tmpdir>\log.txt or ./setup.sh -is:log <tmpdir>/log.txt


This option writes most install messages to a file. Always use the full path to the log file e.g. /tmp/log.txt
or c:\log.txt

2.4.2.5 Known Issues


2.4.2.5.1 Disk Spanning (multiple disks)
In release 3.3.x disk spanning (multiple disks for TPC server install) was introduced which often caused
problems. InstallShield has a naming convention that allows a install with multiple disks to avoid asking
the user where the next disk is located (which can be very useful if you have many disks e.g. 10 disks).
Each disk is located in a directory at the same level as disk1. In our case the TPC install DVD contains
two directories called disk1 and disk2 at the same level. Not using this naming convention or partially
using this naming convention will result in install errors.
If you are downloading the disk images the best practice is to extract these images to disk1 and disk2 at
the same directory level e.g. c:\tpcimage\disk1 and c:\tpcimage\disk2. If you extract the disk image to a
directory called disk1, Installshield will expect disk2 to be at the same level. Also, in TPC 3.x, when
extracting disk3 (agent install) do not name the directory disk3, because Installshield will expect disk1
and disk2 to be at the same level even though they are not needed. This issue does not exist in release
4.1.0 and greater.
Examples of good naming conventions:
TPC Server:
c:\tpc\disk1
c:\tpc\disk2
c:\tpc\serverImage1
c:\tpc\serverimage2
TPC Agents
c:\tpc\agentInstall
c:\tpc\disk_3

December 2011

25

Tivoli Storage Productivity Center


Hints and Tips
For the serverimage1 and 2 examples the installer will ask the user where the second CD is located.
Examples of bad naming conventions:
TPC Server
c:\tpc\disk1
c:\disk2
c:\tpc\serverimage1
c:\tpc\disk2
TPC Agents
c:\tpc\disk3

2.4.2.5.2 Log file cleanup on uninstalls


When a fresh install fails the installer will try to cleanup the entire directory. If you need to collect log
files you need to do so before exiting the installer. Once you click on [Exit] the files will be deleted.
This also applies to uninstall.

2.4.2.5.3 Local System user created


When installing TPC agents (Data and Fabric) on Windows you have the option to run the Common
agent service as a Local System account. Even though the install succeeds an actual LocalSystem
Account is created. This bug was partially fixed in 3.3.2 in 3.3.2.81 in defect 40013. Then later merged
in to 4.1.0.106 in defect 45549. The full fix is now in 4.1.0.111 and greater.
To select an existing user or create a user to run the Common agent service click on [Windows service
info] button.

December 2011

26

Tivoli Storage Productivity Center


Hints and Tips
Leave Common agent service name blank and the LocalSystem user will not get created. For ID and
Password enter LocalSystem.

2.4.2.5.4 During upgrade versions of failed components are updated in InstallShield


This is a limitation of InstallShield and stops the user from re-trying an upgrade because the install
version and the installer version are the same. To get around this issue, we created a tool called
updateReg which is located in the tool directory. This tool lists all components and their versions:
C:\Program Files\IBM\TPC\tool>updatereg
1.

DBSchema
C:\Program Files\IBM\TPC\dbschema
4.2.0.60

2.

Data Server
C:\Program Files\IBM\TPC\data
4.2.0.60

3.

Device Server
C:\Program Files\IBM\TPC\device
4.2.0.60

4.

SRAAgent
C:\Program Files\IBM\TPC\agent
4.2.0.60

5.

GUI
C:\Program Files\IBM\TPC\gui
4.2.0.60

December 2011

27

Tivoli Storage Productivity Center


Hints and Tips
6.

CLI
C:\Program Files\IBM\TPC\cli
4.2.0.60

7.

TIP
C:\Program Files\IBM\TPC\tip
4.2.0.60

8.

Replication Server
C:\Program Files\IBM\TPC\tpcr
4.2.0.60

Enter the number of the component you want to change.

Help.

List components in vpd.script.

Save the registry and exit.

Exit without saving the registry.

Enter command (h,l,#,s,e):

Enter the # of the failed component. By default the tool will set the version back one (i.e., If the
upgraded version was 4.2.0.60 it would become 4.2.0.59), which is enough to rerun the install. If you
would feel better putting in the actual version, you can do that. Once you have set all the failed versions
back you enter 's' to save the file and exit.
Before re-running the upgrade you must resolve the original problem.

2.4.2.5.5 On Linux the installer will not start


The Tivoli Storage Productivity Center installation program is unable to run in graphical mode on Red
Hat Enterprise Linux AS Version 4 and 5 or SUSE Linux Enterprise Server version 9 and 10. You must
install a Linux RPM package on these operating systems.
For more information, see the technote >>>Installer gui will not start. It may be easier to just have your
system administrator re-image or install all graphics libraries on the Linux machine.

2.4.3 DbSchema component


When the install starts it creates a database using DB2. This database contains data used by the TPC
Server and GUI.

2.4.3.1 Log Locations


Install Location: <Install Location>/dbschema
Install Log file Location: <Install Location>/log/dbSchema/install

2.4.3.2 Log file descriptions


1) dbSchemaInstall.log This is the actual install file created by the component installer. In most cases
this file will contain the actual problem or exception that can give support a good idea what is
happening.
2) dbSchemaInstall.err This can contain stderr output from the installer. Most likely you will not see
this file if there was no stderr output. All 0 byte files are deleted.
December 2011

28

Tivoli Storage Productivity Center


Hints and Tips
3) dbSchemaInstall.out This can contain stdout output from the installer.
4) dbSchemaInstallIS.log - This is an InstallShield log that basically shows any validation or file
copying before the actual install is called. In the case of the dbschema we are calling a legacy installer
and in most cases this file is not used for debugging purposes.

2.4.3.3 Known Issues


2.4.3.3.1 Not sourcing db2profile
On non-Windows platforms, i.e. AIX or Linux, not sourcing the db2profile of the instance user (e.g.
db2inst1) will cause the installer to throw an error because it cannot contact DB2. For the example
below we are sourcing the instance user's (db2inst1) db2profile from the command line prompt.
You need to source the db2profile when installing and uninstalling.

. ~db2inst1/sqllib/db2profile
This sets variables in the environment that are needed by the installer to contact DB2 and create or
modify the TPCDB database.

2.4.3.3.2 Wrong DB2 Edition


Until release 4.1.1 on Linux you were able to use DB2 Workstation Server Edition even though it was
not supported. In 4.1.1 new performance capabilities were added to the DBSchema component. Installer
now checks the product identifier is db2ese. To check your machine you need to source the db2profile
of the instance user and type: db2licm l
[root@tpcserver ~]# db2licm -l
Product name:
"DB2 Enterprise Server Edition"
License type:
"Restricted"
Expiry date:
"Permanent"
Product identifier: "db2ese"
Version information: "9.1"
Annotation:
"-4;(_o)"

2.4.4 Data Server Component


This component is the control point for product scheduling functions, configuration, event information,
reporting, and graphical user interface (GUI) support. It coordinates communication with and data
collection from agents that scan file systems and databases to gather storage demographics and populate
the database with results. Automated actions can be defined to perform file system extension, data
deletion, and Tivoli Storage Productivity Center backup or archiving, or event reporting when defined
thresholds are encountered. The Data server is the primary contact point for GUI user interface
functions. It also includes functions that schedule data collection and discovery for the Device server.
The Data Server Component is installed after the DbSchema component.

2.4.4.1 Log locations


Install Location: <Install Location>/data
Install Log file Location: <Install Location>/log/data/install

December 2011

29

Tivoli Storage Productivity Center


Hints and Tips

2.4.4.2 Log file descriptions


1) dataServerInstall.log This is the actual install file created by the component installer. In most
cases this file will contain the actual problem or exception that can give the developers a good idea what
is happening.
2) dataServerInstall.err This can contain stderr output from the installer. Only exists if the file is
greater then 0 bytes
3) dataServerInstall.out This can contain stdout output from the installer. Only exits if the file is
greater then 0 bytes.
4) dataServerInstallIS.log - This is an InstallShield log that basically shows any validation or file
copying before the actual install is called.
5) copyJREStdout.log This log file contains the stdout of the copy of the Runtime JRE which is used
by the Data Server, GUI, and CLI.
6) copyJREStderr.log - This log file contains the stderr of the copy of the Runtime JRE. which is used
by the Data Server, GUI, and CLI.
Key log files are listed in bold.

2.4.4.3 Known Issues


2.4.4.3.1 TSRMsrv.zip being accessed
When uninstalling Data Server, the installer stops saying TSRMsrv.zip is in use and will not let you
uninstall. This can also happen during an upgrade.
To solve this problem, exit the installer and cd to <Install Location>\data\server\lib and rename
TSRMsrv.zip to TSRMsrv_org.zip and rerun the installer. This problem has been fixed in TPC 4.2.

2.4.4.3.2 Copy of JRE fails on AIX 6.1


The problem was caused by double quotes around the path. It worked in AIX 5.3 but not 6.1. This
causes the Data server install to fail, which in turn causes the data agent and TPC GUI to fail because
the Data Server must be up during the Data agent and TPC GUI install.
To solve this problem, download the latest build and ensure that it includes the fix for defect 45975.

2.4.5 Device Server Component


This component discovers, gathers information from, analyzes performance of, and controls storage
subsystems and San fabrics. It coordinates communication with the data collection from agents the scan
SAN fabrics and storage devices.

2.4.5.1 Log locations


Install Location: <Install Location>/device
Install Log file Location: <Install Location>/log/device/install

December 2011

30

Tivoli Storage Productivity Center


Hints and Tips

2.4.5.2 Log file descriptions


1) DeviceServerInstall.log Device server is one of the components that is actually installed by
InstallShield so this is an important log file. This covers installation of Embedded WAS, Profile
Creation, Device Server application deployment, etc...
2) wasInstallstdout.log This is contains the output of the Embedded WAS install which is basically an
XCOPY. It is rare to fine problems in this log.
3) wasCreateProfilestdout.log Before deploying the device server application we need to create a
Profile (deviceServer). If this fails the device server will not get installed. This file contains the output of
the configuration.
4) WasCreateProfilestderr.log If the profile creation fails this is the file to examine.
5) CreatePortstdout.log After the profile is created we still need to update some ports i.e. 9551, 9556,
and 9557. This is usually not an important file.
6) CreatePortstderr.log Any error that occurs during the port updates for the Device server.
7) launchITPCFstdout.log contains the deployment of the Device Server application to WAS.
8) addWASServiceOut.txt found on Windows platforms. It is the output of the creation of the Device
Server service.
9) StartService.log This contains the output when the Device Server starts up after a successful
deployment and configuration.

2.4.5.3 Known Issues


2.4.5.3.1 WAS Profile Creation Fails (Corrupt Download Image)
Occasionally when you download the TPC disk1 image the was subdirectory can get corrupted which
results in the profile creation and port configuration to fail. Since the WAS install is basically an xcopy
it always works and failure happens during the profile creation.
Resolution:
Download image again.

2.4.5.3.2 Max Heap Space Size


Some customers have the need to increase the max heap space of the jvm that runs the Device Server.
The default is -Xmx1536m (1.5gb). The maximum size of the java jvm heap cannot be greater than 2gb
(-Xmx2048m) but should not be set higher than -Xmx2000m to avoid Device Server crashes.
genericJvmArguments="-Xmx1536m -Dtsnm.protocol=http:// -Dtsnm.port=9550">

This value can be set in the server.xml file located in:


<TPC>/device/apps/was/profiles/deviceServer/config/cells/DefaultNode/nodes/DefaultNode/servers/server1/server.xml

Resolution:
December 2011

31

Tivoli Storage Productivity Center


Hints and Tips
This is a bug in Java 1.5 and will not be fixed until TPC incorporates a higher version of WAS.

2.4.5.3.3 Device Server will not stop during upgrade


Occasionally on Windows the Device Server service will not go down during the upgrade. A message
will be displayed saying it could not shutdown the component because it is being accessed.

Resolution:
If you see this message wait a minute and click on the [next] button again. Even though the command to
shutdown the service is complete there is sometimes a delay between the end of the shutdown and the
actual shutdown. If that doesn't work you will have to manually shutdown the device server. To do this
bring up the services panel and double click on IBM WebSphere Application Server Service. This
will bring up the service panel. Check the Server status is it stopped or not, if not stopped click on stop.
If that doesn't work you will manually have to kill the process. To figure out the process id do the
following.
1.
2.
3.
4.
5.

CD to <install location>\device\apps\was\profiles\deviceServer\logs\server
Type or cat the contents of server1.pid to get the pid #.
Bring up task Manager click on processes tab.
Select the process and click on [End Process] button.
Once the process is killed you can go back to the Installer click [next] and continue.

December 2011

32

Tivoli Storage Productivity Center


Hints and Tips
If this does not work try rebooting the machine then upgrading or uninstalling. This is mostly an issue
on Windows. If your Device Server will not go down on a non-Windows platform cd to the logs/server
directory to get the pid and type Kill -9 <Server Pid>

2.4.6 TIP Installer


Tivoli Integrated Portal (TIP) component is required to enable single sign-on for TPC. Single sign-on is
an authentication process that enables you to enter one user ID and password to access multiple
applications.
Install Location: Default C:\Program Files\IBM\tivoli\tip or /opt/IBM/tivoli/tip

2.4.6.1 Log locations


Install Log file Location: <Install Location>\log\tip\install

2.4.6.2 Log file descriptions


1) configTIP.out Configures and deploys the TIP ear file to WAS. Note
2) configTPC.out - Stdout of the configuration LTPA keys with the Device Server.
3) configTPC.out - Stdout of the configuration LTPA keys with the Device Server.
4) startTIPWAS.log Starts the application server for TIP.
5) stopTIPWAS.log stops the application server for TIP.
6) TIPInstallIS.log Validates and executes the actual TIP install.
7) TIPReg.out Stdout from the TIP registration.
The next two log files are the actual TIP installer (InstallAnywhere) log files.
8) %USERPROFILE%\IA-TIPInstall-00.log This is a good file to check if the TIP install fails to
get the root cause of the TIP failure. For non-Windows platforms, check the ~root or ~user directory for
these log files.
9) % USERPROFILE%\IA-TIPUninstall-00.log This is a good file to check if the TIP uninstall
fails to get the root cause for the TIP Failure. For non-Windows platforms, check the ~root or ~user
directory for these log files.

2.4.7 TPC Replication


TPC for Replication is designed to control and monitor copy service operations in storage environments.
It also provides advanced copy service functions for supported storage subsystems on the SAN.
Install Location: Default C:\Program Files\IBM\replication or /opt/IBM/replication

2.4.7.1 Log locations


Install Log file Location: <Install Location>/log/TPCR/install

December 2011

33

Tivoli Storage Productivity Center


Hints and Tips

2.4.7.2 Log file descriptions


1) TPCRInstallIS.log Installshield log validates and executes the actual TPC-R installer located in
<Install Location>\log\TPCR\install.
2) TPCRMInstall.log This is the actual log file of the component installer. Which is located in the
<Install Location>

2.4.8 Data Agent


Data Agents, Fabric Agents, and Storage Resource agents gather host, application, and SAN fabric
information and send this information to the Data Server or Device Server.
Install Location: Default C:\Program Files\IBM\TPC\ca\subagents\TPC or
/opt/IBM/TPC/ca/subagents/TPC
Component subdirectory: Data

2.4.8.1 Log locations


Install Log file Location: <Install Location>\log\install

2.4.8.2 Log file descriptions


1) dataAgentInstall.log This is the key log file for the remote install. There are two ways to get to this
log file. The first and the easiest is to double click on the Computer name in the Install Panel once all
installs are done. The second is to go to the remote machine to retrieve the log file located in <Install
Location>\log\install.
2) DataAgentInstall.out Stdout from the install.
3) DataAgentInstallIS.log InstallShield log file that show validation and execution of the actual data
agent install.
4) InstallStatus.log Exit code from the Common Agent install.
5) agentInstall.log This is actually the common agent install log file located in <install Location>\ca,
since it is installed by the Data Agent it is worth checking this file.
6) msgAgent.log This is the common agent run time log file and will tell you if the Common Agent
has registered with the Agent Manager.

2.4.8.3 Known Issues


2.4.8.3.1 Data Agent fails during Common Agent validation
During a fresh install of a Data Agent, the data agent actually installs the Common Agent then deploys
the agent bundle to the common agent. Part of the data agent install is to validate that the Common
Agent is installed correctly and that it is up (registered with the Agent Manager). If the Common Agent
fails to register with the Agent Manager the Data Agent install will fail.
Solution:
Check to ensure the Agent manager is up and running by running the Healthcheck script. Check that
DNS resolution is working on both the Agent machine and the Agent Manager machine. If everything
checks out, try installing the Common Agent first then running the TPC installer. In most cases a Data
Agent install failure is a result of the Common Agent either failing to install or failing to register with
the Agent Manager.
December 2011

34

Tivoli Storage Productivity Center


Hints and Tips

2.4.9 Fabric Agent


Data Agent, Fabric Agent, and Storage Resource Agent, gather host, application, and SAN fabric
information and send this information to the Data Server or Device Server.
Install Location: Default C:\Program Files\IBM\TPC\ca\subagents\TPC or
/opt/IBM/TPC/ca/subagents/TPC
Component subdirectory: Fabric

2.4.9.1 Log locations


Install Log file Location: <Install Location>/log/install/subagents/TPC/Fabric/install

2.4.9.2 Log file descriptions


1) fabricAgentInstallIS.log This is the actual component install log.
2) Install.success Exit code from common agent installer.

2.4.9.3 Known Issues


2.4.9.3.1 Fabric agent not shutting down during upgrade
Sometimes the fabric agent will not go down because it starts a discovery or scan every few seconds.
This usually happens on machines with multiple HBAs.
Solution
Manually shutdown or kill the fabric agent process before upgrade. Most likely you will discover this
problem after the fabric agent install fails. You will need to to update the InstallShield registry using the
updateReg script discussed under the Installer section.

2.4.10 Storage Resource Agent


Data Agents, Fabric Agents, and Storage Resource Agents gather host, application, and SAN fabric
information and send this information to the Data Server or Device Server.
Install Location: Default C:\Program Files\IBM\TPC\ or /opt/IBM/TPC
Component subdirectory: agent

2.4.10.1

Log Locations

Install Log file Location: <Install Location>\agent\log, <Install Location>\agent\log\<host name>


<Install Location>\log\sra\install (when using disk1 only)

2.4.10.2

Log file descriptions

<Install Location>\agent\log
1) agent.trace - Install agent trace file very important for install problems.
2) agent_000001.log Runtime log file for Agent binary.
3) discovery_FULL_4764.trace Initial discover trace file.
<Install Location>\agent\log\<host name>
1) probe.trace Initial probe trace file could be important.

December 2011

35

Tivoli Storage Productivity Center


Hints and Tips
2) probe_000001.log Runtime probe file.
<TPC Install Location>\log\sra\install (when using disk1 only)
1) agent.trace - Install agent trace file very important for install problems. Copied from <install
Location>\agent\log
2) agent_000001.log Runtime agent log file. Copied from <install Location>\agent\log
3) copyLoglStdout.log Stdout of copy command of logs in agent directory.
4) sraInstallIS.log - Installshield log file that show validation and exectution of the actual sra agent
install.
5) sraInstallStdout.log Stdout of sra agent install.

2.4.10.3

Agent Command Line Usage

Usage:
Agent -VERSION
Agent -HELP

Installation:
Agent -INSTALL
[-FORCE]
-INSTALLLOC pathname
-SERVERIP address
-SERVERPORT portnumber
[-DEBUG MAX]
Additional mandatory parameters depending on the service type:
Daemon:
-COMMTYPE DAEMON
-AGENTPORT portnumber
non-Daemon:
-USERID username
-PASSWORD password
[-CERTFILE file]
[-PASSPHRASE phrase]

Uninstallation:
Agent -UNINSTALL
[-FORCE]
-SERVERNAME servername

Upgrade:
Daemon:
Agent -UPGRADE
-INSTALLLOC pathname
-COMMTYPE DAEMON
non-Daemon:
Agent -UPGRADE
-INSTALLLOC pathname

December 2011

36

Tivoli Storage Productivity Center


Hints and Tips
Parameter explanation:
INSTALLLOC - the agent's installation location (defaults:
Windows: "C:\Program Files\IBM\TPC"
Unix: /opt/IBM/TPC)
SERVERIP - the IP address of the TPC Data server (multiple addresses may be
specified as a comma-separated list)
SERVERPORT - port used for the TPC Data server (default: 9549)
DEBUG
- allows increasing tracing for problem determination purposes
COMMTYPE - agent running in service mode (DAEMON: service mode)
AGENTPORT - the port to be used by the agent (default: 9510)
when installing the agent in daemon mode
USERID - login name used by the server to contact the agent host
PASSWORD - password for USERID
CERTFILE - path to the SSH certificate (located on TPC Data server host)
that will be used for communication between server and agent
PASSPHRASE - passphrase for the SSH certificate
Examples:
Install agent daemon type:
Agent -INSTALL -INSTALLLOC "C:\Program Files\IBM\TPC" -SERVERIP 9.47.97.171 -SERVERPORT 9549
-COMMTYPE DAEMON -AGENTPORT 9510
Agent -INSTALL -FORCE -INSTALLLOC "C:\Program Files\IBM\TPC" -SERVERIP 9.47.97.171 -SERVERPORT
9549 -DEBUG MAX -COMMTYPE DAEMON -AGENTPORT 9510

Install agent non-daemon type:


Agent -INSTALL -INSTALLLOC "C:\Program Files\IBM\TPC" -SERVERIP 9.47.97.171 -SERVERPORT 9549
-USERID userID -PASSWORD password
Agent -INSTALL -FORCE -INSTALLLOC "/opt/IBM/TPC" -SERVERIP 9.52.173.102 -SERVERPORT 9549 -USERID
userID -PASSWORD password -CERTFILE "/opt/IBM/TPC/data/cert"

Upgrade agent daemon type:


Agent -UPGRADE -INSTALLLOC "C:\Program Files\IBM\TPC" -COMMTYPE DAEMON

2.4.10.4

SRA Install Error Codes

Return
Codes

Problem

Possible Cause

45

Failed to register with


server

101

Could not create agent lock Agent is hung


file.

102

Could not stop probe.

103

Could not stop Agent.

December 2011

Agent host name or ip


resolution, agent to
server communication
issue, ports/security,
guid

Agent is hung

Possible Resolution
Correct DNS,
create /etc/hosts file
entries,
open ports, resolve
guid discrepancy
Manually shutdown or
kill agent.
Manually shutdown or
kill agent.

37

Tivoli Storage Productivity Center


Hints and Tips
104

Entry for sra already exists Agent uninstall left


in Agent.config
behind Agent.config or
agent already installed.

105

Could not extract agent.zip Install image is possibly Re-download the


file during install.
corrupted.
install image.

106

Could not create


Agent.config

Agent.config file exists


already.

Use -force option to


overwrite
Agent.config.

107

Could not stop agent


service.

Agent is hung

Manually shutdown or
kill agent.

108

Agent service already


exists.

Uninstall failed, machine Manually remove


not rebooted after
service in
uninstall.
/etc/Tivoli/TSRM/TP
CAGENT or remove
the agent from the
windows registry

109

Failed to create agent


service.

Check Agent.trace to
debug the problem.

110

Agent service failed to


start.

Native code corrupted.


Port could be in use.

111

Probe failed.

Device issue on agent, or Check Agent.trace


Agent failed to send
and probe log.
probe result, or server
failed to receive result

112

Similar to error code 109

See error code 109

114

Invalid com type.

Valid -commtype args


are DAEMON or
RXA

115

Port in use

116

Not enough space.

Clean up space an
retry the install.

117

Another install in progress

Let install finish.

118

Cannot retrieve server


name.

Could be a firewall issue


or data server could be
down or having
problems.

119

Directory not empty.

Previous uninstall failed Cleanup directory or


or agent already
use the -force option.
installed.

120

Missing required parm

December 2011

Manual cleanup or
use the -force option
during install.

Check Agent.trace to
debug the problem.

Probably used by another Use a different port


SRA or CAS agent.

Check if data server is


up and running then
check for firewall
issues or DNS issues.

Run Agent -? to get


usage
38

Tivoli Storage Productivity Center


Hints and Tips
130

Failed to send probe results

2.4.10.5

SRA Uninstall Error Codes

Return Codes

Problem

Possible Cause

Possible Resolution

60

Install cannot
Data server is down
retrieve server name. or not running
correctly. Also could
be firewall or DNS
issue.

61

Could not remove


server entry from
Agent.config

62

Could not stop


Agent job

Agent is in the
middle of a job

63

Failed to remove
registry

Agent has more then Check Agent.config


one server.
if there is more then
one Data server or
use -force option to
cleanup.

64

Failed to remove
service.

Windows only

Check Agent.trace to
debug the issue.

65

Could not stop


service.

Agent is hung

Check if agent is in
the middle of a job,
if not manually kill
the agent then retry
the uninstall.

66

Invalid server name.

December 2011

Check if data server


is up an running.
Check firewall and
DNS.

Retry uninstall when


agent has finished
the job.

Check that server


name matches server
name in
Agent.config.

39

Tivoli Storage Productivity Center


Hints and Tips

2.5 Installing TPC


Review the TPC Installation and Configuration Guide before attempting to install TPC. This publication
can be viewed online or downloaded through the >>>TPC Infocenter. It provides the complete
instructions for installing and configuring TPC.

2.5.1 Server Configuration


Before installing TPC, you should:
Verify that the server operating system and processor architecture are shown as supported by
TPC according to the TPC Platform Support document.
Verify that the server is properly configured, including:
o
Date, time, and time zone are set properly. Note that this is vital to check on all
devices to be used with TPC, including storage devices, switches, monitored servers, to
make sure time settings are correct.
o Network/dns configuration is correct (/etc/hosts entries, localhost entry, ip configuration)
o Ports needed by TPC are not in use
o Adequate disk space for installation
o Check system logs for any errors or problems that should be corrected

2.5.2 Userids and Passwords


For TPC server and agent installations, it is recommended to use the local Administrator account on
Windows. On Unix platforms, you are required to use the root user account.
TPC installation will include creation of additional user accounts and passwords for the various
components (DB2, Agent Manager, TIP, TPC, TPC-R servers). In creating these accounts and
passwords, you must take into consideration a) your companys security policies, b) the security policy
required/enforced by the operating system platform, c) requirements/limitations imposed by the various
TPC software components.
Following are the best practice recommendations:

Use the password worksheet to document all userids and passwords created/used with TPC
Avoid unnecessary password complexity where possible. Keep passwords short (8 characters
recommended), and avoid/limit use of special characters.
Where security policies allow it, set passwords to never expire, especially for TPC component
services (for example the DB2 admin password).

Where password expiration is required, change affected passwords before they expire to
avoid interruption of TPC operations. In some cases, attempts to connect to services with expired
passwords can result in the user being locked out of the application or service.

2.5.3 Installation Tasks


The short list of install tasks is shown below:
1) Install DB2 take the defaults whenever possible:
a) When installing, it is recommended to create the database instance on a high performance and
high capacity filesystem or disk drive on SAN attached storage. This will improve performance,
December 2011

40

Tivoli Storage Productivity Center


Hints and Tips
provide ample space for growth, and keep the database separate from the host server's operating
system disk (see Installing the TPC database in a custom location below).
b)
The DB2 V9.7 package bundled with TPC 4.2.x now has an additional package that you
must download and install with the DB2 license. It is no longer bundled with the DB2 software.
c) The DB2 administrator and DB2 user ID must conform to the following rules:
i) It cannot begin with "SYS", "sys", "IBM", "ibm", "SQL", or "sql".
ii) It must contain only the characters a through z, A through Z, 0 through 9, and the three
special characters (#,@.$).
iii) It cannot be delimited and contain lower case letters.
iv) It cannot be GUESTS, ADMINS, USERS, or LOCAL.
2) Install Agent Manager before installing any Data or Fabric agents (not required for SRA) notes:
a)
You should always choose the option to generate certificates for a production environment.
They are more secure (with respect to Agent Manager access), and the default demo
certificates will expire one year after creation.
b) When installing with generated certificates, make sure that you specify a certificate password,
and that you record the password in a safe place with your other TPC userids/passwords in case
it is needed to open the certificates. If you do not specify the password, the installer will generate
a random password that will be unknown, and if there are problems with certificates the only
recourse will be to create new certificates which requires many steps.
c) The default resource manager userid/password is manager/password This can be left
unchanged. Note that at the time of this update, it is not possible to change the resource manager
password to something other than password. Even if a different value is specified during Agent
Manager installation, password must be used when installing TPC.
d) A default agent registration password is no longer provided. Be sure to record the password you
choose in a safe place with your other TPC userids/passwords.
3) Install the TPC Server using the TPC Server installation package (disk1). For TPC V3.3, if you use
the DVD, you will not be prompted for disk2. If you use the CD images (uncompressed to file
system directories), you will be prompted for the disk2 location if the directory name is not disk2.
It is recommended that the disk1 image be extracted in a directory named disk1 and the disk2
image in a directory named disk2 (Windows example: c:\disk1 and c:\disk2).
a) The Device Server and Data Server must reside on the same machine.
b) Install all the server components. The Typical installation option is recommended.
i) Database Schema
ii) Data Server
iii) Device Server
iv) Graphical User Interface (GUI) (optional)
v) Command Line Interface (CLI) (optional)
vi) Data Agent or SRA (only if this machine is to be managed by TPC)
vii) Fabric Agent (only if the machine is SAN connected, and in a SAN to be managed)
viii) Tivoli Integrated Portal (TIP) (V4.1)
ix) TPC-R (V4.1)
c)
Note about the TPC device server host authentication password: when using the Typical
installation path, you are not given the opportunity to specify this password. It will default to the
password that you specified for your DB2 admin account.
4) Before installing TPC agents:
a) Make sure that the TPC server is up and running without problems. This should include:
i) ping and telnet tests from agent server to TPC server on ports 9549 (data server), 9550
(device server), and 9511 (Agent Manager).
ii) Run AM 'HealthCheck' to verify AM status and CA registration password.
December 2011

41

Tivoli Storage Productivity Center


Hints and Tips
iii) Open the TPC gui to make sure the server is running properly.
iv) Make sure DB2 is running properly (DB2 processes are running, port 50000 is listening).
b) Check to see if the agent machines have had TPC code on them in the past. Refer to the
uninstall/cleanup steps in section 5.7 to check for things that might need to be removed
manually.
c) If agent machines are cloned/built from a master image (e.g., AIX LPARs):
i) check to see if the agent server image includes the Tivoli guid software, and generate a new
Tivoli guid value if necessary prior to attempting to install agents (see section 10.10).
ii) make sure the new/cloned system has been configured completely (I.e., has its own
hostname, correct /etc/hosts file entries, date/time/time zone settings are correct, etc.) before
attempting to install agents.
d) Make sure you are using a user account on the agent that has the necessary administrator
rights/privileges for installing and configuring software. This should be 'root' on Unix platforms,
and a local administrator on Windows (a domain administrator account can also be used see
section 5.9 for more information).
5) Install TPC agents using one of the following methods. For each, specify Custom Installation, then
Remote Data Agent and/or Remote Fabric Agent for Remote/Push installs, and Data Agent or
Fabric Agent for local installs.
i) Start the disk1 installer (TPC Server install) on the TPC Server for remote agent push
installs. You can push agents to any supported platform using this method.
ii) Start the disk3 installer for TPC V3.3 (TPC Agent install) on the TPC Server for remote
agent push installs. This is the same as option i) above. The disk3 platform image must match
the TPC Server platform. You can push agents to any supported platform using this method.
iii) Start the appropriate platform disk2 for TPC V4.1, or disk3 for TPC V3.3 installer (TPC
Agent install) on the server to be managed by TPC. This will perform a local install of either
or both of the data and fabric agents.
iv) If you are deploying Storage Resource Agents (SRA), you should push these out from the
TPC server (does not require running the TPC installer). Start the TPC gui, and go to
Administrative Services -> Configuration, then right-click on Storage Resource Agent
Deployments and select Create Storage Resource Agent Deployments.
a) Remote install of TPC agents must be done from the server where the TPC Server is installed.
When a Data Agent is deployed remotely, the Common Agent is installed if it doesnt
already exist. Local installs of the Data Agent will also install the Common Agent. Remote
deploy of fabric agents can only be done to servers that already have the IBM Tivoli Common
Agent installed. Local fabric agent installs will also install a Common Agent if it doesnt already
exist.
b) When installing agents, the call to get a list of common agents may timeout. You can set the
eplist.timeout parameter before running the installer to increase the timeout value over the 5
minute default. Here is a Windows example showing a timeout value of 10 minutes:
set JVMPARMS=-Deplist.timeout=600

c) Data or SRA agents must be installed on any servers that are to be managed by TPC.
d)
TPC agents can also be installed on AIX VIO servers. **IMPORTANT: it is critical to
check the /etc/hosts file on the VIO server to make sure there is a valid localhost entry. If this is
missing, the TPC agent install will fail, and the only way to recover is to reimage the server.
e) On some VIO servers, the install fails because the TPC software image is missing from
/opt/IBM/tivoli/tpc. If you have this problem, you can download the AIX TPC local agent install

December 2011

42

Tivoli Storage Productivity Center


Hints and Tips
image (disk2 for TPC 4.1, disk3 for TPC 3.3.2) that matches the version of your TPC server,
extract it into this directory, and retry the install.
f) Fabric agents should only be installed on servers that have HBAs and are connected to SANs that
will be managed by TPC. At least two fabric agents should be installed in each fabric to be
managed by TPC to provide a redundant ability to discover each SAN that TPC will manage.
It is not recommended to install Fabric agents on all servers that have HBA connections to
SANs. More inband fabric agents is not better, and can cause a SAN discovery storm when
TPC receives a SAN state change notification.
6) Install and configure SMI-S CIMOM agents for storage devices, fabric switches, and tape libraries,
and configure these CIMOM agents in TPC.
7) If you plan to use the TIP single sign-on interface or the Java Web Start interface to TPC, you must
install IBM Java. Java from other vendors will not work and are known to cause failures with
Element Manager launch and other TPC functions. IBM Java downloaded from the web will only
install on IBM hardware, however IBM Java is bundled with TPC and can be downloaded to the
workstation by pointing your web browser to: http://tpcserver:9550/ITSRM/app/welcome.html, or
you can copy the package for your platform from the TPC server directory at:
<tpc>/device/apps/was/profiles/deviceServer/installedApps/DefaultNode/DeviceServer.ear/DeviceServer.war/app

2.5.4 Installing the TPC database in a custom location


The best-practice recommendation is to install the TPC database on a high-performance, high-capacity
SAN attached disk drive. The default is to install the database in the default DB2 instance that is created
when DB2 is installed, which is typically the C: drive on Windows, and /home/db2inst1 on Unix. These
filesystems typically have limited space, and typically share space with critical operating system files
and data, neither of which are ideal for stability or scalability.
To create the TPC database in a custom location during TPC installation, you should do the following:
1. When installing DB2, specify the instance home directory to be the custom location you wish to
use.
2. Do not create the database prior to launching TPC installer.
3. Use the Custom install option when installing TPC. This will enable you to specify the custom
paths to be used for database creation. Here is a Unix example showing the installer fields when
the database is to be created on a filesystem with an instance home directory of /db2/tpc (the
sizes given are only an example, and should be changed according to your needs):
Schema name:
Database drive:

TPC
/db2/tpc

200mb

Tablespace

Container Directory

Size

Normal
Key
Big
Temp
Temporary user

/db2/tpc/tpc/TPCDB/
/db2/tpc/tpc/TPCDB/
/db2/tpc/tpc/TPCDB/
/db2/tpc/tpc/TPCDB/
/db2/tpc/tpc/TPCDB/

200mb
200mb
350mb
200mb
200mb

* System managed (SMS)

Refer also to the technote >>>Installing TPC Using A Remote Database.


December 2011

43

Tivoli Storage Productivity Center


Hints and Tips

2.6 Upgrading TPC


2.6.1 Backup Your Existing Environment
The TPC upgrade procedures stress the importance of having a full backup of your existing TPC server
environment, covering both the TPC server software and the TPC database, before upgrading. This step
is vital in case something goes wrong and recovery is necessary. See section 5.7 for more information on
backing up TPC.

2.6.2 Temporarily Change DB2 Logging To Circular


If you have DB2 configured for archive logging, you should temporarily reconfigure DB2 back to
circular logging while you are upgrading TPC, particularly if you have a large database. This will help
prevent running out of disk space due to database changes made during TPC dbschema upgrade. To
make this change:
1. Stop the TPC data and device server services.
2. Change the database configuration:
a. Windows open a DB2 command prompt window and enter the commands:
db2 connect to tpcdb
db2 get db cfg for tpcdb >tpcdb_cfg_saved.log
db2 update db cfg for tpcdb using LOGARCHMETH1 OFF
db2 update db cfg for tpcdb using LOGARCHMETH2 OFF
db2 connect reset

b. Unix su to the DB2 instance owner (typically db2inst1), then run these commands:
db2 connect to tpcdb
db2 get db cfg for tpcdb >tpcdb_cfg_saved.log
db2 update db cfg for tpcdb using LOGARCHMETH1 OFF
db2 update db cfg for tpcdb using LOGARCHMETH2 OFF
db2 connect reset

3. Restart TPC services.


4. After your upgrade is completed, refer to the tpcdb_cfg_saved.log file to reconfigure logging
back to the original settings.

2.6.3 SCHEDULED_UPGRADES File


Make sure that you have a blank, empty file named SCHEDULED_UPGRADES in the <TPC>/data
directory. Most TPC releases create this file, but if you dont have it you should create it. This will
prevent a newly upgraded TPC server from attempting to automatically send out upgrade jobs to all
deployed agents. Agents should be upgraded in small batches to avoid having the server overwhelmed
with agent upgrade requests, which in extreme cases can prevent you from logging in to TPC.

2.6.4 Stop TPC Workload


A common reason for difficulties with TPC upgrades is a TPC server that is busy with active workload
during the upgrade process. Before upgrading the TPC server, you should stop all TPC workload by
following this procedure:

December 2011

44

Tivoli Storage Productivity Center


Hints and Tips
1. Stop all jobs, including performance monitors, subsystem and fabric probes, and TPC for data
scans and probes.
2. If possible, reboot the TPC server. This will terminate any remaining TPC java processes that
may not stop in a timely manner. If it is not possible to reboot the TPC server, stop and restart
the TPC server services. It is important that the TPC Device Server service stop and restart
cleanly. If it does not, a server reboot is indicated.
3. Immediately after the TPC server and TPC services are started, start the TPC installation
program and apply the upgrade. Refer to the detailed upgrade instructions in the >>>TPC
Infocenter and/or the 'readme' for the upgrade you are applying.

December 2011

45

Tivoli Storage Productivity Center


Hints and Tips

3 TPC Data Source Overview


In order to make TPC ready to perform the functions it is designed to do, there are certain configuration
steps that are required to enable data gathering from the devices and servers that will be managed and
monitored. Different configuration steps are required to enable these different functions in TPC, and its
important to understand what these configuration steps are.

3.1 Data Sources used in TPC


In releases prior to V3.3, TPC used four different types of agents to gather data about the devices and
servers that will be managed and monitored. In V3.3 and V4.1, two additional agents were added, and
the whole section that lists them is now referred to as Data Sources. TPC Servers and VMware VI
Data Sources were new to this release.
TPC 4.2 introduces the ability to use the storage subsystem native interface to collect data instead of a
CIMOM agent. This is true for IBM System Storage DS8000 systems, IBM System Storage SAN
Volume Controller, IBM Storwize V7000, or IBM XIV Storage Systems. These devices are not
discovered, but are configured in TPC using the new >>>Configure Devices Wizard.
Different combinations of data sources are required to effectively enable the functions of Data Manager,
Fabric Manager, Disk Manager, and Tape Manager. In addition to these manager functions, the topology
viewer is greatly affected by the proper discovery of all the managed entities in the management scope
of TPC.

3.1.1 SMIS Providers (CIM Agents)


These agents are provided by the vendor of the storage device, fabric switch, or tape library. For storage,
they are needed for storage asset information, provisioning, alerting, and performance monitoring. For
fabric switches, they are only used (today) for performance monitoring. For tape libraries, they are used
for asset and inventory information.
Some definitions:
CIMOM Common Information Model/Object Model
SNIA Storage Networking Industry Association
SMI-S Storage Management Initiative Specification
Each vendor of the storage, fabric switch, or tape library supplies unique CIM agent code for their
families of devices. These agents implement an SMI-S provider that conforms to the SNIA SMI-S
specification to provide a communication transport between TPC and the managed devices.

December 2011

46

Tivoli Storage Productivity Center


Hints and Tips
Vendor Storage
Devices

TPC V3.1
Server

To ta lS to ra g e

T o ta lS to r a g e

pSeries

Vendor SMI-S CIM


Proxy Agent
SMI-S XML

System p5

T o t a lS to ra g e

Vendor
Proprietary
Interface

The CIM agent is an interpreter between TPC and the device. The TPC to CIMOM layer uses a SNIA
SMI-S interface using an XML transport for data and command interchange. The CIMOM to device
layer uses the proprietary interfaces provided by the device vendor to convert those commands and
responses from the SNIA XML language to a language that the device can understand and respond to.
These CIM agents are usually in a form referred to as proxies, meaning that a separate code install is
required with the necessary configuration to identify the devices that the agent will communicate with.
A few more advanced devices have the CIM agent imbedded in the device itself, as is the case with
Cisco fabric switches. In this case, there is no proxy agent to install, and TPC is configured to point to
the managed device itself. More information on CIMOMs, and a list of certified devices and
applications can be found at the >>>SNIA Conforming Providers site. Select the appropriate vendor link
for the list of devices supported.
The CIM agents can be referred to by a variety of names, including CIM agent, SMI-S Provider, CIM
agent, etc. Each vendor has taken the liberty to refer to these SNIA SMI-S interfaces slightly differently.
Once a CIM agent is installed and configured, TPC can be configured to communicate with it. The
devices that are supported by TPC, with links to the supported CIM agents can be found at the TPC
>>>Supported Products List, selecting the TPC component youre interested in, then selecting the
Documentation link, and then selecting the link corresponding to your TPC release.
More detailed information about the more common SMI-S agents supported by TPC can be found in
Chapter 4: SMI-S Providers Installation and Configuration on page 56.

3.1.2 Data Agents


These are the traditional TSRM agents. A Data Agent or Storage Resource Agent (see next section) are
installed on ALL computer systems you want TPC to manage. This is commonly referred to as an
Agents Everywhere methodology. These agents collect information from the server they are installed
on. Asset information, file and file system attributes, and any other information needed from the
computer system is gathered. Data agents can also gather information on database managers installed on
the server, Novell NDS tree information, and NAS device information. In TPC, you can create pings,
probes, and scans to run against the servers that have Data agents installed.

December 2011

47

Tivoli Storage Productivity Center


Hints and Tips
Data agents can be remotely installed by running the TPC installer from the TPC server machine.
This will install both the common agent and the Data agent. Data agent can also be installed locally on
the servers that will be managed. There is a nice enhancement in TPC V3.1.3 to recognize the correct
NIC that provides communication back to the TPC server.

3.1.3 Storage Resource Agents


TPC 4.1 introduced the Storage Resource Agent, which is a lightweight native agent available only for
Windows, Linux, and AIX platforms. It does not require the Tivoli Agent Manager and does not require
or install a Tivoli Common Agent. It provides traditional Data Agent probe-equivalent functionality and
data collection, but does not support batch reporting, filesystem scanning, or NAS discovery or
topology.
Storage Resource Agents can also be remotely installed by running the TPC installer from the TPC
server machine.

3.1.4 Inband Fabric agents


These are the traditional TSANM agents. They are installed on computer systems that have fiber
connectivity (through HBAs) into the SAN fabrics you want to manage and monitor. Inband Fabric
agents use scanners to collect information. The scanners are written in O/S native code, and
communicate through the HBA to collect fabric topology information, port state information, and zoning
information. They also can identify other SAN attached devices (if they are in the same zone). Using
O/S system calls, they collect information about the machine they are installed on.

Inband Fabric agents are discovered during the agent install process, and arent discovered
separately through a TPC GUI Discovery task, nor is it possible to do so.
You can remotely deploy Inband Fabric agents from the TPC server. When you run the agent
installer from the TPC server to remotely deploy Inband Fabric agents, the common agent must
already be installed on the target machine, and registered to the Agent Manager that TPC is
associated with. If you install the Inband Fabric agent locally on the server, the common agent will
be installed for you.
Inband Fabric agents are now supported on the same platforms as Data agents. (This includes HPUX.)

Inband Fabric agents should be installed using a "Well-placed" agent methodology, where you would
place an agent on at least one machine that has an HBA connection to each fabric that will be managed
by TPC, and the best practice is to have two agents connected to each fabric for redundancy. Even
though a single Inband Fabric agent can collect all the topology for an entire SAN, you want to have a
backup agent on a server connected to a different switch in the fabric in case any switch becomes
isolated from the rest of the SAN.
One of the side effects (or feature) of Inband Fabric agents is the HBA detail information they collect for
TPC. This HBA information is also used in TPC for the Data Path Explorer function. This might tempt a
customer to install Inband Fabric agents everywhere, which would be a bad practice. The placement of
Inband Fabric agents should be limited to the minimum necessary to collect topology information
through the SAN, allowing for redundancy. This will minimize the possibility of TPC causing critical
fabric path overloads.

December 2011

48

Tivoli Storage Productivity Center


Hints and Tips

3.1.5 Out of Band Fabric (OOBF) Agents


Out Of Band Fabric (OOBF) agents are used to collect topology information from fabric switches
through the IP network using SNMP V1 queries to the switches. These agents discover more or less the
same information as Inband Fabric agents, but from each switch, rather than the whole SAN. Best
practice states that you should have an OOBF agent pointing to each switch in each SAN fabric you are
managing. If the switches are behind private IP networks, as is normally the case with McDATA, you
wont be able to use OOBF agents. OOBF agents are required in order to collect zoning information for
Brocade switches (that's where the admin userid and password are needed), and VSAN information for
each Cisco switch.

TPC will only respond to SNMP V1 queries. If you register OOBF agents to switches that use
SNMP V2 or V3, the agent wont work. You will either have to reconfigure the switch to use
SNMPV1 if possible, or rely on Inband Fabric agents to collect switch information for these
switches.

3.1.6 Cisco SAN environments


Cisco switches require special consideration when configuring for TPC.
You will need to have an Out-of-Band Fabric agent for each Cisco switch to collect VSAN
information. This information is used to correlate the switch to VSAN associations in the SAN
environment. Since Cisco switches use SNMP-V3 by default, each switch will need some
configuration changes described in the following section to configure a SNMPV1 path for TPC.
TPC Inband Fabric agents only have the scope of the VSAN they are connected into. This is because
each VSAN has its own Management Server. If you want to use TPC for zone Management, you
will need to have a TPC Inband Fabric agent on at least one server in each VSAN to be managed by
TPC.
To net this out: For proper management of the Cisco environment, each switch will require an Out-ofBand Fabric agent, and each VSAN will require at least one Fabric agent.

3.1.6.1 Configuring Cisco MDS9000 switches for Out-of-band communication


Configuring a Cisco switch for Out-of-Band Fabric management is necessary to get VSAN definitions.
Note that Cisco switches are configured to use SNMP v3 by default and must be configured to direct
SNMPV1 responses to TPC. VSAN information for the entire physical infrastructure is gathered. The
following commands can be used from the Cisco switch CLI for configuration.
Telnet (or SSH) to the Cisco switch to enable SNMPV1 for the TPC Out-of-Band Fabric agent by
issuing the following commands:
show snmp
Use this command to view the current SNMP settings. The command can be made more specific
to provide details on a particular SNMP setting, such as show snmp community or show snmp
trap.
config terminal
Use the above command to enter into configuration mode. The following commands are issued
at the config prompt.
o snmp-server community TPCRO ro
Set the read-only community string. It has network-administrator access by default.
This sets TPCRO as the read only community name that you can use in TPC.
o snmp-server community TPCRW rw
Set the read-write community string. This is necessary to make sure VSAN information
December 2011

49

Tivoli Storage Productivity Center


Hints and Tips

gathered is not stale. It has network-administrator access by default.


This sets TPCRW as the read-write community name that you can use in TPC.
o snmp-server community TPCRO group network-operator
Issue this command for the community string TPCRO to give it network-operator
authority. By setting TPCRO to network-operator, you are specifying to use the
community string for SNMPv1 communication. If the community string is left with a role
of network-administrator, discovery in IBM TPC for Fabric will not work.
o snmp-server host <address> traps version 1 TPCRO udp-port 162
(Use the TPC Server IP for <address>)
Set the trap destination address so that the Cisco switch sends SNMPv1 traps using
TPCRO as the community and port 162 as the listening port for the host.
o Cntl-Z to exit config mode
snmp-server enable traps [notification-type] [notification-option]
With no options, the command enables all traps on the switch.

Important: Enabling all traps on the switches is not recommended. This can lead to the TPC server
becoming unresponsive as a result of fabric probes being triggered by each alert. Instead, the SAN
administrator should refer to the appropriate Cisco documentation for the switch model for a list of
specific traps for which they want alerts (i.e., anything that may reflect a change in the fabric topology),
with the understanding that any traps they enable will trigger a fabric probe when the alert is received.
Next in TPC, define a new Out-of-band agent for each Cisco switch (where you ran the previous
commands). Specify the hostname or IP in the Host name field, and both the ro and rw community
strings in the SNMP Community field (separated by a colon with a space on each side of the colon).
This is an overload of the field, and is one of the "TPC Secrets". An example is shown below.

Remember that Out-of-Band Fabric (OOBF) agents only have scope of that switch. That means that an
OOBF agent will only return information pertaining to the switch itself, and cannot collect information
on other switches in the fabric. You will need to define OOBF agents for each switch in your managed
fabrics.

December 2011

50

Tivoli Storage Productivity Center


Hints and Tips

3.1.7 TPC Server data sources


TPC Server data sources are used to enable TPC Rollup Reporting, giving you scalable enterprise-wide
reports that roll up information from multiple IBM TPC installations (all TPC servers must be at the
same version level). This function consolidates asset and health information from multiple remote IBM
TPC instances. For installations that have multiple TPC Servers and want to consolidate information for
reporting, each TPC Servers repository data can be mined by the master TPC Server, and special
rollup reports can be generated.

3.1.8 VMware VI data sources


VMware VI data sources are used to register VMware Virtual Center V2 servers and/or VMware ESX
V3 servers to TPC. ESX Servers (hypervisors), VMware guest machines, and VMFS disks are identified
in Data Manager reports and the Topology viewer.

3.2 What sources do I need for what functions?


Each function in TPC requires at least one type of agent installed and/or configured to TPC for that
function to do its thing. These functions range from the simple (File system reports in TPC for Data) to
the complex (Fabric Performance reports in TPC for Fabric). This information is based on the variety of
agents and complexity of configuration required to get a function running.

3.2.1 Data Manager - Server Management


TPC for Data can manage servers with just a data agent installed on each server to be managed. These
agents are automatically recognized by TPC, and are listed under Administrative ServicesData
SourcesData Agents. If you wish to see ESX servers and their guest machines, you will need to define
a VMware VI data source, and install data agents on each guest.

3.2.2 Data Manager Storage Subsystem Asset Management


Data Manager can show storage subsystem capacity summary information and disk, storage group, and
LUN assets for storage devices. You will need to install a CIMOM agent that can communicate with
each storage device you want to manage, and then configure that CIMOM agent in TPC under
Administrative ServicesData SourcesCIMOM Agents.

3.2.3 Data Manager for Databases Database Management


Data Manager for Databases requires a data agent to be installed on the server where the supported
RDBMS application is installed. In addition, the RDBMS must be configured under Administrative
ServicesConfigurationLicense Keys. Double click the magnifying glass next to TPC for Data
Databases, and select the RDBMS Logins tab at the top of the frame. Add a new RDBMS login record
for each RDBMS to be managed.

3.2.4 Disk Manager Reports and Provisioning of LUNs


All functions in Disk Manager require a CIMOM agent be installed and configured to communicate with
the storage devices that will be managed. This is the same requirement as listed in 3.2.2 Data Manager
Storage Subsystem Asset Management above. In addition, if zoning services will be used during
volume provisioning, fabrics and zones must be discovered and zoning information up to date in TPC.
CIMOM agents in TPC can be found under Administrative ServicesData SourcesCIMOM Agents.

December 2011

51

Tivoli Storage Productivity Center


Hints and Tips

3.2.5 Disk Manager - Storage Subsystem Performance management


For any of the storage subsystems that youre managing with Disk Manager, you can create a Storage
Subsystem Performance Monitor job. Create it under Disk ManagerMonitoringSubsystem
Performance Monitors. This requires a CIM agent that is not only SNIA SMI-S V1.1 compliant, but
also contains the Block Server Performance (BSP) subprofile in their CIMOM implementation. See
>>>SMI-S and Performance Management/Monitoring for more information on performance
management using SMI-S. Make sure the CIMOM agent for the storage device is defined to TPC, and
that the CIMOM agent shows a green status. Also, make sure youve run a CIMOM Discovery job since
the CIMOM agent was added to TPC.
Create the Subsystem Performance Monitor job:
1. Storage Subsystem tab add the storage subsystem(s) that you want this job to control into the
right hand panel (Selected subsystems:)
2. Sampling and Scheduling select the minimum interval for Gather data for <xx> minutes as
the interval length. The Advanced button should show that data is being saved every 60
minutes. For duration, it is recommended that you choose 23 hours on a daily basis, or 167 hours
on a weekly basis:

Figure 1 Sampling selection panel

3. Give your Subsystem Monitor a name, and save it. You will have to specify the name again
when you do this.
You should check the status of your newly created monitor, and make sure that its running, and
continues to run. It will take 60 minutes to return the first samples of data, so dont expect any reports to
be ready to run until you wait for a while.
To see the reports that are generated from this data, navigate to Disk Manager Reporting Storage
Subsystem Performance Storage Subsystem. Click the Generate Report button. This will generate a
summary report for all the storage subsystems that have performance data collected.
You should pay particular attention to the Product Support Matrix PDF at the bottom of the IBM
TPC for Disk >>>Supported Products List. It contains information about what vendors CIM agents
support for performance data collection using the SMI-S Block Server Performance subprofile.

December 2011

52

Tivoli Storage Productivity Center


Hints and Tips

TPC support for Performance Management is indicated in the PM column, as shown in the figure above.
In order for TPC to collect performance data using the CIM agent, the agent must be at SMI-S V1.1 and
implement the Block Server Performance subprofile. This subprofile is not required for SMI-S V1.1
certification. SNIA has developed a separate certification for the Block Server Performance subprofile
which vendors will use to test their ability to correctly collect performance data.

3.2.6 Fabric Manager Reports and Fabric Zoning


Fabric Manager is probably the most critical and difficult configuration in TPC. Its configuration will
determine just how much you can do with TPC. You can have fabric agents installed on machines in
each fabric SAN that youll be managing, and/or OOBF agents talking to each switch in the fabric
SANs. Having both is a good practice. Remember, OOBF agents are required for Brocade (zoning info)
and Cisco (VSAN info) switches. The zoning information is provided only by Fabric agents for all
switches except Brocade. So you can begin to see the need for both types, depending on your fabric
infrastructure.
With TPC V3.3, the switch CIMOMs can be used to collect topology information also. These CIMOMs
dont replace the need for inband and outband agents, since each has different information gathering
capabilities.

December 2011

53

Tivoli Storage Productivity Center


Hints and Tips
Following is a table that identifies the data that each of the fabric oriented agents gathers for TPC,
Vendor
Brocade

Inband Agent
- Topology data
- RSCN

Out-of-Band Agent
- Topology data
- SNMP state change
alert
- Zone Management

- Topology data
- RSCN
- Zone
Management

- Topology data
- SNMP state change
alert

- Topology data
- RSCN
- Zone
Management

- Topology data
- SNMP state change
alert
-VSAN information

- Topology data
- RSCN
- Zone
Management

- Topology data
- SNMP state change
alert

>>>Brocade SMI Agent


Downloads

McDATA
>>>McData SMI Agent
Downloads

Cisco
>>>CISCO SMI Agent
Downloads

Qlogic

CIMOM Agent
- Fabric WWN & Switch
info
- Switch FCPorts
- Switch Performance
data
- No state change
notification, topology, or
Zone Mgmt
- Fabric WWN & Switch
info
- Switch FCPorts
- Switch Performance
data
- No state change
notification, topology, or
Zone Mgmt
- Fabric WWN & Switch
info
- Switch FCPorts
- Switch Performance
data
- No state change
notification, topology, or
Zone Mgmt
No CIMOM support

Special Info
Zone Management
requires
authentication
through Brocade API

Out-of-band might
not be an option if
switches are in a
private network
(default config)

CIMOM is imbedded
in each MDS9000
switch

* RSCN = Registered State Change Notification

3.2.7 Fabric Manager Switch Performance Monitoring


In order to successfully configure and run a Switch Performance Monitor in TPC, you will need to have
a switch CIM agent installed and configured to TPC. This CIM agent must be SMI-S V1.1 compliant.
Refer to the >>>SNIA Conforming Providers site for more information.
Heres a list of the things you need to do to get Switch Performance Monitoring working:
1. Install Inband and Out-of-band Fabric agents as needed to discover the fabrics to be managed by
TPC.
2. Install the switch vendors SMI-S Provider agent on a server if necessary. Remember, Cisco
MDS 9000 switches dont need a proxy agent installed, since it resides in the switch itself.
3. Register the CIM agent to TPC, save it, and verify that the agent status is green.
4. Run a CIMOM Discovery job so TPC can discover the switches that the CIM agent is configured
for.
5. Run an Out-of-Band Fabric Discovery job to collect fabric information for these agents.
6. Create a Fabric Probe for each fabric listed, and run the probe(s).
7. You will need to run another Out-of-Band Fabric (OOBF) discovery job before you set up a
Fabric Performance Monitor probe job, even if you don't have any OOBF agents. For switch
performance you must run an out-of-band fabric discovery after a CIMOM discovery in order to
collect the rest of the switch data that the switch performance function requires. You will need to
do this before you can successfully run a Fabric Performance monitor. (This special data
collector job will be moved to a Fabric probe in a future release of TPC.)
December 2011

54

Tivoli Storage Productivity Center


Hints and Tips
8. Create a Switch Performance Monitor, using the steps described in 3.2.5 Disk Manager Storage Subsystem Performance management, substituting switches for storage subsystems.
You should check the status of your newly created monitor, and make sure that its running, and
continues to run. It will take up to 60 minutes to return the first sample of data, depending on how you
configured it, so dont expect any reports to be ready to run until you wait for a while.
To see the reports that are generated from this data, navigate to Fabric Manager Reporting Switch
Performance By Port. Click the Generate Report button. This will generate a summary report for
all the switch ports that have performance data collected. You can then select one or more of them and
generate a graphical report for any of the columns that you wish, as shown below.

Figure 2 History Chart: Total Port Data Rate

3.2.8 Tape Manager


Tape Manager support in TPC has been enhanced to generate reports on Tape Library assets. The SMI-S
Agent for Tape needs to be installed and configured to collect information from tape libraries. Once the
CIMOM discover job completes, you can create a Tape Library Probe that will collect the asset
information (drives, media changers, I/O ports, cartridges, slots, etc.) for each tape library discovered.
These assets can be viewed by navigating to Tape Manager Tape Libraries, and reports can be
generated for this same information.

3.2.9 Replication Manager


The Replication Manager (also referred to as TPC for Replication or TPC-R) can be launched either
from TIP or the TPC gui. TPC-R provides a management interface to flashcopy and copyservices for
IBM storage subsystems supporting high availability and disaster recovery.

December 2011

55

Tivoli Storage Productivity Center


Hints and Tips

4 SMI-S Providers Installation and Configuration


TPC 4.2 introduces the ability to use the storage subsystem native interface to collect data instead of a
CIMOM agent. This is true for IBM System Storage DS8000 systems, IBM System Storage SAN
Volume Controller, IBM Storwize V7000, or IBM XIV Storage Systems.
CIMOM agents are required with other storage subsystems, and also with the above subsystems for TPC
versions 4.1.x and older.
This chapter will document the more common SMI-S Provider proxy agents provided by storage and
switch vendors. In order to properly configure TPC to use these CIM agents, you must properly install
them according to the vendors instructions. What is included here is a condensed version of the
installation instructions for each SMI-S Provider. If more detail is needed, please refer to the appropriate
guide supplied by the vendor.

4.1 Some SMI-S Terminology


The terminology used in SMI-S technology has been confused and somewhat scrambled. Its important
to understand some basic terminology about SMI-S and how TPC relates to it.
SMI-S uses an architecture called Common Information Model (CIM). Think of CIM in 3 layers. From
bottom up:
1. The Provider is the device instrumentation. These providers use an imbedded or proxy model
when implementing the Provider. They are the software components that are described in this
chapter. We refer to them as SMI-S Providers. They are also referred to as CIM agents.
2. The CIMOM is a middle layer capable of connecting to multiple Providers and responding to
CIM client requests. This layer is a combination of SMI-S Provider, data transport, and parts of
the TPC server. It encompasses the means of requesting data from a device, getting the data back
into TPC, and processing that data.
3. The CIM Client is the application leveraging CIM via the CIMOM. This is the TPC server. It is
the requester of information from managed devices.
Here are some formal definitions used within SMI-S circles that you might find useful.
CIM: This is the core standard SMI-S is built on: Common Information Model. An object oriented
description of the entities and relationships in a businesss management environment maintained by the
DMTF (Distributed Management Task Force). CIM is divided into a core model and common models.
The core model addresses high-level concepts (such as systems and devices), as well as fundamental
relationships (such as dependencies). The common models describe specific problem domains such as
computer system, network, user, or device management.
CIMOM: CIM Object Manager: An Object Manager is an entity capable of performing some
processing on information received from multiple managed pieces of equipment. It may also support the
persistence of information. An Object Manager may also be able to offload the processing of some
functions, such as enumeration, from a WBEM Client. Typically, an Object Manager does not
incorporate support for any specific devices, but defines an interface into which multiple Providers may
be plugged.
SLP: Service Location Protocol: SLP is a service discovery protocol that allows computers and other
devices to find services in a local area network without prior configuration. SLP has been designed to
scale from small, unmanaged networks to large enterprise networks.

December 2011

56

Tivoli Storage Productivity Center


Hints and Tips
SMI-S Client: An application using CIMOM services per whats defined in the SMI-S Profiles. The
TPC Server is an SMI-S Client.
SMI-S Provider: A Provider is an entity that communicates with managed objects (devices and
services) to access data and event notifications from a variety of sources, such as the system registry or
an SNMP device. Providers forward this information to the CIM Object Manager for integration and
interpretation.
WBEM: Web-Based Enterprise Management. Describes how to manage an enterprise using CIM and
other standards to form a complete enterprise management solution. A set of technologies that enables
interoperable management of an enterprise. WBEM consists of CIM, an XML DTD defining the tags
(XML encoding) to describe the CIM schema and its data, and a set of HTTP operations for exchanging
the XML-based information. CIM joins the XML data description language and HTTP transport
protocol with an underlying information model, CIM, to create a conceptual view of the enterprise.

4.2 General Configuration Guidelines


SMI-S Providers are needed by TPC to discover the devices that will be managed by TPC, and collect
information from them. You will need a SMI-S Provider for the following types of devices:
Storage Subsystems
Fabric Switches
Tape Libraries
The common steps that need to be completed to make these devices useable in TPC are:
1. Install the device vendors SMI-S Provider according to the vendors instructions, and
configure it to communicate with the device(s) to be managed by TPC. Instructions for
specific vendors SMI-S Provider are included below. This step is only required if the SMI-S
Provider is a proxy agent.
2. (Optional) Do an SLP discovery. You can do this in the TPC GUI by configuring a CIMOM
Discovery to scan the local subnet, or query manually entered SLP Directory Agents. Click
the Options tab, and select the check-box to scan local subnet. You can also enter any SLP
Directory Agents that have been configured. Be sure to uncheck the scan local subnet box
after you submit the job, but before you leave this panel. SLP is not something that you want
to perform every time you run a CIMOM discovery.
3. Manually add any SMI-S Provider agents to TPC that you still need. SLP may or may not
discover all the CIMOM agents that you need. Be sure to add any that are out of the scope of
SLPs discovery, and delete any extra ones that you dont want TPC to manage.
4. Add a userid and password to each SMI-S Provider that was automatically discovered. Save
your changes to confirm the authentication credentials. Once the entry is saved with the
proper authentication credentials, the CIMOM entry should show a
Connection
Status. If not, check all the values for this CIMOM entry.
5. Run a CIMOM Discovery again. Once each SMI-S Provider is authenticated, TPC will use
that agent to discover devices, and record those devices in the TPC repository.
6. Create a device probe for each device or device group. These probes will collect asset
information associated with the device, much like a data probe does for computers. The

December 2011

57

Tivoli Storage Productivity Center


Hints and Tips
picture below shows the device types you can create device probes for:

7. You can also create performance monitors for storage subsystems and fabric switches.
Follow the instructions in the previous chapter for each type of monitor.
It is always considered best practice to install each SMI-S Provider proxy agent on a separate
dedicated server. It may be technically feasible to install multiple SMI-S Provider proxy agents on a
single server, changing the ports as necessary, and being aware of any other co-existence considerations.
However, in a production environment, there is a strong possibility that you will run into CPU and
memory resource problems trying to collect asset and performance information from multiple devices
through the multiple SMI-S Provider proxy agents on the same server. It is also possible there will be
conflicts between providers that cannot be resolved through configuration. Be sure to consider this best
practice when you are sizing the machine requirements for a new TPC installation.
It is also not recommended to install any SMI-S Provider proxy agents on the TPC server. Doing
so will likely result in TPC server performance and stability problems
There are some solutions for placing SMI-S Providers on existing machines that can save you using a
separate server:
The SVC SMI-S Provider is pre-installed on the SVC Master console.
The LSI/Engenio SMI-S Provider can be installed on the same server that has the DS4000 Storage
Manager installed on it.
Cisco MDS9000 switches have an embedded SMI-S Provider in the switch itself. The switch
firmware must be at V3.03 or later. There is no proxy agent needed for the Cisco switch.
McData Directors (6140, etc) come with a server running EFCM. The McDATA SMI-S Provider
can be installed on this server.
Below you will find specific instructions for the most common vendor supplied CIMOM agents for
devices that TPC supports. In describing the installation steps, we cannot show graphics for the installer
panels because of copyright laws. Instead, each install panel is identified by title, and any special
selections or entries that need to be made are noted.

4.2.1 Namespaces for CIM agents


If you are using the IBM TPC GUI and want to enter namespaces manually for the CIM agents for the
switches, search the TPC technotes on the support page for namespace list. This document has the
current up to date list of supported namespace strings to use with TPC. The current list can be found at
the >>>TPC CIMOM Namespaces site.
If these namespaces dont appear to work, you should check with your switch vendor documentation for
the most current namespaces. If you specify the wrong namespace, one of the following errors can
occur:
The connection test fails when the CIM agent is added.
December 2011

58

Tivoli Storage Productivity Center


Hints and Tips

The discovery does not discover all information of the system managed by the CIM agent.
The probe fails.
The function you want to perform on the system might fail (for example, performance data
collection).

4.3 Performance Data Collection using SMI-S CIM Agents


Storage devices that are SMI-S V1.1 CTP compliant may or may not support performance data
collection. Performance management requires that the CIM agent provider implement the Block Server
Performance (BSP) profile. This is an optional profile for SMI-S V1.1 certification, and is not tested by
the SNIA V1.1 CTP certification tests. SNIA has added a certification test for this profile as of
November 2006. You should check the SNIA SMI-S CTP Certification website for each device that you
want to have TPC monitor for performance, and check that the BSP profile is listed as being tested.
Please refer to the TPC support page and the supported products document for the most current list of
storage devices that support performance management via SMI-S CIM agents.

December 2011

59

Tivoli Storage Productivity Center


Hints and Tips

4.4 IBM DS Open API CIM agent


The DS Open API provides a SMI-S CIM agent to enable the automation of configuration management
for IBM Enterprise storage using management applications like TPC. This complements the use of the
IBM TotalStorage Specialist and DS Storage Manager web-based interfaces and the IBM TotalStorage
ESS and DS CLI interfaces.
The DS Open API supports IBM TotalStorage DS8000, IBM TotalStorage DS6000, and the IBM
TotalStorage Enterprise Storage Server (ESS). The CIM agent is available for the AIX, Linux, and
Windows 2000 (or later), operating system environments.
If you refer to the TPC >>>Supported Products List, you will see that TPC works with the DS Open API
at V5.1, V5.2, V5.3 and V5.4 levels depending on the device, model, and firmware level. The 5.1
version is at the SMI-S V1.0.3 level; 5.2 is SMI-S V1.1 compliant; 5.3 and 5.4 are SMI-S V1.2
compliant.
There are two options for installing and using the DS Open API CIM Agent:
installed as a proxy agent on a dedicated server (V5.1/V5.2/V5.3/V5.4.0)
enabled as an embedded agent on the DS8000 HMC. This can be used to manage the DS8000
managed by the HMC
The CIM agents can manage the ESS, DS6000, and DS8000. There are some considerations that can
help you decide which version to use:
1. If a customer has a running CIM agent and is happy with the data that TPC is providing, there is
no immediate need to upgrade to a newer version.
2. For DS6000 and DS8000, any version listed as supported on the TPC >>>Supported Products
list can be used.
In most cases, new TPC installations should use the latest supported
version.
3. Late model DS8000s have the CIM agent pre-installed on the HMC, and can manage that
DS8000 without a separate server being required. The CIM agent must be enabled by a CE
before it can be used.
4. Since the embedded CIM agent on the DS8000 HMC can only be used to manage that DS8000,
you must install and configure the DS Open API CIM agent proxy on a separate server for ESS
or DS6000. There is no embedded CIM agent solution for these devices.

4.4.1 DS Open API CIM Agent V5.2/V5.3/V5.4.0


The IBM TotalStorage DS Open API CIM agent V5.2/V5.3/V5.4.0 supports IBM ESS, DS6000, and
DS8000. It is a Pegasus based application, and does not require the ESSCLI to manage an IBM ESS.
The Pegasus base is the strategic direction for IBM storage devices, and will be the agent that is
enhanced for future capabilities. This agent uses the Block Server Performance (BSP) subprofile to
collect performance data for DS6000 and DS8000 devices.

4.4.1.1 Enabling the DS Open API V5.2/V5.3/V5.4.0 CIM Agent on the DS8000
HMC
Beginning with the DS8000 release 2.4 microcode (bundles 6.2.400.x), the CIM Agent for DS Open API
is now embedded in the Hardware Master Console and works with TPC v3.1.3 and above. If you're

December 2011

60

Tivoli Storage Productivity Center


Hints and Tips
going to use performance statistics, a defect was fixed that you'd want to pick up from bundle 6.2.400.66
forward. You can use that CIM agent to manage the DS8000 that the HMC manages.
The CIM agent is disabled by default, and must be enabled manually. This requires a service call, since
an IBM CE will need to log into the WebSM management console to enable the CIM agent application.
In order to enable the CIM Agent on the HMC, a firewall change is required on the HMC. This will also
be done by the IBM CE during the service call. If the firewall change is not done, the CIM agent will
cause the HMC to freeze. Please see the >>>TotalStorage Hardware Freeze Alert for more information.

The default port for the HMC CIM agent is 6989, and the protocol is HTTPS.
You might want to install the DSCIMCLI on a different server to make the CIM agent configuration
easier when its enabled on the HMC. See section 4.4.1.3 - Setting up the dscimcli utility for
instructions.

4.4.1.2 Installing the DS Open API V5.2/V5.3/V5.4.0 CIM Agent as a proxy agent
The IBM DS Open API CIM Agent install packages can be downloaded at the >>>DS Open API
Downloads site.
Unpack (zip or tar) the install package contents to a temporary location. Part of the install package is the
IBM System Storage DS Open Application Programming Interface Reference Version 1 Release 2
(GC35-0516-01). The guide can be found in the docs directory as installguide.pdf. Use this guide to
install and configure the DS Open API CIM Agent. During installation, you should accept the defaults
where applicable, and provide values when needed.
1. Launch the appropriate setup program (W2003\setup.exe to install on Windows).
2. A Java Virtual Machine startup window will appear. After some time, InstallShield will launch a
graphical window.
3. Welcome Screen Click Next.
4. License Agreement Screen Select I accept and click Next.
5. Destination Directory specify the path for the code to be installed in. Click Next.
6. Product Space Check Watch and Wait. If space is available, the next panel will appear
without intervention.
7. Server Communication Configuration You can choose to have the CIM Agent communicate
to TPC using either HTTP (non-secure) or HTTPS (secure) communications, or both. It is
generally recommended to use HTTPS, and does not add any complexity when configuring TPC.
You can also change the port(s) to be used for the CIM Agent communication. 5988 is the
default port associated with non-secure CIMOM communications (HTTP), and 5989 is the
default port associated with secure CIMOM communications (HTTPS). Change the port only if
needed, and record the new port number and protocol for future reference. Recommended
selections: Communication Protocol=HTTPS and HTTPS port value=5989.
8. Configuration Parameters This panel lets you specify the userid and password used to
authenticate to the CIM agent from TPC, and also to add the storage devices that will be
managed by the CIM agent. This panel is optional, as the information can be specified and/or
changed after installation using the DSCIMCLI command, as described in the following section.
It is recommended to supply the information now.
a. Username and password You can specify a new username and password that will be
used to authenticate to the CIM agent.

December 2011

61

Tivoli Storage Productivity Center


Hints and Tips
b. Add/Modify/Remove storage devices You can add the storage devices you want to
manage with this CIM agent. There are five fields to contend with when entering a
device:
i. Device type Available types are DS, ESS, and ESSCS.
ii. IP Address The address of the first cluster for an ESS device or the Master
Console address of the DS device.
iii. Alternate IP The address of the second cluster for ESS devices.
iv. User Name The device Administrator user name used to log into Specialist or
the Master Console.
v. Password The password that corresponds to the user name specified above.
Click Next when done.
9. Install Preview Review your selections and then click Install.
10. Installation Progress Watch and wait.
11. Finish You should see the messages IBM System Storage CIM Agent for DS Open API 5.2
has been successfully installed. and The "IBM System Storage CIM Agent for DS Open API
5.2 Pegasus Server" service was successfully started. Click Finish to exit the installer.

4.4.1.3 Setting up the dscimcli utility


The dscimcli utility is a tool that is used to configure the DS Open API V5.2/V5.3/V5.4.0 CIM agents.
You must install the version of dscimcli that matches the CIM agent version you are running. With this
tool, you can
1) Manage SSL Certificates used by the CIM agent
2) List, add, and remove devices from the CIM agent
3) List and change the configuration of the CIM agent
4) Manage userids and passwords for the CIM agent
If you installed the proxy agent version of the CIM agent, the dscimcli command was installed with the
package. The default location of this command will be in C:\Program Files\IBM\dsagent\bin.
If you are using the embedded HMC CIM Agent, you will want to install the dscimcli utility onto a
separate server, like your desktop or notebook machine, and configure the CIM agent remotely.
Instructions for setting up the dscimcli utility can be found in the >>>DS Open API 5.x Reference.
Refer to Chapter 5, Installing and configuring the dscimcli utility for complete setup instructions.
There is no installer for this utility. It must be set up manually. Here is the short list of steps:
1. Download the installation package (e.g., DSCIMCLI.zip) from the 'Download' link at
>>>DSCIMCLI Downloads.
2. Extract the contents of the package to a permanent destination directory. There will be a resulting
directory for each of the platforms supported. For .zip files, an alternative is to use a zip utility
like Winzip, and extract just the platform directory you want to use.
3. Once its extracted, create a DSAGENT_HOME environment variable that points to the utility
location. An example:
set DSAGENT_HOME=C:\work\dscimcli\W2003

For Windows, go to StartControl PanelSystem, then select the Advanced tab. Click the
Environment Variables button, and add a new entry for DSAGENT_HOME.
4. Add the bin directory of the DSAGENT_HOME to your path. For Windows, go to
StartControl PanelSystem, then select the Advanced tab. Click the Environment
Variables button, and modify the PATH statement there.

December 2011

62

Tivoli Storage Productivity Center


Hints and Tips
5. Open a command prompt, and try a dscimcli command using the full syntax of the command to
access the HMC instance of the CIM agent using the server (-s) and userid (-u) parameters:
C:\>dscimcli -s https://192.168.35.46:6989 -u myuser p mypass lsdev
Type IP
IP2
Username
===== =============== =============== =========
DS
192.168.35.46
admin
C:\>

4.4.1.4 Configuring the DS Open API V5.2/V5.3/V5.4.0 CIM agent


With the V5.1 CIM agent, there were a couple of commands used to configure users and devices. With
the V5.2/V5.3/V5.4.0 CIM agents, this capability has been enhanced and consolidated into a single
command utility: dscimcli. As mentioned and described in the previous sections, it is either installed
with the proxy CIM agent, or must be manually installed and configured. You should use this command
to change the user authority to the CIM agent, and add, modify, or remove devices from the CIM agent
configuration. All of the commands and parameters for the dscimcli utility are documented in the
>>>DS Open API 5.x Reference in Chapter 6.
First, you will need to change the user access to the CIM agent:
1. Add a unique user to access the CIM agent. In the following example, we will create a new CIM
agent userid, mycimuser, with a password of mycimpw. The s specifies the server location of
the CIM agent. The u specifies the userid/password of the CIM agent used for authentication. In
this example, we will use the default userid and password of superuser and passw0rd.
dscimcli -s https:// myserver.mycompany.com:5989 -u superuser:passw0rd
mkuser mycimuser -p mycimpw

2. Change the password for the default CIM agent userid superuser. This will protect the CIM
agent from unauthorized access. In the example below the new password is changedpassword
dscimcli -s https:// myserver.mycompany.com:5989 -u superuser:passw0rd
chuser superuser -password passw0rd -newpassword changedpassword

3. List the users defined for the CIM agent. The lsuser command will list the userids that are
defined to the CIM agent.
dscimcli -s https://myserver.mycompany.com:5989 -u
superuser:changedpassword lsuser
Username
=========
superuser
mycimuser

Add or remove devices that will be used by the CIM agent:


1. Add new devices to the CIM agent. As each device is added, a verification routine uses the
specified values to verify that the device can be contacted.
The ds_hmc_userid must be a valid logon for the DS8000 Storage manager on the HMC, and
must not be locked. You should make sure you can use this userid to log into the DS8000
Storage Manager (http://<HMC_IP>:8451/DS8000/Login) before you try and add the HMCs
device(s) to this CIM agent.

December 2011

63

Tivoli Storage Productivity Center


Hints and Tips
dscimcli -s https://myserver.mycompany.com:5989 -u mycimuser:mycimpw
mkdev 192.168.35.46 type ds user <ds_hmc_userid> password
<ds_hmc_password>
Device successfully added.

2. If you need to remove existing devices from this CIM agent, you should use the rmdev
command.
dscimcli -s https://myserver.mycompany.com:5989 -u mycimuser:mycimpw
rmdev 192.168.35.46 type ds

3. List devices defined to the CIM agent.


-s https://myserver.mycompany.com:5989 -u mycimuser:mycimpw
IP
IP2
Username
=============== =============== =========
192.168.35.46
admin
192.168.35.175 192,168,35,176 admin

dscimcli

Type
=====
DS
ESS

lsdev

4.4.2 DS Open API CIM Agent V5.4.1


The DS Open API V5.4.1 is an embedded-only deployment, with auto-enable, auto-configure, and adds
a watchdog feature. It also allows support personnel remote access to logs for troubleshooting via
dscimcli commands. From this version forward, there is no proxy agent support. Other differences:
1. The dscimcli 'mkdev' and 'rmdev' commands are not supported, and are not needed because the
CIM agent is pre-configured to the device on which it is embedded.
2. You must use the 5.4.1 dscimcli with this CIM agent, and it is only supported on Windows and
Suse Linux platforms.
The embedded CIM agent must still be enabled on the device as with earlier versions.

4.4.3 Configure TPC for the DS Open API CIM Agent


Once you have completed the configuration step above, you are ready to configure TPC to add the CIM
agent. You will need to have the following information available when you add the CIMOM:
IP Address or FQDN of the CIM Agent server
Userid for the CIM Agent (use the one you created above)
Password of the userid for the CIM Agent (created above)
Port number (port number configured for the CIM agent, usually 5988 or 5989, or 6989 if the CIM is
enabled on the HMC)
Protocol (HTTP or HTTPS. HTTPS for the CIM agent enabled on the HMC)
Interoperability Namespace (/root/ibm)
Description of the CIM Agent (example: IBM DS8300 RQVPDS83)

December 2011

64

Tivoli Storage Productivity Center


Hints and Tips

4.4.4 Scalability / Best Practices Guidelines for the DS Open API


TotalStorage development has provided two rules of thumb to follow for DS Open CIM Agents. This
information provided is a conservative estimate based on limited testing; these guidelines will be refined
as further testing is done.
No more than 10 devices per CIMOM. Additional storage devices add communication delays and
increase the risk that slow devices will negatively affect probe times.
No more than 10,000 volumes per CIMOM. There is a direct correlation between the amount of
memory used and the numbers of volumes managed.
Avoid managing devices in different domains/subnets with a single proxy CIMOM. If possible
install and configure a proxy CIMOM in each domain/subnet where you have devices to manage.
This will help prevent CIMOM communication/timeout issues from affecting all managed devices
when a problem develops in one domain/subnet.

December 2011

65

Tivoli Storage Productivity Center


Hints and Tips

4.5 IBM XIV


The IBM XIV Storage System is designed to be a scalable enterprise storage system based upon a grid
array of hardware components. TPC 4.1 added support for this storage subsystem.
The IBM XIV uses an embedded CIMOM running on the HMC. The CIMOM has to be enabled and
started by a CE, and should be preconfigured for the device being managed. The SMIS support for XIV
requires code level 10.1 or higher.
At the time of this revision, the XIV CIMOM does not yet support TPC Performance Monitoring.
The IBM XIV console runs on a Windows server, and provides the XCLI configuration tool, which can
be run both as a gui and also a command line tool.

4.5.1 Understanding how authentication works in XIV CIM 10.1


The authentication process for TPC to XIV communication is a two step process.
1) The smis user created for and entered by TPC for the CIMOM user is authenticated. If this step
succeeds, authentication continues with step 2.
2) The internal XIV smis_user account is authenticated with its pre-defined password. This password
can only be reset by an IBM XIV support engineer. (Note: the smis_user account cannot be used to
configure the TPC CIMOM login.)

4.5.2 Steps to configure XIV in TPC


1) Start XCLI. OnWindows, you can do Start -> All Programs -> XIV -> XCLI, or you can open a
command prompt window and go to c:\Program Files\XIV\GUI10 and start a command window. You
will be prompted to enter the admin user account and password (default id=admin, pw=adminadmin),
and the ip address of the XIV machine.
xcli -w

2) Create a user account for TPC to use to talk to the XIV CIMOM:
xcli> smis_add_user user=tpcadmin password=mynewpw password_verify=mynewpw
xcli> smis_list_users
Name
tpcadmin
superuser

3) XIV has 3 management modules running SMIS, each of which needs to be configured as a CIMOM
in TPC. The management modules are XIV modules 4,5,6 and each has a corresponding ip address. Go
through the CIMOM login screen to add each ip address using the user account and password created in
step 2. The namespace should be /root/ibm, and the standard port to use is https port 5989.
4) Verify that Test Connection completes successfully.

December 2011

66

Tivoli Storage Productivity Center


Hints and Tips
5) Run a CIMOM discovery job to discover the storage device.
6) Run a TPC probe job against the XIV storage subsystem to collect detailed device information.
>>>XIV Information can be found in Appendix E.

December 2011

67

Tivoli Storage Productivity Center


Hints and Tips

4.6 LSI SMI-S Provider for DS4000 devices


The IBM DS4000 family of storage requires an LSI SMI-S Provider. This CIM agent is available from
the LSI Logic website at >>>LSI SMI Provider Downloads.
The LSI SMI-S Provider allows partners to easily include IBM DS4000 storage products in their SAN
management solutions. The TPC >>>Supported Products List for your release will show you the
currently supported version that you should install.
DS4000/5000 subsystems at microcode level V7.60.x requires LSI SMI-S Provider V10.06.GG.33 for
performance monitoring support. This requires TPC 4.1.1.66 or higher, and uses the interop
namespace /root/PG_interop.
This version of the provider should not be used with microcode levels < V7.50.x, as this can initiate a
reboot of the controller.
The newest versions of the LSI SMI-S Provider (> 10.10.xx.yy) now use a new management utility
called ArrayManagementUtility instead of the legacy providerutil (which is still used by older
versions of the provider).

4.6.1 Installing the LSI SMI-S Provider


The LSI SMI-S Provider CIM Agent can be installed on the same server that has the DS4000 Storage
Manager installed.
Each CIM agent can manage up to 3 (three) DS4x00 devices, and 512 volumes. Youll have to
find other machines to install additional LSI SMI-S Providers on to manage more than three
devices.
o From the LSI readme: The maximum configuration recommended for the SMI provider is
either 3 storage systems or a total of 512 volumes, whichever is reached first. If the
limits are exceeded, the provider may not be able to generate performance statistics. For
best performance of the provider, a minimum of 1 GB heap memory is recommended.
If you are upgrading from a prior version of the LSI SMI-S Provider, you will need to uninstall
the old version of the CIM agent first. See the readme for the CIM agent for more information.
To install the LSI SMI-S Provider, follow these instructions:
1. Download the currently supported version of the SMI-S Provider from the LSI Logic Storage
Download page at >>>LSI SMI Provider Downloads.
2. Important: before running the installation program, check to make sure that ports 5988/5989 are
not in use by another application. If these ports are in use, you should identify and stop the other
application during the installation. You can re-configure the SMI-S provider to use different
ports, but only after installation (see 4.6.2.2).
3. Execute the .exe file from the package you downloaded (xx.xx.xx.xx corresponds to the Provider
version you are installing) to start the CIM agent installer.
4. Engenio SMI-S Provider Welcome page Click Next.
5. License Agreement Click the radio button for I accept , and click Next.
6. Where Would You Like to Install? Change and/or verify the destination folder and click
Next.
7. Please Review the Following Before Continuing Review the information and click Install.
8. Installation Status watch and wait.

December 2011

68

Tivoli Storage Productivity Center


Hints and Tips
9. Enter IPs and/or Hostnames to be discovered at Provider startup click the Add New
Entry button. These addresses can also be modified or added as a post-install process.
10. Congratulations! Click Done. You computer should be rebooted to complete the install
process.

4.6.2 Modifying the LSI SMI-S Provider CIM Agent configuration


There are three configuration items that can be changed for the LSI SMI-S Provider:
1. Devices can be added or removed from the CIM agent
2. Listening Ports can be changed
3. Authorization can be enabled or disabled

4.6.2.1 Adding or Removing a device for the CIM agent


For SMI provider versions 10.10.xx.yy and newer, you use the >>>LSI ArrayManagementUtility
tool to add and remove devices from your CIM agent. The tool is provided a a .zip package which you
can extract in whatever location you prefer. On Windows, the .exe in the package is executed directly to
start the tool. The usage is very similar to the legacy providerutil tool:

For older SMI provider versions, you use the providerutil tool to add and remove devices from your
CIM agent. On Windows, it is located in C:\Program Files\EngenioProvider\SMI_SProvider\bin. On
AIX, it is located in /opt/engenio/SMI_SProvider/bin.
For Windows, execute the providerutil.bat command from a command prompt. For Unix, execute the
providerutil command from a terminal window. Enter the values requested, and specify 1 to add a
device node, and 2 to remove a device node.

December 2011

69

Tivoli Storage Productivity Center


Hints and Tips
C:\Program Files\EngenioProvider\SMI_SProvider\bin>providerutil
Input CIMOM Username:<CIMOM username>
Input CIMOM Password:<CIMOM password>
Input Port[ 5988 ]:
Input Operation
1) addDevice
2) removeDevice
3) Add credentials for an array
Please Input 1, 2, or 3:1
Input device DNS-resolvable hostname or IP address:<DS4000 node IPaddr>
Input Array Password (default is blank):
Attempting extrinsic method call.
The extrinsic call succeeded.
C:\Program Files\EngenioProvider\SMI_SProvider\bin>

4.6.2.2 Changing the CIM agent HTTP and HTTPS port


You can change the ports that the LSI SMI-S Provider listens on by modifying the file C:\Program
Files\EngenioProvider\SMI_SProvider\bin\portInfo.properties. On AIX, it is located in
/opt/engenio/SMI_SProvider/bin.
CIM-XML=5988
HTTPS=5989

Once youve changed these port values, and saved the portInfo.properties file, you must stop and restart
the LSI SMI-S Provider. Wait until the netstat command verifies that the new ports are LISTENING
(netstat an | find LISTENING or netstat an | grep i listening). This can take a few minutes.

4.6.2.3 Configuring the CIM Agent on machines with multiple IP addresses


In order for SLP to work correctly on machines with multiple IP addresses, create a file named
"systemIPs.txt". It should be created in the product installation directory ..\SMI_SProvider\bin. This
differs by operating system. Edit this file and list one or more of the known IP addresses of the server,
each one on a separate line, in order of precedence (most preferred address on top). When the CIMOM
is started it tries to use each address in the file in the order they are listed. It stops when it finds one it
can use and will use that address from then on. If the CIMOM is run on a server with multiple IP
addresses and this process is not followed, it will randomly and inconsistently select one of addresses.

4.6.2.4 Enabling authorization for the CIM agent


The LSI SMI-S Provider is shipped with authentication turned off by default. In order to turn
authentication on, you must stop the Provider service and edit the cimom.properties file in the
wbemservices/cimom/bin folder. Uncomment the entry:
org.wbemservices.wbem.cimom.pswdprov=cimprovider.security.BasicUserPasswordProvider

Next, make sure that all other lines are commented out with a # in column 1. Save the file, and restart
the CIM agent.
When BasicUserPasswordProvider authentication is enabled, domain and local users are allowed to
authenticate on Windows machines. If operating over VPN, domain authentication does not function.
Only local users are allowed to authenticate on Unix machines.

December 2011

70

Tivoli Storage Productivity Center


Hints and Tips

4.6.3 Configure TPC for the LSI SMI-S provider


Once you have completed the installation steps above, you are ready to configure TPC to add the CIM
agent. You will need to have the following information available when you add the CIMOM:
IP Address or FQDN of the CIM Agent server
Username of the CIM Agent
The username can be any value if you havent enabled authentication. Otherwise, it must be a
valid operating system userid.
Password of the CIM Agent
The password can be any value if you havent enabled authentication. Otherwise, it must be the
password corresponding to the userid specified above.
Port number (5988)
Protocol (HTTP)
Interoperability Namespace (/interop, /root/PG_interop for V10.06.GG.33 and higher)
Description of the CIM Agent (example: IBM DS4300 RQQVDS43)

December 2011

71

Tivoli Storage Productivity Center


Hints and Tips

4.7 SAN Volume Controller SMI-S CIM Agent Configuration


The SMI-S CIM Agent for SAN Volume Controller is installed and runs as part of the SAN Volume
Controller Console. The SVC CIMOM communicates with the SVC Master Console to control the SVC
clusters that the Master Console controls. While you can use the default userid/password for the SVC
master console, it is best practice to create a separate administrator userid for use with the TPC.
Refer to the TPC >>>Supported Products List to find the recommended version of SVC for your TPC
release. For TPC V3.3, the SVC Master Console should be at V4.1.1.543 or later. There is a defect in the
SVC CIMOM code at earlier levels that will cause problems when running probes for large SVC
clusters.

4.7.1 Create a userid for TPC


1. Login to the SAN Volume Controller console (http://your-svc-console-IPaddr:9080/ica) with a
superuser account
2. My Work (left panel) Click Users.
3. Viewing Users Select Add a user in the drop down under the Users panel and click Go.
4. Introduction An introduction screen is opened. Click Next.
5. Define Users Enter the User Name and Password (twice). Click Next.
6. Assign Administrator Roles Select your candidate cluster(s) and move it to the right under
Administrator Clusters. Click Next.
7. Assign Service Roles Click Next.
8. Verify user roles Click Finish after you Verify user roles
9. Viewing Users - the newly created Administrator should appear in the list.
The SVC userid that will be used to communicate with the SVC must have superuser, administrator,
or equivalent read/write authorization on the SVC. This level of authorization is required to manage,
manipulate, and configure the storage device, as well as to gather performance data. Read-only
authorization (Service Roles) is insufficient for all but basic inventory collection tasks.

4.7.2 Verify that the SVC CIM agent is running


Even though the SVC SMI-S CIM agent is installed on the SVC master console, you should check to
make sure that the SVC SMI-S CIM agent is running. You can easily do this with a telnet command.
The SVC Master Console CIM Agent listens on port 5999. Issue the command from any command
prompt or terminal session:
telnet <SVC Master Console IPaddress> 5999
If the CIM Agent is running, you will see a blank screen with the cursor in the upper left corner.
You should be able to CNTL-C to terminate the command.
If you get the message: Connecting To 192.168.1.126...Could not open connection to
the host, on port 5999: Connect failed, the CIM agent is stopped, or the default port
has been changed. Try restarting the CIM agent on the SVC Master Console server.

4.7.3 Configure TPC for the SVC CIM Agent


Once you have completed the verification step above, you are ready to configure TPC to add the CIM
agent. You will need to have the following information available when you add the CIMOM:
IP Address or FQDN of the SVC Master Console
December 2011

72

Tivoli Storage Productivity Center


Hints and Tips

Username of the CIM Agent (SVC administrator)


Password of the CIM Agent (SVC administrator password)
Port number (port number configured for the SVC CIM agent, usually 5999)
Protocol (usually HTTPS)
Interoperability Namespace (/root/ibm)
Description of the CIM Agent (example: IBM SVC RQVPSVC1)

4.7.4 Memory considerations for the SVC 3.1 and 4.1 CIM Agents
By default, the SVC CIM Agent is configured with a maximum heap memory size of 512MB. This may
not be enough memory to accommodate some management configurations.
To determine the amount of RAM required for your environment, use the following calculations:
For each SVC cluster being managed, the amount of memory required in megabytes will be the greater
of the following two formulas:
1. number of virtual disks times 0.02 times average striping density
OR
2. number of virtual disks host mappings times 0.17
Notes:

Striping density is the number of managed disks that each virtual disk is striped across. If
you are unsure of the striping density, plan on 8 to 12.
If you are unsure of the number of virtual disks host mappings, then estimate this number
by taking the number of Host SCSI ports attached to an SVC cluster and multiply by the
average number of virtual disks per host you have or plan to have.
These calculations presume one System Resource Manager making requests to the CIM
Agent at a time.

Once the total amount of memory is determined, if this value is greater than 768MB then modify the
/Program Files/IBM/svcconsole/cimom/cimom.bat file, changing the Xmx768mb flag (the numeric
value might be different depending on the version of SVC console installed) to the appropriate "mb" for
this CIMOM. If the calculated value is greater than your configured RAM, the CIMOM may run out of
memory while processing some CIM client commands. It will then be necessary to reconfigure the
CIMOM management topology or cluster configuration to correct this problem. This means that
multiple master consoles may need to be deployed in your data center to handle the management needs
for your clusters. In the case where multiple, very large SVC clusters are being managed, you may need
to deploy one master console per SVC cluster.

4.7.5 SVC embedded CIMOM


If you are migrating from the proxy CIMOM running on the SVC console to the embedded SVC
CIMOM, refer to the >>>http://www.lsi.com/search/Pages/downloads.aspx?k=SMI+Provider
http://www.lsi.com/search/Pages/downloads.aspx?k=SMI+Providertechnote for details.

December 2011

73

Tivoli Storage Productivity Center


Hints and Tips

4.7.6 Disable TPC services when not using TPC on SSPC


If you are only using the SVC console on a SSPC because you already have a TPC server in your
environment, you can (and should!) stop and disable the TPC services to improve performance of the
SVC CIMOM and master console.

4.8 IBM N-Series


1. Install the Data ONTAP SMI-S:

December 2011

74

Tivoli Storage Productivity Center


Hints and Tips

2. Start the C WBEM Server:

December 2011

75

Tivoli Storage Productivity Center


Hints and Tips

*This window must remain open for the SMI to work.


3. Set the JAVA_HOME environmental variable:

If for some reason the JAVA_HOME variable is not picked up you may need to perform a work around.
Ensure the JAVA_HOME variable is properly set and modify the execution scripts in /Program
Files/ws/bin.
Before:

After:

4. You are now ready to add the subsystem(s) to the SMI-S.

December 2011

76

Tivoli Storage Productivity Center


Hints and Tips

5. Run the list command to ensure the subsystem was added:

6. Test to ensure you are querying the subsystem and getting valid data back:

7. The SMI can now be added to TPC.

December 2011

77

Tivoli Storage Productivity Center


Hints and Tips

December 2011

78

Tivoli Storage Productivity Center


Hints and Tips

4.9 IBM Tape Libraries


TPC V3.3.x has been enhanced to provide Reporting under Tape Manager. With this function, you can
generate asset reports on the tape libraries managed by TPC, and also export that data in CSV format for
use elsewhere.
The IBM System Storage Tape Library SMI-S Agent 1.3.0 supports both IBM 3584 and IBM 3494 Tape
Libraries. The CIM agent can be installed on the following systems.
Formal Support
SUSE Linux Enterprise Server 9
Redhat Linux Advanced Server 3

4.9.1 Tape CIM agents


Tape CIMOMs generally will support multiple tape subsystems from the same vendor. Limitations
might be placed on the Tape Library type. For example, the IBM System Storage Tape Library SMI-S
Agent 1.3.0 supports the IBM 3584 and 3494, but the IBM 3310 is not supported by this CIM agent, and
has its own embedded CIM agent. As with the disk storage subsystem CIM agents, you much check
each agents list of supported devices and scalability limitations.
A Tape CIMOM in one location can collect data from a tape subsystem across the network but this is
not recommended as the CIMOM uses UDP based SNMP and in a busy network the UDP traffic tends
to be the first to be dropped. The recommended configuration is a CIMOM on the same switch/router as
the tape library. It will work and our normal testing environment is a WAN with the CIMOM in Oregon
or China connecting to devices in Arizona (but that's on the IBM intranet network). In other production
environments we have seen problems with dropped SNMP packets between the library and CIMOM.

4.9.2 Are there any resource issues to be aware of when contemplating a large
environment?
Our rule of thumb is 6,000 cartridges per IBM System Storage Tape Library SMI-S Agent 1.3.0. So that
can be one big library or lots of little libraries.
For TS 3310, there is no CIM Agent group inside IBM, because it is an OEMd Adic i500. The firmware
with the embedded CIM Agent for TS 3310 is developed by ADIC/Quantum.
TPC actually supports only 3584 and 3310. The 3584 is supported by the IBM Tape Library CIM Agent.
The 3310 is written by Adic and included into their firmware.

4.9.3 TPC Supported Tape Libraries


Tape Libraries that are supported by TPC release are the support matrix on support website at the TPC
>>>Supported Products List.
(Discovery and launch Specialist only
TPC is designed to work with every SMI-S 1.1 SMI Profile certified CIM Agent.

December 2011

79

Tivoli Storage Productivity Center


Hints and Tips

4.10 EMC SMI-S Provider Configuration


EMC Clariion and Symmetrix devices can be managed and monitored by SMI-S compliant management
applications, such as IBM TPC V3.3. EMC Clariion or Symmetrix devices are discovered and managed
through this EMC SMI-S Provider.
The EMC SMI-S Provider SMIKIT-WINDOWS-SE632_20.exe is the current supported level, and
provides a SNIA SMI-S V1.1 level of functionality. This is the SMI-S Provider version 3.1.2
maintenance release for SMI-S on WINDOWS that was released with Solutions Enabler 6.3.2.20. It can
be downloaded at >>>EMC PowerLink (this page requires a signon - you can register for one if you
don't have one).There are also installers for Linux and Solaris. Well use Windows as our example
platform. The general concepts extend to UNIX as well.
With the SE632-20 version, TPC can provide:
Data Manager
o Asset reporting by Storage Subsystem
o Capacity reporting by Storage Subsystem
Disk Manager
o View volumes allocated on the managed device
o Allocate/provision new LUNs to servers. The limitation that exists today is that you
cannot specify a LUN name or name prefix. You must let the LUN name default.
o Storage Subsystem alerts
o Storage Subsystem reports
o Storage Subsystem Performance Reports
You can also download the >>>EMC 4.1 SMI-S Provider Release NotesEMC 4.1 SMI-S Provider
Release Notes. This document has all the instructions for installing and customizing the SMI-S Provider
V3.1.2. You will need this document if you wish to do any advanced configuration of the CIM agent.

4.10.1 Install the EMC SMI-S Provider


The Solutions Enabler server must be a Windows host and must have an HBA with fiber connectivity to
the Clariion with the proper zoning configured from the server to the Clariion device.

The SMI-S Provider V3.1.2 does not require a license key from EMC. It contains the portions of
the EMC Solutions Enabler V6.3.2 code that the CIM agent needs. A separate licensed version of
Solutions Enabler is not required.
Managing a Clariion device using the EMC SMI-S Provider requires the following additional
software and conditions:
The Clariion device requires initial configuration to define the managed host systems.
Managing a Symmetrix device using the EMC SMI-S Provider requires the following additional
software and conditions:
The server that will run the SMI-S Provider software must have a fiber connection to the
same SAN the Symmetrix device resides in and be zoned to that device.
Visibility to the gatekeeper LUNs on the Symmetrix devices being managed from the
server that the SMI-S Provider is installed on.

December 2011

80

Tivoli Storage Productivity Center


Hints and Tips
Install the EMC SMI-S Provider V3.1.2 on the desired server. The following instructions describe the
actions necessary to install the SMI-S Provider on a Windows system.
1. Launch the SMIKIT-WINDOWS-SE632_20.exe program to start the install process.
2. WinZip Self-Extractor SMIKIT-WINDOWS-SE632_20.exe Click Setup. The WinZip
Self-Extractor will unzip the files to a temporary location and launch the SMI-S Provider
installer.
3. EMC Solution Enabler Destination Location accept or change the destination folder for the
SYMAPI component. Click Next.
4. EMC SMI-S Provider Destination Location accept or change the destination folders for the
SYMAPI and SYMCLI components. Click Next.
5. Confirm (info window): Click Yes to Install the EMC SMI-S Provider kit with EMC
Solutions Enabler. Click Yes.
6. EMC Solutions Enabler V31.2.1 An installation progress bar is shown. Watch and wait.
7. Installation Follow Up Click Finish.
The EMC SMI-S Provider V3.1.2 is now installed and ready to be configured.

4.10.2 Configure the EMC SMI-S Provider for Symmetrix devices


Perform the following commands from a command line prompt.
1.You will need to have a gatekeeper LUN masked and zoned to the server from the Symmetrix
device(s). Use Windows Disk Manager to check for at least one LUN from each Symmetrix that will be
managed.
You can determine if the server that has visibility to gatekeeper LUNs by issuing the symgate list
command. It can be found in the C:\Program Files\EMC\SYMCLI\bin directory (default location).
2.Use testsmiprovider to discover the Symmetrix device(s).
a. Open a Windows Command Prompt, and CD to the <EMC SMI-S Provider
basedir>\symcli\storbin directory
Example for Windows: C:\Program Files\EMC\SYMCLI\storbin
b. Execute the command testsmiprovider from your command prompt, and provide the
information requested for Host, Connection Type, Port, Username, and Password. Use
the defaults unless youve changed them.
c. When you get to the command prompt, issue the disco command to discover the
Symmetrix devices. The command should run and end with an output of 0 (success). It
wont show any discovered devices at this point.
C:\Documents and Settings\Administrator>testsmiprovider
Host [localhost]:
Connection Type (ssl,no_ssl) [no_ssl]:
Port [5988]:
Username []: Administrator
Password []: password
Connecting to localhost:5988
Using user account 'Administrator' with password 'password'
########################################################################
##
##
##
EMC SMI-S Provider Tester
##
##
This program is unsupported and intended for use by EMC Support ##
##
personnel only. At any time and without warning this program may ##
##
be revised without regard to backwards compatibility or be
##
##
removed entirely from the Solutions Enabler kit.
##
########################################################################

December 2011

81

Tivoli Storage Productivity Center


Hints and Tips
slp

- slp urls

slpv

- slp attributes

cn
rc
ns
ec

Connect
RepeatCount
NameSpace
EnumerateClasses

dc

- Disconnect

ens
ecn

- EnumerateNamespaces
- EnumerateClassNames

ei

- EnumerateInstances

ein

- EnumerateInstanceNames

a
r

- Associators
- References

an
rn

- AssociatorNames
- ReferenceNames

gi

- GetInstance

gc

- GetClass

refsys
remsys

- EMC RefreshSystem
- EMC RemoveSystem

addsys - EMC AddSystem


disco - EMC Discover
tms
- TotalManagedSpace

q
- Quit
h
- Help
########################################################################
repeat count: 1
(localhost:5988) ? disco
1.0.3 Provider Method {y|n} [n]:
++++ EMCDiscoverSystem ++++
Output : 0
Legend:0=Success, 1=Not Supported, 2=Unknown, 3=Timeout, 4=Failed
5=Invalid Parameter
4096=Job Queued, 4097=Size Not Supported
In 10.484375 Seconds
Please press enter key to continue...

d. Press the Enter key, and then q to exit.


3.Use the symcfg list command to list the Symmetrix devices that have been discovered. This
command will display each Symmetrix that the testsmiprovider command discovered.
C:\Documents and Settings\Administrator>symcfg list
S Y M M E T R I X
Mcode

Cache

Num Phys

Num

Model

Version

Size (MB)

Devices

Devices

DMX2000S
DMX2000S

5670
5670

Symm
SymmID

Attachment

000123456789 Local
000234567890 Local

32768
32768

1
1

1877
1381

C:\Documents and Settings\Administrator>

You are now ready to configure TPC for the Symmetrix (DMX) devices discovered by the EMC SMI-S
Provider.

December 2011

82

Tivoli Storage Productivity Center


Hints and Tips

4.10.3 Configure the EMC SMI-S Provider for Clariion devices


The configuration process for EMC Clariion devices is similar to that described in the previous section
for Symmetrix devices, except
Gatekeeper LUNs are not required since the Clariion communicates via TCP/IP from each
Storage Processor (SP).
The testsmiprovider addsys command is used to add the Clariion device(s).
To add a Clariion device to the EMC SMI-S Provider, do the following:
1. Open a Windows Command Prompt, and CD to the <EMC SMI-S Provider
basedir>\symcli\storbin directory
Example for Windows: C:\Program Files\EMC\SYMCLI\storbin
2. Execute the command testsmiprovider from your command prompt, and provide the
information requested for Host, Connection Type, Port, Username, and Password. Use the
defaults unless youve changed them.
3. When you get to the command prompt, issue the addsys command to manually add the
Clariion devices. The command should run and end with an output of 0 (success). It wont show
any discovered devices at this point.

4.10.4 Configure TPC for the EMC SMI-S Provider


You are now ready to register the EMC SMI-S Provider as a CIMOM agent in TPC V3 using the
following values:
Host: Hostname or IP address of EMC SMI-S Provider host
Interoperability Namespace: root/PG_InterOP (case is not important)
dont use /interop. This is not supported with the EMC V3.1.2 SMI-S Provider.
However, the V3.2.0 SMI-S Provider (not yet supported by TPC) changes the namespace to
interop.
Port and Protocol: 5988 (unsecure - http), 5989 (secure - https)
Userid/password: The EMC SMI-S Provider userid/password defined when you configured
the SMI-S Provider (if you did this). If user authentication isnt enabled on the CIM agent,
enter any values you wish.
Description of the CIM Agent: Use this to identify the managed device(s)
When you look at the output of the TPC CIMOM Discovery, make sure that each EMC Symmetrix
and/or Clariion device shows in the job output.

December 2011

83

Tivoli Storage Productivity Center


Hints and Tips
7/19/07 12:25:23 PM BTADS0000I Starting Discover Process collectStorageSubsystemsFromCIMOM , with Device
Server RUN ID 2005 , and Job ID 6412 .

7/19/07 12:25:24 PM HWN021727I TPC discovery starting on CIMOM http://192.168.1.48:5988.

7/19/07 12:27:04 PM HWN021725I TPC discovered/rediscovered a device with name


SYMMETRIX+000123456789/000123456789 on CIMOM http://192.168.1.48:5988.

7/19/07 12:28:43 PM HWN021725I TPC discovered/rediscovered a device with name


SYMMETRIX+000198765432/000198765432 on CIMOM http://192.168.1.48:5988.

7/19/07 12:28:48 PM HWN021728I TPC discovery on CIMOM http://192.168.1.48:5988 is complete.

7/19/07 12:28:48 PM BTADS0001I Discover Process with Device Server RUN ID 2005 and Job ID 6412 is complete
with Status= 1 , Return Code= 0

4.10.5 EMC 4.1 Clariion SMI Provider Configuration


If you have an EMC power link account, you can review the >>>EMC 4.1 SMI-S Provider Release
Notes.
If you dont have an account you can register for one at the >>>EMC PowerLink web site.
Excerpt from the Release Notes (on Powerlink):
If all CLARiiON arrays are being discovered out-of-band, then it is typical to have the oslsprovider.conf
setting OSLSProvider/com.emc.cmp.osls.se.array.StorApi.database.discover set to false. See the
EMC SMI-S Provider Programmers Guide for further information on discovering CLARiiON storage
arrays out-of-band.
Controlling the SMI-S Provider runtime behavior:
The OSLSProvider.conf file allows you to control the runtime behavior of the EMC SMI-S Provider.
You can find this file in the following directories of the Solutions Enabler:

Windows platforms: C:/Program Files/EMC/ECIM/ECOM/Providers


UNIX platforms: /opt/emc/ECIM/ECOM/Providers

OSLSProvider/com.emc.cmp.osls.se.array.StorApi.database.discover

true | false

Specifies whether to perform a one-time discover upon starting a CIM Server. This is done before
processing the first request received by the CIM Server. Note that when the CIM Server is started, the
EMC SMI-S Provider is not loaded until requested from the CIM Server.
Post installation steps:
Authentication is required to query the EMC CIM Server. An initial setup is required on the EMC CIM
Server to create a CIM user. This can be done as follows:

December 2011

84

Tivoli Storage Productivity Center


Hints and Tips
1. Go to the URL https://<ipaddress>:5989/ecomconfig, and login using the username admin and
the password #1Password.
2. Click Add User and create a user with the role of Administrator. This newly created username
can now be used to obtain access to the Array Provider.
Note: For security reasons, you should change the default password of the admin user.
http://localhost:5988/ECOMconfig
https://localhost:5989/ECOMconfig
5989=secure
5988= non secure
ID=admin
Pwd=#1Password

4.10.6 Enabling Statistics Logging in Clariion


These steps are required to enable statistics logging in Clariion devices:
1. Open Navisphere (Note: Typically this will require a user with Storage Administrator rights and the
URL of the Navisphere software.)
2. Right-click on the APM storage System and select Properties.
3. On the General tab, select Statistics Logging.
4. Click Apply.
5. Click OK.
6. Click File>Save and Exit

December 2011

85

Tivoli Storage Productivity Center


Hints and Tips

4.11 HDS HiCommand SMI-S Provider


Hitachi HiCommand Device Manager software provides a single platform for centrally managing,
configuring, and monitoring Hitachi storage systems and Sun StorEdge 9900/9990 series and T3
systems. All HiCommand modules leverage a SMI-S and CIM-based architecture.
If you are already using HiCommand to manage your HDS subsystems then you should be able to
modify the existing configuration, and enable the SMI-S Provider. See 4.11.2 Configuring HiCommand
Device Manager for the specific configuration changes necessary.

4.11.1 Installing HiCommand Device Manager


If HiCommand Device Manager is not installed, perform the following steps to install it.
1) Locate the HiCommand Device Manager install package, and make it accessible to the SMI-S
server.
2) Install the Sun Java Runtime Environment (JRE) version 1.4.2_06
3) Make sure port 23015 is not in used (by default, Hi-command will use this port) by issuing
command netstat an at windows command prompt.
4) Launch the HiCommand Device manager installer.
a) HiCommand Device Manager Select English. Click OK.
b) Introduction Click Next.
c) Backup Recommendation Click Next.
d) HiCommand Suite Common Component Not Installed Click Next.
e) License Agreement Accept the license agreement, and click Next.
f) SNMP Trap Note Click Next.
g) Choose Install Folder - Change and/or verify the destination folder. Click Next.
h) Choose the Default Database for HiCommand Suite Common Component accept the
default database. Click Next
i) Installation Server Information Settings Enter the IP address of this server, and default port
number of 23015. Click Next.
j) Pre-Installation Summary Press Install.
k) Installation Progress Watch and wait.
l) Install Complete Click Done.

4.11.2 Configuring HiCommand Device Manager


Once the HiCommand Device Manager is installed, the configuration must be modified:
Add a Device Manager license key
Change the dispatcher configuration
Add Subsystems to Device Manager
Enable the SMI-S CIM Agent

4.11.2.1
1)
2)
3)
4)

Add a Device Manager license key

Launch Internet Explorer and enter the following URL: http://localhost:23015/DeviceManager/


HiCommand Device Manager-Login click the License box.
Version Information of Device Manager click the Browse box next to License File.
Choose File select the HDSLI directory on the license CD provided by Hitachi, and highlight
the license file. The license file will have a name similar to HDS HiCommand 20050419
000959.plk. Click Open.

December 2011

86

Tivoli Storage Productivity Center


Hints and Tips
5) Back on Version Information of Device Manager verify license file name. Click Save. The
License Type field should update to Temporary or some value other than Unregistered.
6) Close the HiCommand Device Manager web page by clicking Close.

4.11.2.2

Change the dispatcher configuration

1) Edit the dispatcher.properties file. This file can be located at


<install_location>\HiCommand\DeviceManager\HiCommandServer\config.
a) Change server.dispatcher.daemon.receiveTrap=true to
server.dispatcher.daemon.receiveTrap=false.
b) Save the dispatcher.properties file.
2) Recycle HiCommand Device manager
a) Start Programs HiCommand Device Manager Stop HiCommand
b) Start Programs HiCommand Device Manager Start HiCommand

4.11.2.3

Add Subsystems to Device Manager

1) Launch Internet Explorer and enter the following URL: http://localhost:23015/DeviceManager/


2) HiCommand Device Manager click on the Go to login page button.
3) HiCommand Device Manager Login enter system in user ID field and manager in password
field. Click Login.
4) HiCommand Device Manager select Object Tree Subsystems
5) Add Subsystem enter HDS storage subsystem device information as appropriate. Click OK.
6) Answer Yes to the question: Are you sure you want to add a new subsystem?
7) Add Subsystem Completed click Close.
8) Repeat steps 13 to 15 to add additional Hitachi storage subsystem as needed
9) HiCommand Device Manager device information will be displayed for each subsystem selected.
Click Logout in the menu bar that the top.

4.11.2.4

Enable the SMI-S CIM Agent

1) Edit server.properties. This file is located in the


<install_location>\HiCommand\DeviceManager\HiCommandServer\config\ directory.
2) Change the following values:
a) server.cim.support=false to server.cim.support=true
b) uncomment server.cim.https.port=5989
3) Recycle HiCommand Device manager
a) Start Programs HiCommand Device Manager Stop HiCommand
b) Start Programs HiCommand Device Manager Start HiCommand

4.11.3 Configure TPC for the HiCommand Device Manager CIM Agent
Once you have completed the verification step above, you are ready to configure TPC to add the CIM
agent. You will need to have the following information available when you add the CIMOM:
IP Address or FQDN of the CIM Agent server
Username of the CIM Agent (system)
Password of the CIM Agent (manager)
Port number (port number configured for the CIM agent, usually 5988 or 5989)
Protocol (HTTP or HTTPS)

December 2011

87

Tivoli Storage Productivity Center


Hints and Tips

Interoperability Namespace (/root/Hitachi/dmxx) (xx represents the HiCommand version)


Note: HDS Hi Command 5.0 does not work with the namespace root/hitachi/dm50, please use
root/hitachi/dm42
Description of the CIM Agent (example: HDS Lightning RQVPHL02)

4.11.4 Adding a HiCommand User Admin for TPC


By default this is not required and you can use the Administrator user id "system".
When you login to the HiCommand Server you will see the Explorer window in the top left hand corner.
Click on the Administration tab and Users and Permissions.

Click on the Users folder and you will see a list of the currently defined HiCommand Users.

December 2011

88

Tivoli Storage Productivity Center


Hints and Tips

In the lower right hand corner of this window you will see the Add User button.

Click on Add User and fill in the required information.

December 2011

89

Tivoli Storage Productivity Center


Hints and Tips

When you are done click on OK. You should again see the list of all defined HiCommand users
including the one you just added.

December 2011

90

Tivoli Storage Productivity Center


Hints and Tips
By default the user you created does not have any permissions. Double click on the user and you will
see the permissions available with none of them selected. At the top of this panel click on Change
Permission.

For probing the array you only need the View permission. Select the check box under View for all
Applications. By default this will also place a check in the View box for HDvM. When you are finished
click OK.

December 2011

91

Tivoli Storage Productivity Center


Hints and Tips
Now if you double click on the user you created you will see the check mark in the View column for
HDvM.

Next we need to add the user to a Resource Group. If you do not do this, your user will be able to login,
but not able to the see the storage defined in the HiCommand Server. Under the Administration tab
click on Resources Groups. In the Right hand panel click on Resource Group Allocation.

This will bring up a window showing all users defined to the HiCommand Server and the resource
groups they belong to. Select the user you added and then click Change.

December 2011

92

Tivoli Storage Productivity Center


Hints and Tips

By default your user belongs to <No Group>. In the drop down for Resource Group, you will see a list
of all resource groups defined on this HiCommand Server. Unless you have customized the environment
you will probably only see All Resources. Unless you are instructed otherwise select this group and
select OK.

The resource group allocation panel should come back and now show your user with the All
Resources listed under the Resource Group column. Select Close.

December 2011

93

Tivoli Storage Productivity Center


Hints and Tips

You should now be able to login to the HiCommand server, see the resources being managed by this
HiCommand Server, add the HiCommand Server to TPC, run a Discovery and Probe all the Arrays
being managed by this Server.

4.11.5 Problems with HiCommand in TPC


1. I have a valid HiCommand user ID but I am unable to register the server in TPC.
Check your User ID in the HiCommand GUI and make sure you can login to the HiCommand Server.
If not have the Administrator assign you permission. See step HELP above, you should be able to add
and probe with View permissions. If you already have been given permission make sure you belong to a
Resource Group. The All Resource group will work. See Step HELP above to add your user to a
resource group.
2. When running a Probe from TPC I am getting EXT_ERR_UNABLE_TO_CONNECT messages.
7/17/07 2:17:43 PM HWN021537E Could not create connection to CIMOM http://109.21.145.11:5988.Reason:
EXT_ERR_UNABLE_TO_CONNECT

The most likely cause is exhausting the Java memory on the HiCommand Server during the probe. By
default the HiCommand process is set to 256 MB. The maximum you can set this to is 1536 MB. If
you find the Array you are probing has more than 100 Volumes you will definitely want to change this
value. To increase the Java memory size execute the following command:
Installation-folder-for-HiCommand-Suite-Common-Component\bin\hcmdsweb /add /file installation-folder-theDevice-Manager-server\HiCommandServer\webapps\DeviceManager.war /server HiCommand /javaoption
HvDM.serverpath=installation-folder-for-the-Device-Manager-server /nolog /type DeviceManager /Xms256
/Xmx1408

December 2011

94

Tivoli Storage Productivity Center


Hints and Tips
Example of running the command on a HiCommand Server:
c:\progra~1\HiCommand\Base\bin\hcmdsweb.exe /add /file
c:\progra~1\HiCommand\DeviceManager\HiCommandServer\webapps\DeviceManager.war /server HiCommand
/javaoption HDvM.serverpath= c:\progra~1\HiCommand\DeviceManager /nolog /type DeviceManager /Xms256
/Xmx1408

You will need to recycle HiCommand Server in order to use the new Java memory setting.

4.11.6 HDS TagmaStore Universal Storage Platform (USP) Virtualization in TPC


With HDS's current implementation of the SMI-S Array Profile for TagmaStore devices, TPC will be
able to generate basic reports for TagmaStore-internal backend storage array(s). TPC can not generate
reports for the virtual layer of storage. External storage virtualized by a TagmaStore device is not
included in TPC's basic reports of the device. When HDS implements the SMI-S SVP, TPC will be able
to report on that virtualized storage.

December 2011

95

Tivoli Storage Productivity Center


Hints and Tips

4.12 Brocade SMI-S Agent


The Brocade SMI-S Agent is used by TPC to collect Brocade switch asset and performance monitoring
and reporting data. The SMI-S Agent is installed on a supported server with the available SMI-S Agent
installation package. The SMI-S agent can communicate either with Brocades Fabric Manager or
individual switches.
As you monitor more switches and fabrics with the Brocade SMI-S agent, you will have to increase the
amount of memory available for the CIM agent to use. This will depend on the size of the fabric and the
number of fabrics being managed. You should also increase the memory heap size for the JVM based on
the number of switches and number of switch ports and devices.
This installation package is available at the >>>Brocade SMI Agent Downloads site. Download the
Brocade SMI-S Agent installation package for the latest version that supports your TPC environment
(refer to the TPC >>>Supported Products List), uncompress it, and make it available to the server that
you will install the SMI-S agent onto. Within the compressed package, installers are available for
Windows, Linux, and Solaris.

4.12.1 Installing Brocade SMI-S Agent


Install the Brocade SMI-S Agent using the following instructions. For complete instructions, refer to the
Brocade SMI-S Agent installation guide which is provided in .pdf format.
1) Launch the installer. For Windows, the installer can be found at x:\<base_dir>\Windows\install.exe.
2) License Agreement Accept the license agreement and click Next.
3) System Configuration Click Next.
4) Introduction Click Next.
5) Choose Install Folder Verify and/or change the destination folder. Click Next.
6) HTTP Port Configuration Verify and/or change the HTTP (unsecured) port. Click Next.
7) HTTPS Port Configuration Verify and/or change the HTTPS *secured) port. Click Install.
8) Installing Brocade SMI-S Agent Installer progress. Watch and wait.
9) FabricManager Server Configuration This identifies Brocades Fabric Manager, and is not used
by TPC, Well identify the switches to be monitored on a subsequent panel. Click Next.
10) Enabling mutual authentication for clients We wont do mutual authentication. No is the
default. Press Next.
11) Enabling mutual authentication for indications We wont do mutual authentication. No is the
default. Press Next.
12) Enable Security When security is enabled for the SMI-S CIM agent, Windows authentication is
used by default for authenticating username and password. If domain authentication is enabled on
Windows, then the corresponding domain is used for authenticating username and password. If
security is enabled during installation, then you must always start the SMI-A with Administrator
privileges; otherwise, all communication to the SMI-A will fail. Make a selection, and press Next.
13) Eventing and ARR TCP port configuration leave the fields blank, and click Next.
14) Enabling Console And/Or File Logging Accept the defaults and click Next.
15) Proxy Connections Configuration Heres where we enter the information for the Brocade
switches well monitor/manage. Press the Add button to add each switch. A Proxy Configuration
panel will be displayed. Click Next when all switches have been added.
a) Proxy configuration Youll need to know the IP address and switch admin
userid/password for the Brocade switch you want to monitor. For the Brocade M10, enter
both switch clusters separately.
b) Repeat the above step for each Brocade switch you want to monitor for performance.
December 2011

96

Tivoli Storage Productivity Center


Hints and Tips
16) Important information Configuration file locations. Click Next.
C:\SMIAgent\agent\server\jserver\bin\SMIAgentConfig.xml
C:\SMIAgent\agent\server\jserver\bin\provider.xml
17) Configuring and Starting as a Service Select the Yes button to have the SMI-S Agent start as a
Windows service. Click Next.
18) Installation Completed Congratulations. Click Done.

4.12.2 Configure the Brocade SMI-S Agent


The only thing you should have to do for configuration is to verify that the Brocade SMI-S Agent
service is running, and that its set to start automatically. This is the default setting. If the service is
stopped, be sure to start it before attempting to configure TPC to use it.
There is a configuration tool for the SMI agent provided by Brocade. It is located in the
C:\SMIAgent\agent\server\jserver\bin directory for Windows. Execute the Configurationtool.bat
command. The following tool panel will be launched, allowing you to change just about any
configuration option there is.

4.12.3 Configure TPC for the Brocade SMI-S Agent


Once you have completed the verification step above, you are ready to configure TPC to add the CIM
agent. You will need to have the following information available when you add the CIMOM:
IP Address or FQDN of the CIM Agent server
Username of the CIM Agent (admin)
Password of the CIM Agent (password)
Port number (port number configured for the CIM agent, usually 5988 or 5989)
December 2011

97

Tivoli Storage Productivity Center


Hints and Tips

Protocol (HTTP or HTTPS)


Interoperability Namespace (/interop)
Description of the CIM Agent (example: Brocade M12 RQVPBM12)

4.12.4 Memory and Scalability Considerations for Brocade SMI-S Agent


For the 110.5.0 release, SMI-A has been tested by Brocade in a fabric with more than 3700 switch ports
and more than 2000 devices. The memory usage in such fabric is less than 350 MB. The SMI-A has also
been tested in a fabric with more than 11,000 simulated switch ports. The memory usage is less than 700
MB.
There are java heap size recommendations for fabrics of different sizes.
Java Heap Size Recommendations:

512 MBs for fabrics with < 5,000 ports

1 GB for fabrics between 5,000 and 10,000 ports


From the Brocade SMI Agent Installation Guide for V110.5.0.

The memory required for running the SMI-A depends on the following:
number of switches
number of ports
number of devices in a single fabric
number of fabrics being managed
You should increase the memory as these numbers increase. You should also check the memory usage
of all applications and services running on the host and adjust the memory accordingly. If the agent is
used to manage multiple fabrics, use the total number of switch ports in all fabrics to determine the
memory usage.
You should also increase the memory heap size for the JVM based on the number of switches and
number of switch ports and devices. After you install the SMI Agent, you can increase the memory size
or heap size using the following procedure.
To increase the memory size or heap size
1. If you installed the SMI-A as a service on Windows, open the jserverd.ini file for editing. This
file is in the following location:
C:\Windows\system32\jserverd.ini
If you did not install the SMI-A as a service, open the start_server file for editing:
Linux and Solaris: <SMIAgent>/agent/server/jserver/bin/start_server
Windows: <SMIAgent>\agent\server\jserver\bin\start_server.bat

December 2011

98

Tivoli Storage Productivity Center


Hints and Tips
2. Modify the JVM flag in the file. The default value of the JVM flag is: -Xmx512m. For example,
to increase the memory from 512 MB to 1024 MB, change this value to: -Xmx1024m
For most fabrics, 512 MB is usually sufficient.
3. Restart the SMI Agent, if it is already started.

December 2011

99

Tivoli Storage Productivity Center


Hints and Tips

4.13 McDATA OPENconnectors SMI-S Interface


The McDATA OPENconnectors SMI-S Interface software provides a CIM agent for McDATA switch
and director products, and performs the functions of a general purpose server, enabling a standard set of
management functions to be performed by TPC.
The McDATA OPENconnectors SMI-S Interface software can be obtained from the McDATA
Filecenter website at >>>McData SMI Agent Downloads. Download the Brocade OPENconnectors
SMI-S interface version 1.5.2 for Windows installation package (Windows_SMI-S_1_5_2.zip).
The title is correct. Brocade now owns McDATA, and is re-branding the software. At some point in
the future, I would expect to see a single SMI-S Provider for both families of switches.
There is also an installation package for Solaris.
Unpack the downloaded file Windows_SMI-S_1_5_2.zip into a temporary directory, and make it
available to the server where you intend to install the McDATA SMI-S Interface software.

4.13.1 Installing McDATA SMI-S Interface CIM Agent


Install the McDATA OPENconnectors SMI-S Interface software using the following instructions. For
complete installation instructions and configuration help, refer to the McDATA OPENconnectors SMIS Interface User Guide (P/N 620-000210-140).
1) Launch the installer. For Windows, the installer is called install.exe.
2) SMI-S Interface 1.5 Wait for the installer to load.
3) License Agreement Accept the license agreement and click Next.
4) Choose Type of Install Select Install SMI-S Interface 1.5 and click Next.
5) Choose Install Set Select SMI-S Interface and SLP and click Next.
6) Choose a Folder Change and/or verify the destination folder and click Next.
7) Pre-Installation Summary Verify information and click Install.
8) Installing SMI-S_Interface_Provider Watch and wait.
9) SMI-S Server Interface Configuration This pop-up windows lets you select which management
method the CIM agent will use:
a. Via Management Platform This option uses an existing EFCM server. The switches
managed by the EFCM instance will be used by the CIM Agent. Specify the following
information:
i. Network Address The IP address or FQDN of the EFCM server
ii. User ID The EFCM login userid
iii. Password The EFCM login password
Use the Test Connection button to test connectivity to the EFCM server.
EFCM version 8.7.1 is not supported. An error message is displayed if the management
platform is EFCM version 8.7.1. Supported EFCM versions are 9.0 and 9.1. TPC does
not support EFCM V9.0. So if you are using Via Management Platform your EFCM
must be at V9.1.
b. Direct Connection This option lets you specify each McDATA switch that you want TPC
to monitor. If this option is selected, you must specify each switch to be managed. These
switches cannot be managed by EFCM. You can specify the first switch using the installation
panel
i. Network Address The IP address or FQDN of the switch
ii. User ID The switch admin userid
December 2011

100

Tivoli Storage Productivity Center


Hints and Tips
iii. Password The switch admin password
Use the Test Connection button to test connectivity to the EFCM server. All subsequent
switches must be configured using the configuration steps in 4.13.2 Configuration for Direct
Connection method.
c. SMI-S Interface Message The message The direct connection mode has been saved
successfully. Click OK.
10) Install Complete Congratulations. Click Done.

4.13.2 Configuration for Direct Connection method


This configuration setting is needed if you selected the Direct Connection method during installation.
If you selected Via Management Platform, skip these instructions.
Perform these steps to complete the configuration for Direct Connection. Complete instructions can be
found in the McDATA OPENconnectors SMI-S Interface User Guide on page 4-3.
1) Open Windows Explorer, and navigate to
C:\Program Files\SMI-S_Interface_Provider\wbemservices\CIMOM\bin.
2) Verify the contents of the file mcdataProductInterface.properties. There should be a line that reads:
ConnectionMethod=Direct connection. Uncomment or add this line if needed. All other lines
should be commented out.
3) Open (or create) the file switch.properties. It should contain the following information:
# PLEASE MODIFY THE VALUES AS REQUIRED FOR MANAGING
THE SWITCHES
cimserver=https://localhost/root/mcdata
cimserverusername=Administrator
cimserverpassword=password
switchip=192.168.1.101
switchtype=10
switchusername=Administrator
switchpasswd=password
Change these values as appropriate for your environment. Save the switch.properties file when
youve completed making your changes.
4) Create a new file set_McData_env.bat, and add the following contents:
@echo off
REM the following three lines should all be combined into a single line in the
REM batch file. The path locations in this file assume that the SMI-S Provider
REM was installed in the default location. If the SMI-S provider was installed
REM somewhere else, make the appropriate changes to the directory paths below.
set CLASSPATH=C:\Program Files\SMI-_Interface_Provider\wbemservices\lib\wbem.ja
r;C:\Program Files\SMIS_Interface_Provider\wbemservices\CIMOM\lib\log4j.jar;C:\
Program Files\SMI-S_Interface_Provider\wbemservices\CIMOM\lib\server.jar;
REM Change to the programs drive and directory
C:
cd C:\Program Files\SMI-S_Interface_Provider\wbemservices\CIMOM\bin

5) Save this file to a directory thats included in your machines PATH (such as C:\Windows).
6) Open a command prompt, and perform the following steps:
a. Execute the set_McData_env.bat command. This will set up the execution environment to
run the ManageSwitch command.

December 2011

101

Tivoli Storage Productivity Center


Hints and Tips
b. Issue the command ManageSwitch add from the command line. This will add the
switch(es) that you added or modified in the switch.properties file.
c. Stop and restart the SMI-S CIM agent service from the Windows Services panel.

4.13.3 Configure TPC for the McDATA SMI-S Provider


Once you have completed the configuration steps above, you are ready to configure TPC to add the CIM
agent. You will need to have the following information available when you add the CIMOM:
IP Address or FQDN of the CIM Agent server
Username of the CIM Agent (Administrator)
Password of the CIM Agent (password)
Port number (port number configured for the CIM agent, usually 5989)
Protocol (HTTPS)
Interoperability Namespace (/interop)
Description of the CIM Agent (example: McDATA Intrepid 6140 RQVS6140)

December 2011

102

Tivoli Storage Productivity Center


Hints and Tips

4.14 Cisco SAN-OS CIM server


Each switch or director in the Cisco MDS 9000 Family includes an embedded CIM server. This means
that there is no CIM proxy agent to install. The CIM server communicates with any CIM client to
provide SAN management compatible with SMI-S, such as TPC. Cisco SAN-OS Release 3.1.2(a) and
later are compliant with SMI-S V1.1, which is required for Fabric Manager Performance monitoring and
reporting in TPC V3.3. SAN-OS releases before this level are not supported by TPC.

4.14.1 Enable and Configure the Cisco SAN-OS CIM Server


To configure a CIM server to use HTTP protocol in Cisco MDS 9000 Family products, follow these
steps after logging into the switch from a telnet command line:
_ switch# config terminal

Enters configuration commands, one per line. End with CNTL/Z.


_ switch(config)# cimserver enable

Enables and starts the CIM server


_ switch(config)# cimserver enablehttp

Enables the HTTP (non-secure) protocol for the CIM server(default).


_ switch(config)# CNTL-Z

Exit configuration mode.


_ switch# show cimserver

Displays configured CIM settings and parameters.

4.14.2 Configure TPC for the Cisco SAN-OS CIM server


Once you have completed the configuration steps above, you are ready to configure TPC to add the CIM
server. You will need to have the following information available when you add the CIMOM:
IP Address or FQDN of the Cisco switch with the CIM server enabled
Username of the Cisco switch
Password of the Cisco switch
Port number (5988)
Protocol (HTTP)
Interoperability Namespace (/root/cimv2, /root/PG_InterOp for v3.2.1 or later)
Description of the CIM Agent (example: Cisco MDS9513 RQVS9513)

December 2011

103

Tivoli Storage Productivity Center


Hints and Tips

5 TPC Hints and Tips


This section provides information about pervasive problems and their solutions, and other useful things
about configuring and otherwise setting up TPC. The following sections should help you with the
settings for your magic TPC decoder ring.

Figure 3 Magic TPC Decoder Ring

5.1 TPC Goodies


5.1.1 Topology Viewer Tip
There are some features that will help you navigate the Topology Viewer.

5.1.1.1 Alt key/mouse button 1 navigation


Theres an alternative to using the Mini-map to scroll around the topology viewer, which Ive always
found clumsy and annoying. Try pressing and holding down the Alt key, and then press the left mouse
button while dragging the topology screen with the mouse. It now moves with your mouse.

5.1.1.2 Mouse Wheel navigation


Pressing and holding the wheel mouse (clicking the wheel) will also allow you to drag the Topology
Viewer graphical window around. My favorite

5.1.1.3 Mouse click on a device entry


When viewing the table view below the graphical pane, you can single click on any entity, and it will be
brought to the center position in the graphical pane. Containers will automatically open as needed to
make the entity visible.

5.1.1.4 San Planner: Planning a path for a DS6000


In the SAN planner, when configuring a volume on a DS6K for a Windows system with one HBA port
and the options Volume Planner and Zone Planner are selected, the zoning actions by TPC are to include
the WWN of the DS6K preferred path only. It is up to the user to manually add the WWN of the nonpreferred path on the DS6K to the zone that TPC created.

December 2011

104

Tivoli Storage Productivity Center


Hints and Tips

5.2 Creating a master image to clone TPC Agent machines


If you use a master operating system image to deploy new servers in your environment, you can include
the TPC agents on that master image, so that the TPC agents will start up and register with the TPC
server automatically upon deployment. Follow these instructions to prepare the master image for TPC
agents.
It is assumed that the cloned images will point to the existing Agent Manager and TPC Server
machine(s). If you intend to deploy machines with TPC agents and have them point to different TPC
servers, you will need to provide additional startup time customization. Those customizations are not
provided in these instructions.
The default <agent_install_dir> is:
For Windows:
C:\Program Files\IBM\TPC\ca

For AIX and UNIX:


/opt/IBM/TPC/ca

1. Install the TPC Data agent. This will silently install the Tivoli Common Agent.
2. It is recommended that the TPC Fabric agent not be installed as part of a master image. The TPC
Fabric agent requires a SAN connected HBA to be functional in production, and only a few fabric
agents need to be installed in each SAN managed by TPC to provide a robust SAN data collection
process.
3. In the endpoint.properties file, change the following:
agent.ssl.truststore.download=true

The endpoint.properties file is in this directory:


<agent_install_dir>/config/

4. Verify that the agent is able to register with the agent manager. Check the msgAgent.log file for the
messages:
BTC1025I REG: Agent is now attempting to register and obtain security
credentials from Agent Registration Service amserver.yourcompany.com at port
9511.
BTC1022I REG: Registration succeeded after 1 attempts.

The msgAgent.log file is in this directory:


<agent_install_dir>/logs/

5. Stop the common agent using the endpoint.bat or endpoint.sh tool.


For Windows:
cd C:\Program Files\IBM\TPC\ca
endpoint.bat stop

For UNIX:
cd /opt/IBM/TPC/ca
./endpoint.sh stop

6. Set the GUID on the common agent machine to all hexadecimal zeros (x00) or hexadecimal foxes
(xff). This causes the Common Agent to register with the Agent Manager and also to create a new
GUID.
For Windows:
cd C:\Program Files\Tivoli\guid
tivguid Write
-Guid=00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00

For UNIX:
cd /opt/tivoli/guid

December 2011

105

Tivoli Storage Productivity Center


Hints and Tips
./tivguid Write
-Guid=00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00

7. Create an empty "PROBE_ME" file in the TPC Data agent directory (Case is important. No file
extension!)
For Windows:
C:\Program Files\IBM\TPC\ca\subagents\TPC\Data

For UNIX:
/opt/IBM/TPC/ca/subagents/TPC/Data

8. Delete the contents of the <agent_install_dir>/cert directory. This causes the agent to
download new certificates.
9. Delete the contents of the <agent_install_dir>/logs directory. This clears existing messages so
you will be able to view new messages.
10. If any other software is installed after this point, you will need to verify that the GUID still has all
hexadecimal zeroes (or foxes).
11. Make the master image copies from this TPC agent machine.
12. When a new machine is preloaded with this image and started, it should:
Register with Agent Manager.
Create a unique GUID.
Register the TPC Data agent with TPC.
Check that the Data Agent shows up in Administrative ServicesData SourcesData Agents. You will
need to refresh the view.
You should also have a Data Probe set up to use the Default Computer Group. This will automatically
include any new Data agents that get added.

December 2011

106

Tivoli Storage Productivity Center


Hints and Tips

5.3 Common Agent Tips


The following hints and tips are helpful to manage the Common Agent component.

5.3.1 Check /etc/hosts file for valid localhost entry


In order for successful agent installation, agent servers must have a valid localhost entry in the /etc/hosts
file. On AIX VIO server agent deployments, this is critical as a failed installation can only be recovered
by reimaging the server.

5.3.2 Stopping and restarting a Common Agent


To stop and restart the Common Agent on a Windows machine, the easiest way is to use the Windows
Services panel.
6. Select the service labeled IBM Tivoli Common Agent -
7. Right click on the entry, and select Restart

8. A dialog box will show the stopping and starting progress.


On AIX or UNIX, use the endpoint.sh command to stop, start, or restart the Tivoli Common Agent. The
default location for this command is in the /opt/IBM/TPC/ca (early 2.x TPC releases used /usr/tivoli/ep
on AIX, and /opt/tivoli/ep). An AIX example of the command and its output is shown below:
# cd /opt/IBM/TPC/ca
# ./endpoint.sh
Usage: ./endpoint.sh { console | start | stop | restart | dump | version}
# ./endpoint.sh restart
Stopping Tivoli Common Agent...
Waiting for Tivoli Common Agent to exit...
Waiting for Tivoli Common Agent to exit...
Waiting for Tivoli Common Agent to exit...

December 2011

107

Tivoli Storage Productivity Center


Hints and Tips
Stopped Tivoli Common Agent.
Starting Tivoli Common Agent...
#

5.3.3 Force the Common Agent to uninstall


There may be times when you want to uninstall the common agent, and there are still Data or Fabric
Agents installed. The normal behavior of the Common Agent uninstaller will not allow it to uninstall if a
subagent is still installed. You can force it to uninstall with a special parameter. To uninstall the
common agent under these circumstances, do the following:
1. Open a command prompt (Windows) or Terminal session (Unix)
2. CD to the Common Agent uninstaller
a. Windows with a TPC V3.1 agent - C:\Program Files\IBM\ca\_uninst
b. Windows with a TPC V2.3 agent - C:\Program Files\Tivoli\ep\_uninst
c. AIX and Unix with a TPC V3.1 agent - /opt/IBM/TPC/ca/_uninst
3. Enter the uninstall command (for GUI mode):
a. Windows - "uninstall.exe -W beanArguments.forceUninstall=true "
b. UNIX - "./uninstall -W beanArguments.forceUninstall=true"
4. The GUI uninstaller will launch.
For non GUI mode, specify the console option in the command syntax above.
5. Delete the removed Data agent and Inband Fabric agents from the TPC GUI (Administrative
ServicesAgents)

5.3.4 Cleaning up Common Agent residue


Refer to section 5.6 for information on TPC agent cleanup.

5.3.5 Associate a Common Agent with a new TPC Server


There might be occasions when you will want to associate an existing Common Agent with a new TPC
Server. This might happen if you are forced to uninstall the TPC Server, and havent preserved the
TPCDB repository, or you are splitting agents between two different TPC Servers.
1. Stop the Common Agent
a. Windows: stop the service IBM Tivoli Common Agent - 'C:\Program
Files\IBM\TPC\ca'
b. Unix: execute the command /opt/IBM/TPC/ca/endpoint.sh stop
2. Edit the <TPC_install_location>/ca/config/endpoint.properties file
a. If necessary, change all instances of the TPC Server IP address. If Agent Manager is
installed on the same server (and it almost always is), there will be seven (7) instances of
the TPC server IP address.
b. Copy the Registration.Server.PW line from the TPC Servers
<TPC_install_location>/ca/config/endpoint.properties file.
Example: Registration.Server.PW=ojZGPitgSox4rdy6UKJAHQ\=\=
3. Move ALL of the files in the <TPC_install_location>/ca/cert directory into a backup directory
a. cd to <TPC_install_location>/ca/cert
b. mkdir certbackup
c. move all of the files in the cert directory to <TPC_install_location>/ca/cert/certbackup
4. Edit the configuration files for installed subagents:

December 2011

108

Tivoli Storage Productivity Center


Hints and Tips
a. Data agent edit the file
<TPC_install_location>/ca/subagents/TPC/Data/config/agent.config and change the
value of serverHost to the new fully qualified TPC server name
b. Fabric agent edit the file
<TPC_install_location>/ca/subagents/TPC/Fabric/conf/setup.properties and change the
value of manager.loc to the new fully qualified TPC server name
5. Create an empty PROBE_ME file in the <TPC_install_location>/ca/subagents/TPC/Data
directory.
6. Restart the Common Agent
a. Windows: start the service IBM Tivoli Common Agent - 'C:\Program
Files\IBM\TPC\ca'
b. Unix: execute the command /opt/IBM/TPC/ca/endpoint.sh start

5.3.6 Associate a Storage Resource Agent with a new TPC Server


To reconfigure an existing TPC Storage Resource Agent to register and report to a new TPC server:
1. Edit the <TPC>/agent/config/Agent.config file.
2. Change the Servername and IPAddress parameters to point to the desired TPC server.
3. Restart the SRA:
a) Windows: stop and start the Storage Resource Agent server in the services.msc panel
b) Unix: cd <TPC>/agent/bin; ./agent.sh restart
4. Check the TPC gui on the new TPC server to verify that the agent is registered.

December 2011

109

Tivoli Storage Productivity Center


Hints and Tips

5.4 TPC subagent tips


The following hints and tips are helpful to manage the TPC Data and Fabric subagents installed under
the Common Agent component.

5.4.1 Restarting a stopped or failed Inband Fabric sub agent


If you find a Fabric agent that shows a inactive (Red) status in the TPC GUI, it most likely needs to
be restarted. An example is shown below:

When this condition occurs, you might not want to restart the whole Common Agent, since other
subagents might be working. To restart just the fabric subagent on a managed server, use the
TPCFabric command. The command accepts either a start or stop parameter.
On Windows, the TPCFabric.bat command is located in the following directory:
C:\Program Files\IBM\TPC\ca\subagents\TPC\Fabric\bin\w32-ix86
C:\Documents and Settings\Administrator>cd C:\Program Files\IBM\TPC\ca\subagents
\TPC\Fabric\bin\w32-ix86
C:\Program Files\IBM\TPC\ca\subagents\TPC\Fabric\bin\w32-ix86>tpcfabric
Usage: TPCFabric.bat [start] [stop]
Usage: TPCFabric.bat [start] [stop]
C:\Program Files\IBM\TPC\ca\subagents\TPC\Fabric\bin\w32-ix86>tpcfabric stop
Successfully stopped bundle: file:///c:\Program Files\IBM\TPC\ca\subagents\TPC\Fa
bric\TPCFabric_win32_i386.jar
C:\Program Files\IBM\TPC\ca\subagents\TPC\Fabric\bin\w32-ix86>tpcfabric start
BTC3146I Successfully started bundle: file:///c:\Program Files\IBM\TPC\ca\subagen
ts\TPC\Fabric\TPCFabric_win32_i386.jar

On AIX, the TPCFabric.sh command is located in the /opt/IBM/TPC/ca/subagents/TPC/Fabric/bin/aix


directory.
# cd /opt/IBM/TPC/ca/subagents/TPC/Fabric/bin/aix
#
# ./TPCFabric.sh
Usage: ./TPCFabric.sh { start | stop }
Usage: ./TPCFabric.sh { start | stop }
# ./TPCFabric.sh stop
# Successfully stopped bundle:
file:////opt/IBM/TPC/ca/subagents/TPC/Fabric/TPCFabric_aix_power.jar
Successfully stopped bundle:
file:////opt/IBM/TPC/ca/subagents/TPC/Fabric/TPCFabric_aix_power.jar
./TPCFabric.sh start
# BTC3146I Successfully started bundle:
file:////opt/IBM/TPC/ca/subagents/TPC/Fabric/TPCFabric_aix_power.jar
BTC3146I Successfully started bundle:
file:////opt/IBM/TPC/ca/subagents/TPC/Fabric/TPCFabric_aix_power.jar

December 2011

110

Tivoli Storage Productivity Center


Hints and Tips

5.4.2 Restarting a stopped or failed Data Agent subagent


If you find a Data agent that shows a Down (Red) status in the TPC GUI, it most likely needs to be
restarted. An example is shown below:

When this condition occurs, you might not want to restart the whole Common Agent, since other
subagents might be working. To restart just the Data subagent on a managed server, use the tpcdagt1
command. The command will accept a start or stop parameter.
On Windows, the tpcdagt1.bat command is found in the following directory:
C:\Program Files\IBM\TPC\ca\subagents\TPC\Data
C:\Documents and Settings\Administrator>cd "C:\Program
Files\IBM\TPC\ca\subagents\TPC\Data"
C:\Program Files\IBM\TPC\ca\subagents\TPC\Data>tpcdagt1
Usage: tpcdagt1.bat [ start | [stop [abort | normal]] ]
C:\Program Files\IBM\TPC\ca\subagents\TPC\Data>tpcdagt1.bat stop
4/19/06 3:12:41 PM AGT0040E: Agent Shutting down
C:\Program Files\IBM\TPC\ca\subagents\TPC\Data>tpcdagt1.bat start
BTC3146I Successfully started bundle: file:///c:\Program Files\IBM\TPC\ca\subagen
ts\TPC\Data\agent\lib\TPCData_win32_i386.jar
C:\Program Files\IBM\TPC\ca\subagents\TPC\Data>

On AIX, the tpcdagt1 command is found in the /opt/IBM/TPC/ca/subagents/TPC/Data directory.


# cd /opt/IBM/TPC/ca/subagents/TPC/Data
# ./tpcdagt1
Usage: ./tpcdagt1 [-host hostname] [-port portnum] [ start | [stop [abort |
normal]] ]
# ./tpcdagt1 stop
# 4/19/06 3:37:39 PM AGT0040E: Agent Shutting down
./tpcdagt1 start
# BTC3146I Successfully started bundle:
file:////opt/IBM/TPC/ca/subagents/TPC/Data/agent/lib/TPCData_aix_power.jar

5.4.3 Stopping and restarting multiple TPC data agents


You might encounter a situation where you need to stop all of your data agents immediately. A customer
inadvertently started a manual data scan on all of his 500+ agents, and couldnt tolerate the increased
server or network load at that time of day. There is no simple way to stop the scan once its started. The
answer is to stop and restart the data agent. How do you do that for all of your installed TPC Data
agents? Its not a process you want to perform on a regular basis, but its indispensable when needed.

December 2011

111

Tivoli Storage Productivity Center


Hints and Tips

Backup your TPC Server


Since this procedure requires that you temporarily modify the configuration on your TPC server, it is
highly recommended that you backup your server before you begin. TPC Backup guidance is provided
in section 5.7 and also in the >>>TPC Advanced Topics Redbook. At the very minimum, you should
make a copy of your TPC installation directory (default: C:\Program Files\IBM\TPC).
Stop the TPC Data Agents
The second order of business is to get the TPC Data agents stopped. (The first was making a backup of
your TPC server, remember?) You will notice that Administrative ServicesData SourcesData
Agents doesnt allow you to shutdown more than one agent at a time. Theres a fairly simple, although
not very intuitive, process to shut down all of your data agents at once.
1. Run an Agent report at Data ManagerReportingAssetAgentBy Agent
2. Highlight all of the agents in the list. Mouse click the first entry in the list, scroll to the bottom of
the list, hold the shift key, and mouse click on the last entry. (Ctrl-A didnt work for me.)
3. Click the right mouse button while hovering over the list to show the context menu.
4. Select ShutdownAbort as shown below. This will cause any running processes at the Data
agent level to abort, and stop the Data agent.

5. A warning box will appear with the message:

Click the box labeled YES, Stop the agents. Another warning box will appear:

December 2011

112

Tivoli Storage Productivity Center


Hints and Tips

This is the old double check to make sure you REALLY want to do this. Click the box labeled
YES, Process the requests.
6. The requests will be issued, and after some time, the report column Agent Status will be
updated with Down for each agent.
7. Produce a CSV file of the data in the report at this point. It will be used next to build a script to
restart all of the agents. Select FileExport Data to create the CSV file.
Create a script to restart the TPC Data agents
In order to restart the TPC Data agents, each of the common agents need to execute a command to
restart the Data subagent. We can do this from the TPC Server, and can create a script to provide some
simplification and avoid the finger checks of doing the process manually. The overall script will do the
following:
1. Temporarily modify the Common Agent configuration on the TPC Server to enable the restart
process.
2. Restart each TPC Data agent using the Common Agents agentcli command. The script will
contain two agentcli commands for each agent to be restarted. Well capture the command output
in a log file to verify the command execution, and have a record of the process.
3. Restore the Common Agent configuration on the TPC Server.
You will need to create a script, AgentRestart.bat on Windows, and store it in the C:\Program
Files\IBM\TPC\ca directory. If you installed TPC into a different directory, use the correct path to save
the AgentRestart.bat script.
The script will need a setup section that gets the path set correctly. If TPC is installed somewhere other
than the default location, you will need to change this section to the correct disk and path:
@echo off
REM **********************************************************************
REM ** AgentRestart.bat
**
REM **
Used to restart TPC Data Agents after a massive stop command **
REM **
issued from the TPC GUI using the Data Manager report
**
REM **
Data Manager->Reporting->Asset->Agents->By Agent.
**
REM **
**
REM ** TPC Hints and Tips for TPC V3.3 -- 07/30/2007
**
REM **
**
REM **********************************************************************
setlocal
REM You need to change the following line if you installed TPC in a
REM
non-default directory. Use 8.3 notation since no spaces or quotes
REM
are allowed in basic Windows scripts.
set TPC_Disk=C:
set TPC_CA_Path=C:\Progra~1\IBM\TPC\ca
REM Change to the CA directory
%TPC_Disk%
cd %TPC_CA_Path%

December 2011

113

Tivoli Storage Productivity Center


Hints and Tips
Next, the certificates will need to be changed to issue agentcli commands on different servers. We need
to use the Data Server certificates to get the authority to execute agentcli commands on the different
servers where TPC agents are installed. This next section will backup the TPC\CA\cert directory (by
renaming it to certcopy), and will then copy the TPC\Data\cert directory to the TPC\CA level. (The
original CA certificates will be restored at the end of the script.)
:copy_cert
REM We need to rename the CA cert directory to preserve it, and then copy
REM
the Data Server cert directory into the CA path
@echo.
@echo Backup CA\cert directory
move /Y cert certcopy
xcopy ..\Data\cert cert /s /k /r /h /i

Next is the important stuff. This section must be duplicated for each server that is to have its Data agent
restarted. You should use the data from the Agent Report csv file to get the list of agent servers to
restore. The csv file also contains the agents listening port. The default is 9510, but it could have been
changed when the agent was installed. Check that value also. Example:

Whats happening here is the TPC Servers Common Agent is acting as a conduit to push agentcli
commands down to the specified agent, where they are executed. The first agentcli command starts the
Data agent, and the second agentcli command verifies that the agent has been started successfully; you
should see the status of Active at the beginning of the output from this command.
REM -----------------------------------------------------------------------------set fqdn=sonja.yourcompany.com
@echo.
@echo Restart %fqdn%
call agentcli -host %fqdn% -port 9510 deployer start TPCData
call agentcli -host %fqdn% -port 9510 deployer list bundles state | find "TPCData"

Example output:

Restart sonja.yourcompany.com
BTC3146I Successfully started bundle: File:///C:\Program_Files\IBM\TPC\ca\subagents\TPC\Data\agent\l
ib\TPCData_win32_i386.jar
Active file:///C:\Program Files\IBM\TPC\ca\subagents\TPC\Data\agent\lib\TPCData_win32_i386.jar

After all the Data agents have been restarted, the script needs to restore the TPC\CA\cert directory.
This requires that the Common Agent be stopped to remove any reserves on the files. (The PING
command is used here as a wait command.) The CA service name might be different. Be sure to check
for the correct name in the Services Panel.
:restore_cert
REM We need to restore the CA\cert cert directory. To do this, we need to stop
REM
the CA, restore the directory, and then restart the CA
@echo.
@echo Restore CA\cert directory
net stop "IBM Tivoli Common Agent - 'C:\Program Files\IBM\TPC\ca'"
PING 1.1.1.1 -n 1 -w 10000 > NUL
del cert /f /s /q
rmdir cert
move /Y certcopy cert
net start "IBM Tivoli Common Agent - 'C:\Program Files\IBM\TPC\ca'"

December 2011

114

Tivoli Storage Productivity Center


Hints and Tips
Run the script to restart TPC Data agents
Once the script has been created, you should run the AgentRestart.bat command from the directory
where it resides. It is a very good practice to direct the output to a log file, so you can examine the
results afterward. Example invocation:
D:\Mytools> AgentRestart.bat > AgentRestart.log

An example AgentRestart.bat file is included in Appendix A. Be sure to use it as a template only. It


contains five example servers to restart TPC Data agents on, providing an example of the sequencing
necessary.

If things go wrong and the script aborts before restoring the TPC\CA\cert directory, the directory
will have to be restored and the Common Agent restarted manually. Remember, the original certs are
in the directory \TPC\CA\certcopy. Good thing you made a backup of your TPC server, right?
You can issue other agentcli commands to get information for remote Common Agents using this
same methodology. For example, to get the version of a Common Agent, you could build the
command call agentcli host %fqdn% -port 9510 configurator getConfig agent.version. This will
return the installed version of the CA that you pointed to.
C:\Program Files\IBM\TPC\ca>agentcli host sonja.yourcompany.com port 9510 configurator getConfig
agent.version
1.2.3.5

5.4.4 Check the status of a Fabric subagent


You can check the status of a Fabric Subagent (but not a Data Subagent) by using the Common Agents
agentcli command.
1. Open a command line prompt, and CD to the Common Agent base directory. The default is
C:\Program Files\IBM\TPC\ca.
2. Issue the command: agentcli TPCFabric ServiceManager get status
3. The output will show the status of each Fabric service. They should all be in the running state.
C:\Program Files\IBM\TPC\ca>agentcli TPCFabric ServiceManager get status
=============================
General Information
=============================
Host:
sonja/9.43.237.191
Operating System: Windows 2003 5.2 x86
Endpoint Version: 1.2.2.8
JRE:
IBM Corporation 1.4.2
=============================
Services
=============================
----------------------------ServiceManager
----------------------------Version:
3.1.1.11
Run Status:
Running
Description:
ServiceManager
----------------------------ConfigService
----------------------------Version:
3.1.1.11
Run Status:
Running

December 2011

115

Tivoli Storage Productivity Center


Hints and Tips
Description:
properties

Generic services for getting and saving application

----------------------------SANAgentScanner
----------------------------Version:
3.1.1.11
Run Status:
Running
Description:
Collects data on agents
----------------------------SANAgentInbandChangeAgent
----------------------------Version:
3.1.1.11
Run Status:
Running
Description:
Listens for inband events
----------------------------log
----------------------------Version:
3.1.1.11
Run Status:
Running
Description:
Logging Toolkit
----------------------------SANAgentHostQuery
----------------------------Version:
3.1.1.11
Run Status:
Running
Description:
Agent heartbeat and registration

5.4.5 Forcing a TPC Agent to use a particular IP address


When installing TPC agents on machines with multiple NIC cards, the order of network selection might
be such that the TPC agent provides the wrong network address for the TPC server to contact the agent
with. This will cause the TPC agent to wait indefinitely, and it will never show up as active on the TPC
GUI. There is a workaround for this situation and is documented below. The installed agents must to be
at 3.1.1.11 or higher for this to work, so upgrade the agent if they are at lower level.
1. Shutdown the TPC Server services (data server and device server).
2. Shutdown the common agent on the agent machine.
a. For Windows, use the Services panel
b. For UNIX, use the endpoint.sh command
3. For the fabric agent, edit
/opt/IBM/TPC/ca/subagents/TPC/Fabric/config/user.properties and add:
my.ip=<desired agent ip address> which the server should use to communicate

with the agent


which will resolve to the
correct IP address for the agent. Test this by using the nslookup command on the TPC
server machine.
These keys are case sensitive. Make sure that my.ip and my.name are specified in lowercase as
shown.
4. For the data agent, edit /opt/IBM/TPC/ca/subagents/TPC/Data/config/agent.config and
add:
my.name=<corresponding fully qualified hostname>

December 2011

116

Tivoli Storage Productivity Center


Hints and Tips
which will resolve to the correct address
on the server for the agent.
It is also case sensitive so make sure of networkName.
5. Start the common agent. Make sure data/fabric sub agents are running.
6. Start data server.
7. Start device server.
networkName=<fully qualified hostname>

TPC V3.1.3 has some new function that addresses this issue. When you install an agent on a machine
locally or remotely and the machine has more than one NIC card, the TPC agent installer will determine
the NIC card to use for two-way communication between the server and agent. If there are no NIC cards
that can be used for two-way communication, the installer will return an error message.

5.4.6 How to exclude devices from Fabric agent scans


There is a very important consideration to remember if you want to run the Fabric agent on a system that
accesses removable media devices such as tape. Some removable media devices cannot handle
command queuing, causing a long tape read or write command to fail. If a tape device is performing a
long running I/O operation, and TPC issues a SCSI query to that device, the long running I/O can be
terminated. This can typically happen with TPC fabric agents on TSM servers, and TSM will start
logging I/O errors.
You must identify these tape devices by WWN, and exclude them from Fabric agent attribute scans. To
avoid this problem, you need to add the tape device WWNs to a TPC Fabric configuration file called
ExcludeList, and place it in the Fabric agents conf directory.
It is best practice to create one ExcludeList that contains the device WWNs for all removable media
devices, and copy it to all Fabric agent machines that contain removable media devices.
Follow these steps to exclude removable media devices from Fabric attribute scans:
1. Stop the Fabric agent.
C\Program Files\IBM\TPC\ca\subagents\TPC\Fabric\bin\w32-ix86\TPCFabric.bat stop

2. Create a file named ExcludeList in the following directory: <agent_install_dir>/conf


3. In the ExcludeList file, enter the world wide names (WWN) of each tape device that you want to
be excluded from Fabric agent scans, one WWN per line. Make sure you use the device WWN,
and not the port WWN.
You can find the device WWNs by looking at the IBM TPCMy ReportsSystem
ReportsFabricPort Connections. You can also look in the fabric switchs management
server name list for this WWN information.
4. Save the file. The file name needs to be exactly the same as the one provided above. Here is an
example of an ExcludeList file to exclude four device WWNs from receiving scsi commands
from the Attribute scanner:
500500cdc920d02a
500500cdc920ccf9
500500cdc93f51ca
500500cdc926342a

5. Restart the Fabric agent.


C:\Program Files\IBM\TPC\ca\subagents\TPC\Fabric\bin\w32-ix86\TPCFabric.bat start

December 2011

117

Tivoli Storage Productivity Center


Hints and Tips

5.5 DB2 maintenance, tuning and configuration for TPC


TPC performance can be increased by regular DB2 maintenance and tuning some DB2 parameters.

5.5.1 DB2 Maintenance Steps


TPC database maintenance consists of three tasks:
1. updating statistics (runstats) - DB2 uses this information to determine the most efficient way to store
and retrieve data.
2. reorganization (reorg) - this is comparable to defragmentation of a disk, and improves the efficiency
of data retrieval operations.
3. backup - this step is essential to protect your TPC data and provide recovery capability in the event of
a system failure or other problem that could lead to loss of data.
DO I NEED TO STOP TPC SERVICES?
Although it is possible to perform DB2 maintenance while TPC is running (if archive logging is
configured), it is recommended that TPC data and device server services be stopped while performing
database maintenance. This is especially recommended for large TPC environments with a lot of TPC
server and database activity, in order to avoid server performance problems. Instructions to stop TPC
services can be found in the TPC Command Reference in Appendix B, or in the TPC Infocenter at
>>>Error: Reference source not found.
RUNNING DB2 COMMANDS
Database maintenance commands must be run from a user account that has database administrator
authority, and must be run from a DB2 command environment:
Windows: Start -> All Programs -> IBM DB2 -> DB2COPY1 (default) -> Command Line Tools ->
Command Window
Unix: login as the DB2 administrator user (typical default: 'su - db2inst1')
(All steps shown below assume a DB2 command environment.)
Establish a connection to the database prior to running your commands:
db2 connect to TPCDB

STEP 1 - UPDATING STATISTICS (RUNSTATS)


The easiest way to update statistics is to use the DB2 'reorgchk' command with the update statistics
option:
db2 reorgchk update statistics on schema tpc >filename

Redirecting output to a file as shown here is optional, but recommended so that the file can be reviewed
to determine where reorg is needed.
STEP 2 - REORGANIZATION (REORG)

December 2011

118

Tivoli Storage Productivity Center


Hints and Tips
There are two ways to reorganize the TPC database. You can reorganize all tables in the database, or
you can reorganize only the tables and indexes that require it. If you want to reorganize all tables, use
this command to create a list of the table names:
Unix example:
db2 list tables for schema tpc show detail | grep T_ >filename

Windows example:
db2 list tables for schema tpc show detail | find "T_" >filename

You will now create a command file for DB2 by editing the output file of table names, and changing
each line to this format:
REORG TABLE TPC.tablename;

where 'tablename' is the name of the table (don't forget the semi-colon ';' at the end). For example:
REORG TABLE TPC.T_RES_CONFIG_DATA;

To reorganize only the tables and indexes that need it, first complete step #1 to update statistics and
create a log file from the reorgchk command.
The log file lists each table, and at the end of the line there are flags that will indicate reorg is needed
with the presence of a '*' character. If reorg is not needed, you will see only a series of hyphens and no
'*' such as:
--You will create a command file for DB2 with a line for each table that needs to be reorganized, based on
the reorgchk report, using the format/syntax shown in the example above.
After the list of tables in the report, the database indexes are listed. Similar to the tables, each index
entry line ends with a set of flags indicating if reorganization is needed. If the flags include one or more
'*' characters, you should add an entry to your command file to reorganize the indexes for that table:
REORG INDEXES ALL FOR TABLE TPC.tablename;

After you have assembled your command input file listing all tables and indexes to be reorganized, you
are ready to run the commands:
db2 -tvf commandfile >outputfilename 2>&1

After completing the reorg, you should perform step #1 again to update statistics so that the statistics
reflect the reorganized database.
STEP 3 - BACKUP YOUR DATABASE
Here are the commands to run to do a simple database backup.
Windows:
db2 CONNECT TO TPCDB
db2 QUIESCE DATABASE IMMEDIATE FORCE CONNECTIONS
db2 CONNECT RESET
db2 BACKUP DATABASE TPCDB TO c:\backups WITH 2 BUFFERS BUFFER 1024 PARALLELISM 1 WITHOUT
PROMPTING

December 2011

119

Tivoli Storage Productivity Center


Hints and Tips
db2 CONNECT TO TPCDB
db2 UNQUIESCE DATABASE
db2 CONNECT RESET

Unix:
db2 "CONNECT TO TPCDB"
db2 "QUIESCE DATABASE IMMEDIATE FORCE CONNECTIONS"
db2 "CONNECT RESET"
db2 "BACKUP DATABASE TPCDB TO /backups WITH 2 BUFFERS BUFFER 1024 PARALLELISM 1 WITHOUT
PROMPTING"
db2 "CONNECT TO TPCDB"
db2 "UNQUIESCE DATABASE"
db2 "CONNECT RESET"

These examples show c:\backups and /backups as the target directory for the backup, and these should
be changed to a safe/suitable existing location for your environment.
There are other options to backup your database, including using the Tivoli Storage Manager. Please
refer to chapter 3 of the >>>TPC Advanced Topics Redbook for more information.
RESTORING YOUR DATABASE
This is not a routine maintenance step, but since we have included the database backup step, the
database restore step is included here for reference. In the example below, 20100105 is the date stamp
of the database backup you wish to restore, c:\backups or /backups is the source where the backup is
located, and E: and /db2 is the target for the restore where your database is located.
You must stop your TPC services during a database restore.
Windows:
db2 RESTORE DB TPCDB FROM C:\backups TAKEN AT 20100105 TO E: INTO TPCDB WITH 2 BUFFERS
BUFFER 1024

Unix:
db2 RESTORE DB TPCDB FROM /backups TAKEN AT 20100105 TO /db2 INTO TPCDB WITH 2 BUFFERS BUFFER
1024

5.5.1.1 Tip Safely Backing Up DB2 On Windows


On Windows, you can ensure a quiet, safe environment for backing up the TPCDB database by
following these steps (suitable for whichever logging settings you are using):
1. stop TPC jobs (i.e., discoveries, probes, scans, monitors)
2. set services to Manual in services.msc panel
3. reboot the TPC server
4. perform the backup
5. reset services to 'Automatic'
6. reboot or restart services
7. restart TPC jobs
Unfortunately on Unix, it is not easy to prevent services from starting on reboot, so this tip applies only
to the Windows environment.
December 2011

120

Tivoli Storage Productivity Center


Hints and Tips

5.5.2 DB2 Performance Tuning


There are some tuning parameters that can be applied to the DB2 instance that runs your TPC repository.
These parameters will help TPC performance, but will consume more memory. Check that your
TPC/DB2 server has sufficient memory to accommodate these parameters.
Another recommendation for DB2 v9.x is to turn the self tuning memory manager (STMM) function
OFF. This may seem counter intuitive, but experience has shown that TPC DB2 performance is better if
the database is configured based on recommended settings. It also saves the cpu time that DB2 uses
swapping memory around from one pool to another, allowing for a more stable environment. The
command needed to do this is:
db2 update db cfg using SELF_TUNING_MEM OFF

Important: if you are turning STMM OFF for an environment that has been running with it ON, you
should also set the DB2 memory pools that are managed by STMM to something other than
AUTOMATIC. If you do not do this, these pools will remain at the value previously set by STMM,
which may have been right for the conditions in DB2 at that particular time, but may be inadequate with
different database activity. Here are the pools that should be set, and suggested values:
db2 update db cfg for TPCDB using DATABASE_MEMORY 120000
db2 update db cfg for TPCDB using LOCKLIST 2500
db2 update db cfg for TPCDB using PCKCACHESZ 20000
db2 update db cfg for TPCDB using SHEAPTHRES_SHR 2500
db2 update db cfg for TPCDB using SORTHEAP 20000

The following table shows the parameters that should be changed to improve performance:

December 2011

121

Tivoli Storage Productivity Center


Hints and Tips

Parameter Description
Default application heap size (1)
Database heap size (1)
Log buffer size (1)
Log file size (2)
Number of primary log files (2)
Number of secondary log files (2)
Max DB files open per application (3)
Monitor heap size (4)
Statement heap size (1)
IBMDEFAULTBP Buf. pool size (5)
TPCBFPDATA - Buffer pool size (5)
TPCBFPKEYS - Buffer pool size (5)
TPCBFPTEMP Buffer pool size (5)

Current Value
10240
1000
8
2500
8
100
64
132
10240
250
250
250
250

DB2 Parameter Name


APPLHEAPSZ
DBHEAP
LOGBUFSZ
LOGFILSZ
LOGPRIMARY
LOGSECOND
MAXFILOP
MON_HEAP_SZ
STMTHEAP
IBMDEFAULTBP
TPCBFPDATA
TPCBFPKEYS
TPCBFPTEMP

TPC Recommendation
20480
1500
512
20000 *
150 *
100 *
1024
1024
20480
4000
25000
2500
1000

*
Note: it is very important to make sure you have adequate disk/filesystem space before making
changes to the LOGFILSZ, LOGPRIMARY, LOGSECOND parameters. The numbers given here will
result in approximately 21gb of log file space. If you have less available space, use smaller numbers.
You can use DB2 Control Center to change the DB2 configuration values. Open Control Center, and
navigate to All Systems--><system_name>-->Instances-->DB2. Right-click on DB2, and select
Configure Parameters from the context menu. All parameters in groups 1-4 can be set in Control
Center. Group 5 parameters must be set using DB2 commands issued in a DB2 command window. Here
is where to look for the parameters that should be changed (refer to the number in parentheses next to
the parameter description in the table):
(1)
(2)
(3)
(4)
(5)

TPCDB Performance group


TPCDB Log group
TPCDB Application group
DB2 (instance) Performance group
Use DB2 commands to change these parameters

December 2011

122

Tivoli Storage Productivity Center


Hints and Tips

Scroll down to the appropriate section, and highlight the parameter you want to change. Select the value
column, and click on the
symbol to enter a new value. Enter the new value in the entry field and
click OK. Do this for each of the values in the list above. Once youve change all the values, click OK in
the DBM Configuration Panel. You will see a DB2 message saying the parameters were updated
successfully. Dismiss the message window. Exit the DB2 Control Center.
You can also issue DB2 commands to make the changes. This is especially useful when a graphical
environment is not available (e.g., DB2 Control Center is no longer available on Unix platforms starting
with version 9.x). The following commands must be issued from a sourced DB2 environment. This can
be done in Windows by opening a DB2 Command Window, or on AIX by logging in with the DB2 user
account, and/or sourcing the DB2 profile
(. /home/db2inst1/sqllib/db2profile).
Here are commands which can be issued from within a DB2 command window to make the changes. On
Unix the DB2 command should be enclosed in double quotes (). For example:
db2 get snapshot for all bufferpools >tpcdb_bufferpools_old.log
db2 connect to tpcdb
--record current settings:
db2 get db cfg for tpcdb >tpcdb_cfg_old.log
db2 get snapshot for all bufferpools >tpcdb_bufferpools_old.log
--include this command for v9.x environments
db2 update db cfg using SELF_TUNING_MEM OFF
db2 update db cfg for TPCDB using DATABASE_MEMORY 40000
db2 update db cfg for TPCDB using LOCKLIST 2500
db2 update db cfg for TPCDB using PCKCACHESZ 2000
db2 update db cfg for TPCDB using SHEAPTHRES_SHR 2500
db2 update db cfg for TPCDB using SORTHEAP 20000

December 2011

123

Tivoli Storage Productivity Center


Hints and Tips
--tpcdb configuration tuning:
db2 update db cfg for tpcdb using APPLHEAPSZ 20480
db2 update db cfg for tpcdb using DBHEAP 1500
db2 update db cfg for tpcdb using LOGFILSIZ 20000
db2 update db cfg for tpcdb using LOGPRIMARY 150
db2 update db cfg for tpcdb using LOGSECOND 100
db2 update db cfg for tpcdb using LOGBUFSZ 512
db2 update db cfg for tpcdb using MAXFILOP 1024
db2 update db cfg for tpcdb using STMTHEAP 20480
--bufferpool tuning:
db2 alter bufferpool IBMDEFAULTBP immediate size 4000
db2 alter bufferpool TPCBFPDATA immediate size 25000
db2 alter bufferpool TPCBFPKEYS immediate size 2500
db2 alter bufferpool TPCBFPTEMP immediate size 1000
--record new settings:
db2 get db cfg for tpcdb >tpcdb_cfg_new.log
db2 get snapshot for all bufferpools >tpcdb_bufferpools_new.log
db2 connect reset
--database manager tuning (instance on Windows is DB2, on Unix default is db2inst1):
db2 attach to DB2
--record current settings:
db2 get dbm cfg >tpcdb_dbmcfg_old.log
db2 update dbm cfg using MON_HEAP_SZ 1024
--record new settings:
db2 get dbm cfg >tpcdb_dbmcfg_new.log
db2 detach

December 2011

124

Tivoli Storage Productivity Center


Hints and Tips
Once the DB2 UPDATE commands have been successfully issued, any processes that are using DB2
must be stopped so DB2 can be restarted, and then the services can be restarted. An example for
Windows is shown below. Be sure to shut down the GUI if it is open on the TPC server before recycling
the services.
net stop "IBM TotalStorage Productivity Center - Data Server"
net stop "IBM WebSphere Application Server V6.1 - DeviceServer"
net stop "IBM WebSphere Application Server V6.1 - Tivoli Agent Manager"
REM The next two commands are to allow DB2 to quiece before restarting.
echo Wait about 30 seconds and then
pause
db2stop
db2start
net start "IBM WebSphere Application Server V6.1 - Tivoli Agent Manager"
net start "IBM WebSphere Application Server V6.1 - DeviceServer"
net start "IBM TotalStorage Productivity Center - Data Server"

The pause command is used to let DB2 quiesce before it is stopped.

5.5.3 Increasing the licensed processor limit for DB2


A licensed version of DB2 comes with TPC, and is meant to be used to be used only with TPC. The license as
DB2 is shipped is limited to a single processor environment. This has caused some customers to be concerned
that they might be out of compliance with DB2s license agreement if they install on a machine that has multiple
processors, which TPC requires. The error that was generated is shown below:
ADM12017E The number of processors on this machine exceeds the defined
entitlement of "1" for the product "DB2 Enterprise Server Edition". The
number of processors on this machine is "x". You should purchase additional
processor based entitlements from your IBM representative or authorized
dealer and update your license using the License Center or the db2licm
command line utility. For more information on updating processor based
licenses, refer to the Quick Beginnings manual for your platform. For more
information on the db2licm utility, refer to the DB2 Command Reference.

In fact, additional processor based entitlements for DB2 do not have to be purchased. Since DB2 is
provided with TPC, and TPC requires a multi-processor machine for performance considerations, and
also recommends that DB2 be installed on the same server as TPC, the customer is automatically
entitled to run it in any processor environment to support TPC. You can use the db2licm command to
modify the DB2 license to add additional processors to the DB2 license. This will suppress the error
message. Issue the command from a command prompt (or terminal session on AIX).
C:\Documents and Settings\tivoli.SONJA>db2licm -n DB2ESE 4
DBI1418I The number of licensed processors on this system has
been updated successfully.

5.5.4 Checking the DB2 Listener port


Occasionally, when installing TPC on AIX, you can get an error message that TPC cannot contact the
DB2 instance that youve specified. One of the common problems is that DB2 is not listening on port
50000 for TPC requests.

December 2011

125

Tivoli Storage Productivity Center


Hints and Tips
There is a parameter in DB2, SVCENAME, that contains the name of the TCP/IP port which a database
server will use to await communications from remote client nodes (TPC). In order to accept connection
requests from a database client using TCP/IP, the database server must be listening on a port designated
to that server.
The DB2 TCPIP service name (SVCENAME) must be defined in the DB2 database manager
configuration. In addition, the database server port (50000 in our case) and its TCP/IP service name need
to be defined in the services file on the database client.
Heres how to check the DB2 configuration for the listening port. The examples are taken from AIX,
since the problem seems to be limited to that platform.
1. Test to see if there is a TCP/IP listener on port 50000. Use the netstat command. If port 50000 is
listening, this might not be your problem. But follow the rest of the steps here to make sure that
port 50000 is in use by DB2.
# netstat -na | grep 50000
tcp4
0
0 *.50000

*.*

LISTEN

2. Open a terminal session, and source the DB2 instance to initialize the environment.
# cd /home/db2inst1/sqllib
# . ./db2profile

3. Retrieve the TCPIP service name that is defined in the DB2 database manager configuration.
# db2 get dbm cfg | grep -i svcename
TCP/IP Service name

(SVCENAME) = db2c_db2inst1

4. If there is no entry for SVCENAME, you will have to create it. This can be done using the DB2
update command.
# db2 update dbm cfg using svcename db2c_db2inst1
DB20000I The UPDATE DATABASE MANAGER CONFIGURATION command completed
successfully.

5. View the associated entry in the /etc/services file. It should contain an entry similar to the
following. Notice the named port matches the name in the DB2 database manager configuration.
# cat /etc/services | grep 50000
db2c_db2inst1
50000/tcp

6. If the name in the /etc/services file doesnt exist, you should create an entry for it. This new entry
can be added at the bottom of the file. You should ensure that there are no other entries for port
50000 before you add the new line. Retry the command in the previous step to make sure, and
make sure the syntax is correct for your environment.
db2c_db2inst1

50000/tcp

7. Stop and restart DB2. This will activate the new listener port.
# db2stop
12/06/2006 10:38:10
0
0
SQL1064N DB2STOP processing was successful.
SQL1064N DB2STOP processing was successful.
# db2start
12/06/2006 10:38:22
0
0
SQL1063N DB2START processing was
successful.
SQL1063N DB2START processing was successful.

8. Check for the DB2 listener port again, and make sure its there.
# netstat -na | grep 50000
tcp4
0
0 *.50000

December 2011

*.*

LISTEN

126

Tivoli Storage Productivity Center


Hints and Tips
There have also been instances where for some reason the DB2 communication mode gets reset. You
can check for this also.
1. Open a terminal session, and source the DB2 instance to initialize the environment.
# cd /home/db2inst1/sqllib
# . ./db2profile

2. Retrieve the DB2 communication setting using the db2set command.


# db2set -all | grep -i db2comm
[i] DB2COMM=tcpip

3. If the value is blank, or something other than tcpip, you will need to set it to tcpip.
# db2set DB2COMM=tcpip

4. Repeat step 2 to verify that the db2set worked correctly. Stop and start DB2 to make the new
configuration active.
# db2stop
12/06/2006 10:38:10
0
0
SQL1064N DB2STOP processing was successful.
SQL1064N DB2STOP processing was successful.
# db2start
12/06/2006 10:38:22
0
0
SQL1063N DB2START processing was
successful.
SQL1063N DB2START processing was successful.

Once youve verified that DB2 is listening on port 50000, you can restart your TPC installation, and the
installer should recognize your instance of DB2.

5.5.5 Removing DB2 from AIX


NOTE: Remember to uninstall any programs that use this instance of DB2 BEFORE you uninstall
DB2! This includes TPC and Agent Manager.
The cleanup steps are:
1.change directory to the installed DB2's instance directory.
cd /usr/opt/db2_08_01/instance
2.List the DB2 users
./db2ilist
3.Remove the db2 instance users listed with db2ilist command.
./db2idrop db2inst1
DBI1070I Program db2idrop completed successfully.
4.Remove the DB2 Administrative Server
./dasdrop 'daslist'
SQL4410W The DB2 Administration Server is not active.
DBI1070I Program dasdrop completed successfully.
5.CD to the DB2 install media or untar'd install package.
cd /mnt/dcrom/db2
6.Remove DB2
./db2_deinstall
<lots of console output>
You will also have to clean up the userids and user groups that DB2 creates during the install process (if
they still exist):
Users
db2inst1
December 2011

127

Tivoli Storage Productivity Center


Hints and Tips
dasusr1
db2fenc1
Groups
db2grp1
dasadm1
db2fgrp1

5.5.6 Notes on Installing DB2


When installing TPC, if you encounter the No valid local database found error when it prompts for
DB2 information, here is a list of things to check to resolve the problem:

TPC is checking for a supported version. Make sure you are installing a DB2 package that is
provided specifically for TPC on Passport Advantage.
Make sure you have installed the correct edition. The DB2 installer gives you several choices for
installation. You should choose the Enterprise Server Edition (ESE).
The server should be rebooted after DB2 installation.
Check to make sure that DB2 was installed successfully and is running (check that port 50000 is
listening, services are running, a sample database can be created, etc.).
An issue can occur when DB2 is upgraded, which wipes out the database manager configuration
setting pointing to the location of the DB2 java jdk. This will cause the installer to report "no
valid local database found" after the DB2 administrator id and password are provided. To
resolve this issue:
1. Check the data base manager configuration:
db2 get dbm cfg
2. Check the value of the variable JDK_PATH. It should have a value pointing to the copy
of the DB2 java jdk. For example, on Windows the value is typically:
C:\Progra~1\IBM\SQLLIB\java\jdk
3. If you need to set the value use this command run from a DB2 command shell/window:
Windows: db2 update dbm cfg using JDK_PATH C:\Progra~1\IBM\SQLLIB\java\jdk
Unix: db2 "update dbm cfg using JDK_PATH /opt/IBM/SQLLIB/java/jdk"

More information on resolving issues with the DB2 installation can be found in the technote >>>TPC
fails to detect a newly installed DB2 instance.

December 2011

128

Tivoli Storage Productivity Center


Hints and Tips

5.6 Uninstalling TPC


5.6.1 Silent TPC Agent Uninstall
Important note 1: this can be used to remove agents on the TPC server without damaging the TPC
server components.
A silent uninstall of the TPC agent can be performed using the uninstall.iss file. This file is found in
<TPC>/_uninst, and can also be found on the TPC install media disk1 or disk2 images.
Modify the uninstall.iss file for the components to be uninstalled, and then run the uninstall as follows:
cd <TPC>/_uninst
uninstall -silent -options uninstall.iss -is:log uninstall.log

You can use this to uninstall the data agent, fabric agent, GUI, or CLI. It cannot be used to remove TPC
server components. This will also leave behind a few items that must be cleaned up manually. You will
need to remove the common agent endpoint registry C:\Program Files\Tivoli\ep* or /opt/tivoli/ep*.
Important note 2: ONLY if you are performing this on an agent-only server (i.e., NOT the TPC
server) you should also remove the TPC install directory (C:\Program Files\IBM\TPC or /opt/IBM/TPC)
and the , and Installshield registry <path>/Installshield/Universal/IBM-TPC (path is C:\Program
Files\Common Files on Windows, /usr/lib/objrepos on AIX, and /root on other Unix). NEVER remove
the Installshield registry data unless all TPC software is to be uninstalled from the machine.

5.6.2 Cleaning Up A Bad TPC Install


Situations can occur where the TPC software needs to be reinstalled, but it is not possible to uninstall
successfully using the product uninstaller. This section provides details on how to cleanup the TPC
software.
CAUTION: These instructions MUST NOT be used to remove/clean up a failed agent install on the
TPC server. These instructions will remove ALL TPC software, including the TPC data and device
server. Use these instructions on a TPC server ONLY if you intend to remove ALL TPC software,
including the TPC data and device server.

5.6.2.1 Cleaning Up TPC Installs On Windows


To make sure that the system is clean prior to another installation attempt:
1. first uninstall components as far as possible through 'Add/Remove Programs'
2. make sure c:\Program Files\IBM\TPC directory tree is gone
2a. if you get error that 'files are in use', system must be rebooted
3. make sure c:\Program Files\Common Files\Installshield\Universal\IBM-TPC
directory tree is gone
4. make sure c:\Program Files\Tivoli\ep* (files and directories) are gone
5. make sure any 'itcauser' user accounts are removed
6. drop the TPCDB database if the uninstaller does not drop it (TPC server only)
7. search the Windows registry for any TPC related entries and delete any keys
found (if keys are deleted, a reboot is recommended). Search the registry for:

December 2011

129

Tivoli Storage Productivity Center


Hints and Tips
-

Program Files\IBM\TPC
Program Files\Tivoli\ep
tpcd
tsrm
tpcgui
tpcagent
agent1

if AgentManager needs to be removed...


1. first uninstall components as far as possible through 'Add/Remove Programs'
2. make sure c:\Program Files\IBM\AgentManager directory tree is gone
2a. if you get error that 'files are in use', system must be rebooted
3. drop the IBMCDB database if the uninstall does not drop it
4. search the Windows registry for any AgentManager related entries and delete
any keys found (if keys are deleted, a reboot is recommended). Search the
registry for:
- Program Files\IBM\AgentManager
- Tivoli Agent Manager
if DB2 needs to be removed...
1. first uninstall components as far as possible through 'Add/Remove Programs'
2. make sure c:\Program Files\IBM\SQLLIB directory tree is gone
2a. if you get error that 'files are in use', system must be rebooted
3. make sure c:\DB2 directory tree is gone
(default instance container - may be different if default name was not used)
4. make sure any DB2 user accounts or groups are deleted
if TIP needs to be removed...
Follow the steps below if the uninstaller in <TIP>/_uninst will not work:
1. Remove the Deployment Engine -

C:\"Program Files"\IBM\Common\acsi\setenv.cmd
C:\"Program Files"\IBM\Common\acsi\bin\si_inst.bat -r -f
RMDIR /s /q C:\"Program Files"\IBM\Common\acsi
TASKKILL /F /IM jservice.exe /T
IF EXIST "%TEMP%\acu_de.log" (DEL /F "%TEMP%\acu_de.log" )
RMDIR /S /Q "%TEMP%\acsitempLogs_Administrator"
RMDIR /S /Q "%TEMP%\acsiTemp_Administrator"
REG DELETE HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\acsisrv /f

2. Remove TIP Stop the TIP service either through the Windows Service Panel or the following
command. If you don't know the username/password, try the TPC Admin
username/password.
<TIP>\bin\stopServer.bat server1 -username <TIP Admin> -password <TIP Admin Password>
sc delete "IBMWAS61Service - TIPProfile_Port_16310"

3. Remove the TIP directory. If this command fails, TIP still has active
processes. Either find and kill the processes or reboot then try the deletion again.
rmdir /s /q <TIP install dir>

if TPC-R needs to be removed...


1) Stop WebSphere Application Server (WAS) process using one of the following
methods:
a) Run ..\replication\eWAS\profiles\CSM\bin\stopserver server1
b) Go to Windows Services panel to stop WAS, or set service to manual and reboot
c) Kill WAS process using Windows Task Manager

December 2011

130

Tivoli Storage Productivity Center


Hints and Tips
i. To find WAS process id, look at file
..\replication\eWAS\profiles\CSM\logs\server1\service.pid
ii. or, view SystemOut.log and search for the last occurrence of "process id"
2)
3)
4)
5)
6)

Go to WAS installation directory: cd ..\eWAS\bin


Execute: wasservice.exe -remove CSM
Delete all files under TPC-R installation root directory (..\replication)
Delete RM_Install.jacl from C:\DOCUME~1\ADMINI~1\LOCALS~1\Temp directory
If using DB2 for the database, open a DB2 command window, run cmd:
drop database TPCRM

5.6.2.2 Cleaning Up TPC Installs On Unix


See Caution at the beginning of this section before following this procedure.
To make sure that the system is clean prior to another installation attempt:
1. locate and try the uninstall.sh script first:
/opt/IBM/TPC/_uninst/uninstall.sh -console -force
2. search for and kill any running processes:
ps -ef | grep -i tpc
ps -ef | grep -i nonstop
3. rm -rf /opt/IBM/TPC /etc/Tivoli/TSRM;
rm -rf /usr/tivoli/ep* /usr/Tivoli/ep* /opt/Tivoli/ep* /opt/tivoli/ep*
4. rm -rf /usr/lib/objrepos/Installshield/Universal/IBM-TPC
5. find / -name vpd.properties -print
if file found:
a. make backup copy to different name
b. edit file with vi or other record-based editor, and delete any line that is
a tpc related entry (determined by file and directory path references). if
entire file is tpc related, delete the file (keep the backup copy).
if AgentManager needs to be removed...
1. first uninstall components as far as possible by running the uninstaller
2. make sure /opt/IBM/AgentManager directory tree is gone
3. refer to step 5 above for vpd.properties for Agent Manager entry cleanup
if DB2 needs to be removed...
Refer to section 5.5.5 in this document.
if TIP needs to be removed...
This is a shell script that can be used to remove TIP if the uninstaller in the
<TIP>/_uninst directory does not work:
#!/bin/sh
# Source the DE environment
if [ -f /var/ibm/common/acsi/setenv.sh ]; then
. /var/ibm/common/acsi/setenv.sh
fi
# Uninstall DE

December 2011

131

Tivoli Storage Productivity Center


Hints and Tips
if [ -f /usr/ibm/common/acsi/bin/si_inst.sh ]; then
/usr/ibm/common/acsi/bin/si_inst.sh -r -f
fi
# Kill DE
kill -9 `ps -aef | grep acsi | grep -v grep | awk '{ print $2 }'` 2> /dev/null
# Kill the TIP server
kill -9 `ps -aef | grep tip | grep -v grep | awk '{ print $2 }'` 2> /dev/null
# Remove the install directories
rm -rf /var/ibm/common/acsi /usr/ibm/common/acsi /opt/IBM/Tivoli/tip
# Remove the DE logs
rm -rf ~/IA-TIPInstall*log
# Clean up the /tmp directory
rm rf /tmp/acu_de.log /tmp/root
# Clean up /etc/inittab file
cp /etc/inittab /etc/inittab.bak
sed '/#Begin AC Deployment Engine block/d
/#Start the Cloudscape database server/d
/acsisrv.sh -start/d
/#End AC Deployment Engine block/d' < /etc/inittab >/tmp/inittab
mv /tmp/inittab /etc/inittab
# Clean up /etc/services file
cp /etc/services /etc/services.bak
sed '/# IBM ADE Service/d'< /etc/services > /tmp/services
mv /tmp/services /etc/services

if TPC-R needs to be removed...


1) Stop TPC-R server:

/opt/IBM/replication/eWAS/profiles/CSM/bin/stopServer.sh server1

a)If the cmd fails, go to a command prompt, and get the TPC-R process IDs (pid)
ps -ef | grep CSM
ps ef | grep replication

b)To stop the process:


kill -9 pid

2) Delete the /opt/IBM/replication directory tree


3) If you are using DB2 for the database, drop the TPCRM database:
. /home/db2inst1/sqllib/db2profile
db2 connect to tpcrm
db2 drop db tpcrm

4) Edit /etc/inittab file to removing the following entry:


/opt/IBM/replication/eWAS/bin/startServer.sh

December 2011

132

Tivoli Storage Productivity Center


Hints and Tips

5.7 Backing Up TPC


It is important to backup your TPC environment to protect it and provide for recovery in the event of a
problem or disaster. Backups should begin as soon as possible after your TPC server is installed.
The frequency of backups depends on your company's data protection policies, TPC workload and size
of TPC environment. Designing an effective plan for backup and recovery requires careful planning,
implementation, and testing.
A comprehensive backup plan for your TPC server should provide coverage for all of the following:
1. Installation detail documentation, including user ids and passwords for DB2/AgentManager/TPC; any
deviations from normal installation defaults in case reinstallation is necessary
2. Full server backup if possible (allowing the entire server to be recovered if necessary)
3. DB2 backup of TPC databases (TPCDB, IBMCDB).
(Refer to the >>>TPC Advanced Topics Redbook Chapter 3 for detailed instructions and examples.)
4. For TPC 4.1+ backup the TIP/TPC single signon authentication configuration (see backupConfig
command in Appendix C section 10.10) for TIP, the TPC device server, and the TPC replication server.
The backup file is named WebSphereConfig_yyyy_mm_dd.zip, and there will be one created for each
component.
5. Registry and system files:
a) InstallShield registries
AIX: /usr/lib/objrepos/InstallShield/Universal/IBM-TPC
UNIX: /root/InstallShield/Universal/IBM-TPC
Windows: C:\Program Files\Common Files\InstallShield\Universal\IBM-TPC
b) SRM legacy registry
AIX: subsystem TSRMsrv# (# can be any number)
UNIX: /etc/Tivoli/TSRM
c) Windows registry
d) Common agent registry
AIX/UNIX: /usr/tivoli/ep*, /opt/tivoli/ep*
Windows: C:\Program Files\Tivoli\ep*
e) Hosts file
AIX/UNIX: /etc/hosts
Windows: C:\WINDOWS\system32\drivers\etc\hosts
6. Tivoli GUID setting:
(go to c:\Program Files\Tivoli\guid or /opt/tivoli/guid and do: tivguid -show >tpc_tivguid.txt)
7. Agent Manager files/directories:
<AM_install>/AppServer/agentmanager/config/cells/AgentManagerCell/security.xml
<AM_install>/AppServer/agentmanager/installedApps/AgentManagerCell/AgentManager.ear/
AgentManager.war/WEB-INF/classes/resources/AgentManager.properties
December 2011

133

Tivoli Storage Productivity Center


Hints and Tips
<AM_install>/os.guid
<AM_install>/certs
8. TPC server files/directories:
<TPC_install>/config
<TPC_install>/data/config
<TPC_install>data/scripts
<TPC_install>/device/conf
9. TPC agent files/directories:
<TPC_install>/config
<TPC_install>/ca/cert
<TPC_install>/ca/config
<TPC_install>/ca/*.sys
<TPC_install>/ca/subagents/TPC/Data/config
<TPC_install>/ca/subagents/TPC/Fabric/conf
10. TPC fixes/patches, especially interim fixes or work-arounds provided by support
11. optional items (these files should be available to download from Passport Advantage):
TPC base (with license) installation software images or CD/DVDs
For further information, please refer to:
>>>STE Presentation - DB2 Administration Basics for TPC
>>>TPC Advanced Topics Redbook (Chapter 3)

December 2011

134

Tivoli Storage Productivity Center


Hints and Tips

5.8 TPC and Windows Domain Accounts


Windows domain admin accounts used with TPC must be a member of the local administrators
group on the server.
At the time of this writing, DB2, Agent Manager, and TPC do not support Windows Global groups
(i.e., nested groups where a domain account is a member of a group that is given membership in the
local administrators group).
When installing the Agent Manager, the two formats for the DB2 admin account that will work are:
'db2admin', or 'domain\db2admin'. The 'db2admin@domain' format will not work.
The TPC install does not accept either format for the DB2 admin account name when a domain
account is used for DB2. You can work around this issue by adding a local Administrator account to
the DB2ADMNS group that is created by the DB2 default install, then use the local Administrator
account and password for all DB2 credentials during the install.
If you want to use a Windows domain account to login to the TPC gui, the domain account needs to
be a member of the local group that is defined in the Roles to Groups Mapping in TPC. In this
example it is the local Administrators group.
You can use a Windows domain account to run the Common Agent service. The domain account
must be a member of the local administrators group on the machine where the agent is to be
installed. If you want to specify a domain account for the service when installing the agent, you must
use the 'Custom' installation path, and use the format 'domain\itcauser' when prompted to provide the
Common Agent service account.

5.8.1 How TPC Login Authentication Works In TPC 4.x


In TPC 4.x, if the TPC Device Server is up, the TPC Data Server passes credentials to the Device Server
for authentication. If the Device Server is not up, the Data Server tries to do the authentication through
the server OS. The Device Server tries to look up the TPC admin group in Active Directory, and then
the account specified for membership in that group.
Note: when configuring LDAP, the LDAP server entry needs to point to an LDAP server which is
usually a Unix box, or an Active Directory domain controller.
TPC 4.x includes the Tivoli Integrated Portal (TIP), which supports a single signon login to TPC. This
means that the same userid and password used to login to TIP is used to launch TPC.

5.8.2 LDAP Basics

5.8.2.1 A Note about Windows Active Directory


Windows Active Directory server was designed to manage computers and users in a Windows domain.
It was not designed to for LDAP, but can be used as a LDAP server with limitations. Some of these
limitations have to do with attributes that are supported to search for directory entries.
In order to exploit the full flexibility and capabilities of LDAP, products designed for LDAP such as
IBM Tivoli Directory Server or OpenLDAP can be used.

December 2011

135

Tivoli Storage Productivity Center


Hints and Tips

5.8.2.2 Understanding LDAP directory entries


An LDAP directory is organized in a tree or hierarchy. Each entry has a unique identifier: its
Distinguished Name (DN). This consists of its Relative Distinguished Name (RDN), followed by the
parent entry's DN. Think of the DN as the full file path and the RDN as its relative filename in its parent
folder (e.g. if C:\foo\bar\myfile.txt were the DN, then myfile.txt would be the RDN). Instead of a
filesystem full file path, in LDAP it is a fully qualified directory path, and the filename is the userid
found under that path.

5.8.2.3 Basic LDAP Terminology for TPC


Base distinguished name - the distinguished name in the directory tree from which to start a downward
search for the userid or RDN (cn=RDN). It is important to select a base distinguished name that is high
enough in the directory tree so that all userids to be used with TPC are contained within its
subdirectories.
It is also possible to set the base distinguished name too high, if the same userid (i.e., cn=JohnSmith)
appears in multiple directory entries under the base. In this case, authentication will fail with an error
when trying to login with that userid. This uniqueness requirement also applies to LDAP/AD group
names.
Relative distinguished name - (see above)
dn= distinguished name (example: "cn=tpcadmin,dc=sanjose,dc=ibm,dc=com"). This corresponds to a
fully qualified entry in the directory tree.
cn= common name (the relative distinguished name of the entry; in the case above it is "tpcadmin")
dc= domain component (example, sanjose.ibm.com is expressed as "dc=sanjose,dc=ibm,dc=com")
Federated repositories - a WebSphere term for a collection of one or more repositories that can be used
to satisfy authentication requests.
Realm - the name given to the authentication configuration for the application. In TPC, the default
'TIPRealm' is used, and should not be changed unless there is a valid technical reason to do so.
Bind distinguished name - the distinguished name entry of a userid with authority to connect the LDAP
or AD client (in our case the WebSphere instance running under TIP or TPC) to the LDAP server for
authentication. You have to get the bind distinguished name and password from your LDAP/AD
administrator.

5.8.2.4 Configuring TPC for LDAP Before you start


1. Identify the person or group that is the LDAP and/or Active Directory administrator at your company.
There will be information needed for configuration that they may need to provide if you do not have it,
such as the LDAP server name or ip address and port, and the bind distinguished name and password of
the user account that LDAP clients can use to connect to the LDAP/AD server.
2. Use an LDAP or Active Directory browser (such as the Softerra LDAP Administrator and Browser)
to view the directory and help validate configuration entries.

December 2011

136

Tivoli Storage Productivity Center


Hints and Tips
3. In Windows Active Directory environments, you can also use dsquery commands to determine LDAP
configuration parameters for use in configuring TIP/TPC for LDAP:
--determine the LDAP hierarchy for a given userid in the current Active Directory domain:
dsquery user -d ADdomain -u userid -p password
--determine the LDAP hierarchy for groups:
dsquery group -d ADdomain -u userid -p password
For more information, see >>>Changing the user authentication method in the >>>TPC Infocenter, and
the >>>IBM Redbook Understanding LDAP.

5.8.2.5 LDAP Issues, Tips and Solutions


TPC 4.1 adds support for LDAP authentication as an alternative to OS authentication. There are some
known issues in this area, and frequently there are problems with the existing Windows Active Directory
configuration that can cause problems for LDAP setup and configuration in TIP/TPC.
Following are recommendations based on early TPC 4.1 experience:
1. Use OS authentication (do not use LDAP) with TPC versions prior to 4.1.1. Using LDAP in 4.1.0
versions can cause subsequent TPC upgrades to fail.
2. Use OS authentication (do not use LDAP) when initially installing or upgrading to TPC 4.1 until
initial installation and configuration is completed and TPC is working properly.
3. Do not change the TIP or TPC configuration to use LDAP without first using the backupConfig
command to create a backup of the existing working configuration for both TIP and the TPC device
server. This will allow you to recover to a working environment with restoreConfig if a problem
occurs (Note that use of the backupConfig command will stop the service for TIP or the TPC device
server unless you specify -nostop, and you will need to restart the services):
--TIP:
C:\Program Files\IBM\tivoli\tip\profiles\TIPProfile\bin>
backupConfig -nostop
/opt/IBM/tivoli/tip/profiles/TIPProfile/bin>
./backupConfig.sh -nostop
--TPC device server:
C:\Program Files\IBM\TPC\device\apps\was\profiles\deviceServer\bin>
backupConfig -nostop
/opt/IBM/TPC/device/apps/was/profiles/deviceServer/bin>
./backupConfig.sh -nostop

4. Use Windows dsquery commands to determine LDAP configuration parameters for use in
configuring TIP/TPC for LDAP:
--determine the LDAP hierarchy for a given userid in the current Active Directory domain:
dsquery user -d ADdomain -u userid -p password
--determine the LDAP hierarchy for groups:
dsquery group -d ADdomain -u userid -p password

December 2011

137

Tivoli Storage Productivity Center


Hints and Tips
5. TPC LDAP configuration screens to complete Windows AD LDAP configuration, you will need to
obtain information from your AD admin, including a domain admin userid and password to query AD. It
is also helpful to have a tool to query AD (example: Softerra LDAP Browser, or dsquery commands).
(Note: the screen and entry labels/references that follow pertain to the TPC installer screens, and not
the TIP screens when converting from OS authentication to LDAP.)
a) screen 1 LDAP Server Hostname: specify your Windows domain controller server (example:
mydomain.abcd.com); LDAP Port: 389 (use the default). Since Windows AD does not allow
anonymous bind by default, you need to provide values for these optional fields. To determine the
Bind Distinguished Name, query AD for the domain admin userid entry (example:
CN=dcadmin,CN=Users,DC=abcd,DC=com). The Bind Password should be obtained from the AD
admin.
b) screen 2 Relative Distinguished Name for usernames: this is basically the starting place in AD
to search for usernames (example: CN=Users,DC=abcd,DC=com). Attribute to use for usernames: by
default this is set to uid. It needs to be changed to cn. Relative Distinguished Name for groups: this
is the starting place in AD to search for groups (example: CN=Users,DC=abcd,DC=com). Attribute to
use for groups: defaults to cn (leave as is).
c) Sscreen 3 LDAP TPC Administrator userid: this is the userid in Active Directory that you want
to be the TPC administrator (example: tpcadmin). LDAP TPC Administrator password: the password
for the account you specified. LDAP TPC Administrator group: this is the AD group for the userid
that you entered. All TPC administrator users should be a member of this group (example: TPCAdmins).
d) TPC Role to Group Mappings in the TPC gui under Administrative Services -> Configuration ->
Role to Group Mappings for the value of Superusers enter the TPC administrator group you defined
in the TPC installer (example: TPCAdmins).

5.8.2.6 Stopping and starting services after a configuration change


Successful authentication and single sign on in TPC depends on the synchronization of the configuration
between each of the three embedded WebSphere instances in TIP, TPC Device Server, and TPC
Replication Server.
If changes are made to the authentication configuration, services should be stopped and started in the
right order so that changes are propagated from TIP to the other services. TIP must be allowed to start
first and complete initialization so that the other services can request and receive the updated
configuration.
Stopping:
1) TPC Replication Server (CSM service)
2) TPC Device Server
3) TIP
Starting:
1) TIP
2) TPC Device Server
3) TPC Replication Server

December 2011

138

Tivoli Storage Productivity Center


Hints and Tips
On Windows, we recommend that you use the services.msc panel and set the startup property for each
service to Manual, and reboot the server after a change to the authentication configuration. Each
service can be started manually in the correct order, and the startup property changed back to
Automatic.

5.8.2.7 LDAP Configuration Files


There are three WebSphere instances in TPC that must be synchronized with each other in order for
authentication and single sign on to work: 1) the TIP eWAS instance, 2) the TPC Device Server eWAS
instance, and 3) the TPC-R eWAS instance.
Each of these instances has a set of configuration files that contain the settings and properties controlling
authentication. These files can be examined when trying to troubleshoot a problem with authentication
or single sign on in TPC.
For Windows x is typically C:\Program Files, and on Unix it is /opt.
TIP (x/IBM/Tivoli/tip/...):
.../profiles/TIPProfile/config/cells/TIPCell/wim/config/wimconfig.xml
.../profiles/TIPProfile/config/cells/TIPCell/config/security.xml
.../profiles/TIPProfile/properties/soap.client.props
TPC Device Server (x/IBM/TPC/device/...):
.../apps/was/profiles/deviceServer/config/cells/DefaultNode/wim/config/wimconfig.xml
.../apps/was/profiles/deviceServer/config/cells/DefaultNode/security.xml
.../apps/was/profiles/deviceServer/properties/soap.client.props
TPC Replication Server (x/IBM/replication/...):
.../eWAS/profiles/CSM/config/cells/DefaultNode/wim/wimconfig.xml
.../eWAS/profiles/CSM/config/cells/security.xml
.../eWAS/profiles/CSM/properties/soap.client.props

December 2011

139

Tivoli Storage Productivity Center


Hints and Tips

5.9 Common Agent Service and the Windows LocalSystem Account


You can run the Common Agent service under the Windows 'LocalSystem' account to avoid having to
create a user account for the service. There are two methods you can use to configure this:
1. Change the account to 'LocalSystem' in the services.msc panel on the agent server after the agent
has been installed. Stop the common agent service, change the service account in the properties
panel for the service, then restart the agent. The 'itcauser' account can be deleted from the server
after this change.
2. During agent installation, use the 'Custom' installation option. When you reach the Window with
the option to specify the Common agent service account, enter the account as 'LocalSystem', and
specify a password that meets the Windows complexity security policy requirement. Note that
this password will not be set on LocalSystem on the agent server; this is only done to get past the
installer requirement to provide a value for the password. **Note: in TPC versions prior to 3.3.2
there was a known issue where this method would result in the creation of a new 'LocalSystem'
account on the agent server with the password specified. This is fixed in the latest release. A new
account should not be created.

5.9.1

Limitations

There are limitations and reduced IBM Total Storage Productivity Center functionality as a result using
LocalSystem, including:
Ability to access Netware, NAS, and DB2 (other RDBMSs might also be affected) might work
unreliably or not at all.
Scans run on Windows file servers to collect file ownership information in a domain where the
common agent is running under "Local System" might run into a problem where the "Local
System" account does not have permission to lookup up the Security Identifier (SID) in Active
Directory.
There is no work around, other than to use an account other than "Local System" which can be assigned
the necessary rights/privileges/group membership.

December 2011

140

Tivoli Storage Productivity Center


Hints and Tips

5.10 Scheduling TPC Workload


TPC performance can be adversely impacted when too many resource intensive jobs are scheduled to
run at the same time. Performance can also be impacted if a single job targets a large number of
computers or devices, or if the computers or devices selected in one job are very large file or data
servers. Remember that jobs running against large servers not only require a lot of run time, but also
return a large amount of data that use TPC server and database resources while the data is stored.
To avoid TPC performance problems due to workload, consider the following suggestions:
1) Do not create scan/probe jobs that run against all computers or devices in large environments.
2) Create computer groups for use in scan/probe jobs based on the approximate volume of data:
a) group larger numbers of small computers and workstations together
b) keep large computers/servers such as file and database servers in smaller groups
3) Stagger or spread out your schedule so that jobs are not running or trying to return data at the
same time. Each environment is different, and changes and adjustments will be needed as your
circumstances change. Here is an example to illustrate the concept:
Operation
CIMOM Discovery

Frequency
Every 4-6 hours

Day of Week
All

Time of Day
12/6:00AM & PM

Host Probe
Host Scan

Daily
Weekly

All
Saturday

12:00 AM
2:00 AM

SVC Probe
DS8300 Probe
DS4000 Probe
Hitachi Probe
HP Probe

Daily
Twice/week
Twice/week
Twice/week
Twice/week

All
Sunday, Tuesday
Monday, Wednesday
Thursday, Saturday
Friday, Sunday

12:00 AM
10:00 PM
11:00 PM
12:00 PM
10:00 PM

Fabric_1 Probe
Fabric_2 Probe
Outband Fabric Probe

Daily
Daily
n/a

All
All
n/a

12:00 AM
01:00 AM
n/a

SVC Perf Mon


DS8300 Perf Mon
DS4000 Perf Mon

23 hours
23 hours
23 hours

All
All
All

05:00 PM
05:30 PM
06:00 PM

Switch Performance

n/a

n/a

n/a

Custom Batch Reports

n/a

n/a

n/a

History Aggregator

Daily

All

03:30 AM

4) When putting together your schedule, you should also try to avoid conflicts or overlaps with
critical non-TPC workload, such as I/O intensive jobs like backups.

December 2011

141

Tivoli Storage Productivity Center


Hints and Tips

5.11 Data Path Explorer Tips


There is a known issue with viewing paths on Microsoft Cluster (MSCS) environments. A workaround
for this issue is to install a TPC fabric agent on one of the nodes in the cluster. This will allow you to
view the paths to the devices correctly.

5.12 TIP Tips (Tivoli Integrated Portal)


1. For existing instances of TIP (installed separately from TPC), the default userid is tipadmin. There
is no default password; a password would have been created by the user who installed the
instance (some instances use 'tipadmin' for both the userid and password).
2. When installed by TPC, the TIP admin user and password is the same as the TPC admin user and
password. For example, a default TPC install on Windows performed from the local Administrator
account would have 'Administrator' for the TIP admin account, and the password would be the
'Administrator' account password.
3. It is recommended to have TPC install the TIP instance.
4. If an existing TIP instance is detected by the TPC installer, it will check to see if it is a version
compatible with TPC. The installer will also check to see if it is using a compatible authentication
configuration, as other products using TIP use file based authentication which TPC does not support.
5. Each instance of TIP requires 1 gb of memory.
6. To start the TIP interface from a web browser, use this url:
https://servername:16310/ibm/console/logon.jsp
7. Once TPC and TIP are installed, configured, and working correctly it is vital to use the
backupConfig utility to backup the TIP and TPC device server configuration (see section 5.8.2).
8. The TPC installer does not accurately reflect TIP install status. Check the IA* log files in the
TPCTIPservice.zip that is part of the TPC service collection.

December 2011

142

Tivoli Storage Productivity Center


Hints and Tips

5.13 Netapp/NAS/Netware
Configuration of these storage devices in TPC has always been difficult.
Successful configuration involves steps that are not well documented
Security considerations and policies in the customer environment can add complexity
An understanding of how scanning of these devices works is important
Large environments and selection of proxy agents for scanning have performance implications.
Proxy agents are TPC Data Agents. Storage Resource Agents do not support Netapp/NAS/Netware.

5.13.1 Netapp/NAS - Standard Configuration Steps


1)
2)
3)
4)
5)
6)

Run a discovery job or manually add the NAS/filer


Add root user and password for Netapp filers
License the discovered NAS servers you are interested in
Run second discovery job (for Netapp, this will discover the volumes on the filer)
Update Scan Probe Agent Administration (to set proxy scan agent relationships)
Run normal TPC scans selecting your NAS device

5.13.2 Discovery Automatic or Manual


Some customers may not want automatic discovery to be run. If this is the case use Manual Discovery.
Configure and run Discovery jobs to identify the servers and volumes in your Novell Directory Service
(NDS) trees by going to:
Administrative Services -> Configuration -> Manual NAS/Netware Server Entry.
A NetWare Discovery job:
Logs in to the NDS trees and enumerates the NetWare Servers in those trees.
Logs in to the NetWare servers in the NDS trees and gathers volume- and disk information.
Discovers NAS devices

5.13.3 Common Problems and Solutions 1

5.13.3.1

The customer will not expose a root level userid for NAS.

Without access to the root of the NDS tree, we cannot understand the directory structure correctly.
Typically you will be told to just use SNMP to discover the NAS shares. You will know you have this
problem when you encounter this error in TPC:
Error -Server: mytpcserver

Status: 12

01/01/10 9:00:00 AM STA0248W: Cannot enumerate shares on


\\nasserver.domain.com
01/01/10 9:00:00 AM GEN6014E: OS Error 5

This means that TPC cannot understand how the CIFS and NFS shares relate to each other, therefore we
cannot view all of the volumes and directories from top to bottom.
December 2011

143

Tivoli Storage Productivity Center


Hints and Tips

If you encounter this situation, there is a work around, but it has limitations. You can tell TPC to look at
each share as a discreet entity. The problem with this is that TPC has no way of knowing the share
relationships, and this can result in storage reporting inaccuracy. Consider this configuration:

It is important to get the full correlated view in order to report accurately on the storage.
To implement this work around in TPC, you must modify the TPC data server TPCD.config file to add
the SaveNonRoot=1 parameter in the [server] clause:
[server]
SaveNonRoot=1

You will need to restart the TPC data server service after this change in order for it to take effect.

5.13.3.2

Huge Netapp/NAS devices are discovered.

If TPC is not configured to do parallel scans, a normal scan can take days. Refer to the section on
Scan/Probe Agent Administration for advice on how to set this up to avoid this problem.

5.13.3.3

The NAS device does not uniquely identify itself.

Typically seen with EMC Celerra, this means that we can only support a single NAS device of this type
per TPC server. If you are unable to correct the device configuration, you will have to install a separate
TPC server for each device, and use rollup reporting to consolidate the data.

5.13.4 Netware Standard Configuration Steps


1) Install a Data Agent on the specified machine with a Netware Client.
This install automatically "Probes" the new machine.
The Probe "discovers" the NDS Trees known to the Netware Client
2) Add a NDS Userid and Password for the Trees discovered
3) Run A Netware Discovery Job
4) License the Netware Servers you are interested in
Administrative Services -> Configuration -> License Keys
5) Update Scan Probe Agent Administration
Sets Proxy Scan Agent relationship
6) From here on, treat Netware Servers as you do regular direct attached storage
December 2011

144

Tivoli Storage Productivity Center


Hints and Tips

5.13.5 Common Problems and Solutions 2

5.13.5.1

Customer has multiple NDS Trees

The Agent that is used discovers the wrong NDS Tree

5.13.5.2

Customer is unwilling to share NDS Administrator ID

5.13.5.3

Customer uses only one Agent to Scan too many Netware Servers

Solve this problem by configuring multiple proxy agents to balance the workload. Refer to the section
on Scan/Probe Agent Administration.

5.13.6 Scan/Probe Agent Administration


It is important to set up your scan/probe agents efficiently in order to avoid TPC performance problems.

Understand your network when you pick Proxy Agents


o These scans go over the IP network
o Choose a Proxy Agent on the same LAN
Proxy Agent scans will be slower than a scan by a local Data Agent
o For large devices, use multiple Proxy Agents
Consider how often to scan
o The default scan of once per day is really not required
o Weekly scans are recommended

5.13.7 Remote Scanning


TPC can be set up to remote scan servers outside of the standard NAS devices. The remote devices to be
scanned are identified in the TPC Data server nas.config file. This is used when customers do not want
to install a TPC Data Agent for scanning.
TPC ships with a nas.config file that has entries for some common NAS device vendors.

5.13.7.1

Remote Scanning Windows

1) Edit the <TPC>/data/config/nas.config file and add the following line


311 Microsoft Corporation
2) Run a Probe job on at least one Windows server in each Domain.
3) Run a Discovery job configured with the correct SNMP communities
4) Administer Remote Server Logins - Filer Logins tab
Specify an account with authority to authenticate to administrative shares
Domain\user will override the default domain
5) Configure Scan/Probe Agent Administration
Specify a proxy to connect to each system

December 2011

145

Tivoli Storage Productivity Center


Hints and Tips

5.13.7.2

Remote Scanning Unix

1) Edit the <TPC>/data/config/nas.config file and add the appropriate line for the OS you are trying to
remotely scan:
42 Sun Microsystems
11 Hewlett-Packard Corporation
2 International Business Machines
2) Run a Discovery job configured with the correct SNMP communities
3) Configure Remote Server Logins - Filer Logins tab
4) Configure Scan/Probe Agent Administration
Specify a proxy to connect to each system

5.13.7.3

Remote Scanning snmputil

The SNMP utility is provided to gather information about the various vendors sharing SNMP
information. It must be run from a Windows server, and SNMP must be configured. The utility is part of
the Windows Resource Kit, which you can download for your platform from Microsoft.
Execute the snmputil.bat file with the following syntax:
snmputil.bat <hostname>

When you execute the batch file you will get output like the following:
Variable = system.sysObjectID.0
Value = ObjectID 1.3.6.1.4.1.311.1.1.3.1.1

The bold number "311" is the vendor code we use to determine OS type.

December 2011

146

Tivoli Storage Productivity Center


Hints and Tips

5.14 Note on Configuring TPC for batch reports on UNIX or Linux


In addition to the procedure for >>>Configuring Batch Reports on Unix in the >>>TPC Infocenter, you
may need two additional steps:

edit the system startup scripts (i.e., /etc/inittab) to start the X server on :0
you may also need to use the xhost command on your client to allow the X server to connect

5.15 Unix Open File Limits in Large TPC Environments


In large TPC environments running on Unix servers, if the ulimit setting for the number of open files is
not set sufficiently high (recommendation is 10000), Device Server functions can stop working. Refer to
the technote >>>Device Server functions unexpectedly stop working for information on how to resolve
this issue.

5.16 The TPC GUI and Java


The TPC gui requires the correct version of IBM Java. This also applies to the Element Manager gui and
the TIP gui. If the wrong level of Java is being used, you may experience errors when launching, or
problems with gui window functions (i.e., buttons missing or not responsive, etc.).
In general, the following guidelines must be followed when using the TPC gui, especially for the
Element Manager and the web browser based Java web start gui:

Use IBM Java only Java from other vendors, either for the Java Runtime Environment (JRE) or
the java web start or browser plugins are not supported.
Use the correct version for your version of TPC. TPC 3.3.x required IBM Java 1.4.2. TPC 4.x
requires Java 1.5, and 1.6 in the latest 4.2 releases.
TPC is bundled with the JRE that it requires. Java does not need to be installed separately for
TPC.
Any workstation that will be using Java web start to launch the TPC gui from a web browser
MUST install and use the correct version of IBM Java. You can download and install the version
of Java that is required (and bundled with TPC) by pointing the web browser on your
workstation to the url:
http://<TPC-server-ipaddress>:9550/ITSRM/app/welcome.html

This will load a page from which you can select the Java package for your workstation platform.
Note: When you install this version, you can choose to not have this version as your system
version in order to prevent conflicts with other versions of java that may be installed on your
machine. It is recommended that you choose NOT to have this version of java installed as your
system version of java.
It may be necessary to uninstall other incompatible Java software from your system, especially
from installed web browser plugins, in order to ensure correct TPC and Element Manager gui

December 2011

147

Tivoli Storage Productivity Center


Hints and Tips

operation. You should also flush your web browser cache and exit/restart the browser after
removing plugins.
If you have other software installed on your workstation that requires a version of Java other than
what is required for TPC, you can either choose to install that software on a different server, or if
possible configure the application to point to its required Java environment on start up. If the
other software is web browser (i.e., Java web start) based, and cannot run with the version of
Java that TPC requires, you will have to choose between the two applications, or run one of the
applications from a different web browser (such as Firefox, or Internet Explorer if you are
already using Firefox).
If it is not possible to satisfactorily resolve Java version conflicts, possible alternatives are
installing the native TPC gui on your workstation, or using a remote desktop (Windows TPC
servers) or X windows client (UNIX TPC servers) to login and run the gui from the TPC server.

5.16.1 Where to find the Java bundled with TPC


You can find the compatible Java installation packages in the following directory of your TPC
installation. This is the same location that is linked to the browser url referenced above:
<tpc>/device/apps/was/profiles/deviceServer/installedApps/DefaultNode/DeviceServer.ear/DeviceServer.war/app

5.16.2 Changing the program association for Java Web Start (JNLP)
1. Start the Control Panel.
2. Navigate to Control Panel -> All Control Panel Items -> Default Programs -> Set Associations.
3. Locate the .jnlp file extension. Highlight that entry and click "Change program..."
4. In the "Open with" window that is displayed, click the "Browse" button.
5. In the file explorer window that is displayed, navigate to the javaws.exe file. If you installed java
with the defaults, it will be installed in C:\Program Files (x86) or C:\Program Files > IBM > Java50 >
jre > bin. Alternatively you can search for the javaws.exe file on your system. Select this file and click
the "Open" button.
6. In the "Open with" window, click the "Ok" button.
7. Java Web Start will now open jnlp files with the correct version of java.

5.17 Cleaning Up TPC Directories


TPC directories can begin to consume a lot of disk space with old/large log and trace files, dumps from
problems that have been resolved, etc. In addition to taking up disk space, it can cause the
TPCServiceFiles.zip package created by the service.bat or service.sh commands to take a long time
to create, and the resulting file can be very large.
Refer to these instructions to clean up your TPC server directory periodically, and especially before
using the 'service.bat' or 'service.sh' commands.
General guidelines:

old files may be deleted (or moved to a compressed folder outside of <tpc>)
if any large files have current date/time, you may want to keep the newest copy
when deleting large log files, do not delete the newest/current file being written

December 2011

148

Tivoli Storage Productivity Center


Hints and Tips

you can use the 'Search' option in Windows explorer, or the unix 'find' command to search for
large files (e.g.: find /opt/IBM/TPC -size +50M -print)
review trace settings and history/log retention settings to make sure they are set appropriately.
Refer to the DCF technote >>>Managing TPC Log Files to Reduce TPC Server Disk Space
Usage for information on how to review and modify these settings.

On windows, <tpc> = C:\Program Files\IBM\TPC (drive letter may be different)


On unix, <tpc> = /opt/IBM/TPC
1. Delete the following old/large files (if they exist) from:
<tpc>/device/apps/was/profiles/deviceServer/...
and
<tpc>/data/... or /log/...
heapdump*
tpc_heapdump*
javacore*
2. Search for and delete any old/large log files in the directories:
<tpc>/device/apps/was/profiles/deviceServer/logs/server1/...
(typical are SystemErr* and SystemOut* files)
<tpc>/device/log/...
(typical are msgTPCDeviceServerX.log, traceTPCDeviceServerX.log, tracePerfMgrX.log)
<tpc>/data/log/...

5.18 Planning for Private Switch Networks


Some switch vendors recommend a private IP network for the fiber channel switches.
It is important to understand that the IBM Tivoli Storage Productivity Center Device server cannot
communicate with the switches if they are on a private IP network. It is important to note that the out of
band Fabric agents require a TCP/IP connection from the the Device server to the switch. Also, SNMP
traps from the switches cannot travel directly from the switches to the Device server.
If you are using a private switch IP network you must rely on in-band Fabric agents to gather your SAN
information and to forward SAN events to the Device server.
Another option that is sometimes used with a private switch network is to allow the Tivoli Storage
Productivity Center Device server to communicate on the private switch network using a second
network interface card (NIC).

December 2011

149

Tivoli Storage Productivity Center


Hints and Tips

5.19 DS8k User Accounts


If you are upgrading DS8k firmware, the user accounts and passwords that have been defined for DS8k
gui access may be deleted. This is a known issue when upgrading from R3.x to R4.x code levels. You
will need to use the default userid and password admin/admin when trying to use the DS8k gui if the
accounts are reset, then you will be able to recreate the account and password that you were using.
For upgrades from R4.x and higher, the existing account information should remain intact.

5.20 Agent Registration Problems (SRV0042E/53E, BTC4045E, Tivoli GUID)


A very common problem when deploying TPC agents whether legacy agents, or SRA is a failure of
the agent to register with TPC. This problem is particularly common in environments where machines
are built from a common or master image, and the master image includes other Tivoli software that
deploys the Tivoli GUID such as Tivoli Storage Manager (For more information on how to create
master images with a TPC agent, see section 5.2 Creating a master image to clone TPC Agent
machines).
The Tivoli GUID is supposed to be a unique identifier for each server where it is installed. The problem
comes about when an image used to build multiple machines includes the Tivoli GUID. The result is
that each machine built from this image will have the same identifier. TPC uses this identifier to register
agents in its database.
Symptoms of this problem:

SRV0042E error message in the TPC data server server_0000xx.log file:


SRV0042E: A database error occurred during agent registration

SRV0053E / DB2 SQL0803N (-803) errors in the TPC data server logs:
10/7/07 2:03:25 AM SRV0053E: SQL error updating t_res_host table.
hostname: tpcserver.abccorp.com, ID: 123245
SQLSTATE: 23505, Vendor error code: -803
DB2 SQL error: SQLCODE: -803, SQLSTATE: 23505, SQLERRMC: 2;TPC.T_RES_HOST

BTC4045E error message in the installer when trying to deploy an agent:


BTC4045E Rejected connection attempt from <ipaddress>

You deploy an agent, and it appears in list of agents in the TPC gui, but another agent that was
installed mysteriously disappears

In the simplest case, all that is needed is to:


1. generate a new Tivoli GUID value on the agent server:
cd <Tivoli_GUID_directory>
tivguid -write -new

December 2011

(on Unix: ./tivguid -write -new)

150

Tivoli Storage Productivity Center


Hints and Tips
2. create a blank <TPC>/ca/subagents/TPC/Data/PROBE_ME file.
3. stop and restart the agent (done from Windows services.msc panel, or using the
/opt/IBM/TPC/ca/endpoint.sh stop|start command on Unix). The PROBE_ME file should disappear,
and the agent should appear in the agent list in the TPC gui (and all other agents should remain).
4. before installing any other TPC agents, check for the Tivoli GUID software on the target agent, and
generate a new Tivoli GUID value before deploying the agent (step 1 above).

5.20.1 More on database error occurred during agent registration errors


As TPC administrator, you should check for these error messages in the TPC Data Server logs (usually
the server_0000xx.log and/or the TPCD_0000xx.log files). In addition to the simple Tivoli GUID
case cited above, there is another common scenario where these problems arise, particularly in
environments with a large number (50+) of monitored agent servers.
These errors typically happen when:
A server has its hostname changed (i.e., server123.abc.com gets changed to server456.abc.com)
A server has its ip address changed (i.e., server123.abc.com changes from 1.2.3.4 to 10.20.30.40)
In large environments you cannot always control when machines are decommissioned, redeployed with
a new name or ip address, etc., but you can monitor the status of your TPC agents through both the TPC
gui and periodic review of the current TPC Data Server logs.
It is very important to understand that a large number of these errors can severely impair TPC server
performance, and in some cases can even prevent logging in to the TPC gui!
A TPC agent will continuously send registration requests to the TPC server every few seconds as long as
the agent server is up and the agent service is running until the registration is successful.
Two indicators that you may have this problem are:
1. very large server_0000xx.log file size (particularly when compared to the older log files)
2. agents in the TPC gui agent list that are down or unreachable
You can search the log files for database error if you suspect a problem.
The registration problem is a result of a conflict between the information that the agent is presenting for
registration, and the information that is already stored in the TPC database. There are constraints on the
column entries in the TPC database that stipulate that each agent must have a) a unique hostname, b) a
unique ip address, and c) a unique Tivoli GUID value. If an agent attempts to register using a hostname,
ip address, or Tivoli GUID that is already present in the database associated with some other agent
server, the registration fails and the database error message is logged.
To fix this problem, you need to resolve the conflicting entry(s) in the TPC database.
Note - backup your TPC database before attempting any changes!

December 2011

151

Tivoli Storage Productivity Center


Hints and Tips
You will need the hostname (both short and long name), ip address, and Tivoli GUID value of the agent
with the registration problem (usually identified by name in the Data Server log error messages). The
table you need to examine in the TPC database is T_RES_HOST.
Search the table for the agent hostname (it may be present in either long or short form), the ip address,
and the Tivoli GUID. If it is found and is associated with some other server, this conflict must be
resolved. If the entry in the database is invalid, the row can be deleted from the table. If the entry in the
table is valid, and corresponds to another active server that TPC is monitoring, you may have to confer
with your IT department to determine how to correct the conflicting information.
If the conflicting information is the Tivoli GUID, you can resolve it by generating a new guid value for
the agent (see above), and updating the corresponding column in the T_RES_HOST table for that agent
if necessary.
In the latest 3.3.2/4.1.1/4.2 TPC fixpacks, a server configuration option dupCompInfo was introduced
to provide an option to handle these errors automatically. For more information on how to use this
option, and to see if it is useful for your environment refer to the technote >>>SRV0042E: A database
error occurred during agent registration and APAR IC62234.

5.21 Applying TPC Patches


Tips for handling TPC fixes/patches:
1. When you receive a patch, keep the file that is delivered in a safe place outside of your <TPC>
directory structure. For example:
C:\tpc_patches
A text log file in this directory with a brief description to describe the patch and the issue/pmr number
that it was intended to fix may also be helpful documentation for the TPC administrator.
2. TPC patches are typically created with a directory structure that matches the TPC installation. They
are designed to be extracted in a particular directory so that the subdirectories/folders and their contents
end up in the correct location.
3. TPC patches are specific to your TPC release, and should not be applied to a different level of TPC
unless instructed by TPC support and development.
4. You should be aware that patches might be overridden by patches for other problems, or by a TPC
upgrade. If you are expecting a patch for a problem, be sure to tell support about any other patches you
may already have.
5. If you upgrade TPC, you should upgrade to a version that includes the APAR that provides the fix for
the patch. If a TPC upgrade is required for another issue and the APAR is not yet available in the
upgrade, you may need to request a new patch for the TPC upgrade level.

5.22 Changing the TPC server hostname or IP address

December 2011

152

Tivoli Storage Productivity Center


Hints and Tips
This is a lengthy process requiring manual steps. It is highly recommended that you back up the
TPC server before making these changes.
1. Shutdown all TPC related services like TPC DataServer, DeviceServer, TIP Server, Replication
Server and Common Agent
2. Change the Hostname / IP Address in OS
3. Change the DB2 configuration
a) Refer to: http://publib.boulder.ibm.com/infocenter/db2luw/v9/index.jsp?
topic=/com.ibm.db2.udb.uprun.doc/doc/t0006350.htm for configuration change on DB2
Hostname
b) If DB2 has enabled additional security, refer to:
http://publib.boulder.ibm.com/infocenter/tivihelp/v4r1/index.jsp?
topic=/com.ibm.sspc_v131.doc/fqz0_t_sspc_config_host_name_security_on.html
4. Change the Data Server configuration:
a) Check file <TPC>/data/config/ep_manager.config
b) Adjust "AgentConfiguration.Host=","CertManagement.Host=", "AgentQuery.Host=,
"Registration.Host=", "Registration.Server.Host=", "AgentManagerQuery.Host=" and
"Config.Server.Host=" with new Hostname of AgentManager (please verify to which one it
points before)
c) Check file <TPC>/config/installVariable.properties
d) Adjust "varDevSrvNameForAgent","varSrvName","varDataSrvName" and "varDevSrvName"
with the new correct hostname
5. Change the Device Server configuration
a) Check file <TPC>/device/conf/setup.properties
b) Adjust "agent.deployer=" and "manager.loc=" with new Hostname of TPC Server (please
verify which one it points before)
c) Check file <TPC>/device/conf/tpcr_installed and <TPC>/data/config/tpcr_installed
d) Adjust "tpcr.hostname=" with new Hostname of TPC Server where you have TPC-Replication
installed, usually it is same as TPC Server
e) Check file <TPC>/device/conf/AgentManager/config/ep_manager.config
f) Adjust "AgentQuery.Host=, "Registration.Server.Host=" and "AgentManagerQuery.Host="
with new Hostname of AgentManager (please verify to which one it points before)
6. Change the GUI configuration
a) Check file <TPC>/gui/tpc.bat (or tpc.sh)
b) Adjust rows which include old hostname and change it to the new hostname:
(e.g.: "START javaw -mx1024M -Djava.net.preferIPv4Stack=false com.tivoli.itsrm.gui.GuiMain
tpcserver.xyz.com:9549" )
c) Check file
<TPC>/device/apps/was/profiles/deviceServer/installedApps/DefaultNode/DeviceServer.ear/Dev
iceServer.war/app/tpcgui.jnlp
d) Adjust argument of "<application-desc main-class="com.tivoli.itsrm.gui.GuiMain">" with
corresponding new hostname
7. Change the TIP configuration
a) Check File C:\Program Files\IBM\Common\acsi\ACUApplication.properties (on Unix, the
directory is /usr/ibm/common/acsi)
b) Adjust hostname in row "acu.hostname="
c) Check Files <TIP>/systemApps/isclite.ear/TpcWebLaunch.war/WEB-INF/portlet.xml and
<TIP>/profiles/TIPProfile/config/cells/TIPCell/applications/isclite.ear/deployments/isclite/TpcW
ebLaunch.war/WEB-INF/portlet.xml and
December 2011

153

Tivoli Storage Productivity Center


Hints and Tips
<TIP>\profiles\TIPProfile\config\cells\TIPCell\nodes\TIPNode\serverindex.xml
d) Adjust hostname in rows like "tpcserver.xyz.com:9549" or "tpcserver.xyz.com:9550"
8. Change the TPC Database
a) Open DB2 Control Center, connect to TPCDB Database and click on Tables for getting the
view of all tables
b) Check table T_RES_HOST for entries with hostnames that you change. (Double click on the
table)
c) Adjust the column HOST_URL, NETWORK_NAME, DOMAIN_NAME, HOST_NAME,
ORIGINAL_ALIAS and IP_ADDRESS to the new hostname/IP Address and press Commit
d) Check table T_RES_SERVER for entries with hostnames that you change. (Double click on
the table)
e) Adjust the column SERVER_NAME to the new hostname and press Commit
9. IF APPLICABLE - Change the TPC Common Agent (on each server which contains the agent)
a) Check file <TPC>/ca/config/endpoint.properties
b) Adjust "AgentConfiguration.Host=","CertManagement.Host=", "AgentQuery.Host=,
"Registration.Host=", "Registration.Server.Host=", "AgentManagerQuery.Host=" and
"Config.Server.Host=" with new Hostname of AgentManager (please verify to which one it
points before)
10. IF APPLICABLE - Change the TPC Data Agent (on each server which contains the agent)
a) Check file <TPC>/ca/subagents/TPC/Data/config/agent.config
b) Adjust "serverHost=" with new Hostname of TPCServer (please verify to which one it points
before)
11. IF APPLICABLE - Change the TPC In-Band Agent/Fabric Agent (on each server which contains the
agent)
a) Check file <TPC>/ca/subagents/TPC/Fabric/conf/setup.propertites
b) Adjust "manager.loc=" with new Hostname of TPCServer (please verify to which one it points
before)
c) Check file <TPC>/ca/subagents/TPC/Fabric/conf/user.properties
d) Adjust "my.ip=" and "my.name=" with new corresponding ip and hostname.
12. Change the TPC Storage Resource Agent (on each server which contains the agent)
a) Check file <TPC>/agent/config/Agent.config
b) Adjust "Servername=" with new Hostname of TPCServer
c). Adjust "IPAddress=" with new IPAddress of TPCServer
13. IF APPLICABLE - Change the AgentManager configuration
a) Check file
<AgentManager>/AppServer/agentmanager/installedApps/AgentManagerCell/AgentManager.ea
r/AgentManager.war/WEB-INF/classes/resources/AgentManager.properties
b) Adjust "ARS.host=" with new Hostname/IP Address of AgentManager
c) After changing the parameter, you need to restart the agent manager server.
14. Start all TPC related services - TPC DataServer, DeviceServer, TIP Server, Replication Server and
Agent Manager and Common Agent (if applicable)
15. Change the TPC-R configuration. Verify "hostName=" is the correct hostname or IP address (if
localhost, leave as is) for the following files:
a) Check file
<TPCR>/eWAS/profiles/CSM/config/cells/DefaultNode/nodes/DefaultNode/serverindex.xml.
b) Check file <TPCR>/eWAS/profiles/CSM/properties/wsadmin.properties
c) Check file <TPCR>/eWAS/profiles/CSM/properties/rmserver.properties
d) Check file <TPCR>/CLI/repcli.properties
e) Restart the TPCR (CSM) service if any changes were made
December 2011

154

Tivoli Storage Productivity Center


Hints and Tips

6 System Storage Productivity Center (SSPC)


IBM System Storage Productivity Center (SSPC) is the next step in the IBM storage strategy. This
solution comprises hardware and software, combining device configuration capabilities for the IBM
System Storage DS8000 and IBM System Storage SAN Volume Controller (SVC) in an easy-to-use
hardware console. SSPC extends the manageability of these devices with the introduction and
integration of IBM TPC Basic Edition V3.3.1. Consolidating several hardware and software components
on a single tested system helps support consistent interoperability.
TPC Basic Edition provides the TPC for Disk functionality. It does not include TPC for Data functions,
TPC for Fabric functions, or the Tivoli Agent Manager. There is no TPC agent support.
It is possible to upgrade a SSPC server to a fully capable TPC Standard Edition server, which involves
installing the Tivoli Agent Manager, then upgrading the TPC server license and registering the server
with the Agent Manager.

6.1 Common SSPC Issues


6.1.1 Prerequisite step required when upgrading SSPC based on TPC 4.x
When you upgrade a SSPC server that is based on a TPC 4.x version (SSPC 1.4 and higher), there is an
important prerequisite step you must do first to reconfigure TIP. On SSPC as it is delivered, the TIP
software is present, but not configured or used.
If you miss this step, the upgrade will fail and it can become difficult to recover.
For instructions please see >>>SSPC Upgrading Tivoli Storage Productivity Center.

6.1.2 Apply TPC Standard Edition License


1. Start the TPC installer using the DVD or installation image of the TPC version for your upgrade.
2. Select the 'Installation licenses' installer option. Click 'Next' to complete the installation.
3. Exit the TPC gui if it is open, and stop and restart the TPC server services. It may be necessary to
reboot the SSPC server if you get warnings or errors when trying to stop the services.
4. To verify the license was applied:
1. Look for the *.SYS2 license file in C:\Program Files\IBM\TPC. The SSPC license file for
TPC Basic Edition is TPCBE0303.SYS2. If you are upgrading to the TPC Standard Edition
license, the file you should see is BTSSE03_03.SYS2.
2. In the DB2 Control Center, open the TPCDB table T_RES_LICENSE. Before the license
upgrade, TPCSE=0. After the upgrade, you should now see TPCSE=1.
If you are unable to run the TPC installer, you can manually install the license. Copy the following files
from your IBM TPC electronic image or CD (this must be an image or CD for the upgrade
version/edition, NOT the image used to install SSPC) into the TPC install directory:
<CD_root_directory>/license/key (copy the entire ../key directory)

December 2011

155

Tivoli Storage Productivity Center


Hints and Tips
<CD_root_directory>/BTS*.sys

You will need to restart the server services or reboot the server to activate the license.

6.1.3 Agent Manager Registration


1. Install the Tivoli Agent Manager (see section 2.3 item 2 for additional information)
2. Open the TPC gui and go to 'Administrative Services->Configuration->Agent Manager
Registration'
3. In the Agent Manager registration panel, enter the following values:
1. Hostname or IP address fully qualified name of the SSPC/TPC server
2. Data and Device server registration: user id=manager, password=password, Common Agent
registration password=changeMe.
3. Click 'OK' to complete registration.
4. To verify registration, check the files:
1. C:\Program Files\IBM\TPC\data\config\ep_manager.config
2. C:\Program Files\IBM\TPC\device\conf\AgentManager\config\endpoint.properties
3. look for 'AgentManager.Registration=YES'

6.1.4 Remote Agent Installation


Agent installations will fail without the following configuration change on the TPC server.
1. Edit the file C:\Program Files\IBM\TPC\config\InstallVariables.properties
2. Change the following two variables to be the fully qualified server name. Example:
varDataSrvName=tpcserver.domain.com
varDevSrvName=tpcserver.domain.com
3. Save the file.
4. Run the TPC installer using the 'Custom' option to deploy agents.

6.1.5 CIMOM Installation On SSPC/TPC Server


Installing additional CIMOMs on the SSPC/TPC server is NOT recommended. This can introduce
performance problems, and there is also a potential conflict with the CIMOM that is already installed.
The server is running the SVC Master Console, which includes the SVC CIMOM using ports
5988/5989. If another CIMOM is to be installed, you will have to change the new CIMOM
configuration to use different ports. See section 4.5.2.2 for an example of how to configure alternative
ports for the Engenio SMI Provider.

6.1.6 Running dscimcli.bat

December 2011

156

Tivoli Storage Productivity Center


Hints and Tips
**Note: dscimcli is pre-installed on SSPC v1.2. On later versions of SSPC, dscimcli must be installed. It
is important that you install the version that is compatible with the DS8000 CIM agent you are running.
Running the 'dscimcli.bat' command may give an error message:
The system cannot find the path specified.
'"C:\Program Files\IBM\svcconsole\cimom\pegasus\bin\dscimcli.bat"' is not recognized
as an internal or external command, operable program or batch file.

To correct this problem:


1.
2.
3.
4.

Open a command prompt window.


cd C:\Program Files\IBM\DSCIMCLI\W2003
Run the command 'cli_path.bat'
Re-try the 'dscimcli.bat' command.

6.1.7 Configuring the DS8000 to the HMC CIMOM


There are two steps that must be completed:
1. Create a new user account and password on the DS8000 HMC (using dscli or gui)
2. Define the account created in step 1 to the CIMOM

6.1.7.1 Step 1 - Create new user account on HMC using dscli


1. Login to the dscli.
2. Use the mkuser command to create the new user account:
dscli> mkuser pw abc123 group admin tpcadmin

3. Logout of the dscli, then log back in using the new account you created.
4. The dscli will require you to use the chuser command to change the password. You must change
the password, otherwise the configuration will fail because the password is expired:
dscli> chuser pw tpcadm1n tpcadmin

5. To prevent problems with expiring passwords and security lockouts, you can set the password to
never expire, and also increase the number of failure attempts:
dscli> chpass -expire 0 -fail 10

Be sure to record your user account and password in a safe place with your other TPC accounts and
passwords.

6.1.7.2 Step 1 - Create new user account on HMC using DS8000 Storage Manager
1. Open a web browser.
2. Go to http://hmc_ip:8451/DS8000/Login
December 2011

157

Tivoli Storage Productivity Center


Hints and Tips
3. Login with the default admin/passw0rd account.
4. Create the new tpcadmin account, then logout and log back in with the new account.
You will be prompted to change the password. Be sure to record your new password in a safe
place.

6.1.7.3 Step 2 - Configure the device to the CIMOM


Use the dscimcli utility and your new user account to configure your device to the CIMOM:
cd C:\Program Files\IBM\DSCIMCLI\W2003\bin
dscimcli -s https://hmc_ip:6989 mkdev hmc_ip -type ds -user tpcadmin -password <newpassword>

6.1.8 Isolating DS8000 Performance Monitor Problems


If you are having DS8000 performance monitor problems, you can use the 'cimcli.bat' command to see if
the problem is in TPC or in the CIMOM.
1. Open a command prompt window.
2. Go to c:\Program Files\IBM\DSCIMCLI\W2003\pegasus\bin
3. Enter these commands:
cimcli ni -n root/ibm -l hmc_ip:6989 -s IBMTSDS_VolumeStatistics
cimcli ni -n root/ibm -l hmc_ip:6989 -s IBMTSDS_RankStatistics
cimcli ni -n root/ibm -l hmc_ip:6989 -s IBMTSDS_FCPortStatistics

If the CIMOM is working, these commands will return valid results.

6.1.9 Disable TPC services when not using TPC on SSPC


If you are only using the SVC console on a SSPC because you already have a TPC server in your
environment, you can (and should!) stop and disable the TPC services to improve performance of the
SVC CIMOM and master console.

6.2 SSPC References


See >>>SSPC Reference Information in Appendix E.

December 2011

158

Tivoli Storage Productivity Center


Hints and Tips

7 TPC Replication Manager


The TPC family now includes the Replication Manager, which provides a management interface to
flashcopy and copyservices for IBM storage subsystems. At the time of this writing, it is a stand-alone
application frequently implemented alongside TPC, and does not interact or interface with other TPC
functionality.

7.1 TPC for Replication and device IP ports


This topic describes how the IBM TPC for Replication server uses various ports for communication with
the graphical user interface (GUI), command-line interface (CLI), storage boxes, and high-availability
servers.
Port (all are bi-directional)

Description

22

SVC cluster ssh

162

SNMP

443

SVC cluster https

1750

HMC port for TPC-R server connection

2433

TPC-R to ESS/DS

5110

Remote CSMCLI to TPC-R

5120

Active to standby server

5989, 5999

SVC CIMOM, TPC-R v3.4.0 and earlier

6120+1 increment per SVC

SVC to TPC-R v3.4.0 and earlier

9080/3080, 9443/3443

TPC-R GUI http, https

7.2 Shutting Down TPC for Replication


To properly shutdown TPC for Replication without adversely affecting its configuration and active copy
services operations:
1. Export copysets for currently running sessions
2. If enabled, disable the heartbeat for metro mirror
3. If managed storage subsystems will be powered down or taken offline, suspend any active sessions
(wait for TPC-R gui to update to 'Suspended' state).
December 2011

159

Tivoli Storage Productivity Center


Hints and Tips

4. Shutdown the TPC-R server

7.3 TPC for Replication tips/recommendations:


1. When installing TPC for Replication (TPC-R), consider using the lightweight 'Derby' database
for TPC-R instead of DB2, particularly if TPC-R is being installed on the same server with TPC.
This will help avoid DB2 performance and resource issues.
2. When working with TPC-R sessions, use .csv files to import and export copysets. The data can
then be used to recover/redefine a session or corrupted TPC-R database.
3. Do not change passwords or allow them to expire after installation if possible. Changing
passwords will require running Websphere commands in order to complete the changes. See
>>>Changing the TPC-R Administrator Password for the detailed procedure. Changing the
TPC-R Administrator Password
4. Do not setup the TPC-R database on a copyset volume. A copyset volume freeze will prevent
TPC-R from accessing the database.
5. Metro Mirror session can enable or disable heartbeat function (DS6000, DS8000, ESS). Loss of
IP connection can generate a freeze for all LSS pairs in the boxes if heartbeat is enabled.
6. Avoid putting copyset volumes belonging to different sessions in the same LSS pair of the
source/target subsystems.
7. DS8000 space efficient volume can be used as the target volume in a flashcopy session. This
allows more volumes defined than physical space available in the subsystem.
8. When connecting TPC-R to the SVC CIM agent, make sure the CIM agent is completely up and
logged into the SVC Master Console before connection.
9. Create Metro or Global Mirror session with practice volume if you want to periodically back up
consistent data at the target subsystem.
10. Make sure there is no SCSI reserve on any target or flashcopy practice volume when running a
session, or flash command to a practice volume. To release the reservation, vary the volume
offline from the host, run SDD lquerypr, or DS6000/DS8000 cmt command.
11. Avoid connecting the same SVC cluster to 2 different CIMOMs that can be accessed by the
TPC-R server.
12. When defining Global Mirror (GM) session, use TPC-R to set the remote copy path, and NOT
other tools (like DSCLI).
13. The Global Mirror consistency group interval time should be set to a non-zero value. Typical is
30 or 60 seconds.
14. Avoid remote copying from a faster subsystem to a slower subsystem (example: mirroring from
DS8000 to DS6000/ESS).
15. Setup multiple physical remote copy paths to enhance the data transfer rate.
16. Setup SNMP to monitor session state, configuration, active/standby server changes, and
communication failures.
17. If a standby TPC-R server is not installed, it is best to install the primary/active server at the
recovery site for site disaster recovery.
18. Run the Global Mirror Monitor tool to collect session and LSS data to 2 CSV files periodically.
This tool is not part of TPC-R and has to be run from TSO or a DSCLI command prompt.
19. Use the TPC-R configuration generator. This is a excel spreadsheet based tool that takes
session/copyset information input by the user and generates CLI commands for TPC-R,
DS6000/DS8000, and SVC. The command output can be copied to script files to cleanup an
existing session in hardware, then setup new TPC-R copysets and session to allow the TPC-R

December 2011

160

Tivoli Storage Productivity Center


Hints and Tips
server to take over the session. You can download this tool from the TPC-R support page. Click
on 'Download' and then 'Tools and utilities'.
20. Commands to check the progress of a copy session:
lspprc -l -fmt delim 0000-FFFF (for Metro Mirror and Global Copy)
Here is a sample output. Look at column 5 for Out of Sync Tracks (in this case the value is 225):
ID,State,Reason,Type,Out Of Sync Tracks,Tgt Read,Src Cascade,Tgt Cascade,Date Suspended,SourceLSS,Timeout (secs),Critical Mode,First Pass
Status,Incremental Resync,Tgt Write,GMIR CG,PPRC CG,AllowTgtSE,DisableAutoResync
--------- ------------ - ----------- --- ------- ------- ------- - -- -- -------- ---- -------- -------- ------1600:6600,Copy Pending,-,Global Copy,225,Enabled,Enabled,Invalid,-,16,60,Disabled,True,Disabled,Disabled,Disabled,Disabled,Disabled,False

lssession -l 16-17 (for Global Mirror)


LSS ID Session Status Volume VolumeStatus PrimaryStatus SecondaryStatus FirstPassComplete AllowCascading
16 02 Normal 1600 Active Primary Copy Pending Secondary Simplex True Enable
17 02 CG In Progress 1700 Active Primary Copy Pending Secondary Simplex True Enable

*Note: FirstPassComplete=False can occur during initial full copy and also when copyset's OOS
tracks gets to a high percentage (bandwidth cannot keep up with changing tracks), so cannot form
CG groups
showgmir -metrics 23 (for Global Mirror)

7.4 TPC for Replication References


See >>>TPC Replication References in Appendix E.

December 2011

161

Tivoli Storage Productivity Center


Hints and Tips

8 Appendix A
8.1 AgentRestart.bat listing
@echo off
REM **********************************************************************
REM ** AgentRestart.bat
**
REM **
Used to restart TPC Data Agents after a massive stop command **
REM **
issued from the TPC GUI using the Data Manager report
**
REM **
Data Manager->Reporting->Asset->Agents->By Agent.
**
REM **
**
REM ** TPC Hints and Tips for TPC V3.3 -- 07/30/2007
**
REM **
**
REM **********************************************************************
setlocal
REM You need to change the following line if you installed TPC in a
REM
non-default directory. Use 8.3 notation since no spaces or quotes
REM
are allowed in basic Windows scripts.
set TPC_Disk=C:
set TPC_CA_Path=C:\Progra~1\IBM\TPC\ca
REM Change to the CA directory
%TPC_Disk%
cd %TPC_CA_Path%
:copy_cert
REM We need to rename the CA cert directory to preserve it, and then copy
REM
the Data Server cert directory into the CA path
@echo.
@echo Backup CA\cert directory
move /Y cert certcopy
xcopy ..\Data\cert cert /s /k /r /h /i
:restart_agents
REM In this section, you should create repeating sets of the following six
REM
lines, one set for each server to be restarted.
REM -----------------------------------------------------------------------------set fqdn=sonja.yourcompany.com
@echo.
@echo Restart %fqdn%
call agentcli -host %fqdn% -port 9510 deployer start TPCData
call agentcli -host %fqdn% -port 9510 deployer list bundles state | find "TPCData"
REM -----------------------------------------------------------------------------set fqdn=ruby.yourcompany.com
@echo.
@echo Restart %fqdn%
call agentcli -host %fqdn% -port 9510 deployer start TPCData
call agentcli -host %fqdn% -port 9510 deployer list bundles state | find "TPCData"
REM -----------------------------------------------------------------------------set fqdn=allie.yourcompany.com
@echo.
@echo Restart %fqdn%
call agentcli -host %fqdn% -port 9510 deployer start TPCData

December 2011

162

Tivoli Storage Productivity Center


Hints and Tips
call agentcli -host %fqdn% -port 9510 deployer list bundles state | find "TPCData"
REM -----------------------------------------------------------------------------set fqdn=daisy.yourcompany.com
@echo.
@echo Restart %fqdn%
call agentcli -host %fqdn% -port 9510 deployer start TPCData
call agentcli -host %fqdn% -port 9510 deployer list bundles state | find "TPCData"
REM -----------------------------------------------------------------------------set fqdn=swatsvcmc.yourcompany.com
@echo.
@echo Restart %fqdn%
call agentcli -host %fqdn% -port 9510 deployer start TPCData
call agentcli -host %fqdn% -port 9510 deployer list bundles state | find "TPCData"
:restore_cert
REM We need to restore the CA\cert cert directory. To do this, we need to stop
REM
the CA, restore the directory, and then restart the CA
@echo.
@echo Restore CA\cert directory
net stop "IBM Tivoli Common Agent - 'C:\Program Files\IBM\TPC\ca'"
PING 1.1.1.1 -n 1 -w 10000 > NUL
del cert /f /s /q
rmdir cert
move /Y certcopy cert
net start "IBM Tivoli Common Agent - 'C:\Program Files\IBM\TPC\ca'"
:exit
endlocal

December 2011

163

Tivoli Storage Productivity Center


Hints and Tips

9 Appendix B - TPC Agent Installation Tips


Here are some notes to help avoid or troubleshoot problems when deploying TPC agents. This
information applies to remote/push installs from the TPC server, and also local agent installations
performed directly on the agent server.
- running the installer
o avoid embedded spaces in installer directory path, or a directory path that is too long, e.g.:
c:\Documents and Settings\TPC Install\disk1 ... (path/directory names include embedded spaces)
o use short, simple directory/path names, avoid special characters; example of good choices:
c:\tpc41\disk1, c:\tpc41\disk2
- common installation problems
o tivoli guid issue (when machines are built from a gold/master image)
o common agent registration password not provided or incorrect
o tpc server / AM 'localhost' issue (check the ep_manager.config file)
o firewall/port problems at agent
o user account being used for install does not have sufficient authority
o dns/network problems
o old tpc code on machine not cleaned up
o unsupported environment
o inadequate disk space
- verify the tpc server before agent installation
o telnet tests from agent to tpc server ports (9549, 9511, 9512, 9513, 9550, 50000)
o dns tests (/etc/hosts valid localhost and server entries, ping, nslookup)
o check tpc gui for problems, check latest server_nnnnnn.log, TPCD_nnnnnn.log
o AM healthcheck to verify AM and CA pwd, tpctool to verify device server host authentication pwd
o tpc version (make sure server version matches agent version being installed)
o check for other agents installed successfully
- especially on same OS, especially in same network domain/subnet
- if no, may point to problem on tpc server
o check for db2 problems (db2 control center / health center, db2diag.log file, connect to tpcdb/ibmcdb)
- checking the agent machine before installing
o supported server hardware and OS
- consult appropriate platform support doc for your TPC version
- OS, processor architecture, adequate memory and disk space
- unix umask
o check for any prior tpc or tsrm/tsanm software
- use cleanup guides for locations of items to check
o check for tivoli guid (particularly if machine built from common image)
- when tivguid is present, if in doubt, generate new guid before installing
o firewall/ports
- must be allowed outgoing/incoming to/from tpcserver: 9549,9511,9512,9513,9550
- must be allowed incoming to 9510,9514,9515,9559
o telnet tests from agent to tpc server ports
o dns tests (/etc/hosts valid localhost and server entries, ping, nslookup)
December 2011

164

Tivoli Storage Productivity Center


Hints and Tips
o user account being used for installation
- unix: requires 'root', Windows: local 'Administrator' recommended
- common agent registration step failures
o this can be caused by a bad/conflicting entry in the TPC database T_RES_HOST table. Search
the table for entries that have:
1. the agent server name
2. the agent server ip address
3. the agent server tivoli guid value
If you find an entry in the table that is using one of these values, the discrepancy may need to
be resolved before the agent installation will succeed. The resolution may be to delete the
problem row from the table, or the agent server may need to be reconfigured to use the
correct server name, ip address, or need a unique tivoli guid generated (i.e., tivguid issue).
Note: deleting the row from the table should only be done if you are certain that it is not a valid
entry for a working agent.

December 2011

165

Tivoli Storage Productivity Center


Hints and Tips

10 Appendix C - TPC Command Reference


This section gives examples of some of the most often used commands for TPC administration.
Examples are given in their most commonly used form with typical options. Refer to official product
documentation, such as the >>>TPC Infocenter, for a complete listing of the full syntax and other
options that may be supported.

10.1 Starting TPC


Java web start:
http://servername:9550/ITSRM/app/welcome.html

Tivoli Integrated Portal (TIP):


https://servername:16310/ibm/console/logon.jsp

Windows:
Start -> All Programs -> IBM Tivoli Storage Productivity Center -> Productivity Center
or
C:\Program Files\IBM\TPC\gui>
tpc.bat

Unix:
/opt/IBM/TPC/gui>
./tpcd.sh

10.2 Start/Stop TPC Services


Windows services can be started/stopped using the services.msc panel or the net start/stop
command, both of which can be invoked from a command prompt window. These are the service names
as they appear in the services.msc panel:
IBM Tivoli Common Agent - 'C:\Program Files\IBM\TPC\ca'
IBM Tivoli Storage Resource Agent - C:\Program Files\IBM\TPC
IBM TotalStorage Productivity Center - Data Server
IBM WebSphere Application Server V6.1 - CSM
IBM WebSphere Application Server V6.1 - Device Server
IBM WebSphere Application Server V6.1 - Tivoli Agent Manager
Tivoli Integrated Portal - TIPProfile_Port_16310
You can stop/start a service from the command prompt with the net command which requires typing
out the entire service name:
net stop IBM WebSphere Application Server V6.1 - Device Server
net start IBM WebSphere Application Server V6.1 - Device Server

Unix
--TIP:
/opt/IBM/Tivoli/tip/profiles/TIPProfile/bin>
./startServer.sh server1
./stopServer.sh server1

December 2011

166

Tivoli Storage Productivity Center


Hints and Tips
--TPC data server:
/opt/IBM/TPC/data/server>
./tpcdsrv1 start
./tpcdsrv1 stop
AIX:
stopsrc -s TSRMsrv1
startsrc -s TSRMsrv1
--TPC device server:
cd <TPC>/device/bin/<platform>
./startTPCF.sh
./stopTPCF.sh
--Agent Manager
/opt/IBM/AgentManager/embedded/bin>
./startServer.sh AgentManager
./stopServer.sh AgentManager
--Common agent
/opt/IBM/TPC/ca/endpoint.sh start
/opt/IBM/TPC/ca/endpoint.sh stop
--Storage Resource agent
/opt/IBM/TPC/agent/bin/agent.sh start
/opt/IBM/TPC/agent/bin/agent.sh stop

10.3 Agent Manager and Common Agent Commands


--Start/stop services
/opt/IBM/AgentManager/embedded/bin>
./startServer.sh AgentManager
./stopServer.sh AgentManager
--Get AgentManager version
/opt/IBM/AgentManager/bin>
./GetAMInfo.sh
C:\Program Files\IBM\AgentManager\bin>
GetAMInfo.bat
--Check Agent Manager health/status:
/opt/IBM/AgentManager/toolkit/bin>
./HealthCheck.sh -registrationPW changeMe
C:\Program Files\IBM\AgentManager\toolkit\bin>
HealthCheck.bat -registrationPW changeMe
C:\Program Files\IBM\AgentManager\embedded\bin> (if version < v1.3)
C:\Program Files\IBM\AgentManager\AppServer\bin> (if version >= v1.3)
serverStatus.bat AgentManager
You can also open a web browser to:
https://<agentmanager-server>:9511/AgentMgr/Info
--Collect Agent Manager logs:
/opt/IBM/AgentManager/toolkit/bin>
./LogCollector.sh
C:\Program Files\IBM\AgentManager\toolkit\bin>
LogCollector.bat
--Check state of common agent and subagent bundles:
C:\Program Files\IBM\TPC\ca>
agentcli deployer list bundles state
/opt/IBM/TPC/ca>
./agentcli.sh deployer list bundles state
--The following command changes the host authentication password for a TPC Fabric agent:
C:\Program Files\IBM\TPC\ca>
agentcli TPCFabric ConfigService setauthenticationpw newpassword

December 2011

167

Tivoli Storage Productivity Center


Hints and Tips
--Start/stop agent bundles
C:\Program Files\IBM\TPC\ca>
agentcli.bat deployer start TPCData
agentcli.bat deployer stop TPCFabric
agentcli.bat deployer start file:///C:\Program Files\IBM\TPC\ca\subagents\TPC\Data\agent\lib\TPCData_win32_i386.jar
/opt/IBM/TPC/ca>
./agentcli.sh deployer start TPCData
./agentcli.sh deployer stop TPCFabric
./agentcli.sh deployer start file:///opt/IBM/TPC/ca/subagents/TPC/Data/agent/lib/TPCData_aix_power.jar
--Get list of agents in the IBMCDB Agent Manager database:
C:\Program Files\IBM\AgentManager\toolkit\bin>
RetrieveAgents -dbPassword db2adminpw >agents_list.txt
/opt/IBM/AgentManager/toolkit/bin>
./RetrieveAgents.sh -dbPassword db2adminpw >agents_list.txt

10.4 Device Server - TPCTOOL Commands


--Start tpctool cli:
cd <TPC>/cli
tpctool.bat (or ./tpctool.sh)
--Displaying/setting device server configuration parameters (from tpctool> prompt):
catdscfg -user <tpcadminid> -pwd <hostauthpw> -url localhost:9550
getdscfg -user <tpcadminid> -pwd <hostauthpw> -url localhost:9550 -property CIMClientWrapper.Timeout
setdscfg -user <tpcadminid> -pwd <hostauthpw> -url localhost:9550 -property CIMClientWrapper.Timeout -context DiskManager 1800000

10.5 Device Server SRMCP Commands


--Changing device server host authentication password:
cd <TPC>/device/bin/<platform>
setenv
srmcp -u <tpcadminid> -p <hostauthpw> ConfigService setAuthenticationPW <newhostauthpw>

10.6 DB2 Commands


Windows - DB2 command window:
Start -> All Programs -> IBM DB2 -> DB2COPY1 (default) -> Command Line Tools -> Command Window
These commands should be run when logged in with a userid that has db2admin authority to the instance:
--Source db2 profile on unix:
. /home/db2inst1/sqllib/db2profile

(change /home/db2inst1 if necessary)

--Stop db2 instance:


db2 db2stop
--Start db2 instance:
db2 db2start
--List existing databases:
db2 list database directory
--List database tables:
db2 connect to tpcdb
db2 list tables for schema tpc >tables.log
db2 connect reset
--Check current db configuration:
db2 get db cfg for tpcdb
--Check current db configuration to see what logging method is being used:
db2 get db cfg for tpcdb | find "LOGARCHMETH" (or use 'grep' on unix)
if LOGARCHMETH=OFF, circular logging is in use
--Increase the locklist space by increasing the LOCKLIST db cfg parameter:

December 2011

168

Tivoli Storage Productivity Center


Hints and Tips
db2 update db cfg for tpcdb using LOCKLIST 20000
--Increase the log space size by increasing the LOGFILSIZ db cfg parameter:
db2 update db cfg for tpcdb using LOGFILSIZ 4000
--Allow db2 to increase resources for more connections:
db2 update db cfg for tpcdb using MAXAPPLS AUTOMATIC
--Check database manager configuration settings:
db2 attach to DB2
db2 get dbm cfg
db2 detach
--Update database manager to increase MAXAGENTS:
db2 attach to DB2
db2 update dbm cfg using MAXAGENTS 800
db2 detach
--Update database manager to increase MON_HEAP_SZ (monitor heap size):
db2 update dbm cfg using MON_HEAP_SZ <newvalue>
--Show db2 status:
windows:
db2stat
unix:
db2_local_ps
--To test db connectivity from the command line:
db2 connect to IBMCDB user db2inst1 using password
--reorgchk:
reorgchk update statistics on schema tpc
--Get help on db2 sqlcode message (example for sqlcode -811):
db2 ? sql0811n
--Turn off self-tuning memory manager
db2 update db cfg using SELF_TUNING_MEM OFF
--Archive db2diag.log so a new file can be started (for when file gets huge):
db2diag -A
--Run a db2 sql script:
db2 -tvf myscript.sql >myscript.log 2>&1

10.7 CIMOM Commands


--Run lsdev command to test communication with storage device:
C:\Program Files\IBM\dsagent\bin>
dscimcli -s https://hmc_ipaddress:6989 lsdev -l
--Run mkdev command to add a device to be managed by the cimom:
C:\Program Files\IBM\DSCIMCLI\W2003\bin>
dscimcli -s https://hmc_ipaddress:6989 mkdev hmc_ip -type ds -user tpcadmin -password tpcadminpassword
--Run commands to test cimom to device interface:
C:\Program Files\IBM\DSCIMCLI\W2003\pegasus\bin>
cimcli ni -n root/ibm -l hmc_ipaddress:6989 -s IBMTSDS_VolumeStatistics
cimcli ni -n root/ibm -l hmc_ipaddress:6989 -s IBMTSDS_RankStatistics
cimcli ni -n root/ibm -l hmc_ipaddress:6989 -s IBMTSDS_FCPortStatistics
--Display version of embedded cimom:
cimcli ei -n root/ibm -l <hmc-ipaddress>:6989 -s -u [CIMuser] -p [CIMpassword] IBMTSDS_ObjectManager
--Open XIV xcli command window:
C:\Program Files\XIV\GUI10>
xcli -w
--XIV configuration commands in xcli:

December 2011

169

Tivoli Storage Productivity Center


Hints and Tips
xcli> smis_add_user user=tpcadmin password=mynewpw password_verify=mynewpw
xcli> smis_list_users
--Collecting logs for support:
/opt/IBM/dsagent/bin>
./collectLogs.sh
C:\Program Files\IBM\dsagent\bin>
collectLogs.bat
C:\Program Files\IBM\svcconsole\bin>
collectLogs.bat
--CISCO switch commands (issue from root login to switch)
# cimserver enable
The following example disables the CIM server (default)
# no cimserver enable
SWITCH DIAG INFO:
# show cimserver
# show tech-support
--LSI SMI provider 10.x
Recycling the SMIS-S Agent:
Windows:
- From services.msc, start or stop the Pegasus CIM object manager
- Command window:
> net start cimserver
> net stop cimserver
AIX/Linux/Solaris:
- To start the OpenPegasus [CIM] server:
> cimserver.
- To stop the OpenPegasus [CIM] server:
> cimserver -s

10.8 Replication Manager Commands


--DSCLI commands:
lspprc -l -fmt delim 0000-FFFF (for Metro Mirror and Global Copy)
lssession -l 16-17 (for Global Mirror)
showgmir -metrics 23 (for Global Mirror)
- Check paths of all source LSS 00-1F to target box:
lspprcpath -dev IBM.2107-xxxxxxx -l 00-FF
How to create a Global Mirror H1 -> H2 -> J2 copyset and add to a GM session using DSCLI commands:
1. Go to SOURCE storage subsystem: Setup and start H1 -> H2 global copy volumes:
> mkpprc dev SourceDevID remotedev RemoteDevID type gcp H1volume:H2volume
* Note: Since default option is 'full', the mkpprc cmd will start the fullcopy operation. After it's completed, it will continue to copy the changed
tracks (from the source volume) to target volume asynchronously forever.
2. Go to TARGET storage subsystem: Setup H2 -> J2 Flashcopy volumes
> mkflash dev SourceDevID tgtinhibit record persist nocp H2volume:J2volume
where: -record = incremental copy, -tgtinhibit = target volume is write-inhibited
3. Go to SOURCE storage subsystem (assuming GM session already exists), add copyset to session (running or not):
> chsession dev SourceDevID lss LSSofH1Volume action add volume H1Volume SessionID
CSM CLI commands:
1) CSM CLI Command:
* Run exportcsv command to export the copysets for each session to CSV files
csmcli> exportcsv -file cpset.csv session_name
* Run importcsv command to import the copysets back into the sessions
csmcli> importcsv -quiet -file cpset.csv session_name

December 2011

170

Tivoli Storage Productivity Center


Hints and Tips
Note: If there are more than one session to export, user can put all the CLI exportcsv commands into a script file, such as 'export.txt', then run
the script via the -script option:
C:\Program Files\IBM\replication>csmcli -script export.txt

10.9 Support/Service Data Collection Commands


(Refer to MustGather/Collecting Data technical notes on TPC support site for full troubleshooting data collection steps)
--TPC Server
/opt/IBM/TPC/service>
./service.sh
C:\Program Files\IBM\TPC\service>
service.bat
--TPC Agent
/opt/IBM/TPC/ca/subagents/TPC/service>
./service.sh
C:\Program Files\IBM\TPC\ca\subagents\TPC\service>
service.bat
--TPC Agent (SRA) - collect these files and directories:
<TPC>/agent/...
.../log
.../config
.../output
.../core (if core dump happened, there may be a core file present)
--Agent Manager:
/opt/IBM/AgentManager/toolkit/bin>
./LogCollector.sh
C:\Program Files\IBM\AgentManager\toolkit\bin>
LogCollector.bat
--CIMOM
/opt/IBM/dsagent/bin>
./collectLogs.sh
C:\Program Files\IBM\dsagent\bin>
collectLogs.bat
C:\Program Files\IBM\svcconsole\bin>
collectLogs.bat
--TPC Replication Manager
Get TPC-R logs via gui panel:
Advanced Tools panel -> "Generate log package"
creates TPC_RM_Diagnostics_date_time.jar file
The default location is ..\IBM\replication\eWAS\profiles\CSM\diagnostics
Get TPCRMInstall.log file in the TPCRM installation root directory
If GUI is not available, collect the following folder and contents into a .zip or .tar.Z file:
<TPCR>/eWAS/profiles/CSM/logs

10.10 Miscellaneous Commands


--repocopy command:
windows
cd c:\Program Files\IBM\TPC\data\server\tools
repocopy
unix
cd /opt/IBM/TPC/data/server/tools
./repocopy.sh
--Tivoli GUID commands:

December 2011

171

Tivoli Storage Productivity Center


Hints and Tips
windows
- typical location:
c:\Program Files\Tivoli\guid
(or use windows 'explorer' to search for 'tivguid.exe' on C: drive)
- command usage (from command prompt window):
cd <tivguid_dir>
tivguid -show
tivguid -write new
- to set/restore a specific guid value:
tivguid -write -guid=00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00
unix
- typical location:
/usr/tivoli/guid
/opt/tivoli/guid
(or use 'find / -name tivguid -print' to locate)
- command usage (from console or shell window prompt):
cd <tivguid_dir>
./tivguid -show
./tivguid -write -new
- to set/restore a specific guid value:
./tivguid -write -guid=00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00
--TIP authentication configuration backup/restore:
C:\Program Files\IBM\tivoli\tip\profiles\TIPProfile\bin>
backupConfig -nostop
restoreConfig
/opt/IBM/tivoli/tip/profiles/TIPProfile/bin>
./backupConfig.sh -nostop
./restoreConfig.sh
--TPC device server authentication configuration backup/restore:
C:\Program Files\IBM\TPC\device\apps\was\profiles\deviceServer\bin>
backupConfig
restoreConfig
/opt/IBM/TPC/device/apps/was/profiles/deviceServer/bin>
./backupConfig.sh
./restoreConfig.sh
--TPC-R replication server authentication configuration backup/restore:
C:\Program Files\IBM\replication\eWAS\profiles\CSM\bin>
backupConfig
restoreConfig
/opt/IBM/replication/eWAS/profiles/CSM/bin>
./backupConfig.sh
./restoreConfig.sh
--change password tool (refer to TPC Infocenter for important additional information):
C:\Program Files\IBM\TPC\data\server\tools>
changepasswords.bat
changepasswords.bat -0
/opt/IBM/TPC/data/server/tools>
./changepasswords

December 2011

172

Tivoli Storage Productivity Center


Hints and Tips

11 Appendix D Down level Items Still Supported in TPC


Items in this section are still supported, but are not the latest offering recommended for use with TPC
and may be dropped from future revisions of this document.

11.1 IBM TotalStorage DS Open API V5.1


The IBM TotalStorage DS Open API CIM agent V5.1 is IBMs legacy CIM agent for IBM ESS,
DS6000, and DS8000. It is a java based application, and requires the ESSCLI to manage an IBM ESS.
IBM DS6000 and DS8000 can be managed with just the CIM agent. This version is still supported and
recommended when the CIM agent is already in place to manage a device. In other words, if the devices
are currently managed by a V5.1 CIM agent, it is working, and the customer is happy, dont upgrade to
unless directed to by IBM Technical Support.

11.1.1 IBM TotalStorage Enterprise Storage Server (ESS) command-line


interface (CLI)
**Note: If you are using the IBM TotalStorage DS Open API version 5.3 or later, the CLI will be
installed for you automatically.
If you are using the DS Open API V5.1 to monitor IBM ESS storage, you will need to install the IBM
ESS CLI first. The IBM ESS CLI V2.4.4.70 (or later) can be downloaded from
ftp://ftp.software.ibm.com/storage/storwatch/esscli/ESS_CLI_2.4.4.70.zip.
Unpack (zip or tar) the install package contents to a temporary location. Part of the install package is the
IBM TotalStorage ESS Command-Line Interfaces Users Guide. The guide can be found in the
cliReadmes directory as f2bcli04.pdf. Use this guide to install and verify the ESS CLI. During
installation, you can generally accept the defaults. The general steps to take during installation are:
1. Launch the appropriate setup program (setup.exe on Windows).
2. Welcome Screen Click Next.
3. License Agreement Screen Select I accept and click Next.
4. Windows Users Panel (Windows install only) select Windows unless you are installing on
a Novell platform. Click Next.
5. Installation Location specify the path for the code to be installed in. Click Next.
6. Installation Preview Click next.
7. Installation Progress Watch and wait.
8. Installation Summary Click Next.
9. Readme panel Click Finish.
Once the ESS CLI is installed, you can test it by doing the following:
5) Click on the ESSCLI icon on the desktop (Windows only)
. The same window can be
initialized by opening a Windows command prompt, and changing the directory to the installed
location of the CLI (C:\Program Files\ibm\ESScli by default).
6) Issue the command esscli to see the available commands that can be issued. This is useful to
remind you of the available subcommands and parameters required for each subcommand.
7) Issue the command esscli u <userid> -p <password> -s <ESS IP address> list server. The
command will return some high level summary information for the ESS specified.
December 2011

173

Tivoli Storage Productivity Center


Hints and Tips
C:\Program Files\ibm\ESScli>esscli -u admin -p pwd -s essserver1 list
server
Thu Jul 12 11:09:58 PDT 2007 IBM ESSCLI 2.4.0
Server
---------2105.54321

Model
----800

Mfg
--023

WWN
---------------5005076300C01234

CodeEC
-------2.4.4.96

Cache
----24GB

NVS
----2GB

Racks
----1

Note: The s parameter can specify either the IP address or DNS resolvable network name for
one of the ESS clusters.

11.1.2 Installing the IBM DS Open API CIM Agent V5.1


The IBM DS Open API CIM Agent install package can be downloaded at the >>>DS Open API
Downloads site.
Unpack (zip or tar) the install package contents to a temporary location. Part of the install package is the
IBM TotalStorage DS Open Application Programming Interface Reference (GC35-0493-03). The guide
can be found in the docs directory as installguide.pdf. Use this guide to install and configure the DS
Open API CIM Agent V5.1. During installation, you should accept the defaults, unless you have an
explicit need to change a parameter.
1. Launch the appropriate setup program (W2K\setup.exe to install on Windows).
2. A Java Virtual Machine startup window will appear. After some time, InstallShield will launch a
graphical window.
3. Welcome Screen Click Next.
4. License Agreement Screen Select I accept and click Next.
5. Destination Directory specify the path for the code to be installed in. Click Next.
6. Updating CIMOM Port You have a choice of having the CIM Agent communicate using
either HTTP (non-secure) or HTTPS (secure) communications. It is generally recommended to
use HTTPS, and does not add any complexity when configuring TPC.
You can specify the port to be used for the CIM Agent communication. 5988 is the default port
associated with non-secure CIMOM communications (HTTP), and 5989 is the default port
associated with secure CIMOM communications (HTTPS). Change the port only if needed.
Recommended selections: API port=5989 and Communication Protocol=HTTPS.
7. Installation Confirmation Click Install.
8. Installation Progress Watch and wait.
9. Finish The CIM Object Manager - DS Open API service is started. Click Finish once the
button is enabled. If you didnt unselect the View post installation tasks box, a text window
will open with post install tasks that are needed to configure storage devices to the CIM agent. It
is useful reading.

11.1.3 Configuring the IBM DS Open API CIM Agent V5.1


Once the CIM Agent is installed, there are four basic configuration steps to perform to the CIM Agent
itself before you can register it to TPC.
1. Add a unique userid and password
2. Add storage devices to be managed
3. Restart the CIM Agent
4. Verify that the CIM Agent can communicate with the storage devices.

December 2011

174

Tivoli Storage Productivity Center


Hints and Tips

11.1.3.1

Add a unique userid and password

When the CIM Agent was installed, a default userid and password was created. The userid is
superuser and the password is passw0rd. Everybody knows these values, so in order to protect the
storage devices from unintended manipulation, it is a best practice to create a unique userid and
password, and delete the default one. Thats what were going to do now.
The setuser command requires a userid and password. These userids and passwords are unique to the
CIM Agent, and dont correspond to any O/S userids. In other words, there is no association between the
CIM agent userid and any O/S userid. Use the default: superuser/passw0rd. Once youre in the setuser
command environment, you can issue the help command to see the set of supported commands. Well
use three of these: adduser, lsuser, and rmuser. adduser creates new userid/password entries. lsuser
lists userids/encrypted passwords that can access this CIM agent. rmuser removes userids/passwords
from this CIM Agent.
Open a command window, and CD to the cimagent directory. Then issue the setuser command as shown
in the example below. Youll see that we list the existing userid(s), create a new one. Then we exit the
setuser command environment.
cd C:\Program Files\IBM\cimagent
C:\Program Files\IBM\cimagent>setuser -u superuser -p passw0rd
Application setuser started in interactive mode
To terminate the application enter: exit
To get a help message enter: help
>>> lsuser
USER
: ENCRYPTED PASSWORD
superuser : y95Se6dVeZuE8k8=
>>> adduser cimuser cimpw
An account for user cimuser successfully created.
>>> exit
C:\Program Files\IBM\cimagent>

If you remember, we still need to delete the default userid, but the setuser command wont let us delete
the userid that weve logged in with. So we need to exit the setuser command. Now well log back in
with the new cimuser userid, and can now remove the superuser userid. See the example below.
C:\Program Files\IBM\cimagent>setuser -u cimuser -p cimpw
Application setuser started in interactive mode
To terminate the application enter: exit
To get a help message enter: help
>>> rmuser superuser
The account for user superuser successfully removed.
>>> exit
C:\Program Files\IBM\cimagent>

11.1.3.2

Add storage devices to be managed

Before we can use the CIM Agent, we need to identify the storage devices that well manage through
this interface. The DS Open API provides the setdevice command to register storage devices to the CIM
Agent. The setdevice command has a set of commands to manipulate the registration of these storage
devices. The addess and lsess, and rmess commands are used to add, list, and remove ESS storage
devices. The addessserver, lsessserver, and rmessserver commands are used to add, list, and remove DS
servers (and ESS Copy Services Servers, which we wont worry about here).
See the example below where we add an ESS and a DS8000. For the addess command, specify each
ESS cluster IP address (or DNS resolvable network name), and the Specialist Administrator userid and
December 2011

175

Tivoli Storage Productivity Center


Hints and Tips
password. WE do this for each ESS cluster. For the addessserver command, specify the DS8000s
Hardware Master Console (HMC) IP address (or DNS resolvable network name), and the HMC userid
and password.
C:\Program Files\IBM\cimagent>setdevice
Application setdevice started in interactive mode
To terminate the application enter: exit
To get a help message enter: help
>>> addess essc0 essuser esspw
An ess provider entry for IP 192.168.1.155 (essc0) successfully added
>>> addess essc1 essuser esspw
An ess provider entry for IP 192.168.1.156 (essc1) successfully added
>>> addessserver ds8c0 ds8user ds8pw
An essserver entry for IP 192.168.1.73 (ds8c0) successfully added
>>> lsess
IP
: USER
: ENCRYPTED PASSWORD
192.168.1.155 : essuser : qPIW/lfj9w==
192.168.1.156 : essuser : 70blIQJMlQ==
>>> lsessserver
IP
: USER
: ENCRYPTED PASSWORD : IPA
192.168.1.73 : ds8user : dCuDN1Fq3yvGjw==
: NOT SET
>>> exit
C:\Program Files\IBM\cimagent>

11.1.3.3

Restart the CIM Agent V5.1

Now that weve added the storage devices that well manage with this CIM agent, we need to recycle
or stop and restart the CIMOM service, so that it can recognize the new changes. There are commands
provided to make this easy: stopCIMOM and startCIMOM.
C:\Program Files\IBM\cimagent>stopCIMOM
The CIM Object Manager - DS Open API service is stopping....
The CIM Object Manager - DS Open API service was stopped successfully.
D:\Program Files\IBM\cimagent>startCIMOM
The CIM Object Manager - DS Open API service is starting....
The CIM Object Manager - DS Open API service was started successfully.
C:\Program Files\IBM\cimagent>

Wait a few minutes after restarting the CIM Agent so that all the services have a chance to get running
again. In my experience, SLP takes a minute or two to become active again.

11.1.3.4

Verify that the CIM Agent can communicate with the storage devices

Once the CIM Agent has been restarted, and youve patiently waited the required few minutes, youre
ready to verify that the CIM agent can actually communicate with the storage devices that youve
registered with it.
First and easiest is to issue a netstat command. The example below was issued on Windows. The ano
parameter requests all ports be listed numerically, listing the owning PID of the port. The output is piped
to the find command to capture the CIMOM port information only.

December 2011

176

Tivoli Storage Productivity Center


Hints and Tips
C:\Program Files\IBM\cimagent>netstat -ano | find "5989"
TCP
0.0.0.0:5989
0.0.0.0:0
LISTENING

2892

C:\Program Files\IBM\cimagent>

This is a good indication that the CIMOM service is running.


Next, we need to issue a verifyconfig command to make sure that the CIM agent can communicate with
the storage device(s).
D:\Program Files\IBM\cimagent>verifyconfig -u cimuser -p cimpw
Fri Apr 21 11:08:09 PDT 2006
Verifying configuration of CIM agent for the IBM TotalStorage DS Open Application
Programming Interface...
Communicating with SLP to find WBEM services...
2 WBEM services found
host=192.168.1.3, port=5989
host=192.168.1.5, port=5988
Connecting to CIM agent, host=192.168.1.3, port=5989
Found 2 IBMTSESS_StorageSystem instances:
2105.12345
2107.AB1234
Internal Server at 192.168.1.120 configured for 2107.AB1234
Verification Successful
D:\Program Files\IBM\cimagent>

If the verifyconfig command fails, you will need to correct your CIM Agent configuration. Make sure
that youve specified the storage device addresses, userids, and passwords correctly. For ESS devices,
you can use a browser to log into the TotalStorage Specialist to verify that the values are correct. For DS
devices, you can use a browser to log into the HMC. Make sure that the CIM Agent service has been
restarted, and is running. Make sure that youve waited the required few minutes for the services to
restart and get running. Try the verifyconfig command again.
Once the verifyconfig command runs successfully, the DS Open API CIM Agent V5.1 is ready to use.

11.1.3.5

Additional information for IBM ESS devices and the V5.1 CIM agent

Several sites have hit problems collecting performance data collection on ESS devices with TPC 3.1. In
some circumstances, TPC is able to collect volume and subsystem information and configure volumes
and LUN assignments via the DS API CIM agent, but unable to collect performance data. Here is a list
of known issues so far and some hints and tips on diagnosing ESS performance collection problems. All
the issues are due to setup of the DS API, environment and configuration of the ESS.
System clocks on the ESS controllers must be in sync and in sync with the TPC server
If the ESS cluster's (cluster 1 and cluster 2) system clocks are more than 5 minutes apart, then when the
performance data (which has the timestamp in it) is received by TPC, the TPC program will throw away
the performance data, thinking there is something wrong, so nothing is saved in the database. No error is
logged in TPC for this. It is also strongly recommended that the ESS system clocks are set to the same
time and time zone as the DS API (CIMOM) and TPC servers. To check the clocks, the IBM CE needs
to log into the machine to check the system time of each controller. If incorrect this has to be changed by
the CE.

December 2011

177

Tivoli Storage Productivity Center


Hints and Tips
DS API server contains multiple IP interfaces
When the CIMOM server has multiple IP adapters the ESS can send the performance data to the wrong
IP interface on the CIMOM server and it will be lost. The adapter the data is sent to depends on the IP
configuration of the host server running the DS API. The DS API allows configuration of the preferred
IP address to send the performance data to. This should be the IP address of the adapter on the same
network as the ESS. In some revisions of the DS API this parameter is ignored, this is fixed at the
5.1.0.51 level. Note due to a number of related issues with the DS API, ESSCLI and TPC, a software
PMR should be opened to address this problem, as updates may be required for all three components.
The file system where the DS API and ESS CLI are installed must have freespace
The ESS performance statistics are written by the ESS CLI into the filesystem on the CIMOM server. If
there is no space in the file system the log files will be 0 bytes in size and no performance data will be
received by the CIM agent and sent to TPC.
ESS userid used for DS API configuration must have ESS administrator rights
If the id used specified in the SET DEVICE command does not have ESS Specialist administrator rights,
the CIMOM will be unable to collect performance data as it is unable to execute the esscli LIST
PERFSTATS command.
The ESS Web Specialist InfoServer (running inside the ESS machine) must be running
If the Specialist InfoServer is not running, performance stats cannot be collected. To check, invoke ESS
Web Specialist GUI -> Go to the Storage Allocation panel and see if volumes, and hosts can be
displayed. If no information is returned the InfoServer needs to be restarted.
Any firewalls between the ESS and DS API server should be configured to allow LIST
PERFSTATS traffic
All IP ports, on the CIMOM machine side, above 1023 must be opened to receive performance data
from the ESS
Diagnosing ESS performance collection issues
If having started a TPC performance collection on an ESS and after more than 2 collection intervals, no
data is shown in TPC, a number of checks can be performed to localize where the problems might be
with ESS performance collection.
1. Check the providerTrace.log in the /cimagent directory on the server where the DS API is installed.
If the log file contains 'long' entries of performance data, then the CIMOM is correctly collecting
performance data. If the CIMOM log does not show entries containing performance data, the problem is
likely to be with the esscli LIST PERFSTATS command used to collect performance data. If the log
does contain performance data, the problem is with TPC.
2. If the CIMOM is collecting data correctly, the tracePerfMgr.log file in the /TPC/device/logs directory
on the TPC server may contain more information about the failure.
3. Ask the IBM CE to check the system clocks on the ESS controllers.
4. If the CIMOM is not collecting performance data, the ESSCLI LIST PERFSTATS command can be
used to determine if the ESS is able to collect performance statistics
esscli list PerfStats -d "ess=2105.nnnnn" -s <ipaddress of ess cluster> -u username - p password
If timestamped log files containing performance data are written to the local file system, the ESS is in a
good state to collect performance data. Otherwise check for any of the above issues.

December 2011

178

Tivoli Storage Productivity Center


Hints and Tips

12 Appendix E Links
This section lists all web links in the order that they are referenced in this document.
TPC Infocenter
http://publib.boulder.ibm.com/infocenter/tivihelp/v4r1/index.jsp

TPC 4.2 Release Guide


http://www.redbooks.ibm.com/redpieces/abstracts/sg247894.html?Open

Deployment considerations for Storage Resource agents


http://publib.boulder.ibm.com/infocenter/tivihelp/v4r1/index.jsp?
topic=/com.ibm.tpc_V421.doc/fqz0_r_native_agents_deployment_considerations.html

Migrating Data agents and Fabric agents to Storage Resource agents


http://publib.boulder.ibm.com/infocenter/tivihelp/v4r1/index.jsp?
topic=/com.ibm.tpc_V421.doc/fqz0_c_migrating_agents.html

Passport Advantage
http://www-01.ibm.com/software/lotus/passportadvantage

Installer gui will not start


http://www-01.ibm.com/support/docview.wss?uid=swg21303823

IBM Xtreme Leverage Downloads


https://w3.ibm.com/software/xl/download/ticket.do?openform

Installing TPC Using A Remote Database


http://www-01.ibm.com/support/docview.wss?uid=swg21460493

Configure Devices Wizard


http://publib.boulder.ibm.com/infocenter/tivihelp/v4r1/index.jsp?
topic=/com.ibm.tpc_V421.doc/fqz0_t_configure_devices_storagesystems.html

SNIA Conforming Providers


http://www.snia.org/ctp/conformingproviders#10provider

Supported Products List


http://www-01.ibm.com/support/docview.wss?uid=swg21386446

SMI-S and Performance Management/Monitoring


http://www.wbemsolutions.com/tutorials/snia/SMI/Technical/block-server-performance-subprofile.html

Brocade SMI Agent Downloads


http://www.brocade.com/support/SMIAGENT.jsp

McData SMI Agent Downloads


http://www.brocade.com/services-support/drivers-downloads/smi-agent/application_matrix.page

CISCO SMI Agent Downloads


http://www.cisco.com/en/US/products/ps6030

TPC CIMOM Namespaces


http://www-01.ibm.com/support/docview.wss?uid=swg21366393

TotalStorage Hardware Freeze Alert


http://www-1.ibm.com/support/docview.wss?uid=ssg1S1003071

DS Open API Downloads


http://www-1.ibm.com/support/search.wss?rs=1118&tc=STC4NKB&dc=D400&dtm.

DS Open API 5.x Reference


http://www-01.ibm.com/support/search.wss?rs=1118&tc=STC4NKB&dc=DA420&dtm

DSCIMCLI Downloads
http://www-01.ibm.com/support/search.wss?rs=1118&tc=STC4NKB&dc=D400&dtm

XIV Information
http://publib.boulder.ibm.com/infocenter/ibmxiv/r2/index.jsp

XCLI User Guide and Reference:


http://publib.boulder.ibm.com/infocenter/ibmxiv/r2/topic/com.ibm.help.xiv.doc/docs/GA32-0638-02.pdf
http://publib.boulder.ibm.com/infocenter/ibmxiv/r2/topic/com.ibm.help.xiv.doc/docs/GC27-2213-02.pdf

December 2011

179

Tivoli Storage Productivity Center


Hints and Tips
LSI SMI Provider Downloads
http://www.lsi.com/search/Pages/downloads.aspx?k=SMI+Provider

LSI ArrayManagementUtility
http://www.lsi.com/Search/Pages/downloads.aspx?k=ArrayManagementUtility

LSI Install GUIDE 10.19.GG.xx


http://www.lsi.com/downloads/Public/External RAID/Management Software/SMI Provider - New/50783-00C_LSI_SMIS_Provider1019.pdf

LSI Install GUIDE 10.10.GG.xx


http://www.lsi.com/downloads/Public/External RAID/Management Software/SMI Provider - New/45924-00
RevB_LSI_SMI-S_Provider1010GG.pdf

SVC Embedded CIMOM Migration


http://www-01.ibm.com/support/docview.wss?uid=swg21396239

TPC Advanced Topics Redbook


http://www.redbooks.ibm.com/abstracts/sg247348.html?Open.

TPC fails to detect a newly installed DB2 instance


http://www-01.ibm.com/support/docview.wss?uid=swg21452614

Starting/Stopping TPC Services


http://publib.boulder.ibm.com/infocenter/tivihelp/v4r1/index.jsp?
topic=/com.ibm.tpc_V421.doc/fqz0_r_start_stop_tpc_services.html

STE Presentation - DB2 Administration Basics for TPC


http://www-01.ibm.com/support/docview.wss?rs=0&uid=swg27010419

Changing the user authentication method


http://publib.boulder.ibm.com/infocenter/tivihelp/v4r1/index.jsp?
topic=/com.ibm.tpc_V421.doc/fqz0_c_change_user_auth_method.html

IBM Redbook Understanding LDAP


http://publib-b.boulder.ibm.com/abstracts/sg244986.html?Open

Configuring Batch Reports on Unix


http://publib.boulder.ibm.com/infocenter/tivihelp/v4r1/index.jsp?
topic=/com.ibm.tpc_V332.doc/fqz0_t_configure_batch_reports.html

Managing TPC Log Files to Reduce TPC Server Disk Space Usage
http://www-01.ibm.com/support/docview.wss?uid=swg21297506

Device Server functions unexpectedly stop working


http://www-01.ibm.com/support/docview.wss?uid=swg21424630

SSPC Reference Information


http://www.ibm.com/systems/support/supportsite.wss/supportresources?brandind=5000033&familyind=5356448&taskind=1

SRV0042E: A database error occurred during agent registration


http://www-01.ibm.com/support/docview.wss?uid=swg21393877

SSPC Upgrading Tivoli Storage Productivity Center


http://publib.boulder.ibm.com/infocenter/tivihelp/v4r1/index.jsp?
topic=/com.ibm.sspc_v15.doc/fqz0_r_sspc_upgrading_tpc.html

SVC Support
http://www.ibm.com/systems/support/supportsite.wss/supportresources?brandind=5000033&familyind=5329743&taskind=1

DS8300 Support
http://www947.ibm.com/support/entry/portal/Troubleshooting/Hardware/System_Storage/Disk_systems/Enterprise_Storage_Servers/DS
8300/

IBM System Storage Productivity Center Deployment Guide, SG24-7560


http://www.redbooks.ibm.com/redbooks/pdfs/sg247560.pdf

TPC Replication References


Changing the TPC-R Administrator Password
http://publib.boulder.ibm.com/infocenter/tivihelp/v4r1/index.jsp?
topic=/com.ibm.rm341.doc/frc_t_if_change_ldap_os_passwords.html

Global Mirror Monitoring Tool:


http://hurgsa.ibm.com/projects/g/gmcoc/web/public/tools.html#Global

December 2011

180

Tivoli Storage Productivity Center


Hints and Tips
EMC PowerLink
https://powerlink.emc.com/nsepn/webapps/btg548664833igtcuup4826/kmlogin/login.jsp?CTAuthMode=BASIC

EMC Solutions Enabler


https://powerlink.emc.com/nsepn/webapps/btg548664833igtcuup4826/km/appmanager/km/secureDesktop?
_nfpb=true&_pageLabel=servicesDownloadsTemplatePg&internalId=0b01406680020355&_irrt=true

EMC 4.1 SMI-S Provider Release Notes


https://powerlink.emc.com/nsepn/webapps/btg548664833igtcuup4826/km/live1/en_US/Offering_Technical/Technical_Docu
mentation/300-009-597_a02.pdf?
mtcs=ZXZlbnRUeXBlPUttQ2xpY2tDb250ZW50RXZlbnQsZG9jdW1lbnRJZD0wOTAxNDA2NjgwNDhmM2E0LG5hdmV
Ob2RlPVNvZndhcmVEb3dubG9hZHMtMQ__

EMC SMI-S Provider Release Notes


https://powerlink.emc.com/nsepn/webapps/btg548664833igtcuup4826/km/live1/en_US/Offering_Technical/Technical_Docu
mentation/300-004-606_a01_elccnt_0.pdf?
mtcs=ZXZlbnRUeXBlPUttQ2xpY2tDb250ZW50RXZlbnQsZG9jdW1lbnRJZD0wOTAxNDA2NjgwMjFhMmJhLG5hdmV
Ob2RlPVNvZndhcmVEb3dubG9hZHMtMQ__

December 2011

181

Tivoli Storage Productivity Center


Hints and Tips

Program Number: 5608-VC0, 5608-VC1, 5608-VC3, 5608-VC4, 5608-VC6


Printed in USA

December 2011

182

Anda mungkin juga menyukai