0)
This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://
www.stlport.org/doc/ license.html, http://asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://
httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/release/
license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/licenseagreements/fuse-message-broker-v-5-3- license-agreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/licence.html;
http://www.jgraph.com/jgraphdownload.html; http://www.jcraft.com/jsch/LICENSE.txt; http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/Consortium/Legal/
2002/copyright-software-20021231; http://www.slf4j.org/license.html; http://nanoxml.sourceforge.net/orig/copyright.html; http://www.json.org/license.html; http://
forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/software/tcltk/license.html, http://
www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html, http://www.slf4j.org/license.html; http://www.iodbc.org/dataspace/iodbc/wiki/iODBC/License; http://
www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/index.html; http://www.net-snmp.org/about/
license.html; http://www.openmdx.org/#FAQ; http://www.php.net/license/3_01.txt; http://srp.stanford.edu/license.txt; http://www.schneier.com/blowfish.html; http://
www.jmock.org/license.html; http://xsom.java.net; http://benalman.com/about/license/; https://github.com/CreateJS/EaselJS/blob/master/src/easeljs/display/Bitmap.js;
http://www.h2database.com/html/license.html#summary; http://jsoncpp.sourceforge.net/LICENSE; http://jdbc.postgresql.org/license.html; http://
protobuf.googlecode.com/svn/trunk/src/google/protobuf/descriptor.proto; https://github.com/rantav/hector/blob/master/LICENSE; http://web.mit.edu/Kerberos/krb5current/doc/mitK5license.html; http://jibx.sourceforge.net/jibx-license.html; https://github.com/lyokato/libgeohash/blob/master/LICENSE; https://github.com/hjiang/jsonxx/
blob/master/LICENSE; https://code.google.com/p/lz4/; https://github.com/jedisct1/libsodium/blob/master/LICENSE; http://one-jar.sourceforge.net/index.php?
page=documents&file=license; https://github.com/EsotericSoftware/kryo/blob/master/license.txt; http://www.scala-lang.org/license.html; https://github.com/tinkerpop/
blueprints/blob/master/LICENSE.txt; http://gee.cs.oswego.edu/dl/classes/EDU/oswego/cs/dl/util/concurrent/intro.html; https://aws.amazon.com/asl/; https://github.com/
twbs/bootstrap/blob/master/LICENSE; https://sourceforge.net/p/xmlunit/code/HEAD/tree/trunk/LICENSE.txt; https://github.com/documentcloud/underscore-contrib/blob/
master/LICENSE, and https://github.com/apache/hbase/blob/master/LICENSE.txt.
This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and Distribution
License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary Code License
Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php), the new BSD License (http://opensource.org/
licenses/BSD-3-Clause), the MIT License (http://www.opensource.org/licenses/mit-license.php), the Artistic License (http://www.opensource.org/licenses/artisticlicense-1.0) and the Initial Developers Public License Version 1.0 (http://www.firebirdsql.org/en/initial-developer-s-public-license-version-1-0/).
This product includes software copyright 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this
software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab.
For further information please visit http://www.extreme.indiana.edu/.
This product includes software Copyright (c) 2013 Frank Balluffi and Markus Moeller. All rights reserved. Permissions and limitations regarding this software are subject
to terms of the MIT license.
See patents at https://www.informatica.com/legal/patents.html.
DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied
warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The
information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation is
subject to change at any time without notice.
NOTICES
This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software
Corporation ("DataDirect") which are subject to the following terms and conditions:
1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT,
INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT
INFORMED OF THE POSSIBILITIES OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT
LIMITATION, BREACH OF CONTRACT, BREACH OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS.
Part Number: IN-AGS-10000-0001
Table of Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Informatica My Support Portal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Informatica Product Availability Matrixes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Informatica Web Site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Informatica How-To Library. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Informatica Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Informatica Support YouTube Channel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Informatica Marketplace. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Informatica Velocity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Informatica Global Customer Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Table of Contents
Table of Contents
Preface
The Informatica Administrator Getting Started Guide is written for Informatica administrators and operators
who manage and monitor the domain. It provides a tutorial to help first-time users learn how to use the
Administrator tool.
Informatica Resources
Informatica My Support Portal
As an Informatica customer, the first step in reaching out to Informatica is through the Informatica My Support
Portal at https://mysupport.informatica.com. The My Support Portal is the largest online data integration
collaboration platform with over 100,000 Informatica customers and partners worldwide.
As a member, you can:
Search the Knowledge Base, find product documentation, access how-to documents, and watch support
videos.
Find your local Informatica User Group Network and collaborate with your peers.
Informatica Documentation
The Informatica Documentation team makes every effort to create accurate, usable documentation. If you
have questions, comments, or ideas about this documentation, contact the Informatica Documentation team
through email at infa_documentation@informatica.com. We will use your feedback to improve our
documentation. Let us know if we can contact you regarding your comments.
The Documentation team updates documentation as needed. To get the latest documentation for your
product, navigate to Product Documentation from https://mysupport.informatica.com.
Informatica Marketplace
The Informatica Marketplace is a forum where developers and partners can share solutions that augment,
extend, or enhance data integration implementations. By leveraging any of the hundreds of solutions
available on the Marketplace, you can improve your productivity and speed up time to implementation on
your projects. You can access Informatica Marketplace at http://www.informaticamarketplace.com.
Informatica Velocity
You can access Informatica Velocity at https://mysupport.informatica.com. Developed from the real-world
experience of hundreds of data management projects, Informatica Velocity represents the collective
knowledge of our consultants who have worked with organizations from around the world to plan, develop,
deploy, and maintain successful data management solutions. If you have questions, comments, or ideas
about Informatica Velocity, contact Informatica Professional Services at ips@informatica.com.
Preface
The telephone numbers for Informatica Global Customer Support are available from the Informatica web site
at http://www.informatica.com/us/services-and-training/support-services/global-support-centers/.
Preface
CHAPTER 1
Application clients. A group of clients that you use to access underlying Informatica functionality.
Application clients make requests to the Service Manager or application services.
Application services. A group of services that represent server-based functionality. An Informatica domain
can contain a subset of application services. You create and configure the application services that the
application clients require.
Application services include system services that can have a single instance in the domain. When you
create the domain, the system services are created for you. You can configure and enable a system
service to use the functionality that the service provides.
Profile warehouse. A relational database that the Data Integration Service uses to store profile results.
Repositories. A group of relational databases that store metadata about objects and processes required to
handle user requests from application clients.
Service Manager. A service that is built in to the domain to manage all domain operations. The Service
Manager runs the application services and performs domain functions including authentication,
authorization, and logging.
The following table lists the application clients, not including the Administrator tool, and the application
services and the repositories that the client requires:
Application Client
Application Services
Repositories
Data Analyzer
Reporting Service
Jaspersoft repository
Informatica Analyst
Analyst Service
Content Management Service
Data Integration Service
Model Repository Service
Search Service
Model repository
Informatica Developer
Analyst Service
Content Management Service
Data Integration Service
Model Repository Service
Model repository
Metadata Manager
PowerCenter Client
PowerCenter repository
PowerCenter repository
The following application services are not accessed by an Informatica application client:
10
PowerExchange Listener Service. Manages the PowerExchange Listener for bulk data movement and
change data capture. The PowerCenter Integration Service connects to the PowerExchange Listener
through the Listener Service.
PowerExchange Logger Service. Manages the PowerExchange Logger for Linux, UNIX, and Windows to
capture change data and write it to the PowerExchange Logger Log files. Change data can originate from
DB2 recovery logs, Oracle redo logs, a Microsoft SQL Server distribution database, or data sources on an
i5/OS or z/OS system.
SAP BW Service. Listens for RFC requests from SAP BI and requests that the PowerCenter Integration
Service run workflows to extract from or load to SAP BI.
Domain administrative tasks. Manage logs, domain objects, user permissions, and domain reports.
Generate and upload node diagnostics. Monitor Data Integration Service jobs and applications. Domain
objects include application services, nodes, grids, folders, database connections, operating system
profiles, and licenses.
11
CHAPTER 2
Objectives
In this lesson, you complete the following tasks:
Record the domain and the administrator user account information. The domain information provides
address components of the Administrator tool URL, and the user account provides access to the
Administrator tool.
Log in to the Administrator tool. Lessons in this tutorial require that you can log in to the Administrator
tool.
Prerequisites
Before you start this lesson, verify the following prerequisites:
The administrator or user who installed Informatica has provided you with the domain connectivity
information and an administrator user account.
Timing
Set aside 10 to 15 minutes to complete this lesson.
12
Values
Domain Name
Gateway Node Host Name
Informatica Administrator Port Number
Values
Administrator Username
Administrator Password
Security Domain
Database Connections
Ask the database administrator to set up databases and user accounts for Informatica repositories. The
database administrator must provide the connection information for the following databases:
Model repository
Profiling warehouse
You will use the database connection information when you create connection objects in another lesson.
13
Use the following table to record the connection information for the databases:
Database
Connection
Information
Description
Database Type
Oracle
IBM DB2
Microsoft SQL Server
ODBC
Database Password
Metadata Access:
Connection String
14
Environment SQL
Model
Repository
Profiling
Warehouse
Data Object
Cache
Database
Connection
Information
Description
Model
Repository
Transaction SQL
Connection Retry
Period
Domain Name
Packet Size
Owner Name
Schema Name
Tablespace
Profiling
Warehouse
Data Object
Cache
15
Database
Connection
Information
Description
SQL Identifier
Character
Model
Repository
Profiling
Warehouse
DOUBLE_QUOTE
SINGLE_QUOTE
BACK_QUOTE
SQUARE_BRACKETS
QUOTE_EMPTY
16
Description
HTTP Port
Username
Password
Security Domain
Values
Data Object
Cache
Description
HTTP Port
Username
Password
Security Domain
Values
2.
In the Address field, enter the following URL for the Administrator tool login page:
http://<fully qualified host name>:<port>/administrator/
The host is the gateway node host name. The port is the Informatica Administrator port number.
3.
In the Informatica Administrator login page, enter the user name and password.
4.
5.
In the Administrator tool header area, click Manage > Change Password.
The Change Password page appears.
2.
On the Change Password page, enter the current password in the Current Password box, and the new
password in the New Password and Confirm New Password boxes.
3.
Click Update.
17
18
CHAPTER 3
Model Repository Service. The Model Repository Service manages the Model repository. The Analyst
tool, Data Integration Service, and Administrator tool store metadata in the Model repository.
Data Integration Service. The Data Integration Service is an application service that performs data
integration tasks for the Analyst tool and other external clients.
Analyst Service. The Analyst Service is an application service that runs the Analyst tool. The Analyst
Service manages the connections between service components and the users who access the Analyst
tool.
You associate a Data Integration Service with a profiling warehouse and a data object cache database. When
you create these application services, you select connections to these databases. A database connection is a
domain object that contains connectivity information for a relational database. You create the connection
objects before you create the Data Integration Service and Analyst Service.
Story
An administrator at HypoStores needs to create application services. An analyst needs a Model Repository
Service, Data Integration Service, and Analyst Service to use the Analyst tool.
19
Objectives
In this lesson, you complete the following tasks:
Create database connections to the profiling warehouse and data object cache database.
Prerequisites
Before you start this lesson, verify the following prerequisites:
You have the database connection information for the Model repository, profiling warehouse, and data
object cache. You gathered this information in the first lesson.
You have a license object in the domain. The Informatica installer creates a license object in the domain.
You need a license object to create application services.
Timing
Set aside 30 to 45 minutes to complete this lesson.
2.
3.
On the Navigator Actions menu, click New > Model Repository Service.
The New Model Repository Service - Step 1 of 2 dialog box appears.
4.
Enter the following general properties for the Model Repository Service:
Property
Description
Name
Name of the Model Repository Service. The name is not case sensitive and must be unique
within the domain. It cannot exceed 128 characters or begin with @. It also cannot contain
spaces or the following special characters:
`~%^*+={}\;:'"/?.,<>|!()][
20
Description
Description of the Model Repository Service. The description cannot exceed 765 characters.
Location
The folder where you want to create the service. If a folder is not specified, the domain name
appears in this field.
5.
Property
Description
License
License to assign to the Model Repository Service. Select the license installed with
Informatica services.
Node
Node to run the Model Repository Service. Select an existing node in the domain.
Click Next.
The New Model Repository Service - Step 2 of 2 dialog box appears.
6.
Enter the following database properties for the Model Repository Service:
Property
Description
Database Type
Type of database.
Username
Password
Connection String
jdbc:informatica:sqlserver://
<host_name>:<port_number>;DatabaseName=<database_na
me>;SnapshotSerializable=true
- Microsoft SQL Server that uses a named instance.
jdbc:informatica:sqlserver://<host_name>
\<named_instance_name>;DatabaseName=<database_name>
;SnapshotSerializable=true
- Oracle. jdbc:informatica:oracle://
<host_name>:<port_number>;SID=<database_name>;MaxPo
oledStatements=20;CatalogOptions=0;BatchPerformance
Workaround=true
7.
8.
Select Create New Content to create content for the Model repository in the specified database.
9.
10.
Click Finish.
21
2.
3.
4.
5.
6.
DB2
ODBC
ORACLE
SQLSERVER
Click OK.
The New Connection - Step 1 of 3 dialog box appears.
7.
8.
Description
Name
Name of the connection. The name is not case sensitive and must be unique within the
domain. The name cannot exceed 128 characters, contain spaces, or contain the following
special characters:
~ ` ! $ % ^ & * ( ) - + = { [ } ] | \ : ; " ' < , > . ? /
ID
String that the Data Integration Service uses to identify the connection. The ID is not case
sensitive. It must be 255 characters or less and must be unique in the domain. You cannot
change this property after you create the connection. Default value is the connection name.
Description
Optional description of the connection. The description cannot exceed 765 characters.
User Name
Password
Click Next.
The New Connection - Step 2 of 3 dialog box appears.
22
9.
Description
JDBC connection URL used to access metadata from the database.
- IBM DB2: jdbc:informatica:db2://<host name>:<port>;DatabaseName=<database name>
- Oracle: jdbc:informatica:oracle://<host_name>:<port>;SID=<database name>
- Microsoft SQL Server: jdbc:informatica:sqlserver://<host
name>:<port>;DatabaseName=<database name>
Data Access:
Code Page
Code page used to read from a source database or write to a target database or target
file.
10.
Click Test Connection to verify that the connectivity information for metadata access is valid.
11.
Click Next.
The New Connection - Step 3 of 3 dialog box appears.
12.
Description
Environment SQL
SQL commands to set the database environment when you connect to the database.
The Data Integration Service runs the connection environment SQL each time it
connects to the database.
Transaction SQL
SQL commands to set the database environment when you connect to the database.
The Data Integration Service runs the transaction environment SQL at the beginning of
each transaction.
Connection Retry
Period
The number of seconds that the Data Integration Service tries to reconnect to the
database if the connection fails. If the Data Integration Service cannot connect to the
database in the retry period, the integration object fails. Default is 0.
Domain Name
Packet Size
Microsoft SQL Server. The packet sized used to transmit data. Used to optimize the
native drivers for Microsoft SQL Server.
Owner Name
Schema Name
Microsoft SQL Server. The name of the schema in the database. You must specify the
schema name for the Profiling Warehouse if the schema name is different from the
database user name.
Enable Parallel
Mode
Oracle. Enables parallel processing when loading data into a table in bulk mode. By
default, this option is cleared.
23
Property
Description
Tablespace
SQL Identifier
Character
The type of quote character used for the Support Mixed Case Identifiers property.
Select the quote character based on the database in the connection. The options are:
-
Support Mixed
Case Identifiers
DOUBLE_QUOTE
SINGLE_QUOTE
BACK_QUOTE
SQUARE_BRACKETS
QUOTE_EMPTY
Enables the Developer and Analyst tools to place quotes around table, view, schema,
synonym and column names when they generate and run SQL against these objects in
the connection. Use if the objects have mixed-case or lowercase names, or if the
object names contain SQL keywords, such as WHERE. By default, this option is
disabled.
13.
Click Finish.
14.
Repeat all the steps to set up connection objects for each of the following databases:
You have created database connection objects that you will use when you create application services in the
next lesson.
2.
3.
4.
5.
On the New Data Integration Service - Step 1 of 14 page, enter the following properties:
Property
Description
Name
Name of the service. The name is not case sensitive and must be unique within the
domain. It cannot exceed 128 characters or begin with @. It also cannot contain spaces
or the following special characters:
`~%^*+={}\;:'"/?.,<>|!()][
Description
24
6.
Property
Description
Location
Domain and folder where the service is created. Click Browse to choose a different
folder. You can move the service after you create it.
License
Assign
Select Node to configure the service to run on a node. If your license includes grid, you
can create a grid and assign the service to run on the grid after you create the service.
Node
Backup Nodes
If your license includes high availability, nodes on which the service can run if the
primary node is unavailable.
Model
Repository
Service
Username
User name that the service uses to access the Model Repository Service. Enter the
Model repository user that you created. Not available for a domain with Kerberos
authentication.
Password
Password for the Model repository user. Not available for a domain with Kerberos
authentication.
Security
Domain
LDAP security domain for the Model repository user. The field appears when the
Informatica domain contains an LDAP security domain. Not available for a domain with
Kerberos authentication.
Click Next.
The New Data Integration Service - Step 2 of 14 page appears.
7.
Enter the HTTP port number to use for the Data Integration Service.
8.
Accept the default values for the remaining security properties. You can configure the security properties
after you create the Data Integration Service.
9.
10.
11.
Click Next.
The New Data Integration Service - Step 3 of 14 page appears.
12.
Set the Launch Job Options property to one of the following values:
In the service process. Configure when you run SQL data service and web service jobs. SQL data
service and web service jobs typically achieve better performance when the Data Integration Service
runs jobs in the service process.
In separate local processes. Configure when you run mapping, profile, and workflow jobs. When the
Data Integration Service runs jobs in separate local processes, stability increases because an
unexpected interruption to one job does not affect all other jobs.
If you configure the Data Integration Service to run on a grid after you create the service, you can
configure the service to run jobs in separate remote processes.
13.
Accept the default values for the remaining execution options and click Next.
25
If you created the data object cache database for the Data Integration Service, click Select to select the
cache connection. Select the data object cache connection that you created for the service to access the
database.
15.
Accept the default values for the remaining properties on this page and click Next.
The New Data Integration Service - Step 5 of 14 page appears.
16.
For optimal performance, enable the Data Integration Service modules that you plan to use.
The following table lists the Data Integration Service modules that you can enable:
17.
Module
Description
Runs workflows.
Click Next.
The New Data Integration Service - Step 6 of 14 page appears.
You can configure the HTTP proxy server properties to redirect HTTP requests to the Data Integration
Service. You can configure the HTTP configuration properties to filter the web services client machines
that can send requests to the Data Integration Service. You can configure these properties after you
create the service.
18.
Accept the default values for the HTTP proxy server and HTTP configuration properties and click Next.
The New Data Integration Service - Step 7 of 14 page appears.
The Data Integration Service uses the result set cache properties to use cached results for SQL data
service queries and web service requests. You can configure the properties after you create the service.
19.
Accept the default values for the result set cache properties and click Next.
The New Data Integration Service - Step 8 of 14 page appears.
20.
If you created the profiling warehouse database for the Data Integration Service, select the Profiling
Service module.
21.
If you created the workflow database for the Data Integration Service, select the Workflow Orchestration
Service module.
22.
23.
Click Next.
The New Data Integration Service - Step 11 of 14 page appears.
26
24.
If you created the profiling warehouse database for the Data Integration Service, click Select to select
the database connection. Select the profiling warehouse connection that you created for the service to
access the database.
25.
If you created a new profiling warehouse database, select No content exists under specified
connection string.
26.
Click Next.
The New Data Integration Service - Step 12 of 14 page appears.
27.
Accept the default values for the advanced profiling properties and click Next.
The New Data Integration Service - Step 14 of 14 page appears.
28.
If you created the workflow database for the Data Integration Service, click Select to select the database
connection. Select the workflow database connection that you created for the service to access the
database.
29.
Click Finish.
The domain creates and enables the Data Integration Service.
On the Manage tab of the Administrator tool, select the Services and Nodes view.
2.
3.
The New Analyst Service - Step 1 of 5 dialog box appears. Configure the properties in the dialog box.
The following table describes the properties:
Property
Description
Name
Name of the service. The name is not case-sensitive and must be unique within the domain.
The name cannot exceed 128 characters or begin with @. The name cannot contain character
spaces. The characters in the name must be compatible with the code page of the Model
repository that you associate with the Analyst Service.
The name cannot contain the following special characters:
`~%^*+={}\;:'"/?.,<>|!()][
Description
Location
License
License to assign to the Analyst Service. Select the license that you installed with Informatica
services.
Node
Node on which the Analyst Service runs. If you change the node, you must recycle the
Analyst Service.
4.
Click Next.
5.
The New Analyst Service - Step 2 of 5 dialog box appears. Configure the properties in the dialog box.
27
Description
HTTP Port
Port number for the Analyst Service. Use a port number that is different from the
HTTP port number for the Data Integration Service.
Enable Secure
Communication
Enable Service
Do not configure the HTTPS Port, Keystore File, Keystore Password, or SSL
Protocol properties.
Select the option to enable the service.
6.
Click Next.
7.
The New Analyst Service - Step 3 of 5 dialog box appears. Configure the properties in the dialog box.
The following table describes the properties in step 3:
Property
Description
Model
Repository
Service
Model Repository Service to associate with the Analyst Service. The Model Repository
Service manages the Model repository that the Analyst tool uses. If you update the
property to specify a different Model Repository Service, recycle the Analyst Service.
Username
Password
Data Integration
Service
Optional property. Data Integration Service to associate with the Analyst Service so that
you can manage exception record data in the Analyst tool. Select a Data Integration
Service that you configured to run workflows. If you update the property to specify a
different Data Integration Service, recycle the Analyst Service.
8.
Click Next.
9.
The New Analyst Service - Step 4 of 5 dialog box appears. Configure the properties in the dialog box.
The following table describes the properties in step 4:
10.
28
Property
Description
Data Integration
Service
Data Integration Service to associate with the Analyst tool so that you can perform data
preview, mapping specification, and profile operations in the Analyst tool. If you update
the property to specify a different Data Integration Service, recycle the Analyst Service.
Directory of the flat file cache where the Analyst tool stores uploaded flat files. The
Data Integration Service must also be able to access this directory. If the Analyst
Service and the Data Integration Service run on different nodes, configure the flat file
directory to use a shared directory.
Metadata
Manager Service
Metadata Manager Service to associate with the Analyst Service so that you can
perform data lineage operations on scorecards in the Analyst tool.
Click Next.
11.
The New Analyst Service - Step 5 of 5 dialog box appears. Configure the property in the dialog box.
The following table describes the property in step 5:
Property
Description
Temporary
Export File
Directory
Path to the directory to which you export business glossary files from the Analyst tool.
Enter a local path on the machine that hosts the Analyst Service.
Enter a local path on the machine that hosts the Analyst Service.
If you configure a directory that does not exist, the Analyst Service creates the directory.
Restart the Analyst Service if you change the flat file location.
12.
Click Finish.
You created an Analyst Service.
29
CHAPTER 4
Story
An administrator at HypoStores gets a user account request from an analyst. The analyst needs access to
the Analyst tool to create projects.
Objectives
In this lesson, you complete the following tasks:
Grant user privileges to access the Analyst tool and to create projects in the Analyst tool.
Prerequisites
Before you start this lesson, verify the following prerequisites:
Timing
Set aside 5 to 10 minutes to complete this lesson.
30
2.
3.
Description
Login Name
Login name for the user account. The login name for a user account must be unique within
the security domain to which it belongs.
The name is not case sensitive and cannot exceed 128 characters. It cannot include a tab,
newline character, or the following special characters:
,+"\<>;/*%?&
The name can include an ASCII space character except for the first and last character. All
other space characters are not allowed.
Password
Password for the user account. The password can be from 1 through 80 characters long.
Confirm
Password
Enter the password again to confirm. You must retype the password. Do not copy and
paste the password.
Full Name
Full name for the user account. The full name cannot include the following special
characters:
<>
4.
Click OK.
You created a user account that, with the correct privileges, can log into application clients, such as the
Administrator tool or the Analyst tool.
2.
In the Users section of the Navigator, select the user you created in Task 1. Create a User on page 31.
3.
4.
Click Edit.
The Edit Roles and Privileges dialog box appears.
5.
31
6.
7.
Location
Description
Create Projects
Click OK.
32
CHAPTER 5
33
multiple requests. When you view a request for an SQL data service, the contents panel displays information
such as the request ID, state, and elapsed time.
You can also view event logs for the Data Integration Service that runs the jobs and applications. The event
logs show the service activity, any errors encountered, and the severity of the errors.
Story
An administrator at HypoStores wants to view the status of jobs, a workflow, and SQL data services running
on a Data Integration Service.
Objective
In this lesson, you complete the following tasks:
View jobs that are running or previously ran on a Data Integration Service to check for failures.
View connections to an SQL data service to check for active and timed out connections.
View requests for an SQL data service and a web service to monitor requests.
Monitor a workflow instance that ran on a Data Integration Service to check for failures.
View event logs for a Data Integration Service to check for service errors.
Prerequisites
Before you start this lesson, verify the following prerequisites:
An analyst runs profile and scorecard jobs on the Data Integration Service in the domain.
A developer runs an SQL data service and a workflow on the Data Integration Service in the domain.
Timing
Set aside 10 to 15 minutes to complete this lesson.
2.
3.
In the Navigator, expand a Data Integration Service and then select Jobs.
The contents panel displays jobs for the selected Data Integration Service.
4.
34
2.
3.
4.
5.
6.
7.
2.
3.
4.
5.
6.
7.
2.
3.
35
4.
5.
6.
7.
2.
3.
4.
Select the workflow instance in the contents panel to view more details about the workflow instance.
The details panel shows general properties and recovery properties defined for the workflow instance. It
also shows error messages associated with the workflow run.
5.
In the contents panel, expand the workflow instance to view tasks and gateways defined for the
workflow.
6.
7.
If a workflow failed, view the error messages associated with the task or gateway to determine the cause
of the failure.
2.
36
3.
4.
From the Service Name list, select the Data Integration Service that runs the jobs and applications that
you want to monitor.
5.
37