Anda di halaman 1dari 102

Performance Tuning

Version 8.6

Ben Lee
Informatica Professional Services
Senior Consultant
Mid-Atlantic IUG – October 8, 2009

1
Agenda

• Memory optimization
• Performance tuning methodology
• Tuning source, target, & mapping bottlenecks
• Pipeline partitioning
• Server Grid
• Q&A
• Course evaluation

2
Anatomy of a Session

Integration Service

Data Transformation Manager


(DTM)

DTM Buffer

Source Target
WRITER data
data READER

Transformation
caches
TRANSFORMER

3
DTM Buffer

• Temporary storage area for data


• Buffer is divided into blocks
• Buffer size and block size are tunable
• Default setting for each is Auto

4
Reader Bottleneck

Transformer & writer threads wait for data

DTM Buffer
waiting

READER WRITER
Slow reader

waiting waiting

TRANSFORMER

5
Transformer Bottleneck

Reader waits for free blocks; writer waits for data

DTM Buffer
waiting waiting

READER WRITER

TRANSFORMER
Slow transformer

6
Writer Bottleneck

Reader & transformer wait for free blocks

DTM Buffer
waiting

READER WRITER
Slow writer

waiting waiting

TRANSFORMER

7
Large Commit Interval

DTM Buffer
waiting

READER WRITER

TRANSFORMER

Target rows remain in the buffers until the DTM reaches the
commit point

8
Tuning the DTM Buffer

• Temporary slowdowns in reading, transforming


or writing may cause large fluctuations in
throughput
• A “slow” thread typically provides data in
spurts
• Extra memory blocks can act as a “cushion”,
keeping other threads busy in case of a
bottleneck

9
Transformation Caches

• Temporary storage area for certain transformations


• Except for Sorter, each is divided into a Data & Index
Cache
• The size of each transformation cache is tunable
• If runtime cache requirement > setting, overflow
written to disk
• The default setting for each cache is Auto

10
Max Memory for Transformation Caches

Only applies to transformation caches set to Auto

11
Max Memory for Transformation Caches

• Two settings: fixed number & percentage


• System uses the smaller of the two
• If either setting is 0, DTM assigns a default size to each
transformation cache that’s set to Auto

• Recommendation: use fixed limit if this is the


only session running; otherwise, use percentage
• Use percentage if running in grid or HA
environment

12
Tuning the Transformation Caches

• If a cache setting is too small, DTM writes


overflow to disk
• Determine if transformation caches are
overflowing:
• Watch the cache directory on the file system while the
session runs
• Use the session performance counters

• Options to tune:
• Increase the maximum memory allowed for Auto
transformation cache sizes
• Set the cache sizes for individual transformations manually

13
Performance Counters

14
Tuning the Transformation Caches

• Non-0 counts for readfromdisk and writetodisk


indicate sub-optimal settings for transformation
index or data caches
• This may indicate the need to tune transformation
caches manually
• Any manual setting allocates memory outside of
previously set maximum
• Cache Calculators provide guidance in manual
tuning of transformation caches

15
Aggregator Caches

• Unsorted Input
• Must read all input before releasing any output rows
• Index cache contains group keys
• Data cache contains non-group-by ports

• Sorted Input
• Releases output row as each input group is processed
• Does not require data or index cache
(both =0)
• May run much faster than unsorted BUT
must consider the expense of sorting

16
Joiner Caches: Unsorted Input

MASTER
Staging algorithm:
All master data loaded
into cache

Specify smaller data


set as master

DETAIL

• Index cache contains join keys


• Data cache contains non-key connected outputs

17
Joiner Caches: Sorted Input

Streaming algorithm: MASTER


Both inputs must be
sorted on join keys

Selected master data


loaded into cache

Specify data set with


fewest records under a
single key as master DETAIL

• Index cache contains up to 100 keys


• Data cache contains non-key connected outputs
associated with the 100 keys

18
Lookup Caches

• To cache or not to cache?


• Large number of invocations – cache
• Large lookup table – don’t cache
• Flat file lookup is always cached

19
Rank Caches

• Index cache contains group keys


• Data cache contains non-group-by ports
• Cache sizes related to the number of groups &
the number of ranks

20
Sorter Cache

• Sorter Transformation
• May be faster than a DB sort or 3rd party sorter
• Index read from RDB = pre-sorted data
• SQL SELECT DISTINCT may reduce the volume of data
across the network versus sorter with “Distinct” property set

• Single cache
(no separation of index & data)

21
64 bit vs. 32 bit OS

• Take advantage of large memory support in 64-


bit
• Cache based transformations like Sorter,
Lookup, Aggregator, Joiner, and XML Target can
address larger blocks of memory

22
Performance Tuning Methodology

• It is an iterative process
• Establish benchmark
• Optimize memory
• Isolate bottleneck
• Tune bottleneck
• Take advantage of under-utilized CPU & memory

23
The Production Environment

• Multi-vendor, multi-system environment with


many components:
• Operating systems, databases, networks and I/O
• Usually need to monitor performance in several places
• Usually need to monitor outside Informatica as well

Disk Disk

Disk Disk Disk Disk


LAN / DBMS
Disk WAN OS Disk
Disk Disk
Disk PowerCenter Disk
Disk Disk

24
Preliminary Steps

• Eliminate transformation errors & data rejects


“first make it work, then make it faster”
• Source row logging requires reader to hold onto buffers
until data is written to target, EVEN IF THERE ARE NO
ERRORS; can significantly increase DTM buffer
requirement
• You may want to set stop on errors to 1

25
Preliminary Steps

• Override tracing level to terse or normal


• Override at session level to avoid having to examine each
transformation in the mapping
• Only use verbose tracing during development & only with
very small data sets
• If you expect row errors that you will not need to correct,
avoid logging them by overriding the tracing level to terse
(not recommended as a long-term error handling solution)

26
Benchmarking

• Hardware (CPU bandwidth, RAM, disk space,


etc.) should be similar to production
• Database configuration should be similar to
production
• Data volume should be similar to production
• Challenge: production data is constantly
changing
• Optimal tuning may be data dependent
• Estimate “average” behavior
• Estimate “worst case” behavior

27
Identifying Bottlenecks

• The first challenge is to identify the bottleneck


• Target
• Source
• Transformations
• Mapping/Session

• Tuning the most severe bottleneck may reveal


another one
• This is an iterative process

28
Thread Statistics

• The DTM spawns multiple threads


• Each thread has busy time & idle time
• Goal – maximize the busy time & minimize the
idle time

29
Thread Statistics - Terminology

• A pipeline consists of:


• A source qualifier
• The sources that feed that source qualifier
• All transformations and targets that receive data from that
source qualifier

30
Thread Statistics - Terminology

PIPELINE 1 A pipeline on the master


MA
input of a joiner terminates
ST
ER at the joiner

DETAIL

PIPELINE 2

31
Using Thread Statistics

• By default PowerCenter assigns a partition point


( ) at each Source Qualifier, Target, Aggregator
and Rank.
Partition Points

Reader Thread Transformation Thread Transform Writer Thread


Thread
(First Stage) (Second Stage) (Third Stage) (Fourth Stage)

32
Target Bottleneck

• The Aggregator transformation stage is waiting


for target buffers

Reader Thread Transformation Thread Transform Writer Thread


Thread
(First Stage) (Second Stage) (Third Stage) (Fourth Stage)
Busy% Busy% Busy%=15 Busy%=95

33
Transformation Bottleneck

• Both the reader & writer are waiting for buffers

Reader Thread Transformation Thread Transform Writer Thread


Thread
(First Stage) (Second Stage) (Third Stage) (Fourth Stage)
Busy%=15 Busy%=60 Busy%=95 Busy%=10

34
Integration Service Monitor in WFMonitor

35
Other Methods of Bottleneck Isolation

• Write to flat file


If significantly faster than relational target – Target
Bottleneck
• Place FALSE Filter right after Source Qualifier
If significantly faster – Transformation Bottleneck
• If target & transformation bottlenecks are ruled
out – Source Bottleneck

36
Session Statistics in WFMonitor

37
Target Optimization

• Target Optimization often involves non-


Informatica components
• Drop Indexes and Constraints
• Use pre/post SQL to drop and rebuild
• Use pre/post-load stored procedures

• Use constraint-based loading only when


necessary

38
Target Optimization

• Use Bulk Loading


• Informatica bypasses the database log
• Target cannot perform rollback
• Weigh importance of performance over recovery

• Use External Loader


• Similar to bulk loader, but the DB reads from a flat file

39
Target Optimization

• “update else insert” session property


• Works well if you rarely insert
• Index required for update key but slows down insert
• PowerCenter must wait for database to return error before
inserting

• Alternative – lookup followed by update strategy

40
Source Bottlenecks

• Source optimization often involves non-


Informatica components
• Generated SQL available in session log
• Execute directly against DB
• Update statistics on DB
• Used tuned SELECT as SQL override

• Set the Line Sequential Buffer Length session


property to correspond with the record size

41
Source Bottlenecks

• Avoid transferring more than once from remote


machine
• Avoid reading same data more than once
• Filter at source if possible (reduce data set)
• Minimize connected outputs from the source
qualifier
• Only connect what you need
• The DTM only includes connected outputs when it
generates the SQL SELECT statement

42
Reduce Data Set

• Remove Unnecessary Ports


• Not all ports are needed
• Fewer ports = better performance & lower memory req.

• Reduce Rows in Pipeline


• Place Filter Transformation as far upstream as possible
• Filter before aggregator, rank, or sorter if possible
• Filter in source qualifier if possible

43
Expressions Language Tips

• Functions are more expensive than operators


• Use || instead of CONCAT()

• Use variable ports to factor out common logic

44
Expressions Language Tips

• Simplify nested functions when possible

instead of:
IIF(condition1,result1,IIF(condition2,
result2,IIF… ))))))))))))

try:
DECODE (TRUE,
condition1, result1,
:
conditionn, resultn)

45
General Guidelines

• High precision (session property) is expensive


but only applies to “decimal” data type
• UNICODE requires 2 bytes per character; ASCII
requires 1 byte per character
• Performance difference depends on number of string ports
only.

46
Transformation Specific

Reusable Sequence Generator –


Number of Cached Values Property
• Purpose: enables different sessions to share the
same sequence without generating the same
numbers
• >0: allocates the specified number of values &
updates the current value in the repository at the
end of each block
(each session gets a different block of numbers)

47
Other Transformations

• Normalizer
• This transformation INCREASES the number of rows
• Place as far downstream as possible

• XML Reader/ Mid Stream XML Parser


• Remove groups that are not projected
• We do not allocate memory for these groups, but still need
to maintain PK/FK relationships
• Don’t leave port size lengths as infinite. Use appropriate
length.

48
Iterative Process

• After tuning your bottlenecks, revisit memory


optimization
• Tuning often REDUCES memory requirements
(you might even be able to change some settings back to Auto)

• Change one thing at a time & record your results

49
Partitioning

• Apply after optimizing source, target, &


transformation bottlenecks
• Apply after optimizing memory usage
• Exploit under-utilized CPU & memory
• To customize partitioning settings, you need the
partitioning license

50
Rules for Adding Partition Points

• You cannot add a partition point to a Sequence


Generator
• You cannot add a partition point to an unconnected
transformation
• You cannot add a partition point on a source
definition
• If a pipeline is split and then concatenated, you
cannot add a partition point on any transformation
between the split and the concatenation
• Adding or removing partition points requires the
partitioning license

51
Guidelines for Adding Partition Points

• Make sure you have ample CPU bandwidth


• Make sure you have gone through other optimization
techniques
• Add on complex transformations that could benefit
from additional threads
• If you have >1 partitions, add where data needs to be
re-distributed
• Aggregator, Rank, or Sorter, where data must be grouped
• Where data is distributed unevenly
• On partitioned sources and targets

52
Partition Points & Partitions

• Partitions subdivide the data


• Each partition represents a thread within a stage
• Each partition point distributes the data among the partitions

Threads - partition 1
Threads – partition 2
Threads – partition 3

3 Reader Threads 3 Transformation Threads 3 more trans threads 3 Writer Threads


(First Stage) (Second Stage) (Third Stage) (Fourth Stage)

53
Rules for Adding Partitions

• The master input of a joiner can only have 1 partition


unless you add a partition point at the joiner
• A pipeline with an XML target can only have 1
partition
• If the pipeline has a relational source or target and
you define n partitions, each database must support
n parallel connections
• A pipeline containing a custom or external
procedure transformation can only have 1 partition
unless those transformations are configured to allow
multiple partitions

54
Rules for Adding Partitions

• The number of partitions is constant on a given


pipeline
• If you have a partition point on a Joiner, the number of
partitions on both inputs will be the same

• At each partition point, you can specify how you


want the data distributed among the partitions
(this is known as the partition type)

55
Cache Partitioning

• DTM may create separate caches for each


partition for each cached transformation; this is
called cache partitioning
• DTM treats cache size settings as per partition
for example, if you configure an aggregator with:
2 MB for the index cache,
3 MB for the data cache,
& you create 2 partitions–
DTM will allocate up to 4 MB & 6 MB total

• DTM does not partition lookup or joiner caches


unless the lookup or joiner itself is a partition
point

56
Monitoring Partitions

• The Workflow Monitor provides runtime details


for each partition
• Per partition, you can determine the following:
• Number of rows processed
• Memory usage
• CPU usage

• If one partition is doing more work than the


others, you may want to redistribute the data

57
Dynamic Partitioning

• Integration Service can automatically set the


number of partitions at runtime.
• Useful when the data volume increases or the
number of CPU’s available changes.
• Basis for the number of partitions is specified as
a session property

58
Concurrent Workflow Execution (8.5)

• Prior to 8.5

• Only one instance of Workflow can run

• Users duplicate workflows – maintenance issues

• Concurrent sessions required duplicate of session

59
Concurrent Workflow Execution

• Allow workflow instances to be run


concurrently
• Override parameters/ variables across run
instances
• Same scheduler across multiple instances
• Supports independent recovery/ failover
semantics

60
Workflow on Grid (WonG)

• Integration Service is deployed on a Grid – an IS


service process (pmserver) runs on each node in
the grid
• Allows tasks of a workflow to be distributed
across a grid – no user configuration necessary
if all nodes homogenous

61
Load Balancer Modes

• Round Robin
• Honors Max Number of Processes per Node

• Metric-based
• Evaluates nodes in round-robin
• Honors resource provision thresholds
• Uses stats from last 3 runs - if no statistics is collected yet,
defaults used (40 MB memory, 15% CPU)

62
Session on Grid (SonG)

• Session partitioned and dispatched across


multiple nodes
• Allows Unlimited Scalability
• Source and targets may be on different nodes
• More suited for large sessions
• Smaller machines in a grid is a lower cost option
than large multi-CPU machines

63
Configuring Session on Grid

• Enable Session on Grid attribute in session configuration


tab
• Assign workflow to be executed by an integration service
that has been assigned to a grid

64
SonG Partitioning Guidelines

• Set # of partitions = # of nodes to get an even


distribution
• Tip: use dynamic partitioning feature to ease expansion of
grid

• In addition, continue to create partition-points to


achieve parallelism

65
SonG Partitioning Guidelines

• To minimize data traffic across nodes:


• Use pass-through partition type which will try to keep
transformations on the same node
• Use resource-map to dispatch the source and target
transformations to the node where source or target are
located
• Keep the target files unmerged whenever possible (e.g. if
being used for staging)

• Resource requirement should be specified at the


lowest granularity e.g. transformation instead of
session (as far as possible)
• This will ensure better distribution in SonG

66
File Placement Best Practices

• Files that should be placed on a high-bandwidth shared


file system (CFS / NAS)
• Source files
• Lookup source files [sequential file access]
• Target files [sequential file access]
• Persistent cache files for lookup or incremental aggregation [random file
access]

• Files that should be placed on a shared file system but


bandwidth requirement is low (NFS)
• Parameter files
• Other configuration files
• Indirect source or target files
• Log files.

67
File Placement Best Practices

• Files that should be put on local storage


• Non-persistent cache files (i.e. sorter temporary files)
• Intermediate target files for sequential merge
• Other temporary files created during a session execution
• $PmTempFileDir should point to a local file system

• For best performance, ensure sufficient


bandwidth for shared file system and local
storage (possibly by using additional disk i/o
controllers)

68
Data Integration Certification Path
Level Certification Title Recommended Training Required Exams

Informatica Certified » PowerCenter QuickStart (eLearning) »Architecture & Administration;


Administrator » PowerCenter 8.5+ Administrator (4 days) »Advanced Administration

Informatica Certified » PowerCenter QuickStart (eLearning) »Architecture & Administration;


Developer » PowerCenter 8.5+ Administrator (4 days) »Mapping Design
» PowerCenter Developer 8.x Level I (4 days) »Advanced Mapping Design
» PowerCenter Developer 8 Level II (4 days)

Informatica Certified » PowerCenter QuickStart (eLearning) »Architecture & Administration;


Consultant » PowerCenter 8.5+ Administrator (4 days) »Advanced Administration
» PowerCenter Developer 8.x Level I (4 days) »Mapping Design
» PowerCenter Developer 8 Level II (4 days) »Advanced Mapping Design
»Enablement Technologies
» PowerCenter 8 Data Migration (4 days)
» PowerCenter 8 High Availability (1 day)

Additional Training:
» PowerCenter 8.5 New Features » PowerCenter 8 Team-Based Development
» PowerCenter 8.6 New Features » PowerCenter 8.5 Unified Security `
» PowerCenter 8 Upgrade

69
Q&A

Thomas Bennett
Informatica Professional Services
Senior Consultant

70
Appendix
Informatica Services by Solution

71
B2B Data Exchange
Recommended Services
B2B

Professional Services Education Services


Strategy Engagements Recommended Courses
• B2B Data Transformation • Informatica B2B Data
Architectural Review Transformation (D)
Baseline Engagements • Informatica B2B Data Exchange
(D)
• B2B Data Transformation
Baseline Architecture
Implement Engagements
• B2B Full Project Lifecycle
• Transaction/Customer/
Payment Hub

Target Audience for Courses


D = Developer M = Project Manager
A = Administrator
72
Data Governance
Recommended Services

Professional Services Education Services


Strategy Engagements Recommended Courses
• Informatica Environment • PowerCenter Level I Developer (D)
Assessment Service • Informatica Data Explorer (D)
• Metadata Strategy and Enablement • Informatica Data Quality (D)
• Data Quality Audit
Related Courses
Baseline Engagements • PowerCenter Administrator (A)
• Data Governance Implementation • Metadata Manager (D)
• Metadata Manager Quick Start
• Informatica Data Quality Baseline
Deployment Certifications:
• PowerCenter
Implement Engagements • Data Quality
• Metadata Manager Customization
• Data Quality Management
Implementation
Target Audience for Courses
D = Developer M = Project Manager
A = Administrator
73
Data Migration
Recommended Services
Data Migration

Professional Services Education Services


Strategy Engagements Recommended Courses
• Data Migration Readiness • Data Migration (M)
Assessment • Informatica Data Explorer (D)
• Informatica Data Quality Audit • Informatica Data Quality (D)
Baseline Engagements • PowerCenter Level I Developer (D)
• PowerCenter Baseline Deployment Related Courses
• Informatica Data Quality (IDQ), • PowerExchange Basics (D)
and/or Informatica Data Explorer • PowerCenter Administrator (A)
(IDE) Baseline Deployment
Certifications
Implement Engagements
• PowerCenter
• Data Migration Jumpstart
• Data Quality
• Data Migration End-to-End
Implementation

Target Audience for Courses


D = Developer M = Project Manager
A = Administrator
74
Data Quality
Recommended Services
Data Quality

Professional Services Education Services


Strategy Engagements Recommended Courses
• Data Quality Management Strategy • Informatica Data Explorer (D)
• Informatica Data Quality Audit • Informatica Data Quality (D)
Baseline Engagements Related Courses
• Informatica Data Quality (IDQ), • Informatica Identity Resolution (D)
and/or Informatica Data Explorer • PowerCenter Level I Developer (D)
(IDE) Baseline Deployment
• Informatica Data Quality Web Certifications
Services Quick Start • Data Quality
Implement Engagements
• Data Quality Management
Implementation

Target Audience for Courses


D = Developer M = Project Manager
A = Administrator
75
Data Synchronization
Recommended Services
Data
Synchronization

Professional Services Education Services


Strategy Engagements Recommended Courses
• Project Definition and Assessment • PowerCenter Level I Developer (D)
Baseline Engagements • PowerCenter Level II Developer (D)
• PowerExchange Baseline • PowerCenter Administrator (A)
Architecture Deployment
Related Courses
• PowerCenter Baseline Architecture
• PowerExchange Basics Oracle Real-
Deployment
Time CDC (D)
Implement Engagements • PowerExchange SQL RT (D)
• Data Synchronization • PowerExchange for MVS DB2 (D)
Implementation
Certifications
• PowerCenter

Target Audience for Courses


D = Developer M = Project Manager
A = Administrator
76
Enterprise Data Warehousing
Recommended Services
Data Warehouse

Professional Services Education Services


Strategy Engagements Recommended Courses
• Enterprise Data Warehousing (EDW) • PowerCenter Level I Developer (D)
Strategy • PowerCenter Level II Developer (D)
• Informatica Environment • PowerCenter Metadata Manager (D)
Assessment Service
• Metadata Strategy & Enablement Related Courses
• Informatica Data Quality (D)
Baseline Engagements • Data Warehouse Development (D)
• PowerCenter Baseline Architecture
Deployment Certifications
• PowerCenter
Implement Engagements
• EDW Implementation

Target Audience for Courses


D = Developer M = Project Manager
A = Administrator
77
Integration Competency Centers
Recommended Services
ICC

Professional Services Education Services


Strategy Engagements Recommended Courses
• ICC Assessment • ICC Overview (M)
Baseline Engagements • PowerCenter Level I Developer (D)
• ICC Master Class Series • PowerCenter Administrator (A)
• ICC Director
Implement Engagements Related Courses
• ICC Launch • Metadata Manager (D)
• ICC Implementation • Informatica Data Explorer (D)
• Informatica Production Support • Informatica Data Quality (D)
Certifications
• PowerCenter
• Data Quality

Target Audience for Courses


D = Developer M = Project Manager
A = Administrator
78
Master Data Management
Recommended Services
Master Data
Management

Professional Services Education Services


Strategy Engagements Recommended Courses
• Master Data Management (MDM) • Informatica Data Explorer (D)
Strategy • Informatica Data Quality (D)
• Informatica Data Quality Audit • PowerCenter Level I Developer (D)
Baseline Engagements Related Courses
• Informatica Data Explorer (IDE) • Metadata Manager (D)
Baseline Deployment • Informatica Identity Resolution (D)
• Informatica Data Quality (IDQ)
Baseline Deployment Certifications
• PowerCenter Baseline Architecture • PowerCenter
Deployment • Data Quality
Implementation
• MDM Implementation

Target Audience for Courses


D = Developer M = Project Manager
A = Administrator
79
Services Oriented Architecture
Recommended Services
Data Services

Professional Services Education Services


Strategy Engagements Recommended Courses
• Data Services (SOA) Strategy • PowerCenter Level I Developer (D)
Baseline Engagements • Informatica Data Quality (D)
• Informatica Web Services Quick Certifications
Start • PowerCenter
• Informatica Data Quality Web • Data Quality
Services Quick Start
Implement Engagements
• Data Services (SOA) Implementation

Target Audience for Courses


D = Developer M = Project Manager
A = Administrator
80
Governance, Risk & Compliance (GRC)
Recommended Services

Professional Services Education Services


Strategy Engagements Recommended Courses
• Informatica Environment • PowerCenter Level I Developer (D)
Assessment Service • Informatica Data Explorer (D)
• Enterprise Data Warehouse Strategy • Informatica Data Quality (D)
• Data Quality Audit
Related Courses
Baseline Engagements
• Data Warehouse Development (D)
• Informatica Data Quality Baseline
Deployment • ICC Overview (M)
• Metadata Manager Quick Start • Metadata Manager (D)

Implement Engagements Certifications


• Risk Management Enablement Kit • PowerCenter
• Enterprise Data Warehouse • Data Quality
Implementation

Target Audience for Courses


D = Developer M = Project Manager
A = Administrator
81
Mergers & Acquisitions (M&A)
Recommended Services

Professional Services Education Services


Strategy Engagements Recommended Courses
• Data Migration Readiness • Data Migration (M)
Assessment • PowerCenter Level I Developer (D)
• Informatica Data Quality Audit
Related Courses
Baseline Engagements • Informatica Data Explorer (D)
• PowerCenter Baseline Deployment • Informatica Data Quality (D)
• Informatica Data Quality (IDQ), • PowerExchange Basics (D)
and/or Informatica Data Explorer
(IDE) Baseline Deployment Certifications
• PowerCenter
Implement Engagements
• Data Quality
• Data Migration Jumpstart
• Data Migration End-to-End
Implementation

Target Audience for Courses


D = Developer M = Project Manager
A = Administrator
82
Deliver Your
Project Right
the First Time
with
Informatica
Professional
Services

83
Informatica Global Education Services

Joe Caputo, Director, Pfizer

"We launched an aggressive data migration project


that was to be completed in one year. The
complexity of the data schema along with the use of
Informatica PowerCenter tools proved challenging
to our top colleagues.

We believe that Informatica training led us to triple


productivity, helping us to complete the project on
its original 1-year schedule.”

84
Informatica Contact Information

Informatica Corporation Headquarters


100 Cardinal Way
Redwood City, CA 94063
Tel: 650-385-5000
Toll-free: 800-653-3871
Toll-free Sales: 888-635-0899
Fax: 650-385-5500

Informatica EMEA Headquarters Informatica Asia/Pacific Headquarters


Informatica Nederland B.V. Informatica Australia Pty Ltd
Edisonbaan 14a Level 5, 255 George Street
3439 MN Nieuwegein Sydney
Postbus 116 N.S.W. 2000
3430 AC Nieuwegein Australia
Tel: +31 (0) 30-608-6700 Tel: +612-8907-4400
Fax: +31 (0) 30-608-6777 Fax: +612-8907-4499
Global Customer Support
support@informatica.com
Register at my.informatica.com to open a new service request
or to check on the status of an existing SR.

http://www.informatica.com

85
Title: Times Bold 32 pt.; Title Case

Subtitle: Garamond bold 28 pt.; Sentence style


• Body: Arial Bold 26 pt.; Bullet: Normal text;
Sentence case
• Sub-bullets: Arial Bold 22 pt.; Bullet: Normal text;
• Second sub-bullet
• Third sub-bullet

86
Bullet Format

• Use “smart” quotes


• Fiscal years should be formatted as FY ’01
• Highlight in this color for overheads
• Hyperlink: www.informatica.com
• Correct dashes used: -, –, —

87
Chart Example

80 One Two Three Four

70

60

50

40

30

20

10

0
1st Qtr 2nd Qtr 3rd Qtr 4th Qtr

Source: Placeholder for notes, etc. 14 pt..

88
Pie Chart Example

20%
25%

1st Qtr
2nd Qtr
3rd Qtr
4th Qtr

20%

35%

Source: Placeholder for notes, etc. 14 pt..

89
Table Example

1st Qtr 2nd Qtr 3rd Qtr 4th Qtr


Secure 1 „ „ „
Secure 2 „ „ „
Secure 3 „ „
Secure 4 „ „ „
Secure 5 „ „ „ „
Secure 6 „ „
Secure 7 „ „
Source: Placeholder for notes, etc. 14 pt..

90
Text with Graphic on Right

• Body: Arial Bold 26 pt.;


Large Label
Bullet: Normal text;
Sentence case
• Sub-bullets: Arial 22 pt.; Bullet:
Normal text;
• Second sub-bullet
• Third sub-bullet

Small Label Small Label

91
Sample Quote

“When a billion wireless phones wake up in


the next few years, a distributed architecture
will be the only way people will be able to
receive the information they need.”

Anonymous
Title
Company

92
How To Apply Different Styles To Objects
Use this way if you see a color you want to use from the “Graphics Elements” slide

Step 1 Step 2 Step 3 Final Result

Select the object with Click on the object you Depending on the
the style and color you Click on the “Format Painter”. want to apply the new object the 3D effect
want to use If you do not see it then select style and color may look different
“View/Toolbars/Standard”
from the top menu bar

(To apply a style to more than one object


double click the “Format Painter” and then
click on the object you want to change.
When done click once more on the
“Format Painter” to turn it off.)

93
How To Apply Different Colors To Objects
Use this way if you don’t see a color you want to use from the “Graphics Elements” slide

Step 1 Step 2 Step 3 Step 4 Final Result

Right mouse click on Click on the “Color” Select Select the new
the object you want pop-up and select “More Colors” color you want and
to change and select “Fill Effects” click “OK”
“Format AutoShape”
from the pop-up

94
Color Palette

Primary colors Highlights

white white white white white white


Solid ⌧
text ⌧
text ⌧
text ⌧
text ⌧
text text

0-85-149 190-192-194 92-135-39 236-137-29 128-130-133 191-49-26

Gradient/ white white white white white white


effects text text text text text text

Use this style for PPT objects


To apply this style to an object:
Click on object with desired style, click the “format painter” tool in the toolbar

Then click on your new object to apply the style

95
PowerPoint Object Parts

Database_grouped Arrow Connectors

Circle Shape: Triangle Shape:


To change color: To change color:
Ungroup, change fill Ungroup, change fill
color of top object, color of top object,
Box/Rectangle Shapes: regroup regroup
To change color: Ungroup, change fill color
of top object, regroup

96
2008 PowerPoint Icon Library

• For an up to date version of our icon library,


please download the latest file from our
Marketing website on INFAnet:
http://infanet-prod/SiteDirectory/Departments/marketing/

• Sample icons from our library:

97
Informatica Logo

98
Informatica Logo

99
100
Slide Transitions

All slides: “wipe right”

101
Out-of-Bounds

Graphics and text should not extend into this red area

102

Anda mungkin juga menyukai