Ascential
PACK for SAP R/3
Ascential, DataStage, and MetaStage are trademarks of Ascential Software Corporation or its affiliates and may be
registered in the United States or in other jurisdictions.
ABAP, BW, R/3, and SAP are registered trademarks of SAP AG.
Microsoft, Windows, and Windows NT are registered trademarks of Microsoft Corporation in the United States and
other countries.
UNIX is a registered trademark in the United States and other countries, licensed exclusively through X/Open
Company, Ltd.
This product may contain or utilize third party components subject to the DataStage user documentation previously
provided by Ascential Software Corporation or contained herein.
PACK4SAPR3.book Page iii Thursday, January 15, 2004 11:31 AM
Preface
Organization of This Manual .......................................................................................... vii
Documentation Conventions .......................................................................................... viii
DataStage Documentation ............................................................................................. viii
Chapter 1. Installation
Platforms ........................................................................................................................ 1-1
Software Requirements .................................................................................................. 1-1
Installing the Server Component on Windows .............................................................. 1-3
Installing the Server Component on UNIX ................................................................... 1-4
Installing the Client Component .................................................................................... 1-5
Terminology ................................................................................................................... 1-5
Table of Contents v
PACK4SAPR3.book Page vi Thursday, January 15, 2004 11:31 AM
Index
Preface
This manual describes and explains the use of Version 5.0 of the Ascential PACK
for SAP R/3.
If you are new to DataStage, read the DataStage Designer Guide and the DataStage
Manager Guide. These give descriptions of the DataStage Designer and Manager,
and help you get started.
Documentation Conventions
This manual uses the following conventions:
Convention Usage
Bold In syntax, bold indicates commands, function names, and
options. In text, bold indicates keys to press, function names,
and menu selections.
UPPERCASE In syntax, uppercase indicates commands, keywords, and
options; statements and functions; and SQL statements and
keywords.
Italic In syntax, italic indicates information that you supply. In text,
italic also indicates UNIX commands and options, filenames,
and pathnames.
Courier Courier indicates examples of source code and system
output.
Courier Bold In examples, courier bold indicates characters that you type
or keys you press (for example, <Return>).
[ ] Brackets enclose optional items. Do not type the brackets
unless indicated.
{} Braces enclose nonoptional items from which you must
select at least one. Do not type the braces.
itemA | itemB A vertical bar separating items indicates that you can choose
only one item. Do not type the vertical bar.
... Three periods indicate that more of the same type of item can
optionally follow.
➤ A right arrow between menu options indicates you should
choose each option in sequence. For example, “Choose
File ➤ Exit” means you should choose File from the menu
bar, then choose Exit from the File menu.
DataStage Documentation
DataStage core documentation is available online in PDF format. You can read
them with the Adobe Acrobat Reader supplied with DataStage. See Ascential
installation documentation for details on installing the manuals and the Adobe
Acrobat Reader.
Online help is also supplied for DataStage and the Ascential PACK for SAP R/3.
1
Installation
Install server and client components for Version 5 of the Ascential PACK for SAP
R/3 on the DataStage server and client systems respectively. This PACK includes
the following plug-ins and utility:
• ABAP Extract. Lets DataStage extract data from the R/3 Repository using
the ABAP extraction program generated by the plug-in.
• IDoc Extract. Lets DataStage capture IDocs from R/3 source systems to be
used as source data for DataStage job data streams.
• IDoc Load. Generates IDocs to load data into SAP R/3
• BAPI. Loads data into and extracts data from SAP R/3 Enterprise.
• Administrator for SAP. Manages the configurations of R/3 connection and
IDoc type objects.
Platforms
Because the plug-ins for the Ascential PACK for SAP R/3 are packaged together,
you install all plug-ins (you cannot optionally install an individual plug-in). The
product is distributed in one of the following ways, depending on your platform:
• Windows. One CD-ROM.
• UNIX. Two CD-ROMs. The server components are distributed on a sepa-
rate CD-ROM from the client component. Please ensure that you have also
received the Windows CD-ROM to install the DataStage client components.
Installation 1-1
PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM
Software Requirements
For information about configuration requirements for DataStage and the latest
information about DataStage, see the instructions supplied with your DataStage
installation CD and the online readme.txt file for your platform. For other installa-
tion prerequisites, see the respective sections in Chapter 2 for the ABAP Extract
plug-in, Chapter 3 for IDoc Extract, Chapter 4 for IDoc Load, and Chapter 5 for
BAPI.
The following are required for the Ascential PACK for SAP R/3, depending on
your platform:
• Windows. Use one of the following:
Windows NT 4.0 with Service Pack 6A or later
Windows 2000 Professional/Server/Adv. Server with Service Pack 2 or
later or
Windows XP Professional
• UNIX. Use one of the following:
Sun Solaris 2.7, 2.8
IBM AIX 4.3.3, 5.1
HP HP-UX 11.0, 11i
Linux RedHat Linux 7.3
HP/Compaq Tru64 (on request)
• SAP R/3 4.0B, 4.5B, 4.6C, 4.7 or later
• SAP RFC library:
DataStage client and server (Windows). SAP RFC client library librfc32.dll
6.10 or later.
DataStage server (UNIX). Thread-safe, shared SAP client library, 6.10 or
later. The name varies depending on your platform, for example:
If you are installing the client and server components of the plug-ins
on Windows, librfc32.dll must exist in the Windows system directory.
If you install the client on Windows and the server on Unix,
librfc32.dll must still exist on the Windows system. Additionally, you
must ensure that the library (the name depends on the specific UNIX
platform) exists in the <dshome>/lib directory, where <dshome>
corresponds to the DataStage home directory. (We recommend that
the library file reside in your user environment.)
To find and change to your DataStage home directory, enter the
following at the UNIX prompt:
# cd `cat /.dshome`
pwd displays the working directory, for example:
# pwd
/u1/dsadm/Ascential/DataStage/DSEngine
# cd lib
# pwd
/u1/dsadm/Ascential/DataStage/DSEngine/lib
If the SAPGUI front-end has been installed on your Windows NT
system, this library may already be installed. However, you should
check it to insure that it is the correct version.
If the front-end is not installed, see your SAP Administrator to
obtain this library from SAP. See Note 19466 on the SAP Service
Marketplace web site for details about obtaining downloads. Addi-
tionally, see “Configuring the SAP/Dispatch/Gateway Service” on
page A-12 for additional configuration requirements.
• DataStage 6.0 or later on the DataStage client and server machines
Install the server component from the CD-ROM as described in the following
sections for Windows NT and UNIX. To install the client component, see
“Installing the Client Component” on page 1-7.
Note: Before installing the Ascential PACK for SAP R/3, you should shut down
all SAP-related software, specifically: SAP Frontend, SAP Internet
Graphics Server service (if present). Otherwise, errors can occur during
the install.
Installation 1-3
PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM
R/3, the IDoc Extract for SAP R/3, the IDoc Load for SAP R/3, and the BAPI for
SAP R/3 plug-ins, described in Part 1, Part 2, and Part 3 respectively of this tech-
nical bulletin.
1. Log in to the DataStage server host system as root or dsadm.
2. Mount the CD-ROM containing the server components.
3. Change directories to the mounted CD-ROM.
4. Run the installation script. Depending on your platform, the format of
files on the CD-ROM may differ, for example:
The installation script installs the RFC listener sub-system first. If an entry in
the dsenv file for DSSAPHOME does not exist, you are prompted to add it, for
example:
DSSAPHOME=/u1/dsadm/Ascential/DataStage;export DSSAPHOME
For more information about the dsenv file, see DataStage Install and Upgrade
Guide.
5. Install the plug-ins (they are all automatically installed and all projects
are updated).
See DataStage Manager documentation for information about registering
plug-ins for new projects.
Installation 1-5
PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM
Depending on your platform, the format of files on the CD may differ, for
example, on AIX, the plug-ins are in /cdrom/packages.
Old DataStage jobs using the plug-ins are upgraded for compatibility with
the new version of the plug-ins.
Note: Do not use dspackinst to perform the initial installation of the plug-ins.
If you must install the ABAP Extract plug-in, you cannot use dspackinst. The plug-
in must be installed by registering it with the DataStage Manager.
To register the ABAP Extract plug-in:
1. From the DataStage client, start the DataStage Manager and log in to the
desired project.
2. Select Tools ➤ Register Plug-In… from the menu bar.
3. In the dialog, browse or enter the path of the plug-in, for example,
dsr3enu.so (or dsr3jpn.so for a Japanese installation).
4. Enter Server\PACKS in the Category field.
5. Select the Parallel stage type required check box.
6. Click OK.
8. Click OK.
Note: You must use a username and non-blank password for DataStage logon
credentials to use the IDoc plug-ins. This means you must clear Omit on
the Attach to Project dialog in DataStage, otherwise, unexpected results
occur.
Installation 1-7
PACK4SAPR3.book Page 8 Thursday, January 15, 2004 11:31 AM
Terminology
The following table describes the terms used in the Ascential PACK for SAP R/3
to describe all plug-ins:
Term Description
ABAP Advanced Business Application Programming. The
language developed by SAP for application development
purposes. All R/3 applications are written in ABAP.
BAPI Business Application Programming Interface. A precisely
defined interface providing access to processes and data
in business application systems. BAPIs are defined as
API methods of SAP objects. These objects and their
methods are stored in the Business Objects Repository.
BOR Business Object Repository, which is the object-oriented
Repository in the SAP BW system. It contains, among
other objects, SAP Business Objects and their methods.
Business Object The representation of a business entity, such as an
employee or a sales order, in the SAP R/3 system.
Control record A special administrative record within an IDoc, one for
each IDoc. The control record contains a standard set of
fields that describe the IDoc as a whole.
ERP Enterprise Resource Planning business management
software.
IDoc Intermediate document. An IDoc is a report, that is, a
hierarchal package of related records, generated by SAP
R/3 in an SAP proprietary format. An IDoc, whose trans-
mission is initiated by the source database, exchanges
data between applications.
IDoc type Named meta data describing the structure of an IDoc that
is shared across databases. It consists of a hierarchy of
segment record types.
MATMAS01 An example of an IDoc type.
PACK Packaged Application Connection Kit. Accesses and
extracts data from and loads data to SAP R/3.
PSA Persistent Staging Area.
R/3 Real time/three tiers.
Term Description
RFC Remote Function Call. The SAP implementation of RPC
(Remote Procedure Call) in ABAP. It calls a function
module that runs on a different system from the calling
function. The Remote Function Call can also be called
from within the same system, but usually the caller and
callee are dispersed.
RFM Remote Function Module. A function that belongs to a
BOR object type and has a BAPI method name.
SAP Systems, Applications, and Products in Data Processing.
SAP is a product of SAP AG, Walldorf, Germany.
SCM Supply Chain Management. The solution that tracks
financial, informational, and materials processes and
identifies processing exceptions.
Segment A record within an IDoc that is identified by a segment
number.
Segment type A named record definition for segments within an IDoc
that is one level in the hierarchy of segment types within
an IDoc type.
tRFC port Transactional RFC port.
Variant A collection of predefined criteria, similar to a group of
values used as parameters.
Variants are attached to various processes used by
DataStage for the ABAP program process, for example.
The ABAP Program referenced by the ABAP Program
process itself has a variant attached to it.)
Installation 1-9
PACK4SAPR3.book Page 10 Thursday, January 15, 2004 11:31 AM
2
The ABAP Plug-In
This chapter describes the ABAP Extract plug-in, which is part of Version 5.0 of the
Ascential PACK for SAP R/3 for DataStage 6.0 or later. Use the ABAP Extract plug-
in to let DataStage extract data from the R/3 Repository using the ABAP extraction
program generated by the plug-in.
It describes the following for the ABAP Extract plug-in:
• Functionality
• Installation prerequisites
• Integrating DataStage with SAP R/3 systems
• Creating a DataStage job
• Defining ABAP Extract stage properties
• Extracting data from SAP R/3
• SAP R/3 table type support
• SAP R/3 data type support
DataStage provides enhanced SAP support with its Packaged Application
Connection Kit (PACK) for SAP R/3.
The ABAP Extract plug-in lets your company maximize its existing investments in
large ERP systems by complementing SAP standard software and consulting
services. It does this by automatically generating the ABAP program to extract
data from the R/3 Repository. The ABAP Extract plug-in lets users of all levels
efficiently build an extraction object, then generate an extraction program written
in the SAP proprietary ABAP programming language. Or, you can use an SQL
query to generate the ABAP program.
See“Terminology” on page 1-7 for a list of the terms used in this chapter.
Functionality
The ABAP Extract plug-in has the following functionality and enhancements:
• Lets you choose and define SAP connections using the GUI. You can select
a DataStage connection to SAP instead of entering the information into the
stage interface.
• Uses the ABAP Program page for all ABAP-related functionality.
• Lets you view the development status of the ABAP program.
• Lets you view referenced and referring tables.
• Lets you use the SQL Query Builder or an extraction object to generate the
ABAP program.
• Lets you overwrite the ABAP program as you save it to R/3.
• Uses the Data Transfer Method page for the functionality of the data
transfer methods.
• Uses the functionality of the former Access page for RFC-connection fail-
ures as you exit the stage editor.
• Lets you synchronize and validate R/3 columns and DataStage columns.
• Supports NLS (National Language Support). For information, see
DataStage Server Job Developer’s Guide.
The following functionality is not supported:
• Data transformations or mappings. Use the Transformer stage to do this.
• The use of subqueries in the WHERE and HAVING clauses for SQL
queries.
The functionality of the ABAP Extract plug-in can be represented in the following
diagram where the ABAP program is automatically generated to extract data from
the R/3 Repository.
R/3 Extraction Plug-In Architecture
1 4
DataStage
Repository
Installation Prerequisites
In addition to the software requirements for the Ascential PACK for SAP R/3
described on page 1-2, the following are required for installing the ABAP Extract
plug-in:
• FTP server on the machine where the temporary extraction files from the
SAP R/3 are stored (this must be the same machine as the SAP application
server)
• Database account with read access to the SAP R/3 data dictionary tables
server system, and lets you view meta data for the SAP R/3 fields. For
details, see “Defining Output Properties” on page 2-8.
• Columns. Lets you view the meta data of the SAP R/3 field that corre-
sponds to the currently selected DataStage column. You can also
synchronize and validate columns (see “The Output Columns Page” on
page 2-46).
This page works similarly to the General tab of the BW Load Input page for the
BW Load plug-in (for details, see Ascential PACK for SAP BW, 00D-025DS60). You
can use the DataStage server to access the list of connections to SAP, which is stored
in a file.
By default, a newly created stage uses the connection most recently selected by the
current user for other stages of this type. Since the number of R/3 systems accessed
by a given DataStage installation is likely to be small, the default connection is
generally correct.
Enter the following information on the General tab:
• DataStage Connection to SAP. The DataStage connection to the SAP R/3
system that is defined on the DataStage server machine and shared by all
DataStage users connected to that machine. The fields in this area are read-
only and are obtained from the connection that you selected.
Name. The name of the selected connection to the SAP R/3 system that
generates the data to be extracted.
Select… . Click to choose a DataStage connection to the SAP R/3 system.
The selected connection provides all needed connection and default logon
details that are needed to communicate with the corresponding SAP R/3
system. This opens the Select DataStage Connection to SAP dialog (see
“Defining SAP Connection and Logon Details” on page 2-10). You can add
new entries here and modify or delete existing entries from the list of
connections.
Description. Additional information about the selected connection.
Application Server. The name of the host system running R/3.
System Number. The number assigned to the SAP R/3 system used to
connect to R/3.
• SAP Logon Details. User Name, Client Number, and Language default to
the values last entered by the current user for the ABAP Extract stage.
User Name. The user name that is used to connect to SAP.
Password. A password for the specified user name.
Client Number. The number of the client system used to connect to SAP.
Language. The language used to connect to SAP.
• Description. Enter text to describe the purpose of the stage.
• Validate Stage… (formerly Job Validation). Click to open the Validation
Stage dialog. The validation process begins automatically when the dialog
opens.
See “Validating Run-time Jobs” on page 2-49 for details.
Extraction Object, Review Code, and Launch SAP GUI (formerly on the General
page) are now on the ABAP Program page (see“The Output ABAP Program Page”
on page 2-18).
The selected connection provides all needed connection and default logon details
to communicate with the corresponding SAP R/3 system. You can add new entries
here and modify or delete existing entries from the list of connections.
Although SAP logon details for the ABAP Extract plug-in are not stored on the
server, the plug-in can share SAP connections created by other R/3 plug-ins you
create, such as the BAPI, IDoc Load, and IDoc Extract plug-ins.
New…, Properties…, and Remove are administrative connection operations.
They let you manage the list of connections that is maintained on the DataStage
server machine from this dialog:
• New… . Click to open the Connection Properties dialog, which lets you
define the properties for the new connection. It is added to the list of
connections. Here you can specify SAP connection details and default
logon details for the new connection. The Connection and Logon Details
page appears by default (see the following section).
• Properties… . Click to open the Connection and Logon Details page of the
Connection Properties dialog, showing the properties of the selected
connection. This is the same dialog that is opened when you click New…,
but in this context the connection name is read-only (see “Defining Connec-
tion Properties” on page 2-10).
• Remove. Click to delete the selected connection after your confirmation.
These administrative operations function similarly to the corresponding buttons
on the DataStage Connections to SAP page of the DataStage Administrator for
SAP utility (see “The Administrator for SAP Utility,” beginning on page 6-1).
fully-qualified node name for your RFC server machine in the SAP router
table.
2. Select Use load balancing to use load balancing when connecting to R/3.
The Application Server and the System Number controls are replaced by
Message Server, System ID, and Group controls so that connection
details specific to load balancing can be entered (see “Load Balancing” on
page 2-13).
3. Specify the CPI-C Connection details. These values are read-only and
match the entries in SAP Connection Details if Use load balancing is
selected. Otherwise, the fields are editable and define a non-load
balancing connection that can be used at run time for CPI-C processing.
Load Balancing
Select Use Load balancing on the Connection and Logon Details page of the
Connection Properties dialog (see page 6-5) to balance loads for SQL queries
when connecting to the R/3 system. Load balancing at design time works as
follows:
• R/3 lets you make logon connections through a message server. The
message server uses an algorithm that considers server workload and
availability to choose an appropriate application server to handle the
logon.
• When connections are configured, you can choose a load balancing connec-
tion to a message server rather than a specific R/3 instance to retrieve and
validate IDoc types and meta data, for example.
This page displays the selected Data Transfer Method, which controls how to
process the dataset. Choose one of three data transfer methods:
• CPI-C Logon. Default. Transfers data from the SAP application server
directly to the DataStage server, without using FTP or other utilities.
The Local File option has no effect, that is, no flat files are generated in the
SAP application server. The SAP user name must be the CPI-C user type.
• FTP. Uses the FTP service in the SAP application server to get the dataset
from the SAP application. (The buttons for ABAP program loading in the
former version are now on the ABAP Program page. See page 2-18 for
more information.)
Choosing FTP enables the FTP Logon fields, which are required. Enter the
pathname for the data file on the remote system to use during run time in
Path of Remote File. This pathname is used to access the data file when you
may not have direct access to the data file. The data file contains the data
extracted from R/3. This file is transferred back to the DataStage server
system.
Alias for Remote Path. Optional. You cannot access the data file using the
pathname in the Path of Remote File field, for example, if the system
administrator has restricted access to a directory in the path. In this case, use
a relative pathname to access the file. If the DataStage server cannot transfer
the data file using the path specified in the Path of Remote File field, it uses
the path in the Alias for Remote Path field to access the file.
Alternate Host Name. Specify an alternate host name for FTP to use as an
alias to access files on the application server’s machine (where the ABAP
program runs). This is necessary if, for security reasons, FTP cannot access
files on the application server’s machine using the application server name.
Run the program as a background process. Specifies whether to run ABAP
programs in the background (the default is cleared). For details, see
“Running Programs in the Background” on page 2-15.
• Local File. Use this option if the file containing the dataset cannot be
accessed using FTP. The resultant dataset file can be placed on the
DataStage server machine by the Administrator, and Datastage can then
access it from there using the Local File option.
Choosing this option enables the Local Data File field and Browse… . Type
a name for the data source file, or click Browse… to search for a name. SAP
R/3 needs write access to this file.
Note: When you switch between CPI-C and FTP or vice versa after gener-
ating ABAP code, a message tells you to regenerate the code.
• Use SAP Logon Details. If selected for a CPI-C data transfer method, User
Name and Password are read-only and display the corresponding values
from the General page.
The job name is the same as the name of the generated ABAP program. This
helps you easily identify the job.
Your job disappears from the list after the ABAP run time deletes it.
This includes program generation (formerly done using the Extraction Object
dialog on the Access General page), editing, saving, and validation.
The button is disabled after you generate the program, and all the
previously disabled buttons are enabled, including Edit Program and Editor
Options.
• Load Program to R/3. Appears if you need to load the program to R/3.
• ABAP Workbench. Opens the SAP GUI, replacing Launch SAP GUI on the
General page in the former version.
• Save Program as File… , Clear Program, and Save Program as File… are
the other buttons you see, if appropriate.
• Edit Program… . Lets you edit the program and save the modifications.
• Editor Options… . Opens a dialog that lets you indicate what editor to use
to edit the program.
• Validate Program. Verifies the program syntax. If errors exist, the ABAP
editor for the stage opens and highlights the error, similar to validation
done from the editor.
• ABAP Workbench invokes the SAP GUI, (replacing Launch SAP GUI on
the General page in the previous version).
Note: When you save the ABAP program to R/3, you see a warning if a program
with the same name already exists on the R/3 system (you can overwrite
the program).
You can also click Load Program to R/3, Save Program as File… , Clear program,
Edit program… , Editor options… , or Validate program if appropriate.
If you select Build SQL Query as the Generation Method on the Output ABAP
Program page and click Build… , the Build Open SQL Query dialog opens with
the Tables page on top:
The Build Open SQL Query dialog contains the Tables (the default), Select,
Where, Having, Order By, and SQL pages.
The generated SQL is customized to match the specific SQL syntax of the source
database and conforms to the SAP proprietary Open SQL syntax used by ABAP
programs.
The first five pages of the Build Open SQL Query dialog correspond to a clause in
the generated SQL. The SQL page shows the complete SQL statement as
determined by the settings in the preceding pages.
Click > and the selected table appears in the Selected tables list.
Join Type determines the type of join (see Join to -> buttons) between two or more
specified tables as follows. The tree representation of join nesting is described later.
• Inner >. Adds the table using inner join.
• Left >. Creates other types of joins.
• Right >. Creates other types of joins.
Click < to remove the selected table or join.
Click << to remove all tables and joins.
Show Related Tables displays tables related to the selected table through foreign
or primary keys. The related tables are listed in two folders that appear in a tree
structure below the selected table. These folders are labelled Referenced Tables
and Referring Tables. When related tables are added to the join tree (Selected
Tables), default join conditions are created automatically based on the key
relationships.
The join condition and join type of the selected join are shown in controls below
the join tree. Through these controls, you can modify the properties of the join.
The fields in Available columns include the Short Text for each field to help you
identify the ones you want.
Click Find to open a dialog that lets you search for a field in Available columns or
Selected columns, depending on where the focus is when you click Find.
Use the Key column checkbox to specify which columns should be used as key
columns for the link.
When you click OK, the entire query is validated. The plug-in validates the syntax
of the Where and Having tabs and verifies the presence of at least one table and at
least one selected column. Errors are reported as warnings, but you can exit the
dialog without fixing the errors.
You can use DataStage job parameters on the Where tab of the SQL Builder by
typing the parameter reference into one of the grid boxes. In this case, the
parameter references are in the standard format that is used elsewhere in
DataStage (#PARAM#, where PARAM is the name of the job parameter).
The Open SQL statement is like the statement that appears in the generated ABAP
program, but without the INTO clause.
The SQL page displays the conversion of parameter references to the format
required by ABAP. This conversion is done automatically when the SQL is
generated.
Click OK to accept the SQL statement or Cancel to discard all the input.
About Navigation
When you navigate the SAP R/3 Module Tree or the SAP R/3 Logical Database
Tree, double-click an entry to expand it. Because SAP R/3 is such a large system,
load meta data only when you want to interact with the system.
In the Extraction Object Tree, the root is the highest level and the field is the lowest
level. Since DataStage processes input as a rowset, the second level must only have
one table.
The following navigation guidelines pertain to the Build Extraction Object dialog.
Commonly executed operations have corresponding buttons and shortcut menu
commands.
• You cannot delete the root.
• You can relocate a table by selecting and dragging it to the desired location.
• A table can become a subtree of another table tree.
• You can delete tables by clicking Delete or choosing Delete from the
shortcut menu.
• You can move a column within one table, but you cannot move it outside
that table to another table.
• You cannot move columns to other columns.
• You can delete all or selected columns associated with a table by clicking
Delete, choosing Delete from the shortcut menu, or pressing Alt+e.
Adding a Table
1. Click Add Tables…, then click From Search…, or choose Add Tables from
the Extraction Object Editor shortcut menu. The R/3 Table Searching &
Adding dialog box appears.
2. To find a table, do one of the following:
• Type a full or partial table name using the * or % wildcard in the Table
Name field. For example, you can type t00, t00*, or t00%. You can use * or %
interchangeably in one table name.
• Type a description in the Table Description field. You can type a full or
partial name using a wildcard (* or %). This name is case-sensitive.
• Create a combination of table name and table description values. Use the
AND or OR option buttons to specify the Boolean relationships among the
selection criteria.
Click Search. The Search Results for Table grid gets filled in. The dialog box
name is replaced by “Searching result: n entries found.”
• Highlight one or more table columns, then click Add to EO Tree or choose
Add to Extraction Object from the Table Definition Operations shortcut
menu. The columns are added to the Build Extraction Object dialog. Close
the Table definition dialog.
• Click Print, or choose Print from the Table List Operations shortcut menu
to send the search results to the printer.
• Click Save As, or choose Save As from the Table List Operations shortcut
menu to save the search results to a flat file.
4. To view the table contents, click Table Content, or choose Table Content
from the Table List Operations shortcut menu. The Table Contents
dialog box appears:
The table content list on the left under Select Field for Selection Criteria
displays the associated column details. You can search for a string in the field
name or a description using the Find What field, the Direction list, and Find.
Using Find What is especially helpful for searching large tables.
You must specify selection criteria in order to view the table contents.
a. To add a single value, select the column for which you are specifying the
condition. Select a comparison operator from the Operator list under
Single Value Operation.
Type the comparison value in the Value field. See “Selection Criteria Data
Types” on page 2-47 for information on entering values.
Click Add Single Value. The selection criteria is added to the SQL Selec-
tion Criteria box.
b. You can add a range of values in the same manner. Select the column for
which you are specifying the condition. Select INCLUDE or EXCLUDE
from the Operator list under Range Operation. Type the first value in the
range in the first Range of Values field and the last value in the range in
the second field. See “Selection Criteria Data Types” on page 2-47 for infor-
mation on entering the values. Click Add Range. The criteria is added to
the SQL Selection Criteria box.
Selecting INCLUDE or EXCLUDE produces output similar to the
following:
– INCLUDE valueA, valueB will get condition
as field >= min(valueA, valueB)
AND
field <= max(valueA valueB),
– EXCLUDE valueA, valueB will get condition
as field > max(valueA, valueB)
OR
field < min(valueA, valueB),
You can create a complex condition by using a combination of single and
range values. Use the AND or OR option buttons to specify the Boolean
relationships among the selection criteria.
Note: If you make an error in specifying the condition, you can remove
the condition from the SQL Selection Criteria box by clicking
Clear Condition. This removes the entire selection criteria defini-
tion from the box. You can also manually edit the SQL condition.
c. If you do not specify any fields, this plug-in by default loads every field
back into the Fields for Selection list. If you use RFC to connect to the SAP
application server and the table is large (that is, the record length is large),
then you can load nothing back. In this case, make sure that the record you
want to view is less than 512 bytes long. After you specify the selection
criteria, click View Contents. A Table Content dialog showing the table
contents appears. The rows are displayed in 500-row lots.
Select one of the following using the buttons or commands on the Table
Content Operations shortcut menu:
• Click Save As… to save the table contents to a flat file using the Save As
dialog box.
• Click Print… to send the table contents to the printer.
Close this dialog and the Table Content dialog to go to the R/3 Table
Searching & Adding dialog box.
5. After you verify that you have the correct table, select the table from the
grid in your R/3 Table Searching & Adding dialog box, and click Add to
Extraction… .
Select one of the following using the buttons or the shortcut menu on the R/3
Table Searching & Adding dialog box:
• Select Save As to save the table columns to a flat file using a standard Save
As dialog box.
• Select Print to send the table columns to the printer.
6. Close the R/3 Table Searching & Adding dialog box. The Build Extrac-
tion Object dialog now shows the added table.
3. The lowest level of any node in a module tree is a table. Right-click a table
to see its table definition or table content, or add it to the Build Extraction
Object dialog. For more information about the table definition and table
content, see “Adding a Table” on page 2-33.
Note: You can select multiple fields or multiple tables at the same time, but you
cannot select both fields and tables at the same time.
3. The lowest level in the logical database tree is a table. Right-click a table
to see its table definition or table content, or add it to the Build Extraction
Object dialog. For more information about the table definition and table
content, see “Adding a Table” on page 2-33.
Note: You can select multiple fields or multiple tables at the same time, but you
cannot select both fields and tables at the same time.
2. Select a column from the Table list. You can search for a string in the field
name as well as a description using the Find What field, the Direction
list, and Find. Using Find What is especially helpful for searching large
tables.
3. Specify a single value for a field, or use the list to specify job parameters.
Click Add Single Value to add the selection criteria to the ABAP SQL
Condition box.
4. Specify a range of values for a field, or use the drop-down list to specify
job parameters. Click Add Range to add the selection criteria to the
ABAP SQL Condition box.
5. Click OK to add the condition to the Build Extraction Object dialog. You
can see the condition statement at the end of the table description in the
tree.
6. If you create a join condition, specify the SQL condition for the child
table. The parent table is displayed in the Join Table list. Double-click the
parent table to list its fields in the Join Field list.
7. Select the column name from the Table list. Select the column on which
you are doing the join from the Join Field list. Click Add Relationship to
add the condition to the ABAP SQL Condition box.
You can specify additional columns for the join condition. Select the columns
in the same manner, and click either AND or OR to specify the Boolean
condition.
8. Click OK to add the join condition to the Build Extraction Object dialog.
The condition is added to the table description in the tree.
Note: If you make an error in specifying the condition, you can remove the
condition from the ABAP SQL Condition box by clicking Clear Condi-
tion. This removes the entire selection criteria definition from the box.
You can also manually edit the SQL condition. You can then review
your SQL statement by clicking View SQL.
9. Highlight any entry, and click Property… on the Build Extraction Object
dialog to look at its properties and values in the Properties window.
2. You can use job parameters in the SQL Condition Builder dialog by
using the syntax: DS_JOB_PARAM@job_parameter. In the following
example, LANG_LOW and LANG_HIGH specify a range of values in the
SQL condition. Likewise, you can use them in a single value or a join
operation.
Click Add Range to add the range condition to the ABAP SQL Condition box.
3. Generate the ABAP code. Save and compile the job when your job design
is complete. When you run this job, DataStage prompts for values for
each of the job parameters that are defined for this job. At run time, these
values are passed to the ABAP program that performs the data extraction
from SAP R/3.
generated by letting you enter lines of ABAP code that are automatically inserted
into the generated program.
This lets you regenerate the program with all your custom insertions after you
modify the SQL query.
This dialog has the following components:
• Additional Program Header Comments
• Additional Commands to Execute just before the Program Starts
• Additional Commands to Execute just before the Program Exits
• Set as Default Options. If selected, the text in the three edit controls
appears as the defaults for these controls when the current user creates new
ABAP Extract stages.
SAP R/3
Data Type Description Examples
ACCP Posting period YYYYMM Type the value in single quotation marks.
Type a year and month in the format
YYYYMM, for example, ‘199906’.
CHAR Character strings Type the value in single quotation marks,
for example, ‘xyz’.
CLNT Client Type the value in single quotation marks,
for example, ‘800’ (3-digit).
CUKY Currency key, referenced by Type the value in single quotation marks,
CURR fields for example, ‘USD’ for US dollars, or
‘DEM’ for German marks.
CURR Currency field, stored as DEC Type a numeric value, for example,
500.00.
DATS Date field (YYYYMMDD) Type the value in single quotation marks.
stored as char (8) Type year, month, and day in the format
YYYYMMDD, for example, ‘19990623’.
DEC Counter or amount field with Type a numeric value without quotation
comma and sign marks, for example, 8.0 or –8.0.
FLTP Floating-point number, accu- Type a numeric value without quotation
rate to 8 bytes marks, for example, 8.0 or –8.0.
INT1 1-byte integer, decimal number Type a numeric value without quotation
<= 254 marks, for example, 1 or 2 through 254.
INT2 2-byte integer, only used for Type a numeric value without quotation
length field before VARC or marks, for example, 1 or 2 through 32655.
RAW
INT4 4-byte integer, decimal number Type a numeric value without quotation
with sign marks, for example, 1 or 2 through 232–1.
SAP R/3
Data Type Description Examples
LANG Language key Type a 1-digit language identifier. Type
the value in single quotation marks, for
example, ‘E’ for English or ‘F’ for French.
LCHR Long character string, requires Do not use this data type for specifying a
preceding INT2 field selection condition.
LRAW Long byte string, requires Do not use this data type for specifying a
preceding INT2 field selection condition.
NUMC Character field with only digits Type a numeric value without quotation
marks, for example, 8.0 or –8.0.
QUAN Quantity field, points to a unit Type a numeric value without quotation
field with format UNIT marks, for example, 8.0 or –8.0.
RAW Uninterpreted sequence of Do not use this data type for specifying a
bytes selection condition.
TIMS Time field (hhmmss), stored as Type the value in single quotation marks.
char (6) Type hour, minutes, and seconds in the
format hhmmss, for example, ‘091024’.
UNIT Unit key for QUAN fields Type a value in single quotation marks,
for example, ‘pk’.
Note: DataStage cannot handle an LRAW field that is a cluster in an SAP R/3
database.
Validation is based on the Data Transfer Method for FTP, CPI-C, or Local File
transfers which you specify on the Data Transfer Method tab of the Output page.
(The system automatically supplies this data transfer method information for the
Runtime Validations dialog.)
Stage validation also provides a consolidated view of the stage options and
displays the various validation steps. It uses the following graphical status lights
to do this:
• Green - success
• Orange - warning
• Red - failure
In addition, system error and status messages appear beside the affected items in
the form of text tips.
FTP Transfers
This example of the Runtime Validations dialog displays successful validations
for FTP data transfers:
CPI-C Transfers
You can also perform validations for CPI-C data transfers. For these transfers, the
following operations are considered:
• SAP Connection/Logon. SAP connection and logon are validated using the
details specified on the General tab of the Output page. However, if you
specify CPI-C User Name and Password in the CPI-C logon details section,
these values are used instead. After a successful logon, the system also
determines whether the user type is CPI-C and indicates this with a light.
• Check installed RFC. This component is shipped with the Ascential PACK
for SAP R/3 and must be installed on the SAP R/3 system. The validation
routine checks whether the correct version of RFC (Z_RFC_DS_SERVICE)
is installed and activated on the SAP R/3 system.
• ABAP Options. Based on the method selected to load the ABAP program,
this section requires one of the following validation sequences:
– Design time Load and Manual Load. The validation routine only checks
whether the ABAP program exists in the DataStage Repository.
– Runtime Load. The validation routine also checks whether the ABAP
program is syntactically correct.
3
The IDoc Extract Plug-In
Introduction
The SAP R/3 suite of applications supports ERP (Enterprise Resource Planning),
integrating the supply-chain processes of a company. An IDoc (intermediate
document) is a report, that is, a hierarchal package of related records, generated by
SAP R/3 in an SAP proprietary format. An IDoc, whose transmission is initiated
by the source database, exchanges data between applications. It provides a
standard format for exchanging data with SAP R/3. Individual IDocs contain the
data that make up a business transaction (for example, a sales order) or master data
(for example, material master) and include segment records and a control record.
Part 2 of this technical bulletin describes the DataStage IDoc Extract for SAP R/3
plug-in stage, which lets DataStage capture IDocs from R/3 source systems to be
used as source data for DataStage job data streams. It lets you browse SAP R/3
IDoc meta data, select IDoc types to process, and extract the data from IDocs.
This part of the technical bulletin describes the following for the IDoc Extract for
SAP R/3 plug-in, which works with DataStage 6.0 or later:
• Functionality
• Using DataStage to process SAP IDocs
• Configuration requirements
• Runtime components
• IDoc Extract for SAP R/3 plug-in stage
• Connection to SAP
• Output link definitions
• DataStage Administrator for SAP
• Properties
• File permissions
The IDoc Extract plug-in includes a set of tools and a custom GUI to passively
retrieve and process IDocs generated by SAP R/3. It complements the DataStage
ABAP Extract for SAP R/3, which is described in part 1 of this technical bulletin.
You can use only standard SAP interfaces to access IDoc meta data and content. No
ABAP code is uploaded to R/3 source systems for IDocs.
NLS (National Language Support) is supported for the IDoc Extract plug-in.
For information about using plug-ins, see the DataStage documentation. See
“Terminology” on page 1-7 for a list of the terms used in this chapter.
Functionality
The IDoc Extract plug-in has the following functionality:
• Retrieval of IDocs generated by SAP R/3 as source data for DataStage job
data streams.
• Simultaneous connections to multiple SAP R/3 instances from DataStage.
• Ability to define a unique directory for each IDoc type, R/3 source
combination.
• IDoc meta data browser capability.
• Automatic and manual job processing modes.
• Automatic re-connections to SAP R/3.
• A separate client utility, the DataStage Administrator for SAP. It manages
and configures R/3 connection and IDoc type properties.
• A persistent staging area (PSA) on the DataStage server for storage of IDocs
retrieved from R/3 source systems.
• Performance features for a high volume of data.
• A mechanism for coordinating the processing of IDocs from the PSA.
two components provide the IDoc retrieval and processing functions for the
system, as described in the following sections.
Configuration Requirements
DataStage Systems. You need to install the following components: two on the
DataStage server system and two on the client.
• DataStage server system:
– Listener sub-system, which installs and registers the listener manager as
a service executable. (See “Listener Sub-System” on page 3-5 for more
information.)
– IDoc Extract plug-in.
• DataStage client system:
– IDoc Extract client GUI.
Note: You must use a username and non-blank password for DataStage logon
credentials to use the IDoc plug-ins. This means you must clear Omit on
the Attach to Project dialog in DataStage, otherwise, unexpected results
occur.
Runtime Components
The IDoc Extract includes the listener sub-system and the DataStage plug-in
components. The following sections describe the listener sub-system.
For information about the plug-in stage and the administrative functions, see
“IDoc Extract for SAP R/3 Plug-in Stage” and “The Administrator for SAP
Utility”on page 3-15 and page 6-1 respectively.
Listener Sub-System
The listener sub-system includes the following components:
• Listener manager
• One or more RFC listener servers
This architecture is similar to that of the BW Load PACK (see DataStage Load
PACK for SAP BW Plug-In (74-0126). It lets multiple R/3 sources deliver IDocs to
DataStage. It runs as a daemon on UNIX, as a service on Windows NT.
Listener Manager
The listener manager detects changes in RFC servers by doing the following:
1. At startup, the manager reads a configuration file that contains parameters
describing R/3 connections, which are defined by the DataStage job designer.
2. For each configured R/3 connection, the listener manager starts an RFC
listener server in a separate background process. The manager ensures
that all its associated listener servers remain active and connected to R/3.
3. If it finds any irregularities, it tries to restart the server in question.
4. Additionally, if connection parameters change while a server is
connected, the manager stops the server and uses the updated connection
parameters to restart it.
5. The listener sub-system sends any error messages or messages reported
by R/3 to a log file, one for each listener. The log file contains error
messages encountered by the listener sub-system as well any messages
reported by R/3. You can view the contents of the log file manually, or
use the DataStage Administrator for SAP utility. (See “The Administrator
for SAP Utility” on page 6-1.)
Listener Server
Listener servers run in the background, listening on R/3 transactional RFC (tRFC)
ports, waiting for IDocs to be delivered by R/3. An R/3 administrator must
configure tRFC ports, identifying DataStage as a target system. When a server
receives IDocs, it saves them as flat files to a PSA on disk and sends receipt
confirmation to R/3.
The stage runtime component parses the IDoc content and forwards the content as
relational rows down the output links for the stage for processing by downstream
stages.
Platform Script
Solaris /etc/rc2.d/S99dsidocd.rc
HP-UX /sbin/rc2.d/dsidocd.rc
AIX /etc/dsidocd.rc
Linux /etc/rc2.d/S999dsidocd.rc
Compaq Tru64 UNIX /sbin/rc2.d/S99dsidocd.rc
Note: A DataStage limitation prevents multiple instances of the same job from
running simultaneously. Therefore, the listener has to queue job requests or
combine batches if the threshold for launching a job is reached before the
current job instance is complete. You may need to adjust the batch count to
an appropriate value to prevent this situation from occurring.
start one from a listener server, the IDocs are read as text files from local storage.
Local storage refers to the file system of the computer, which conceptually serves
as the data source for the IDoc Extract plug-in. This local storage on the file system
is referred to as the Persistent Staging Area (PSA). The PSA comprises a collection
of individual directories that you configure when designing DataStage jobs or
configuring IDoc types with the DataStage Administrator for SAP utility.
The PSA is a configurable property of an IDoc type and the R/3 connection from
which it arrives.
• Type. You can define a separate directory for each IDoc type that the
DataStage external system can receive. That is, you can specify a directory
of your choice for each IDoc type that is represented by some DataStage
job.
• Connection. Since DataStage jobs must also specify an R/3 connection,
IDocs can be segregated by connection as well as type. For example, a
single DataStage instance can run production level jobs against an R/3
production environment while simultaneously serving as a development
or QA environment for the respective R/3 instances. In another scenario,
you can configure DataStage to listen to more than one production R/3
instance for the same or different IDoc types.
Example. The following diagram represents four RFC listener servers associated
with a single DataStage server instance. Each RFC listener server listens on one of
four R/3 instances: development, QA, and two production instances. Assume six
DataStage jobs are on this server.
The relationship between DataStage jobs and the IDoc types they process is as
follows:
• J1 processes IDoc type A from the R/3 development instance.
• J2 processes IDoc type B from the R/3 development and QA instance.
• J3 processes IDoc type A from the R/3 QA instance.
• J4 processes IDoc type A from the R/3 production instance P1.
• J5 processes IDoc type B from the R/3 production instance P1.
• J6 processes IDoc type C from the R/3 production instance P2.
Security. In addition to offering flexibility in determining how IDocs from various
R/3 instances are processed, this method lets you have full control over the file
permissions for the directories where the IDocs are stored. Thus, varying levels of
security can be applied based on IDoc type. However, the owner of the listener
server process must have write permissions on this directory, and the owner of the
process that executes the DataStage job must have read and write permissions.
PSA Maintenance
Because of potential limitations in available disk space, the file system may not
have the capacity to store every IDoc that DataStage receives. To minimize the
possibility of file systems becoming full due to an excess of IDocs in the PSA, the
IDoc Extract includes the DataStage Administrator for SAP utility for cleaning up
or archiving IDocs that have been processed by DataStage jobs and are no longer
needed. During installation, a separate executable is scheduled to be run
periodically (weekly every Saturday by default) by the operating system. Use the
utility to modify the default scheduling of the executable or to run the
cleanup/archive manually. A bookmark file determines when processed IDocs are
deleted from persistent storage.
You can achieve an additional level of control by using manual removal or
archival. Even though this task is performed automatically, you can request a
manual removal of processed IDocs of a particular type at any time (see “About the
IDoc Cleanup and Archiving Page” on page 6-15).
When the process runs, it scans the PSA, identifying all IDocs that have been
successfully processed by all jobs having an interest in the particular IDoc. When
these IDocs are identified, they are deleted from the file system or archived to
another location. If they are archived, it is the responsibility of the user to maintain
the archive.
Inactive jobs. Inactive jobs are those that have not recently run. If a job stops being
run, IDocs that are to be processed by the job accumulate. They are not cleaned up
because the cleanup process detects that the inactive job has not yet processed the
IDocs and may never process them. If the job never runs again, the IDocs will
accumulate indefinitely.
To resolve this issue, a job attains an inactive status with respect to the listener sub-
system when it has not been run for 40 days. Forty days is the default value, which
you or the DataStage Administrator can modify using the DataStage
Administrator for SAP. When a job becomes inactive, its unprocessed IDocs are not
saved or archived by the cleanup process. For details, see “The Administrator for
SAP Utility” on page 6-1.
Configuration Files
Because R/3 connection parameters and IDoc type properties are outside the scope
of individual DataStage jobs, they are not stored in job definition files in the
DataStage Repository. External configuration files store these parameters and
properties. We recommend that you use the IDoc Extract GUI to modify settings
for these configuration files. This section describes the various configuration files
that are used by the server to manage R/3 connection parameters and IDoc type
properties:
• DSSAPConnections.config
• IDocTypes.config
• <IDocType>.config
Note: On UNIX platforms, the dsenv file in the DataStage server directory must
contain a umask setting so that users have read and write permissions on
these configuration files. (Windows NT platforms handle this
automatically.)
DSSAPConnections.config File
The DSSAPConnections.config file stores R/3 connection parameters and is
located in the DSSAPConnections subdirectory of the DataStage server directory.
It contains all the information specific to the individual physical R/3 connections
that have been configured. R/3 connections are typically configured during job
design (see “Defining the DataStage Connection to SAP” on page 3-20). However,
you can also configure them using the DataStage Administrator for SAP. These
configuration files should never be directly edited (see “The Administrator for
SAP Utility” on page 6-1).
An example of the R/3 connections configuration file is shown below:
DSSAPCONNECTIONS=<BEGIN>
<BEGIN>
DSPASSWORD=py~sZiv.
SAPROUTERSTRING=
DEFAULTLANGUAGE=EN
DATAFILESPATH=
SAPSYSNUM=02
DEFAULTUSERNAME=p45user
DESCRIPTION=SALES logical system on ultra - P45 instance
SAPGROUP=
DSUSERNAME=dsuser1
SAPMESSERVER=
DEFAULTCLIENT=800
ALLOWSAPTORUNJOBS=TRUE
SAPAPPSERVER=R3sys1
LISTENFORIDOCS=TRUE
NAME=P45VERSION3
SAPSYSID=
REFSERVERPROGID=idoctest3
USELOADBALANCING=FALSE
DEFAULTPASSWORD=JSKM~
<END>
<END>
Fields of the DSSAPConnections.config File. Each connection is delimited by
<BEGIN>/<END> pairs. The fields within a connection block in the
DSSAPConnections.config file are defined as follows:
• NAME. A tag identifying the physical connection.
• DSPASSWORD. An encrypted password used by a listener server to
launch DataStage jobs.
• DEFAULTLANGUAGE. One of the SAP mnemonics representing the
native language of an RFC client connection. RFC client connections are
used by the IDoc custom GUI and by the listener when sending status
information back to R/3.
• DEFAULTUSERNAME. The username for connecting to R/3 as an RFC
client.
• DSUSERNAME. The username used by a listener server to launch
DataStage jobs.
• DEFAULTCLIENT. The R/3 client number used to connect to R/3 as an
R/3 client.
• SAPROUTERSTRING. The router string used to connect to R/3 as an RFC
client.
IDocTypes.config File
When the listener manager starts, it starts a listener server for each connection
described in this file. The previous example contains a single connection,
PA45VERSION3.
When an IDoc type is first associated with a connection during job design, the
client GUI creates a directory in the DataStage server directory called
DSSAPConnections/<ConnectionName>/IDocTypes/<IDocTypeName>.
• ConnectionName specifies one of the connections defined by the NAME=
parameter in the DSSAPConnections.config file.
• IDocTypeName is a directory that specifies the name of an IDoc type.
For each configured connection, a corresponding subdirectory
(DSSAPConnections/<ConnectionName>) is created.
Each of these subdirectories contains an IDocTypes directory that contains a
configuration file named IDocTypes.config and further subdirectories, one for each
IDoc type configured. The type is received from the respective connection.
The IDocTypes.config file specifies the directory locations for writing each type of
IDoc that has been configured for that connection. It has the same format as the
DSSAPConnections.config file, but it contains different fields. For example:
DSIDOCTYPES=<BEGIN>
<BEGIN>
USE_DEFAULT_PATH=FALSE
IDOC_FILES_PATH=/u1/IDocs/MATMAS01
NAME=MATMAS01
<END>
<BEGIN>
USE_DEFAULT_PATH=TRUE
IDOC_FILES_PATH=
NAME=CREMAS01
<END>
<END>
The fields within an IDoc type block are defined as follows:
• NAME. The name of the IDoc type.
• USE_DEFAULT_PATH. A value of TRUE indicates that DataStage defines
the default directory to serve as the PSA for this IDoc type. If TRUE, IDocs
received by the listener are stored in the corresponding IDocTypeName
directory in the IDocTypes directory.
• IDOC_FILES_PATH. If USE_DEFAULT_PATH is FALSE, specifies the
user-defined directory that serves as the PSA for this type.
<IDocType>.config File
The <IDocType>.config exists in every IDocTypeName directory that is configured as
a destination for a particular IDoc type, either by default or that you explicitly
specify.
IDocType is the type name of the IDoc. This configuration file contains all the
information that is specific to that IDoc directory location. For example:
IDOC_COUNT=100
DSUSERNAME=johnharvey
DSPASSWORD=py~sZiv.
AUTORUN_ENABLED=TRUE
ARCHIVE_IDOC_FILES=FALSE
USE_DEFAULT_LOGON=FALSE
The IDoc Extract plug-in for release 5 of the Ascential PACK for SAP R/3 uses this
icon:
Each output link from the IDoc Extract stage is mapped to a single IDoc data
segment. You should create links for each IDoc segment whose data is of interest
when you design a job.
The following screen uses IDoc type MATMAS01 as an example:
If you process only segments E1MARAM and E1MAKTM, for example, the data
contained in other segments is ignored. Segments are records within an IDoc that
can be child or parent segments. They include administrative fields such as the
IDoc number, the segment number, and the number of the parent segment. Child
segments have a many-to-one relationship to parent segments.
The selection of E1MARAM, which has child segments, does not imply that all its
child segments are included. Only the segment fields of E1MARAM are processed.
The segment fields for E1MARAM correlate with the column definitions for the
output link on which E1MARAM is associated. The segment fields for E1MAKTM
become the column definitions for a second output link. Use the custom GUI to
define the column meta data definitions. When a segment is chosen for a link, the
GUI displays the available segments fields that you can select for output. The GUI
then automatically translates the R/3 meta data for the selected segment fields to
equivalent DataStage meta data definitions and populates the column grid.
Examples. The following screens illustrate the translation of IDoc segment field
E2MAKTM meta data to DataStage columns with DataStage meta data:
Note: DataStage uses the segment definition name, for example, E2MAKTM001,
rather than the segment type name. The distinction between the two is
beyond the scope of this document. For more information, consult your
SAP documentation.
Note: On UNIX platforms, the DataStage server must be started so that users who
want to run a job have write permission on the bookmark file. This also
applies to the saprfc.ini file, which the job maintains in the DataStage project
directory.
connection and default logon details that are needed to communicate with the
corresponding SAP R/3 system.
Configure an IDoc type as follows. The subsequent sections describe these steps in
detail.
1. Click Select… to open the Select IDoc Type dialog for selecting an IDoc type.
2. Select a type, and click OK.
3. A prompt may appear about an unconfigured IDoc type. Click Yes to
configure the IDoc type for use with DataStage.
4. Define the IDoc type configuration properties on the IDoc Type Proper-
ties dialog.
5. The selected IDoc type and its component segment types are now visible
on the IDoc Type tab of the Stage page.
The Select IDoc Type dialog displays all the released IDoc types defined on the
SAP R/3 system. It has the following components:
• Connection. The connection name for the R/3 system whose IDoc types
are being shown is indicated at the top of the dialog.
• Description. A description of the R/3 system.
• IDoc Types. A list of all the released IDoc types defined on the R/3 system.
• Find… . Click to open a Find IDoc Type dialog. It lets you search for IDoc
Types that contain user-specified substrings in their name or description.
• Properties… . Click to view and change the DataStage configuration for the
selected IDoc type.
• OK. When you select an IDoc type and click OK, the system checks
whether anyone on the DataStage system configured the IDoc type for the
current connection. If not, you see the following message:
This IDoc type has not been configured for use with
DataStage. Would you like to set these options now?
Click Yes to set the options to default values. The IDoc Type Properties
dialog appears (see the following section).
The IDoc Type Properties dialog contains various parameters set to default
values. The default settings are usually appropriate.
1. Name is the name of the selected IDoc type (read-only) with its description in
the Description field.
2. Directory containing temporary IDoc files for this IDoc type and
connection is the pathname for storing IDoc files for this IDoc type and
connection. Initially, the directory is set to a default location, and the edit
box is read-only. To enter your own directory, clear the Use default direc-
tory box.
If Use default directory is selected, the default pathname is displayed, and the
control is read-only.
3. Click Browse… to browse for an alternate directory to store IDoc files for
this IDoc type and connection. If Use default directory is cleared, this
button is enabled.
6. If Run jobs that extract IDocs of this type after receiving n IDocs is
selected, jobs are automatically run that read IDocs of this type each time
another n IDocs of this type are received by the IDoc listener server. (The
default is one.) If the number of IDocs of this type is expected to be small
and to arrive frequently, increase the number of IDocs that must arrive
before jobs are automatically run.
Alternately, you can disable automatic jobs invocation for this IDoc type by
clearing the check box. In this case, use the DataStage Director to schedule jobs.
7. The DataStage Logon Details for Running the Jobs area specifies the
DataStage logon user name and password, and whether to use the
defaults for the connection.
8. R/3 Version specifies the version of the R/3 system that is set for the IDoc
type in the R/3 system itself. This lets you change segment meta data for
the IDoc types when you upgrade an R/3 system to a later version (see
“Specifying R/3 Versions” on page 3-29).
9. After you select an IDoc type and optionally define IDoc Type configura-
tion properties, the Select IDoc Type dialog closes, and you return to the
IDoc Type tab of the Stage page, with the properties of the selected IDoc
type now visible.
The IDoc Components area shows the control record and all the segments defined
for the IDoc type with their descriptions. (The control record, one for each IDoc, is
an administrative record that contains a standard set of fields describing the IDoc
as a whole.) This area contains the following information:
• Name. Shows the hierarchical relationship among the segments using a
tree structure with their descriptions.
(A segment type can appear only once within an IDoc type. The names for
the segments are the segment definition names, not the segment type
names. You can infer the segment type name from the segment definition
name.)
• Assigned Output Link. After particular segments in the IDoc Type are
assigned to the output links of the stage, the IDoc Components control
shows the names of the links in the Assigned Output Link column of the
control. This gives you an overview of which segments are being extracted
by the stage.
The General tab of the Output page shows the segment type or control record for
each output link as described in subsequent sections.
If you enter a version number that is later than that of the R/3 system specified in
the connection, you see the following error:
If you change the port version of the connection for the stage or the R/3 version of
the IDoc type for the stage, the stage automatically refreshes itself using
appropriate IDoc meta data. If the either of these versions is changed before you
open the stage editor, the act of opening the stage editor produces the following
warning:
After you choose the output link from the Output name box, the following steps
summarize how output links function.
The subsequent sections describe these steps in detail.
1. Click Select… to open the Select IDoc Component to Extract dialog (see
“Selecting the IDoc Component to Extract” on page 3-32). The IDoc Compo-
nent to Extract, Description, and Fields in Component information is
displayed after you select a control record or a segment for the link.
2. The selected segment type and its fields now appear on the General tab
of the Output page.
The administrative fields common to all segment types appear at the begin-
ning of the list.
3. Columns are automatically generated for each link when the segment
type is selected.
4. You can delete unnecessary columns (administrative fields that are rarely
used are not visible).
5. Columns for missing fields can be added by clicking Add Columns on
the Columns tab of the Output page.
Adding Columns
When you choose a segment or the control record on the General tab on the
Output page, a corresponding list of columns is automatically generated.
You can view and modify these columns using the Columns tab on the Output
page (see a sample screen in the “Examples” section, which begins on page 3-17).
To view and modify the columns:
1. Columns corresponding to segment administration fields are given names
prefixed by ADM_. The default column list does not include columns for the
less frequently used administrative fields (namely, SEGNAM, MANDT,
HLEVEL, and SDATA). You can also delete any unneeded columns.
The Corresponding Extract Field for Column "ADM_DOCNUM" is the
single-row grid near the bottom of the tab showing the extract field that corre-
sponds to the selected column, in this case, ADM_DOCNUM (the value
changes depending on the selected column). It includes Name, Description,
Internal Type, and Length information.
2. Click Add Columns… to add columns for any fields that are not
currently represented in the columns list. The Add Columns dialog
appears.
If you double-click a field or select one or more fields, and click Add, the
fields disappear from the Add Columns dialog.
Corresponding columns appear in the Columns tab. The new columns are
inserted into positions in the column list that match the sequence of the
corresponding fields in the segment. The columns are shown as selected and
are automatically scrolled into view.
After the columns are added, the Add Columns dialog remains open so you
can optionally add columns for remaining fields.
Click Close to exit the dialog and return to the Columns tab of the Output
page. The dialog also closes automatically if there are no columns left to be
added.
3. Click Validate Columns to check whether each column has a matching
extract field and that the properties for the column are consistent with
those of the field. If the properties of the column are inconsistent, the
program offers to correct the properties of the column.
4. Click Save… to open the Save IDoc Segment Definition dialog to save
meta data definitions for a job to the DataStage Repository. (See the
following section, “Saving IDoc Meta Data Definitions”.)
The Save IDoc Segment Definition dialog contains the following fields:
• Data source type. The type is always IDoc.
4
The IDoc Load Plug-In
Introduction
The SAP R/3 suite of applications supports ERP (Enterprise Resource Planning),
integrating the supply-chain processes of a company. An IDoc (intermediate
document) is a report, that is, a hierarchical package of related records, generated
by SAP R/3 in an SAP proprietary format. An IDoc, whose transmission is
initiated by the source database, exchanges data between applications. It provides
a standard format for exchanging data with SAP R/3. Individual IDocs contain the
data that make up a business transaction (for example, a sales order) or master data
(for example, material master) and include segment records.
This part of the technical bulletin describes the DataStage Load for SAP R/3 plug-
in, which is a passive stage that has input links but no output links. Use this plug-
in to generate IDocs from source stages to load data into SAP R/3.
This technical bulletin describes the following for Version 1.0 of the DataStage
Load for SAP R/3 plug-in, which works with DataStage Release 5.1, or later:
• Functionality
• Configuration requirements
• Defining the IDoc Load for SAP R/3 stage
The user interface for the DataStage IDoc Load for SAP R/3 stage is almost
identical to that of the DataStage IDoc Extract for SAP R/3 stage.
NLS (National Language Support) is supported for the DataStage Load for SAP
R/3 plug-in.
See ”Terminology” on page 1-7 for a list of the terms used in this chapter.
For More Information. For information about using plug-ins or SAP, see the
following table:
The IDoc Load for SAP R/3 plug-in has the following functionality:
• Ability to browse and select IDoc types from a meta data browser in the
GUI.
• Selection of IDoc segment types to which non-SAP data is loaded. The data
for each segment is read in on a separate link in parallel processes.
• Ability to allow the DataStage job designer to map relevant segment data
from multiple sources. Each IDoc Load stage in a DataStage job can send
IDocs of one chosen type to SAP R/3.
• A mapping mechanism that allows relational row data to be converted to
SAP’s proprietary IDoc data format. IDoc structural meta data determines
the link processing order. Join keys, which are sort keys, are identified by
the job designer. The processing order and join keys are used by the stage
to assemble IDocs.
• Support for message types, the scheme for electronically transmitted data
used for one specific business transaction.
• Support for primary key (P-key), and foreign key (F-key) handling, which
differs from that in the IDoc Extract for SAP R/3 plug-in. Foreign keys for
segments at root level of the IDoc hierarchy separate data into separate
IDocs, allowing one stage to send several IDocs in one transmission to SAP
R/3. Foreign keys in non-root level segments relate child segment records
to their parent segment record.
• NLS (National Language Support). For information, see DataStage Adminis-
trator Guide.
Automatic job execution is unsupported for the Parallel Canvas.
Configuration Requirements
DataStage Systems. You need to install the following components: two on the
DataStage server system and two on the client.
• DataStage server system:
– IDoc Load for SAP R/3 plug-in.
• DataStage client system:
– IDocLoad for SAP R/3 client GUI.
– Administrator for SAP utility.
As with the ABAP Extract for SAP R/3, the installation is facilitated by a prior
installation of the SAP GUI.
R/3 Source Systems. Some configuration on all R/3 source systems is required to
identify DataStage as a target system.
Install the DataStage Load for SAP R/3 plug-in from the DataStage CD-ROMs as
described in “Installation” starting on page 1-1.
After installing the plug-in GUI, start the plug-in editor from the DataStage
Designer by doing one of the following:
• Double-clicking the stage in the Diagram window
• Selecting the stage and choosing Properties from the shortcut menu
• Selecting the stage and choosing Edit ➤ Properties from the DataStage
Designer window
Note: You must use a username and non-blank password for DataStage logon
credentials to use the IDoc plug-ins. This means you must clear Omit on
the Attach to Project dialog in DataStage, otherwise, unexpected results
occur.
The IDoc Load plug-in for release 5 of the Ascential PACK for SAP R/3 uses this
icon:
Each input link loads records for a particular segment type within the IDoc type
that is selected for the stage. A column list for each link is generated automatically,
based on the fields of the selected segment type. In addition, special columns are
generated to represent primary and foreign key values for each link. These key
values allow the runtime to determine which child segment records flowing into
one link belong to each parent segment record that flows into a different link.
For a root-level segment type, the foreign key value identifies the specific
generated IDoc into which the segment records will be incorporated.
Since the column lists for an IDoc Load for SAP R/3 stage are generated
automatically, Transformer stages map values from source data columns to the
columns generated from the fields of the segment types. In each Transformer stage,
you must also map source data values to values that can be used as primary and
foreign keys for the segments. Values for the key columns not actually loaded as
data into the IDocs, but are only used to correlate records flowing into separate
links.
You can design jobs in any way that provides effective key values. The foreign key
values provided for a link representing a parent segment type must exactly match
primary key values provided for a link that represents the parent segment type.
For root-level segment types, the foreign key identifies the IDoc for the segments.
When you start the plug-in GUI from the DataStage Designer to edit an IDoc Load
for SAP R/3 stage, the General tab of the Stage page appears by default:
All available, released IDoc types, both basic and extended types, that are
defined on the R/3 system are visible, for example:
Connection displays the connection name with its description for the R/3
system whose IDoc types are being shown.
IDoc Types lists all released IDoc types defined on the R/3 system.
2. Click Find… on the Select IDoc Type dialog to open the Find IDoc Type
dialog. It lets you search for IDoc Types that contain user-specified
substrings in their name or description.
3. Select a type, and click OK to set as the IDoc type for the stage.
The segment hierarchy (that is, the selected IDoc type and its component segment
types) is now visible on the IDoc Type tab of the Stage page:
Begin the selection process from the General tab on the Input page:
The Input page has an Input name field, the General and Columns pages, and the
Columns… button. The Input link pages are almost identical to the Output link
pages of the IDoc Extract for SAP R/3 stage. They display the fields to load.
• Input name. The name of the input link. Choose the link you want to edit
from the Input name list box.
• Click the Columns… button to display a brief list of the columns desig-
nated on the input link. As you enter detailed meta data in the Columns
page, you can leave this list displayed.
After you choose the input link from the Input name box, the following steps
summarize how input links function. The subsequent sections describe these steps
in detail.
1. Click Select… beside IDoc Component to Load on the General tab of the
Input page to open the Select IDoc Component to Load dialog (see
“Selecting the IDoc Component to Load” on page 4-15). The IDoc Compo-
nent to Load, Description, and Fields in Component information is
displayed after you select a segment for the link.
2. The selected segment type and its fields to load now appear on the
General tab of the Input page.
3. Columns are automatically generated for each link when the segment
type is selected (see “Modifying Columns” on page 4-16).
4. You can delete unnecessary columns.
5. Columns for missing fields can be added by clicking Add Columns on
the Columns tab of the Input page (see “Modifying Columns” on
page 4-16).
The following sections describe how input links function.
2. The dialog closes, and you return to the General tab of the Input page
with the IDoc Component to Load, Description, and Fields in Compo-
nent information now shown, for example:
Modifying Columns
When you choose a segment from the Select IDoc Component to Load dialog, a
corresponding list of columns for the link including key columns is automatically
generated.
P-keys (primary keys) are named after the segment type. F-keys (foreign keys)
relate records for input links to parent records. As a result, meta data is
synchronized throughout the links when changes are made to it. F-keys and P-keys
are automatically updated, giving users flexibility in using data from different
columns in the source data.
You can view and modify these generated columns using the Columns tab on the
Input page as in the following example:
The Columns tab of the Input page has the following components:
• The Corresponding Load Field for Column "xx" is the single-row grid near
the bottom of the tab showing the load field that corresponds to the
selected column. (The value changes depending on the selected column.) It
includes Name, Description, Internal Type, and Length information. In this
screen, for example, no values are displayed since a key column was
selected with no corresponding segment field.
• Click Add Columns… to add columns for any fields that are not currently
represented in the columns list. The Add Columns… dialog appears where
you can select from the available load fields for columns.
You can also delete any unneeded columns.
• Click Validate Columns to check the column list for consistency with the
segment fields.
• Key Columns… . Click to open the Key Columns for Link dialog. This lets
you change the number of columns that represent the primary and foreign
key for the link (see “Synchronizing Columns” on page 20).
• Save… . Click to open the Save IDoc Segment Definition dialog to save
the meta data definitions when you finish defining the columns for the
link.
Use Validate All on the General tab of the Stage page to detect when a link is
assigned to a segment, but the links are not assigned to their ancestors. If you try
to assign a segment type whose parent segment type has not been assigned to some
other link, you are warned, but you can continue.
Root Segments. When a segment type is assigned to a link, the automatically
generated column list includes primary and foreign key columns. When the
selected segment type is a root segment type (that is, one that has no parent
segment type in the IDoc), the foreign key column represents the primary key for
the IDoc as a whole.
Non-Root Segments. If a non-root segment type is selected, the resulting
generated foreign key column represents the value of the primary key of the parent
segment type.
Synchronizing Columns
It is sometimes more convenient to use more than one column for a particular key.
Use Key Columns… to open the Key Columns for Link dialog. This lets you
change the number of columns that are used to represent the primary and foreign
key for the link. If you change the primary key to two, for example,
and click OK, the column list is refreshed, generating as many columns for each
key as requested.
Changing the number of columns that are used for a primary key also causes the
column lists for links that include a corresponding foreign key to be updated.
Likewise, changing the number of columns used for a foreign key causes the
corresponding primary key columns to be updated in the link representing the
parent segment type.
5
The BAPI Plug-In
Introduction
This chapter describes the following for the BAPI plug-in. This plug-in is part of
version 5 of the Ascential PACK for SAP R/3, for DataStage 6.0 or later:
• Functionality
• Installation prerequisites
• Building a job
• Loading data into SAP R/3
• Extracting data from SAP R/3
DataStage provides enhanced SAP support with its Packaged Application
Connection Kit (PACK) for SAP R/3.
SAP created the Business Framework, which is an open, component-based
product architecture. It allows the technical integration and exchange of business
data among R/3 SAP components and between SAP and non-SAP components.
Business Objects and their BAPIs are important components of the Business
Framework, which are used by the plug-in to move data across SAP and non-SAP
components in a standard way.
A Business Object is the representation of a business entity, such as an employee or
a sales order, in the SAP R/3 system.
BAPIs are standard SAP interfaces that are used by the plug-in to integrate with
SAP R/3 systems. They are technically implemented using function modules that
are RFC-enabled (Remote Function Call) inside SAP systems.
The availability of these BAPIs as object-oriented interfaces lets other components
directly access the application layer of an SAP system without implementation
details. BAPIs allow integration with SAP at the business level.
You can use the BAPI plug-in to interface with the my-SAP Business Suite, which
is a family of solutions and integrated application platforms, such as CRM, SCM,
and so forth that are supplied by SAP. Use the BAPI plug-in within DataStage as a
passive stage, for example, to load data into and extract data from SAP R/3 Enter-
prise as the target system. Do this by using the library of Business Application
Programming Interfaces (BAPIs), which is provided and maintained by SAP. The
databases always remain in a consistent state after executing the BAPIs
(methods).
The BAPI plug-in lets you use these BAPIs to design load or extract jobs. You can
do this because the plug-in captures the meta data for each BAPI and dynamically
builds the complex RFM call to execute these BAPIs.
You can use the BAPI plug-in GUI to select a Business Object and its BAPI from
the SAP R/3 system and display the corresponding interface which you can use
to load or extract data. The run-time component for the plug-in loads or extracts
data into or from the SAP application server.
See “Terminology” on page 1-7 for a list of the terms used in this chapter.
Functionality
The BAPI plug-in has the following functionality:
• Lets you choose and define SAP connections using the GUI.
• Lets you explore the SAP BOR, dynamically choose any BAPI, and store its
meta data in the job.
• Lets you view the BAPI interface using a function module (RFC) with its
import, export, and table parameters, and optionally decide which parame-
ters to use when executing the BAPI.
• Works with the my_SAP Business Suite products.
• Loads and extracts data into and from SAP R/3 using BAPIs with appro-
priate log information.
• Supports NLS (National Language Support). For information, see
DataStage Server Job Developer’s Guide.
The following functionality is not supported:
• The creation of custom BAPIs using the plug-in.
• Data transformations or mappings. Use the Transformer stage to do this.
• Testing of the BAPI during design time.
Building a Job
The BAPI plug-in stage can be a data source and a data destination. You can add
links into the stage and out from the stage. Multiple links are allowed in both
directions that let this stage call multiple BAPIs in the same job to load and extract
data at the same time.
If the job uses multiple input or output links, each link must define a unique
BAPI, that is, a link cannot contain a BAPI that is already used in another link in
the same job.
To build a job:
1. Create a DataStage job using the DataStage Designer.
2. Create the BAPI plug-in stage, adding the stages and links you need to load
and extract data. Double-click the BAPI plug-in stage icon to open the stage
editor dialog (GUI).
3. Define the stage properties.
4. Define the SAP R/3 connection details and the SAP logon information.
5. Specify the properties for input links for loading data to SAP R/3.
6. Specify the properties for output links for extracting data from SAP R/3.
7. Compile the job.
Each task is described in more detail in the following sections.
• SAP Logon Details. Provides all the necessary information to logon to SAP,
which gets stored in the job.
User Name. The user name that is used to logon to SAP.
Password. A password for the specified user name.
Client Number. The number of the client system used to logon to SAP.
Language. The language used to logon to SAP.
• Description. Enter text to describe the purpose of the stage.
For more information about creating or defining new connections and deleting
existing connections, see the following section and “The Administrator for SAP
Utility” on page 6-1.
• Use load balancing. Select to use load balancing when connecting to R/3.
Client connections are made through a message server rather than an
application server.
The Application Server and the System Number controls are replaced by
Message Server, System ID, and Group controls so that connection details
specific to load balancing can be entered.
The Input page has an Input name field, the General, BAPI, Logs, and Columns
pages, and the Columns… button.
• Input name. The name of the input link. Choose the link you want to edit
from the Input name list.
• Click the Columns… button to briefly list the columns designated on the
input link. As you enter detailed meta data in the Columns page, you can
leave this list displayed.
Note: This interim dialog appears only for release 4.6 and later of SAP. Other-
wise, the BAPI Explorer screen opens.
• Application Component ID. You can select a specific component from the
list or use <All Components> to select all components from the list.
After you select a component ID, this dialog contains only those Business
objects that belong to the selected application component. The list of
available application components to choose from is dynamically built,
depending on the accessed system.
• BAPI’s to Display. You can show only released BAPIs (the default) or all
BAPIs in the BAPI Explorer screen. You must ensure unreleased BAPIs are
complete and ready to use, or a warning appears. (Click Show All BAPIs
to see those BAPIs that are unreleased.)
• OK. Click to open the BAPI Explorer dialog.
• Cancel. Click to return to the Input or Output General page.
In this example, the standard Bank Business Object is selected, and a list of BAPIs
is displayed for the expanded node.
Similarly, you can further expand nodes that show BAPIs to list parameters that
are defined for that BAPI.
Click Cancel to return to the calling dialog without selecting a BAPI.
Click OK to select a BAPI. OK is enabled only when the selected node in the tree
is the BAPI type. This indicates that you are selecting a BAPI, not another object in
the tree.
When you select a BAPI, the relevant fields on the Input page are populated with
the meta data.
Example for output links. The following example shows an equivalent screen
for an output link with Bank expanded:
When a BAPI is first selected, only required parameters are active on the BAPI
Import tab of the Input page. For input links, fields that are required for import
parameters appear on the Columns page.
You can tell whether a parameter is active by the following indications:
• Green icons beside the parameter names indicate that the parameters are
active, that is, used to dynamically build BAPI calls at run time. (Red icons
indicate that parameters are inactive, unused when calling BAPIs.)
• Green icons display I or E to indicate whether table parameters are acti-
vated for Import or Export.
To activate or de-activate parameters, do one of the following:
• Double-click a parameter name
• Right-click for a shortcut menu
When you move the cursor over a parameter, its shape changes, indicating that
you can activate or deactivate parameters.
Fields that are required for tables parameters appear on the Columns page.
Fields that are required for export parameters appear in a grid on the Input Logs
page, not on the Input Columns page, because they contain return values from
the BAPI call.
You can see the reference structure being used for each parameter with a short
text description of the parameter. This helps you decide whether to activate a
particular parameter.
• Location for Return Parameters and other Log Files. By default, you see a
pathname whose directory corresponds to the selected BAPI. This directory
for temporary log files is created under the DataStage\DSSAPConnec-
tions directory.
In this example, StandardMaterial.SaveData is the selected BAPI for the
stage. Assuming the connection name is C46, the default directory for return
values is:
<DSHOME>\DSSAPConnections\C46\BAPIs\Loads\StandardMaterial.SaveData
All fields in the grid except the first (BAPISeqno) are derived from reference struc-
tures for the import and tables parameters that get activated on the Input BAPI
page.
BAPISeqNo is automatically generated by the GUI to let the run-time component
distinguish between parent and child rows in an input dataset that represents a
header-detail relationship. For example, if the dataset contains a header record
with multiple child records, this field contains the same value for each of those
rows.
As for other plug-ins, you can use the following components for the column
definitions:
• Save… . Click to open the Save Table Definition dialog to save column
definitions to the DataStage Repository.
• Load… . Click to open the Table definitions dialog to load the column
definitions from the DataStage Repository into other stages in your job
design.
This resizable screen looks exactly like the Input General page, except that the
selected BAPI extracts data from SAP.
The Output page has an Output name field, the General (the default), BAPI,
Read Logs, and Columns pages, and the Columns… button.
• Output name. The name of the output link. Choose the link you want to
edit from the Output name list.
• Columns… . Click to briefly list of the columns designated on the output
link. As you enter detailed meta data in the Columns page, you can leave
this list displayed.
When you first select a BAPI, only required parameters are active.
The color of the icons beside parameter names indicate whether they are active:
• Green. Indicate active parameters that are used when dynamically
building the BAPI call at runtime.
• Red. Indicate inactive parameters that are unused when calling the BAPI.
Double-click a parameter name or right-click to open a shortcut menu to activate
or deactivate parameters. When you move the cursor over a parameter, its shape
changes, indicating whether you can activate or deactivate parameters.
You can see the reference structure being used for each parameter with a short
text description of the parameter to help you decide whether to activate a partic-
ular parameter.
For output links, the following parameters appear on the Columns page:
• Parameters activated on the Export tab
• Tables parameters that are activated for extracting values
Fields for import parameters and the table parameters activated for Input Import
appear on the Output Read Logs page. This is because these fields are not
extracted when the job runs but need to be read (loaded) to call a BAPI.
The grid displays those fields that correspond to the parameters selected on the
Import tab on the Output BAPI page and the fields that correspond to parameters
activated on the Tables tab for Input on the Output BAPI page.
The locations for import parameters and other log files are specified on this
screen. By default, a directory name that corresponds to the selected BAPI that
gets created in the DataStage\DSSAPConnections directory appears.
For example, if the selected BAPI for the stage is PurchaseOrder.GetDetails and
the connection name is C46, the default directory for import values is:
DSHOME\DSSAPConnections\C46\BAPIs\Extracts\PurchaseOrder.GetDetails
Click Browse beside Location for Log Files to open the Browse directories dialog
to select a directory.
You can also click Browse beside Input File for Import Parameters to open a
dialog to select a file.
Run-Time Component
The BAPI plug-in run-time component is a modified sequential file stage that
supports multiple links in both directions. It dynamically builds a call to the
selected BAPI and executes that call.
Input links. The return values from the BAPI call are stored in the appropriate
directory that you specify on the Input Logs page.
The name of the log files corresponds to the parameter names that are activated
for the BAPI, for example:
• Export parameters. If an export parameter named PURCHASEORDER is
activated for the BAPI, a log file named PurchaseOrder.txt is created in the
Export directory in the directory specified on the Input Logs page.
• Tables parameters. You can use table parameters to import or export
values. Two log files are created for each parameter, which contain the
following:
– The parameter name with the .TXT suffix
– The parameter name with the __RETURN.TXT suffix
For example, for the table parameter named DESCRIPTION, log files named
DESCRIPTION.TXT and DESCRIPTION_RETURN.TXT are created.
Output links. The job expects import values to be available in the Import direc-
tory that gets created under the pathname specified on the Output Read Logs
page. The name of the log files corresponds to the parameter names that are acti-
vated for the BAPI, for example:
• Import parameter. If an import parameter named PURCHASEORDER is
activated for the BAPI, a log file named PurchaseOrder.txt is expected to be
in the Import directory in the pathname specified on the Output Read
Logs page.
• Tables parameters. You can use table parameters to import or export
values. two log files get created for each active parameter, which contain
the following:
– The parameter name with the .TXT suffix
– The parameter name with the __Return suffix
For example, if the table parameter named DESCRIPTION is active, DESCRIP-
TION.TXT and DESCRIPTION_RETURN.TXT log files are created.
6
The Administrator for SAP
Utility
The following steps let you define a DataStage connection to SAP. The subsequent
sections describe these steps in detail.
1. Run the DataStage Administrator for SAP on a DataStage client machine.
2. Click Add… ➤ New… to define the properties for the new connection.
3. Enter a connection name, a description, and SAP connection and logon
details.
4. Optionally, use load balancing.
5. Enter an IDoc listener program ID (the same ID must be defined for the
tRFC port in the SAP database).
6. Define the properties for running jobs automatically as IDocs arrive on
the DataStage Job Options for Idocs page.
7. Click Add to test the connection and logon properties and add the
connection. An IDoc listener server starts automatically after the new
connection is added.
8. Configure the SAP system to access DataStage by doing the following:
a. Log on to the R/3 system.
b. Create a tRFC port, and assign it the IDoc listener program ID that was set
for the connection.
c. Create a logical system to represent DataStage.
d. Attach the tRFC port to the logical system.
9. Click IDoc Types to see the IDoc types for the selected connection. It
opens the IDoc Types dialog where you can set the properties of IDoc
types.
The DataStage Connections to R/3 page has the following components:
• Add… . Click to open a popup menu with the New… and Import… items.
Do the following:
1. Click New… to open the Connection Properties dialog, which lets you
define the properties for the new connection. Here you can specify
Connection Name, Description, and SAP connection and default
logon details for the new connection. (See “Defining DataStage
Connections to R/3” on page 6-4.)
2. Click Import … to open a standard Open File dialog that lets you
select the export file to be imported. This lets you import a new
connection (see “Importing New Connections” on page 6-11).
• Properties… . Click to open the Connection Properties dialog, showing the
properties of the selected connection. This is the same dialog that is opened
when you click New on the Select DataStage Connection to SAP dialog,
but in this context the connection Name is read-only, and the Add button is
replaced by an OK button.
• Import into… . Click to import IDoc type configurations into the selected
connection that already exists. A standard Open File dialog appears so you
can select a file to be imported (see “Importing IDoc Type Configurations”
on page 6-11).
• Export… . Click to save the configuration information for the selected
connection and all its associated IDoc types into a file (see “Exporting
Connections” on page 6-13).
• Remove. Click to delete the selected connection after your confirmation.
• IDoc Types. Click to see the IDoc types for the selected connection. It opens
the IDoc Types dialog (see the section “About the IDoc Types Dialog” on
page 6-14). This button does not appear on the Select DataStage Connec-
tion to SAP dialog.
• IDoc Log. Click to open the IDoc Log dialog. This dialog displays log
messages reported by the IDoc listener for the connection (see the section
“About the IDoc Log Dialog” on page 6-15). This button does not appear
on the Select DataStage Connection to SAP dialog.
If Listen for IDocs received through this connection is cleared on the IDoc
Listener Settings page, the Import Into, IDoc Types, and IDoc Log buttons on
this page are disabled when you select that connection. Also, if this same option is
cleared, Import and Export operations apply only to the properties for the connec-
tion. They do not involve IDoc Type configurations. (This is useful for connections
that are only used for ABAP or BAPI stages, rather than IDoc stages.)
The DataStage Connections to R/3 page lets the RFC Server make a non-load
balancing connection even if load balancing is specified in the connection
configuration and provides an option that prevents the Listener from sending
status updates back to the R/3 system when IDocs are received.
2. Select Use load balancing to use load balancing when connecting to R/3.
The Application Server and the System Number controls are replaced by
Message Server, System ID, and Group controls so that connection
details specific to load balancing can be entered (see “Load Balancing” on
page 6-6).
3. Specify the default SAP logon details:
• User Name. The name of the user for connecting to SAP.
• Password. The password for User Name.
• Client Number. The SAP client number.
• Language. The language used for connecting to SAP.
Load Balancing
Select Use Load balancing on the Connection and Logon Details page of the
Connection Properties dialog (see page 6-5) to balance loads when connecting to
the R/3 system. Load balancing works as follows:
1. R/3 lets you make logon connections through a message server. The message
server uses an algorithm that considers server workload and availability to
choose an appropriate application server to handle the logon.
When connections are configured, you can choose a load balancing connection
to a message server rather than a specific R/3 instance to retrieve and validate
IDoc types and meta data.
2. If load balancing is selected for this connection, the listener server uses
load balancing when returning status updates to R/3.
3. If load balancing is selected, the plug-in runtime component uses load
balancing when connecting to R/3 to validate IDoc type meta data at job
execution time.
The listener server does NOT use this connection to listen for IDocs arriving on an
RFC port since this is an RFC server connection. Load balancing is only a client
connection feature.
This page lets you make an additional load balancing test connection as follows:
1. Select Listen for IDocs received through this connection so that a listener
server runs continuously on the DataStage server machine. This check box is
selected by default. If selected, this option indicates that the connection is
enabled for use with IDoc Extract or IDoc Load.
If Listen for IDocs received through this connection is cleared, the other
controls on the IDoc Listener Settings page and DataStage Job Options for
IDocs page are disabled (including the labels for the controls). Also if cleared,
the Import Into, IDoc Types, and IDoc Log buttons on the DataStage Connec-
tions for R/3 page of the DataStage Administrator for SAP utility are disabled
when you select that connection.
2. Specify IDoc Listener Program ID. The listener registers this program ID
with the R/3 system. Use the same program ID for the tRFC port on the
SAP R/3 system to be invoked when the R/3 system sends an IDoc to
DataStage.
3. Clear Return status update upon successful receipt of IDoc to prevent
the listener from sending status messages to the R/3 system when IDocs
are received. (It is set by default.) Do this to reduce the load incurred by
the R/3 system when it sends IDocs to DataStage.
4. If you select Use load balancing on the Connection and Logon Details
page (see page 6-5), and the load balancing test connection succeeds, an
additional test connection is made using the IDoc Listener SAP Connec-
tion Details that you specify.
If this test connection fails, the following warning appears:
Currently unable to connect to SAP using the IDoc Listener
connection information specified for this connection. Do you
want to save your changes anyway?
Separate connection details must be provided for the listener, since the
Listener cannot make a load-balancing connection. If load balancing is indi-
cated on the Connection and Logon Details page, you can modify the
Application Server, System Number, and Router String controls shown on
the IDoc Listener Settings page.
The Application Server, System Number, and Router String information is
used by the listener when it connects to the R/3 system. If you clear Use load
balancing on the Connection and Logon Details page, the listener uses the
connection details specified on that page. In that case, the Application Server,
System Number, and Router String controls on the IDoc Listener Settings
page displays values copied from the Connection and Logon Details page,
but these values are read-only (disabled).
• 3.0/3.1. Stages that read IDocs for this connection use meta data for IDoc
control records and segment administrative fields that corresponds to the
meta data used for R/3 Version 3.1 or earlier systems. This meta data is
used even if the R/3 system specified in the connection actually has a later
version.
3. The Add Connection dialog opens, with property values that default to
those of the connection being imported. If the name of the connection is
the same as that of an existing connection, you are asked if you want to
overwrite the existing connection.
4. The Import IDoc Type Configurations dialog opens, which shows the
name and description of the newly imported connection, and lists the
IDoc types whose configurations are imported with the connection (see
the following section).
Exporting Connections
To export connections by saving configuration information into a file:
1. Click Export… from the DataStage Connections to SAP page to open the
Export Connection dialog. This dialog lets you save the following configura-
tion information for the selected connection and all its associated IDoc types
into a file:
The Connection field specifies the connection whose IDoc types are displayed
in the list with descriptive text in the Description field.
2. Click Find… to open a Find IDoc Type dialog to search for IDoc types
that contain user-specified substrings in their name or description as in
the Select IDoc Type dialog (see “Selecting IDoc Types” on page 4-10).
3. Click Properties… to examine and change the DataStage configuration
for the selected IDoc type using the IDoc Type Properties dialog (see
“Defining IDoc Type Properties” on page 3-26). (You can also do this by
double-clicking an IDoc type in the list.)
The Connection field specifies the connection whose IDoc log messages are
displayed. Descriptive text is included in the Description field.
2. The IDoc Log Messages field lists log messages about the activities of the
IDoc listener that is associated with the connection. When the dialog first
opens, this list is automatically scrolled to the end so that the most recent
messages are visible.
3. Click Refresh to reload the log messages, including any that were gener-
ated since you first opened the dialog.
4. Click Clear Log to delete the messages currently in the log (after you
provide confirmation) and refresh the display.
Note: Use this administrative utility as you set Archive processed IDoc files on
the IDoc Type Properties dialog to archive IDoc files (see “Defining IDoc
Type Properties” on page 3-26).
Additionally, you can achieve a level of control by running the cleanup executable
manually from the command line on the DataStage server machine. The dsidoccln
command is in the bin directory of the DataStage home directory (<dshome>).
Open the IDoc Cleanup and Archiving page from the DataStage Administrator
for SAP:
If you select No Timeout, jobs are always assumed to be active, regardless of how
much time has passed since the jobs last ran.
Selecting this check box disables Job Inactivity Timeout.
A
SAP Authorization
Requirements for ABAP
This appendix describes the SAP authorization requirements to run Version 3.0.1r1
of the ABAP Extract for SAP R/3 plug-in. It documents how to manually create an
Z_DS_PROFILE authorization profile and the Z_RFC_DS_SERVICE RFC
function module in SAP R/3.
You already have these two components installed on your R/3 system if you
imported transport requests as described in “Integrating DataStage with SAP R/3
Systems” on page 2-4. However, if this import is unsuccessful, you can manually
create these components in your R/3 systems (see “Creating an Authorization
Profile Manually” and “Installing the RFC Function Code Manually” on page A-2
and page A-5 respectively).
The appendix also describes the configuration of the SAP Dispatch/Gateway
service. It further includes information about packaged extraction jobs that include
common extraction scenarios which you can customize for your environment.
Authorization
Object Text Authorization
S_RFC Authorization check for RFC access Z:DS_RFC
S_DATASET Authorization for file access Z:DS_DATASET
S_CPIC CPIC Calls from ABAP programs Z:DS_CPIC
S_TABU_DIS Table maintenance (using standard tools, Z:DS_TABU
such as SM30)
Note: To upload or delete ABAP code, SAP users (CPIC and Dialog) must be
registered within SAP as developers. To do this, log on to the Online
Support System (OSS), and select the appropriate SAP installation number.
Go to Register ➤ Register Developer, and type the SAP user name. OSS
provides a 20-character access key. Record this access key, and create a
dummy ABAP program. When SAP prompts you for an access key, type
this OSS key, and the system will confirm it.
Type the following values shown on this sample Import page for the Param-
eter name, Type, Reference type, Default value, Optional, and Pass value
fields:
Default
Parameter name Type Reference type value Optional Pass value
I_SERVE_TYPE LIKE SY-TFILL Yes
I_TABLE_NAME LIKE DD02T-TABNAME Yes Yes
I_REPORT_NAME LIKE D010SINF-PROG Yes Yes
I_DELIMITER LIKE SONV-FLAG SPACE Yes Yes
I_NO_DATA LIKE SONV-FLAG SPACE Yes Yes
I_ROWSKIPS LIKE SOID-ACCNT 0 Yes Yes
I_ROWCOUNT LIKE SOID-ACCNT 0 Yes Yes
I_CURR_VARIANT LIKE RSVAR-VARIANT YES YES
I_VARI_DESC LIKE VARID YES YES
Type the following values shown on this sample Export page for the Param-
eter name, Type spec., Reference type, and Pass val. fields:
Type the values shown here on this sample Tables page for the Parameter
name, Type spec., Reference type, and Optional fields according to the
following table:
Type the following values shown on this Exceptions page for the Exception
field:
Exception
TABLE_NOT_FOUND
WRONG_TYPE
TABLE_WITHOUT_DATA
OPTION_NOT_VALID
FIELD_NOT_VALID
DATA_BUFFER_EXCEEDED
REPORT_NOT_FOUND
10. Click Source code. The source code appears in the ABAP Editor window:
To replace this default code with the source code provided on the CD, that is,
the ZDSRFC.TXT file in the RFC directory:
a. Choose Utilities ➤ more utilities.
b. Choose Upload/Download ➤ Upload from the ABAP Editor menu.
c. Specify the pathname for the ZDSRFC.TXT file in the RFC directory on the
CD. For example, type E:\Datastage\RFC\ZDSRFC.TXT in the File
name field.
11. After the upload is complete, save the RFC function code, and activate it.
sapdp25 3225/tcp
sapdp26 3226/tcp
sapdp27 3227/tcp
sapdp28 3228/tcp
sapdp29 3229/tcp
sapdp30 3230/tcp
sapdp31 3231/tcp
sapdp32 3232/tcp
sapdp33 3233/tcp
sapdp34 3234/tcp
sapdp35 3235/tcp
sapdp36 3236/tcp
sapdp37 3237/tcp
sapdp38 3238/tcp
sapdp39 3239/tcp
sapdp40 3240/tcp
sapdp41 3241/tcp
sapdp42 3242/tcp
sapdp43 3243/tcp
sapdp44 3244/tcp
sapdp45 3245/tcp
sapdp46 3246/tcp
sapdp47 3247/tcp
sapdp48 3248/tcp
sapdp49 3249/tcp
sapdp50 3250/tcp
sapdp51 3251/tcp
sapdp52 3252/tcp
sapdp53 3253/tcp
sapdp54 3254/tcp
sapdp55 3255/tcp
sapdp56 3256/tcp
sapdp57 3257/tcp
sapdp58 3258/tcp
sapdp59 3259/tcp
sapdp60 3260/tcp
sapdp61 3261/tcp
sapdp62 3262/tcp
sapdp63 3263/tcp
sapdp64 3264/tcp
sapdp65 3265/tcp
sapdp66 3266/tcp
sapdp67 3267/tcp
sapdp68 3268/tcp
sapdp69 3269/tcp
sapdp70 3270/tcp
sapdp71 3271/tcp
sapdp72 3272/tcp
sapdp73 3273/tcp
sapdp74 3274/tcp
sapdp75 3275/tcp
sapdp76 3276/tcp
sapdp77 3277/tcp
sapdp78 3278/tcp
sapdp79 3279/tcp
sapdp80 3280/tcp
sapdp81 3281/tcp
sapdp82 3282/tcp
sapdp83 3283/tcp
sapdp84 3284/tcp
sapdp85 3285/tcp
sapdp86 3286/tcp
sapdp87 3287/tcp
sapdp88 3288/tcp
sapdp89 3289/tcp
sapdp90 3290/tcp
sapdp91 3291/tcp
sapdp92 3292/tcp
sapdp93 3293/tcp
sapdp94 3294/tcp
sapdp95 3295/tcp
sapdp96 3296/tcp
sapdp97 3297/tcp
sapdp98 3298/tcp
sapdp99 3299/tcp
sapgw00 3300/tcp
sapgw01 3301/tcp
sapgw02 3302/tcp
sapgw03 3303/tcp
sapgw04 3304/tcp
sapgw05 3305/tcp
sapgw06 3306/tcp
sapgw07 3307/tcp
sapgw08 3308/tcp
sapgw09 3309/tcp
sapgw10 3310/tcp
sapgw11 3311/tcp
sapgw12 3312/tcp
sapgw13 3313/tcp
sapgw14 3314/tcp
sapgw15 3315/tcp
sapgw16 3316/tcp
sapgw17 3317/tcp
sapgw18 3318/tcp
sapgw19 3319/tcp
sapgw20 3320/tcp
sapgw21 3321/tcp
sapgw22 3322/tcp
sapgw23 3323/tcp
sapgw24 3324/tcp
sapgw25 3325/tcp
sapgw26 3326/tcp
sapgw27 3327/tcp
sapgw28 3328/tcp
sapgw29 3329/tcp
sapgw30 3330/tcp
sapgw31 3331/tcp
sapgw32 3332/tcp
sapgw33 3333/tcp
sapgw34 3334/tcp
sapgw35 3335/tcp
sapgw36 3336/tcp
sapgw37 3337/tcp
sapgw38 3338/tcp
sapgw39 3339/tcp
sapgw40 3340/tcp
sapgw41 3341/tcp
sapgw42 3342/tcp
sapgw43 3343/tcp
sapgw44 3344/tcp
sapgw45 3345/tcp
sapgw46 3346/tcp
sapgw47 3347/tcp
sapgw48 3348/tcp
sapgw49 3349/tcp
sapgw50 3350/tcp
sapgw51 3351/tcp
sapgw52 3352/tcp
sapgw53 3353/tcp
sapgw54 3354/tcp
sapgw55 3355/tcp
sapgw56 3356/tcp
sapgw57 3357/tcp
sapgw58 3358/tcp
sapgw59 3359/tcp
sapgw60 3360/tcp
sapgw61 3361/tcp
sapgw62 3362/tcp
sapgw63 3363/tcp
sapgw64 3364/tcp
sapgw65 3365/tcp
sapgw66 3366/tcp
sapgw67 3367/tcp
sapgw68 3368/tcp
sapgw69 3369/tcp
sapgw70 3370/tcp
sapgw71 3371/tcp
sapgw72 3372/tcp
sapgw73 3373/tcp
sapgw74 3374/tcp
sapgw75 3375/tcp
sapgw76 3376/tcp
sapgw77 3377/tcp
sapgw78 3378/tcp
sapgw79 3379/tcp
sapgw80 3380/tcp
sapgw81 3381/tcp
sapgw82 3382/tcp
sapgw83 3383/tcp
sapgw84 3384/tcp
sapgw85 3385/tcp
sapgw86 3386/tcp
sapgw87 3387/tcp
sapgw88 3388/tcp
sapgw89 3389/tcp
sapgw90 3390/tcp
sapgw91 3391/tcp
sapgw92 3392/tcp
sapgw93 3393/tcp
sapgw94 3394/tcp
sapgw95 3395/tcp
sapgw96 3396/tcp
sapgw97 3397/tcp
sapgw98 3398/tcp
sapgw99 3399/tcp
5. Click Source code. The function module header appears at the top of the
code.
The IMPORTING and EXPORTING values must be passed by VALUE, not
REFERENCE. Instead of uploading the code into the Editor, do the following:
a. Copy the code in the ZDSRFC.TXT file, from TABLES through the last
ENDIF, to the clipboard.
b. Paste the code to the Editor in the function module.
c. Check that the syntax for this function procedure displays No syntax
error found.
6. Activate Z_RFC_DS_SERVICE. Do this by selecting Function module ➤
Activate (or Ctrl+F3) to activate the code so that it can function properly.
7. As part of the installation process, test that the function works in your
environment by testing the version number or the table content:
SAP and RFC code version number. Do the following:
a. Start Transaction SE37, and type Z_RFC_DS_SERVICE.
b. Press F8. The Test Function Module: Initial Screen appears.
c. Click Execute. The Test Function Module: Result Screen appears. The
SAP version number (for example, 46B) and the RFC utility version
number are displayed.
Table content. Do the following:
a. Start Transaction SE37, and type Z_RFC_DS_SERVICE in the Function
module field.
b. Press F8. The Test Function Module: Initial Screen appears.
c. Type 5 in the I_SERVE_TYPE field.
d. Type T000 for I_TABLE_NAME.
e. Click Execute. The Test Function Module: Result Screen appears. If
T_FIELDS and T_DATA contain records, the RFC utility works properly.
Packaged Extractions
The DataStage ABAP Extract for SAP R/3 plug-in includes pre-packaged
extraction jobs that help you understand common extraction scenarios.
Note: The Extraction Object and the corresponding ABAP program cannot
be regenerated since it overwrites the code to call the function
module.
• LongTextFTP. This job uses the FTP data transfer method to extract long
text from VarChar fields. It decompresses them using the READ_TEXT
function module.
Note: The Extraction Object and the corresponding ABAP program cannot
be regenerated as this results in overwriting the code to call the func-
tion module. If Path of Remote File in the Output ➤ Runtime tab is
changed, you must manually search for the old path in the ABAP
program and replace it with the new one.
B
Packaged Extractions for
ABAP
The DataStage ABAP Extract for SAP R/3 plug-in includes pre-packaged
extraction jobs that help you understand common extraction scenarios.
These pre-packaged extractions are provided as a DataStage export file
(Extracts.dsx). This file is located on the DataStage PACK for SAP R/3 client CD in
the templates directory. Use the DataStage Manager import function to import any
or all of the jobs contained in the file.
You can customize these jobs for your ETL initiatives by providing parameters
specific to your environment and by using the extraction object and corresponding
ABAP that already exist in the job. The ABAP Extract plug-in also includes jobs
that extract long text fields from R/3. The following list describes these jobs and
the SAP tables from which they extract data:
• LongTextCPIC. This job uses CPIC data transfer method to extract long
text from VarChar fields. It decompresses them using the READ_TEXT
function module.
Note: The Extraction Object and the corresponding ABAP program cannot
be regenerated since it overwrites the code to call the function
module.
• LongTextFTP. This job uses the FTP data transfer method to extract long
text from VarChar fields. It decompresses them using the READ_TEXT
function module.
Note: The Extraction Object and the corresponding ABAP program cannot
be regenerated as this results in overwriting the code to call the func-
C
Properties for the IDoc
Extract Plug-In
The IDoc Extract stage contains grid-style stage and output properties that are
visible from the DataStage Designer. These are included for reference only since
you cannot configure the stage properly using the grid editor.
The next table includes the following column heads:
• Prompt is the text that the job designer sees in the stage editor user
interface.
• Description describes the properties.
The IDoc Extract stage supports the properties listed in the following table:
Properties
Prompt Description
USERNAME Stage. The user name used to connect to SAP.
PASSWORD Stage. The password used to connect to SAP.
CLIENT Stage. The client number used to connect to SAP.
DESTINATION Stage. The name of the physical connection defined in the
DSSAPConnections.config fie.
LANGUAGE Stage. The language used to connect to SAP.
GWHOST Stage. The host on which the R/3 application message server
resides.
SYSNBR Stage. The system number of the R/3 instance.
ROUTERSTR Stage. Optional. A string used to connect to the remote SAP
server.
Properties (Continued)
Prompt Description
LOADBALANCING Stage. A Boolean value that determines whether this is a load-
balancing connection.
USEDDEFAULTSAP- Stage. A Boolean value that, if true, determines whether to use
CONNECTION the default values from the DSSAPConnections.config file. All
stage properties up to this point are ignored.
TESTMODE Stage. The value specifying that this job does not update the
bookmark file when it is run. Also, the job is not started auto-
matically by the listener when its batch count threshold is
exceeded.
SEGTYP Output. The segment type of the IDoc to be processed by this
link.
D
File Permissions for IDoc
Extract Plug-In
This section documents permissions on files that are created by the IDoc Extract
Pack. For more information on umask, see the UNIX documentation for your
system.
Files created by the IDoc Extract Pack are subject to two sets of permissions,
depending on how they are created:
• The first set belongs to those files that are created by child processes of the
dsidocmgr executable, such as the dsidocsvr executable.
• The second set belongs to those that are created by DataStage client connec-
tions, such as the IDoc stage editor and the DataStage Administrator for
SAP.
These files inherit the user id and group id of the user logged into the
DataStage client and the umask setting of the DataStage server process
owner. Note that the umask is not that of the user logged in to the client.
When the IDoc Extract plug-in is installed, it creates a directory named
DSSAPConnections in the directory where the DataStage server is installed. This
directory is owned by root, and the permissions are set so that everyone has
permissions to read and write, regardless of the umask for the process.
A file called DSSAPConnections.config is created in the DSSAPConnections directory
when a user logs in using the DataStage client and has permissions subject to the
umask with which the DataStage server is started.
Several other files with the .config suffix reside in this directory. (The DataStage
client creates these for internal purposes.)
• The IDoc.SegmentAdminFields.3.config and IDoc.SegmentAdminFields.2.config
files are created the first time a user accesses meta data for an IDoc from a
particular ALE port version.
• The IDocCleanup.config file is created the first time a user accesses the IDoc
Cleanup and Archiving page in the DataStage Administrator for SAP.
• The SAPVersions.config file is created at the time the first connection to SAP
is created.
The user who first logs into DataStage to perform these activities owns these files,
which are subject to the umask setting that the DataStage server was started with.
When a connection to SAP is created, a directory with the same name given to the
connection is created containing a subdirectory called IDocTypes. These directories
are owned by the user who creates the connection, with permissions subject to the
umask setting that the DataStage server was started with.
Note: Users who want to create a connection to SAP either using the stage editor
or the DataStage Administrator for SAP must have write permissions to the
DSSAPConnections.config file and the DSSAPConnections directory.
When the first IDoc type is configured, a file called IDocTypes.config is created in the
IDocTypes directory, so the user who configures this first IDoc type must have write
permissions to the IDocTypes directory. Remember that this directory was created
by the user who defined the connection and may be different than the user
defining the first IDoc type.
When a user configures subsequent IDoc types, that user must have write
permissions to the IDocTypes.config file and permissions to create the directory
location (PSA) to be configured for that type. This directory can be configured
anywhere on the file system, but is created by default in the IDocTypes directory.
In the configured PSA, another file with the .config suffix is created that contains
parameters for this IDoc type. The user who logs into DataStage to configure this
IDoc type owns these files, which are subject to the umask setting that the
DataStage server was started with.
Any user who later modifies the configuration for an IDoc type must have write
permissions to:
• The IDocTypes.config file
• The directory the IDoc type is to be written to
• The IDoc type config file in that directory
If the modification is to specify a new directory location, the user must have
permission to write to the new directory location.
file, so all the users who want to run jobs in this project must be able to write to this
file. For example, if user A runs a Load PACK for SAP BW job, and user B
subsequently runs an IDoc Extract job, the job fails unless user B has permission to
remove and recreate this file.
Recommendations
• The umask setting should apply a consistent set of permissions to all users
who log in to DataStage for the purposes of administering the IDoc Extract
functionality. In addition, the umask setting should be consistent for all
users running DataStage jobs for all SAP R/3 products.
For example, if only one user will ever administer DataStage connections to
SAP or run DataStage jobs using SAP R/3 plug-ins, the umask should be
022. If only users from the same group will perform these functions, the
umask should be 002. If you want users to administer DataStage
Index
Index-1
PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM
Index-3
PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM
Index-5
PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM
bookmark files for IDoc Extract 3-19 Select Application Component ID dialog
PSA definition 1-6 BAPI plug-in 5-10
PSA maintenance 3-10 Select DataStage Connection to SAP
PSA property 3-9 dialog
ABAP Extract plug-in 2-11
R BAPI plug-in 5-7
IDoc Extract plug-in 3-22
R/3 source systems configuration IDoc Load plug-in
IDoc Extract plug-in 3-5 Connection and Logon Details page
R/3 upgrades Connection Properties
changing segment meta data for IDoc dialog 4-7
Extract 3-28 Select IDoc Component to Extract
R/3 Versions 3-29 dialog 3-32
restore bookmark files to original Select IDoc Component to Load dialog
state 3-18 IDoc Load plug-in 4-15
return parameters 5-17 Select IDoc Type dialog
RFC definition 1-6 IDoc Extract plug-in 3-24
RFC, importing for ABAP Extract 2-4 IDoc Load plug-in 4-10, 4-11
root segments selecting
IDoc Load plug-in 4-19 ABAP Extract data transfer
run methods, ABAP Extract 2-20 methods 2-13
running background processes, ABAP control records or segments for IDoc
Extract 2-15 extraction 3-32
running DataStage jobs automatically 6-8 selecting a BAPI
BAPI plug-in 5-11
S selecting DataStage connection to SAP
IDoc Load plug-in 4-5
saprfc.in file selecting IDoc components
permissions D-3 IDoc extraction 3-32
saprfc.ini file 3-19, D-3 selecting IDoc Types
IDoc Extract plug-in 3-19 IDoc Extract plug-in 3-23
Save IDoc Segment Definition selecting IDoc types
dialog 3-35 IDoc Load plug-in
IDoc Load plug-in 4-18 IDoc Load Stage IDoc Type
scheduling jobs page 4-10
IDoc Extract 3-28 selecting segment definitions
Segment definition 1-7 IDoc Load plug-in 4-15
segment definition names specifying
IDoc extraction 3-29 ABAP program properties, ABAP
segment meta data for R/3 upgrades Extract 2-18
changing for IDoc Extract 3-28 specifying for IDoc Extract 3-29
segment type definition 1-7
Index-7
PACK4SAPR3.book Page 8 Thursday, January 15, 2004 11:31 AM