Anda di halaman 1dari 7

How to know everything about BW Extractors

July 25th, 2013 by blogadmin.


Extractor in a simple terminology is used for extracting the data from various s
ources to BW.
For this purpose we have SAP pre-defined extractors (LO extraction etc ) and cust
omized extractors (Generic extractors)
Application specific BW content extractors:
Lo Extraction:
Logistics refers to the process of getting a product or service to its desired
location upon request which involves transportation, purchasing, warehousing etc
.
Main areas in logistics are:
Sales and Distribution (SD) : application 11, 13, 08 (in LBWE T-code)
Materials Management (MM) : application 03, 02
Logistics Execution (LE) : application 12
Quality Management : application 05
Plant Maintenance (PM) : application 04, 17
Customer Service (CS) : application 18
Project System (PS) : application 20
SAP Retail : application 40,43,44,45
How the data extraction happens?
Extraction can be done using either Full update/delta update.
Full load: In case of logistic application, Full/Initialization will extract the
data from setup tables (contains only historical data).
For full update the data will be taken from setup tables, so in order to capture
the changes you need to fill setup tables every time ,which will be a laborious
task.
So, it is always suggestible to go for delta loads which makes loading life eas
ier
Read the below note to get details on delta load-:
Initialization: Data will be fetched from application table to setup tables (In
Lo extraction, the extractor won t allow the direct communication with the applica
tion tables) from here, data finally reaches the target (info cube/ODS).Remember
this process is for onetime.
Pre-requisites: Prior to initialization make sure the following steps are comple
ted:
1. Maintain Extract Structure
2. Maintain data sources
3. Activate Extract Structure
4. Delete Setup tables
5. Fill setup tables
Delta load: Once after successful initialization, we can use delta update to cap
ture the changed /new records
Once a new transaction happens/an existing record is modified, upon saving it g
oes to the respective application table.
Pre-requisites: Prior to delta loads make sure the following steps are complete

d:
1.Define periodic V3 update jobs 2. Setting up the update mode (direct/queued/U
n serialized v3 update)
LO- Delta Mode:
Info object 0Recordmode helps in identifying the delta
Check the field delta in ROOSOURCE /RODELTAM table
Incase of Lo extraction it is ABR
ABR: An after image shows the status after the change, a before image the statu
s before the change with a negative sign and the reverse image also shows the ne
gative sign next to the record while indicating it for deletion. This serializes
the delta packets.This process supports an update in an ODS object as well as i
n an Info Cube.
FI extraction:
FI Module deals with accounting and financial needs of an organization.
Financial Accounting is broken down into the following sub-modules:
Accounts Receivables
Accounts Payable
Asset Accounting
Bank Accounting
Consolidation
Funds Management
General Ledger
Special Purpose Ledger
Travel Management
Note: Only discussing key areas (AP/AR/GL/SL) briefly because of the complexity
of the area
We can extract the financial data at totals level / line item level.
In general, we will use R/3 line item tables as the data source for extracting
the data to allow drill down capability from summarized data to line-item detail
s.
Financial Accounting data can be extracted directly from the tables.
Depending on the business requirement we can use either FI-SL or standard BW co
ntent extractors (FI-AR, FI-AP, and FI-GL) to fetch FI data.
FI-AR, FI-AP, and FI-GL:
General Ledger: All accounting postings will be recorded in General Ledger. The
se postings are real time to provide up-to-date visibility of the financial acco
unts.
Account Receivable: Accounts Receivables record all account postings generated
as a result of Customer sales activity. These postings are automatically updated
in the General Ledger
Accounts Payable: Accounts Payables record all account postings generated as a
result of Vendor purchasing activity. Automatic postings are generated in the Ge
neral Ledger as well.
Standard FI data sources:
0FI_GL_4 (G/L Accounts- line items)
Takes the data from the FI document tables (BKPF/BSEG) that are relevant to gen
eral ledger accounting (compare table BSIS).
0FI_AP_4 (AP-line items) and 0FI_AR_4 (AR- line items
Selections are made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (
Accounts Payable)
How the data extraction happens?
In FI extraction 0FI_AR_4 and 0FI_AP_4 are linked with 0FI_GL_4 in order to mai
ntain consistent data transfer from OLTP system (it is called coupled data extra
ction, Ref OSS notes 428571).
Note: Uncoupled extraction possible with Plug-In PI 2002.2, see OSS note 551044
0FI_GL_4 writes the entries into the time stamp table BWOM2_TIMEST in the SAP R
/3 System with a new upper limit for the time stamp selection.
And now, 0FI_AP_4 and 0FI_AR_4 will copy this new upper limit for the time stam
p selection during the next data extraction in the SAP R/3 System. This ensures
the proper synchronization of accounts payable and accounts receivable accountin

g with respect to G/L accounting.


Full load: Not a valid choice because of large volumes of detailed R/3 transact
ion data.
Delta load:
Note: Here the delta identification process works differently for new financial
records and for changed financial records.
New Financial accounting line items which are posted in SAP R/3 sytem will be i
dentified by the extractor using the time stamp in the document header (Table BK
PF-(field) CPUDT).
By scheduling an initialization IP all the historical data can be loaded into B
W from the application tables and it also sets X indicator in field LAST_TS (Flag:
X = Last time stamp interval of the delta extraction).That means after the last d
elta, initialization was done.
OLTPSOURCE AEDAT/AETIM UPD DATE_LOW DATE_HIGH LAST_TS
0FI_GL_4 16 May 2013/21:05 Init 01 Jan 1990 15 May 2013
0FI_GL_4 24 May 2013/17:30 delta 16 May 2007 23 May 2013
0FI_GL_4 21 June 2013/18:12 delta 15 June 2007 20 June 2013 X
0FI_AP_4 18 May2013/20:14 Init 01 Jan 1990 15 May 2013
After this, daily delta loads can be carried out depending on timestamp by sched
uling delta info packages.
During the delta load , the SAP R/3 system logs two time stamps that delimit a
selection interval for a Data Source in table BWOM2_TIMEST(fields TS_LOW and TS_
HIGH).
In case of changed FI documents, selections will be based on tables:
BWFI_AEDAT and (timestamp table) BWOM2_TIMEST (See OSS note 401646 for more det
ails).
Delta extraction using delta queue method can also be possible incase if we wan
t,
Serialization of the records
To distribute delta records to multiple BW systems.
FI -Delta Mode:
A time stamp on the line items serves to identify the status of the delta. Time
stamp intervals that have already been read are then stored in a time stamp tab
le (BWOM2_TIMEST).
(Info object 0Recordmode plays vital role deciding delta s .Check the field delta
n ROOSOURCE /RODELTAM table to identify the image)
The Financial Accounting line items are extracted from the SAP R/3 system in th
eir most recent status (after-image delta method).
AIE: This delta method is not suitable for filling Info Cubes directly in the B
W system. To start with therefore, the line items must be loaded in the BW syste
m in an ODS object that identifies the changes made to individual characteristic
s and key figures within a delta data record. Other data destinations (Info Cube
s) can be provided with data from this ODS object.
It uses delta type E(pull) means the delta data records are determined during t
he delta update by the data source extractor, updated to the delta queue and pas
sed on to BI directly from there.
Check the below helpful links:
General ledger
Accounting payable/receivable:
CRM extraction:
Customer relationship management (CRM) is broadly about managing the relationsh
ips with customers, and is useful to analyze customer, vendor, partner, and inte
rnal process information.
How the data extraction happens?
We can do both full load and delta load depending on the CRM extractor behavior
.
Initialization:

During the initialization, all data that can be extracted using a data source i
s transferred from SAP CRM into SAP BW.
Execute the initialization of the delta process in SAP BW by creating and sched
uling an Info Package.
SAP BW calls up the BW Adapter using the Service API.
The BW Adapter reads the data from the respective database.
The selected BDoc data is converted into the extract structure from a mapping m
odule that is also entered in the BW Adapter metadata.
The type of Business Add-In (BAdI) that is called up by the BW Adapter depends
on the BDoc type
The requested data package is transferred to SAP BW using the Service API.
Any new postings/uptation of old postings from the source sytem (CRM )side will
be communicated via Middleware in the form of a BDoc.
The flow controller for Middleware calls up the BW Adapter.
The BW Adapter first checks whether the change communicated via the BDoc is rel
evant for SAP BW. A change is relevant if a Data Source for the BDoc is active.
If the change is not relevant, it is not transferred to SAP BW and the process
is complete.
If it is relevant, then the BW Adapter calls up the corresponding mapping modul
e and BAdi (the type of BAdi that needs to be called up in turn depends on the t
ype of BDoc).
And finally these will help in converting the BDoc data into the extract struct
ure.
Note:The mapping module and the BAdis that are called up during delta upload ar
e same as those called up during the initialization of the delta process.
The change is transferred to SAP BW using the Service API.
CRM-Delta Mode:
The delta will be identified /communicated via middleware in the form of Bdoc t
o BW adapter.
CRM standard data sources support AIMD (After-Images with Deletion Indicator De
lta Queue)
HR extraction:
The HR module enables customers to effectively manage information about the peo
ple in their organization, and to integrate that information with other SAP modu
les and external systems
HR broadly has the following modules:
PA (Personnel Administration) and Organization Management
Personnel Development
Payroll Accounting
Time Management
Compensation
Benefits
Training and Events
The Personnel Administration (PA) sub module helps employers to track employee
master data, work schedules, salary and benefits information. Personnel Developm
ent (PD) functionality focuses on employees skills, qualifications and career pla
ns. Finally, the Time Evaluation and Payroll sub modules process attendance and
absences, gross salary and tax calculations, and payments to employees and third
-party vendors
HR delivers a rich set of business content objects that covers all HR sub-funct
ional areas.
How the data extraction happens:
Before getting into how the data gets populated into HR info cube
Let s understand the term info type
An info type is a collection of logical and/or business-related characteristics
of an object or person
Here the data will be extracted from an info type (PA, PD, time management etc)
and for few other applications it is from the cluster tables (Payroll, compensa
tion etc.)
HR is basically master data centric because it is always related to people rela

ted Info Objects, such as Employee, Person. In most of the cases HR master data
is defined as Time Dependent to enable historical evaluation. HR R/3 system reco
rds a specific period of validity for each Info type.
Procedure to extract the HR data:
Activate Data Sources in the Source system (R/3)
Replicate Data Sources in BW system:
Activate business contents in BW.
Populate HR cubes with data by scheduling info packages.
Note: Master Data should be loaded first
Except for payroll and time management rest all sub-functional areas supports o
nly full load.
In case of full loads, old data needs to be deleted to avoid duplicated records
in the target.
Application specific-customer generated extractors:
Controlling:
Controlling is broken down into following sub modules:
Cost Element Accounting
Cost Center Accounting
Internal Orders
Activity-Based Costing ( ABC)
Product Cost Controlling
Profitability Analysis
Profit Center Accounting
Note: Only discussing (CO-PA) briefly because of the complexity of the area.
CO-PA:
Profitability analysis allows Management to review information with respect to
the company s profit or contribution margin by business segment.
It can be obtained by the following methods:
Account-Based Analysis
Cost-Based Analysis
Note:The details will be discussed once after understanding the CO-PA data flow
.
How the data Extraction happens?
When the data is requested from SAP BW, the extractor determines which data sou
rce the data is to be read from. This depends on the
Update mode (full, initialization of the delta method, or delta update)
On the definition of the DataSource (line item characteristics (apart from fiel
d REC_WAERS) or calculated key figures)
On the available summarization levels.
The extractor always tries to select the most appropriate data source, that is,
the one with the smallest data volume
Once an Info-Package is executed, the SAP BW Staging Engine calls the CO-PA tra
nsaction data interface. CO-PA extraction program for the SAP BW uses the same r
eplication method as the update program for CO-PA updating summarization levels.
On the BW side, only data that is at least 30 minutes old is received .This is to
secure data integrity.Because the time stamps from different application server
s can be slightly different.
This retention period of 30 minutes is often described as a security delta/Safet
y delta The system only extracts data that is at least 30 Min Old.
Account-Based Analysis
For account-based CO-PA extraction, only Full Update from summarization levels
is supported for releases up to and including Release PI2001.1.
In this case we can carry out delta using pseudo delta technique. Here we need
to do selective full load based on some selection conditions (Fiscal period) and
then we need to selectively drop the requests for the last period and reload th
e data that have changed.
From Release PI2001.2, the delta method can also be used.
Initialization: The initialization must be performed from a summarization level
.
Delta update: Delta will be read from line items.

During the delta load controlling area, fiscal period fields should be mandator
y.
Note: If the data needs to be read from a summarization level, then the level m
ust also contain all the characteristics that are to be extracted using the Data
Source (entry * in maintenance transaction KEDV). Furthermore, the summarizatio
n must have status ACTIVE.
Account based CO-PA is part of the CO module. This means the data which is post
ed in account based CO-PA is always in synchronize with the CO-module (CCA, OPA,
PA, PS etc).
The CO tables are COEP, COBK (for line items) COSS and COSP (for the totals).
Cost-Based Analysis:
In the case of costing-based CO-PA, data can only be read from a summarization
level if no characteristics of the line item are selected apart from the Record
Currency (REC_WAERS) field, which is always selected.
An extraction from the segment level, that is, from the combination of the tabl
es CE3XXXX / CE4XXXX (where XXXX stands for the operating concern), is only perf
ormed for Full Updates if no line item characteristics are selected (as with sum
marization levels).
Initialization: There are two possible sources for the initialization of the de
lta method. One is from Summarization levels (if no characteristics of the line
item are selected) and the other one is from line item level.
In case of Summarization level, it will also record the time when the data was
last updated / built.
If it is not possible to read data from a summarization level, data is read fro
m line items instead.
Delta update: Data is always read from line items.
Costing Based CO-PA data is statistical data. This means that the update of COPA is not always equal to what is stored in the CO modules or in finance. The co
st element is also not always updated and there are also more key-figures used t
o store info about the type of costs or revenues.
Understanding various tables(CE1/CE2/CE3/CE4) that are involved in co-pa extrac
tion, please read BW data extraction .
CO-PA Delta Mode:
Extraction is based on Timestamp.
When data is extracted from CO-PA, a safety delta of half an hour is used with th
e initialization and the delta upload. This always ensures that only records tha
t are already half an hour old since the start of the upload are loaded into SAP
BW. Half an hour was chosen as the safety delta to overcome any time difference
s between the clocks on the different application servers.
Please check the below links for more information:
Profitability analysis
FI-SL:
There are two types of ledgers in the FI-SL System:
Standard Ledger: Delivered by SAP, Ex: General Ledger Accounting (FI-GL)
Special Purpose Ledgers: These will be designed as per business needs (User def
ined,Ex:FI-SL)
The FI-SL Data Source can supply the data both at totals record level and also
at line item level
How the data extraction happens?
Prerequisite:
Since FI-SL is a generating application, the Data Source, transfer structure an
d assignment of the DataSource to the InfoSource must be created manually.
FI-SL line items:
Line item Data Source provides actual data at line item level.
Full and Delta mode: FI-SL line items can be extracted both in full and delta u
pload mode. The time stamp (TIMSTAMP field in the extract structure) is used to
identify the delta load, which is supplied from the CPUDT and CPUTM fields in th
e line items table. It uses safety delta concept set to one hour. This means tha
t posted line items can be loaded into BW after an hour.
Constraint:

The extract structure does not contain the BALANCE field. Refer note 577644 to
find out alternative ways to populate this field.
FI-SL Totals Records:
This DataSource can provide both actual and plan data at totals record level
Full update: The full update DataSource can be used to determine the balance ca
rry forward, since the line items DataSource does not supply this.
Usually Plan data will be transferred using the totals datasource in full updat
e mode.
Delta Update: The delta method can only be used for actual data with the select
ion (0VTYPE = 010). The Delta method is based on Delta queue technology. That me
ans after initialization during updating, the relevant data is posted to the Del
ta queue.
Before running the delta, please check the restrictions in the below link
Delta-Special Ledger
Part3: Cross application -Generic extractors
When none of the SAP- predefined extractors meeting the business demand, then t
he choice is to go for Generic extraction
We will go for Generic extraction:
1. When Business content does not include a data source for your application.
2. Business content requires additional enhancements that need data that is not
supplied by SAP BW.
3. The application does not features it s own generic data extraction method
4. When the requirement demands to use your own programs to fill your tables in
SAP Systems
Check the below link for more information:
Generic extraction
Generic delta
Generic Extraction via Function Module
Data recovery:
Scenario 1: The last run delta was failed(Not applicable to ALE based datasourc
es)
Solution:
Make the QM status red, delete the request from all targets
Senario2: Everyday delta was running fine but you find suddenly delta is missin
g for certain period (the reason may be anything),
Solution:
1. Reload data from the PSA
2. Reload data from an ODS Object or an Info Cube (in a layered
Architecture, EDW approach)
Applicable to Logistics:
Please refer One stage stop to know all about BW Extractors-Part1 to get an idea
on Logistics extraction .
Option 1 and 2 are not applicable, the only choice is to extract the data from
sources system
Check this OSS notes: 691721: Restoring lost data from a delta request
Here again we have one more constraint
As explained in the above OSS, because of huge data we can t bear the downtime du
e to re-initialization, we have a workaround here
1. in BW,transfer the existing target contents to an external source using open
hub services
2. Then selectively fill the setup tables for the missing data for the respecti
ve duration.
3. And run initialization, schedule V3 jobs to enable delta postings
Source: http://scn.sap.com

Anda mungkin juga menyukai