Anda di halaman 1dari 5

Informatica Configuration details:

Domain Name: - DOM_BIINF_PROD


Gateway Host: - inf-prod-biinf-a1.vmware.com
Gateway Port: - 6005

Step1: - Open the Informatica workflow Monitor.

Step2: - Once its open, add the Repository as show below.


Step3: - Once its done, under the repositories we will get the RS_BIINF_PROD. Double click on the
repository then it will ask for the user name and password.
Step 4: - then it will open the Integration services. For EDW_HANA we have an Integration service i.e.
INT_SRVC_EDW. Double click on the EDW_HANA Integration services. It will open the all the folders,
scheduled and running jobs under the integration service.

Metadata DB Connection Details: -

User Name: - EDW_HANA_METADATA


Password: - ed3hana"
Host: - ORA-PROD-BIINF-D1.VMWARE.COM
Port no# 1521
Connection string: - INFBIPRD'
Monitoring: -
All the jobs are scheduled under the folder WF_SCHEDULE_LOAD.
if any job is not in schedule, we have an alert that will triggers every 4hrs and send the email support
team DL.

All the main jobs are scheduled in the WF_SCHEDULE_LOAD, based on the name we can identify its
related to which subject area and all. All these main jobs will trigger the subject area specific workflows.

These jobs are categories into multiple types.

1) Table to table loads: -


All the table to table loads are running under the specific subject area folders and most
of them are increment loads. These jobs are called as PSA loads.

2) Flat file loads: -


All the flat file loads are running under the folder SRC_LOAD_VFLATFILE and main job
name is wf_SCHEDULE_FLATFILES under the WF_SCHEDULE_LOAD folder. Its schedule to run
every 5mins. File data loads based on the user modification date and timestamp, this logic has
been taken care in the Unix shell script (edw_hana_file_sniffer.sh).
All the flat file related information (File name, Business folder, Infa Folder and Infa job details)
are stored under the meta data table i.e. EDW_HANA_FILE_METADATA.
All the flat file load stats/run history stored under the EDW_HANA_FILE_LOG table.
These loads are called as User flat file loads.

3) HANA Stored Procedures loads: -

All these SP loads are running under the folder VSRC_PSA_TO_CORE. These will run
based on the batch name called by the main jobs.
All the SP related information is stored under the table EDW_HANA_JOB_STATUS and
dependency is stored under the EDW_HANA_JOB_DEPENDENCY table.
If any jobs failed, we need to check in Infa command logs and HANA audit table
"VCORE"."EDW_DML_AUDIT_LOG" based on the SP/ routine_name.
These loads are called as CORE loads.

Ex: -
Infa Meta data SQL
select * from EDW_HANA_JOB_STATUS
where batch_name = 'WEEKLY_MARKETING'
and is_active ='Y'

HANA SP Audit SQL


select * from "VCORE"."EDW_DML_AUDIT_LOG"
where routine_name like '%VSP_FCT_DSS_GSS_PULSE_AMER%'
order by date_inserted desc
All these loads we can monitor under the BOBJ Report (HANA Data Load Audit).
We have different tables for each loads like PSA, Core, User Flat file Loads, .. etc.