Anda di halaman 1dari 4

2.

Different options to export artifacts across landscape (developer mode , delivery unit and LCM)
Schema mapping
Authoring Schema -> Physical Schema
(The schema mapping defined in the system also applies to the default schema property
of Analytic and Calculation Views)
(This is particularly useful in script based calculation views)
Administrator should normally maintain the Schema Mapping in the target system before
importing the package that contain the information models
When the schema has been created in the target system we should first
import all the relevant tables as first step. After that and before importing the
information models ,define Schema mapping
It is also possible to define Package Specific Authoring Schema

Information Models are stored into package and in order to export or import Packages They
must be included in a Delivery Unit

Objects can be exported using


(1)Delivery Unit
(2)Developer Mode
Function to export individual Objects to a directory on your local client computer

Objects can be imported using


(1) Delivery Unit ( Function to import Object from (grouped in delivery unit) server or
client location available in form of .tgz file
(2) Developer mode (Function to import Object from a Client location )

SAP HANA Application Lifecycle Management :


SAP HANA application lifecycle management supports you in all phases of an
SAP HANA application lifecycle, from modeling your product structure, through
application development, transport, assemble, and install.
Is part of XS (SAP HANA Extended Application Services)

SAP HANA Application Lifecycle Management is particularly well-suited to SAP HANA


transport landscapes with no ABAP development (without CTS+) and with no
requirement for synchronizing transports with other non-HANA content.
With SAP HANA Application Lifecycle Management (HALM), We can create
e.g. products and delivery units, We can manage the changes that We are
doing to our coding and we can transport our content from development to
production.

3.Hierarchies (level and Parent child) when to use level and when to use parent child

Level Based Hierarchy : Requires each nodes in separate column


level based Hierarchy Define Level Only. It Used For Level Based Measures.
Ex:Time Dimenssion
Year--Quarter--Month--Week---Day
The address hierarchy is a typical example of a level hierarchy.
Parent Child Hierarchy:
Requires separate columns for Parents and Children nodes

A parent-child hierarchy is a hierarchy of members that all have the same type. Consists
of values that define the hierarchy in a parent-child relationship and does not contain named levels.
This contrasts with level-based hierarchies, where members of the same type occur only
at a single level of the hierarchy.

A parent-child hierarchy is based on a single logical table


· Each row in the table contains two identifying keys, one to identify the member itself, the other to identify the
"parent" of the member

Example:
Employee hierarchy might have no levels, but instead have names of employees who are
managed by other employees.
Employees can have titles, such as Vice President. Vice Presidents might report to other
Vice Presidents and different Vice Presidents can be at different depths in the hierarchy.

Parent Child Hierarchies Is a Value Based Hierarchy


· Level Base Hierarchies Is A Structure Based Hie.

https://blogs.sap.com/2015/03/13/implementing-and-displaying-standard-hierarchy-
with-sap-hana/

STANDARD HIEARCHY IN SAP

Standard hierarchy is a tree hierarchy which is used to organize business processes of a controlling area.
The highest node of a standard hierarchy is normally the first business process group. The groups
created thereafter make of the remaining nodes of the standard hierarchy.

There are different standard hierarchies available in SAP ECC like cost center, profit center, cost element
etc.
There are three tables in SAP for groups in standard hierarchy.

 SETHEADER
 SETNODE
 SETLEAF

4.SDI (Table replication task and flow graph)

Table replication task and flow graph


Table replication task is for loading data to custom tables in hana .
when you execute table replication task it creates custom table and virtual table.if with initial load
its load else no data load
To execute table replication task we create procedure which has a method Start_REPLICATION()
flow graph
We use this method if we need to perform data cleansing before table replication

The vision of SAP HANA smart data integration and SAP HANA smart data quality is to build the
data provisioning tier natively within HANA; to simplify the SAP HANA landscape by removing an
entire tier previously built with heterogeneous non-native tools.

The native integration capability of SAP HANA smart data integration is architected for on premise,
cloud or hybrid deployments, and supports all styles of data delivery including

 Federated
 Batch
 Real Time (not all data sources)

Hana Smart Data Integration – Overview


Hana Smart Data Integration – Adapters
Hana Adapter SDK – Setup
Hana Adapter SDK – The first adapter
Hana Adapter SDK – The Open Source Adapters from Github
Hana Adapter SDK – Interaction via SQL
Hana Adapter SDK – The Adapter code
Hana Adapter SDK – Adapter capabilities
Hana Smart Data Integration – Batch Dataflows
Hana Smart Data Integration – Realtime Table Replication

6. Schedule job in XS admin

The XSJob file enables you to run a service (for example, an XS JavaScript or a SQLScript)
at a scheduled interval.
Required roles
sap.hana.xs.admin.roles::JobAdministrator and sap.hana.xs.admin.roles::HTT
PDestAdministrator assigned to your use
a. name.xsjob // job schedule definition
b. name.xshttpdest // HTTP destination details
c. name.xsjs // Script to run on schedule

configuration for system to allow job scheduling


1. In the SAP HANA studio, open the Administration perspective.
2. In the Configuration tab, add the section xsengine.ini Scheduler and set the enabled = true
parameter.
7. Performance analysis using Plan viz
Performance Analysis
SQL Plan CACHE
Contains two kinds of information
(1)Parsed statement
(2)Call and runtime statistics for these elements
Can answer (1) How much time was spent on actually executing a statement
(2)Can my statement be reused or does it need to be optimized
Explain Plan
SQL explain plans are used to generate detail explanation of SQL statements.
They are used to evaluate execution plan that SAP HANA database follows to execute
the SQL statements.
The results of explain plan are stored into EXPLAIN_PLAN_TABLE for evaluation. To use
Explain Plan, passed SQL query must be a data manipulation language (DML).

Gives information about


. Which engine was used

Plan Visualization
PlanViz is a built-in SAP HANA performance tool that provides information about
the runtime of specific queries on data models.
PlanViz offers insight into the data transfer between the join, calculation, OLAP,
and SQL engines, which is a crucial element of performance.
With this information, developers and administrators can pinpoint possible
bottlenecks in the system and identify how to optimize data models to address these
issues.
Provide estimation and actual runtime statistics for memory , CPU time ,parallelism and
total run time.
Can be saved in xml file and viewed latter

Anda mungkin juga menyukai