Anda di halaman 1dari 6

Ph:716-816-5678

AbInitio
Abhinav Pande,PMP Specialist Loc:
Pasadena,CA

Profile

Eight years of professional experience in Information Technology with specialization


in Datawarehouse techniques and RDBMS. Large hands on experience with ETL
tools like Ab Initio, Informatica with Teradata, Oracle, SQL Server and DB2 as
backend. Adequate knowledge with UNIX Shell Scripting, Cobol Copybook,
Mainframe

 Worked with batch graphs, Continuous>Flows, Web Service graphs, and


Conduct>It (previously Plan>It).
 Helped with installing, configuring, supporting and upgrading entire Ab Initio
environments, for Production, QA and Development.
 Trained production support teams to handle and support new Production
environments, helped with its continued improvement and provided Tier2
support for deployed applications.
 Experienced in SQL and UNIX Shell Scripts.
 Expertise with data migration, data transformation and data loading using Ab
Initio into Oracle, DB2,Teradata and other databases, and files
 Exceptional ability to work individually and in a team environment, is willing to
and can easily grasp and learn new ideas and concepts.
 Excellent written and oral communication skills and outstanding inter-personal
and organizational skills.

Certifications

Education
Bachelors in Computer Engineering from State University of New York at Buffalo
(Deans List)

Abhinav Pande 1
Software Expertise
Datawarehousing Ab Initio (GDE 1.14.37
Co>Operating System 2.14.3/2.13.8/ 2.7.3)

Star-Schema Modeling, Snowflakes Modeling, Erwin


Data Modeling
4.0, Visio.

RDBMS Teradata, Oracle 10g/9i/8.0/7.x, MS SQL Server , DB2

UNIX Shell Scripting, Korn Shell, SQL, SQL*Plus,


Programming
PL/SQL.

Operating Systems Windows 2000/NT, UNIX

Work Experience : Client: Kaiser Permanente


Project 1 Dec 2007 - Present
AbInitio Lead Developer

Kaiser Permanente is an integrated managed care organization, based in Oakland,


California, founded in 1945. Permanente is a consortium of three distinct groups of
entities: the Kaiser Foundation Health Plan, Inc. and its regional operating
organizations, Kaiser Foundation Hospitals, and the Permanente Medical Groups. As
of 2006, Kaiser Permanente has 8.7 million health plan members,156,000 employees,
13,729 physicians, 37 medical centres, 400 medical offices, and $34.4 billion in
annual operating revenues and $1.3 billion in net income

Roles and Responsibilities:

 Participated in Logical and Physical Data modeling. Identified the Entity


types, their attributes and the relationship between the Entities in the
organization’s business process.
 Worked extensively with Tivoli Maestro for job scheduling and organizing
ETL scripts for different job functions
 Sorted the Extraction from heterogeneous source systems, like Oracle,
internal and external flat files and building of the Transformations and Loading
formatted data into the Multi-file and Serial files during the intermediate and
the final stages of the processes (ETL) using Ab Initio.
 Used Ab Initio GDE to generate graphs for the generation of the Loan Level
file without summaries and to summarize loan-level data records.
 Developed number of Ab lnitio Graphs for the ETL processes based on
business requirements using various Ab lnitio components like Partition by

Abhinav Pande 2
Key, Partition by round robin, reformat, rollup, join, scan, gather,
replicate, merge etc.
 Extensively used the Ab Initio tool’s feature of Component, Data and
Pipeline parallelism.
 Used Teradata Utilities Fast Load, Multi-Load, and Tpump for data
loading.
 Configured the source and target database connections using .dbc files
 Generated DB configuration files (.dml, .cfg) for source and target tables
using db_config and modified them according to the requirements.
 Used sandbox parameters to check in and check out of graphs from
repository Systems.
 Developed various Ab Initio Graphs for data cleansing using function like
is_valid, is_defined, is_error, is_defined, string substring,
string_concat and other string_* functions.
 Created Sub Graphs to impose application/business restrictions.
 Developed UNIX Shell Scripts to automate file manipulation and data
loading.

Environment: UNIX, Oracle 8i/9i Ab Initio (GDE 1.12.5.2,Co>Operating


System 2.12.3), Shell Scripting, SQL.

Work Experience : Client: JP Morgan Chase, Columbus, OH


Project 2 Aug 2006 to Dec 2007
Sr.Ab Initio developer

The Enterprise Data Warehouse provides integrated information and services that
enable our Lines of Business partners to attract new customers, retain existing
customers, make informed financial decisions, and improve business practices.

The EDW is comprised of a series of integrated dimensional data marts. These data
marts enable end-users to directly query historical data relating to JP Morgan Chase
accounts, transactions, households, customers, products, and prospective customers.

 Worked for the Enterprise Data Warehouse Team of JP Morgan Chase in Columbus as
a Developer / Analyst
 Gathering business requirement from business users for the financial interfaces
coming from various source systems of JPMorgan Chase and BankOne.
 Designing and developing the ETL processes to process and load data into Enterprise
Data warehouse (EDW) coming from various source systems
 Developed a lot of graphs in the ETL tool AbInitio for Deposit & Customer team of the
bank.
 Worked for Production Support during evenings and weekends.
 Used various components like sort, reformat, join, lookup, filter, dedup etc

Abhinav Pande 3
 Various performance tuning techniques used to improve the performance

 Responsible for the automation of Ab Initio graphs using Maestro Scheduling tool

 Interacting with the off shore team for best coordination


 Documenting the Technical Manuals
 24x7 On Call Production Support
Environment: UNIX, Oracle 9i Ab Initio (GDE 1.13.5.2,Co>Operating System
2.12.3), Shell Scripting, SQL,Maestro Scheduling, PVCS
Version Control, Hyperion Brio, Toad, DB2

Client: Bank One/JP Morgan Chase, DE


Project 3 April 2005 to Jul 2006
Sr.Ab Initio developer

Data warehouse is built to handle the immense amount of data produced in the
bank’s operations and allow that data to be analyzed for correlations that would
provide commercial advantages. This Data Warehouse on Oracle 8.1.7 database is
built by integrating data from the bank’s major data source systems like DB2 and Flat
file data and external data sources available on different platforms.

Roles and Responsibilities:

 Analysed the Bank’s business processes and interacted with the end users for
informational requirements.
 Involved in the design and implementation of the Data model for the Data
Warehouse using Star Schema.
 Developed and supported the extraction, transformation and load process
(ETL) for the Data Warehouse from heterogeneous source systems using Ab
Initio.
 Used most of the generally used Ab lnitio components like Reformat, Join,
Rollup, Normalize, Dedup, Sort, Input table, Output table, etc and
worked with parallelism using various partition methods like partition by
key, partition by expression, etc.
 Developed source watchers that look for incoming flat files (Delta) from other
servers and once found the required flat file will create indicators for the down
streams that will indicate availability of the file.
 12-way multi file system was implemented to partition the data and
various operations were run in parallel on the partitioned data.
 Configured the source and target database connections using .dbc files
 Created .xfr and .dml files for various transformations and specifying the
record format.
 Incorporated data parallelism into graphs by using partition by key and
partition by Round Robin. Used partition by Round Robin to avoid skew.
 Phasing was done to the complex graphs to avoid the table lock situations
while loading and updating the tables.

Abhinav Pande 4
 Involved in Performance Tuning of Ab Initio graphs using Various Ab Initio
performance techniques and best practices such as using Lookup’s instead of
Joins and used in memory sort where ever possible.
 Deployed, tested ran the graph as executable Korn shell scripts in the
application system.

Environment: Ab lnitio GDE 1.12.3, Ab lnitio Co>Op 2.12.3, Oracle 9i, AIX
5.0, Solaris 5.8, Harvest.

Client: UBS Investment Bank, CT


Project 4 June 2003 to March 2005
ETL Developer

This project deals with the banking transactions at regional and Country level. They
follow two systems Advantage and E-Advantage. Advantage Application captures
data from Flat files and Excel Sheets, where as E-Advantage Application pumps
huge data into the production database, which makes it impossible to maintain
beyond 90 days. Each location at the end of the day has to transfer the data into
central zones and each location daily generates reports for different purposes like
Loan portfolio, Non sufficient fund port folio. They needed extensive Data
Warehousing to maintain historical data at a central location for integration and
analyze the business information in different locations according to the profit areas,
which could serve the purpose of a DSS management.

Roles and Responsibilities:

 Developed and Implemented extraction, transformation and loading the data


from the legacy systems using Ab Initio
 Extensively Used Transform Components: Aggregator, Match sorted, Join,
Denormalize sorted, Reformat, Rollup and Scan Components.
 Metadata mapping from legacy source system to target database fields and
involved in creating Ab Initio DMLs and written complex XFRs for
implementing the business logic transformation.
 Involved in Unix Korn shell wrapper scripts to accept parameters and
scheduled the processes using Crontab, Job Scheduler, Database Load
Interface, and Denormalization using Ab Initio.
 Involved in using EME (Ab Initio metadata repository) setup and
maintenance, version control, debugging.
 Developed Ab Initio scripts for data conditioning, transformation,
validation and loading.
 Assisted in developing various reports using PL/SQL Stored Procedures.
 Involved in writing wrapper scripts to run the graphs, to load into Data
Warehouse and to verify the counts while loading.

Abhinav Pande 5
 Developed shell scripts to automate file manipulation and data loading.

Environment: Ab Initio Co>OS vs. 2.11.8, GDE 1.12.5.2, Oracle 9i, Toad
7.6.0.11, HP UX 11.x, IBM PCOM 4.2, TSO/ISPF, Korn Shell Scripting.

Client: Walgreens, IL
Project 5 June 2001 to May 2003
ETL Developer

Walgreens is one of the leading pharmacyin the country. It provides various


financial services including medications, pharmaceutical , Extracare, Insurance etc.
Walgreens developed a data mart called Risk Mart to perform risk analysis for
various lines of business like pharmacy, insurance, patient care etc. The data for the
Risk Mart was extracted weekly from the corporate data warehouse and conditioned
using Ab Initio ETL and Unix shell programs and loaded into an Oracle database.
Up to six years of history was maintained in the Risk Mart.

Roles and Responsibilities:

 Developed and Implemented extraction, transformation and loading the data


from the legacy systems using Ab Initio
 Extensively Used Transform Components: Aggregator, Match sorted, Join,
Denormalize sorted, Reformat, Rollup and Scan Components.
 Metadata mapping from legacy source system to target database fields and
involved in creating Ab Initio DMLs and written complex XFRs for
implementing the business logic transformation.
 Involved in Unix Korn shell wrapper scripts to accept parameters and
scheduled the processes using Crontab, Job Scheduler, Database Load
Interface, and Denormalization using Ab Initio.
 Involved in using EME (Ab Initio metadata repository) setup and
maintenance, version control, debugging.
 Developed Ab Initio scripts for data conditioning, transformation,
validation and loading.
 Assisted in developing various reports using PL/SQL Stored Procedures.
 Involved in writing wrapper scripts to run the graphs, to load into Data
Warehouse and to verify the counts while loading.
 Developed shell scripts to automate file manipulation and data loading.

Environment: Ab Initio Co>OS vs. 2.11.8, GDE 1.12.5.2, Oracle 9i, Toad
7.6.0.11, HP UX 11.x, IBM PCOM 4.2, TSO/ISPF, Korn Shell Scripting.

Abhinav Pande 6

Anda mungkin juga menyukai