PROFESSIONAL SUMMARY
Eight plus (8+) years of IT experience in the Analysis, Design, Development, Testing and Implementation of business
application systems for Health care, Financial, Telecom, energy, oil and services sector.
Experienced in interacting with Business users in analyzing the Business process requirements and transforming them
into documents, designing, and rolling out the deliverables.
Experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data
mapping, build, unit testing, systems integration and UAT.
Strong experience in Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouses and
Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor).
Hands on experience with mappings from varied transformation logics like Unconnected and Connected Lookups,
Router, Aggregator, Joiner, Update Strategy, expression, sorter, Rank, sequence generator, Normailizer and Re-usable
transformations.
Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations
Developer, Mapplet and Mapping Designer.
Strong Experience on Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.
Expertise in working with relational databases such as Oracle 11g/10g, SQL Server 2008, MySQL, DB2 8.0/7.0, MS
Access, Postgre SQL and Green Plum.
Strong experience in coding using SQL, PL/SQL Procedures/Functions, Triggers and Packages
Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like
sources, targets, mappings and sessions
Solid experience in the installation, administration and monitoring of Informatica server, Domain, Repositories and
Informatica monitoring tools
Highly proficient in processing tasks, scheduling sessions, import/export repositories, manage users, groups and
deployment activities in Dev, QA and Prod environment.
Experience in Developing Repository (RPD) using Admin tool (i.e.) Physical layer, BMM layer and Presentation layer.
Well versed in UNIX shell scripting.
Experience in using Automation Scheduling tools like Control-M, CAWA and Infa. Scheduler.
Practical understanding of the database schemas like Star Schema and Snow Flake Schema used in relational and
dimensional modeling.
Exceptional analytical and problem-solving skills. Team Player with the ability to communicate effectively at all levels
of the development process.
TECHNICAL EXPERTISE
ETL Tools: Informatica 9.x/8.x/7.x, Informatica, Oracle Data Integrator 11g, 12c
Databases: Oracle 11g/9. i/8.x, SQL Server, DB2, MySQL, MS Access, Greenplum
Operating Systems: Unix/Linux, Windows (7,8,10)
Packages: SQL *Plus, Toad 7.x, SQL Developer
Data Modeling: Erwin
Scheduling Tools: Control-M, CAWA.
PROFESSIONAL EXPERIENCE
Tektronix Communication March 2017 to Present
Client: Vodacom, SA
Informatica/ETL Developer
The NAD solution is a standard ETL environment that takes Voice, SMS, and Bearer & GN IMSI FACT data from touchpoint. It
then transforms that data using Informatica by enriching it with dimensional data also sourced from touchpoint. Finally, the
transformed data into a target warehouse datamart.
In addition to this it also performs aggregation, aging, and has a few slowly changing dimensions that provide the capability of
a rich historical view of its datasets. Reporting and data access is via a COGNOS framework.
Extract: 15-minute data (level of granularity) is extracted from the touchpoint Database using database customized views.
This data is loaded into tables in the staging schema (Staging)
Transformed: The data is transformed from the 15-minute data into data aggregated to Hourly, Daily, Weekly and Monthly
periods of aggregation. In addition, the data is structured based on specific data dimensions such as IMSI, Cell, and Device etc.
Load: The data is loaded into the Data Warehouse from Staging. The primary mechanism for moving data from staging to the
Data Warehouse is via Oracle Partition Exchange. The benefit of this mechanism is that once the data is processed it is
incorporated into the Data Warehouse in a single step and thereby ensuring the integrity of the data in the Data Warehouse.
Environment: Informatica Power Center 9.5.1, UNIX, Oracle 11g, SQL Developer
Environment: Oracle Data Integrator 11g/12c, Oracle Communication Data Model 11.2.5, BRM Adapter, Oracle 11g, Toad,
Unix
Responsibilities:
• Developed the CIM ETL Mapping by using Informatica power center to load the data from XML source to 48
staging tables in Green plum Database.
• Created 48 individual mapping to load the data from Staging to ODS as per business requirement. This ODS
data is being used for the EAS upstream requirement.
• Modified the existing XSD by adding 03 more classes (Series Reactor, Meter bank, meter device) for new CIM
model (XML Source)
• Created 02 new Informatica environments (Unit_test, Sus_unit_test @INFATST) for the Databases for testing
the code before promoting to QA and production.
• Changed and performed unit testing for the database functions in Dev that are using old PostGis functions to
use new PostGIS functions so they can be executed in the GP 4.3.5. Created Source to Target (xml to staging
and then staging to ODS) Mapping document in Excel.
• Revised and modified the existing documentation in the BC Hydro SharePoint environment to in-corporate the
new business rules for smart metering integration (SMI).
• Used CIM EA Enterprise Architect (EA) to manage IEC Common Information Model (CIM), CIM Profiles, and
CIM-based artifacts.
• Used PGAdmin administration and Development tool for PostgreSQL.
Environment: ETL, Informatica Power Center 9.1, Oracle DBMS 10g, 11g, flat file, TIBCO, Greenplum DBMS, Informatica Power
Exchange for JMS, Pivotal Hadoop, Informatica Data Replication 9.1.
Groundswell Group, Vancouver, BC June 2014 to September 2014
Client: Provisional Services Health Authority
Sr. Integration Consultant/Informatica Administrator
Provisional Health Services Authority (PHSA) primary role is to ensure that BC residents have access to a coordinated network
of high-quality specialized health care services.
Project: Data Migration: The Panorama Data Conversion Project is intended to handle the entire Extract, Transform and Load
(ETL) interfaces required to migrate the data from the legacy iPHIS system, and the Sexually Transmitted Infections
Information System (STIIS) in order to convert the data over to the Pan-Canadian Health Surveillance and Management
System: Panorama.
Responsibilities:
• Designed and Developed Informatica Mapping to load data for STI (Sexually Transmitted Infection System)
from XML Source to target.
• Extensively used Lookup, Sorter, Aggregator, Filter, Expression, Normalizer, Mapping variables and
parameters, sequence Generator and Mapplets.
• Used JAMA for requirement capturing and used SQL Developer extensively for SQL, running queries, export
data to desired format, debugging and testing.
• Converted text file to PDF using JAVA based solution and PERL application as part of project requirement.
Used external loader to attach the file.
Environment: Informatica Power Center 9.5 (Designer, Workflow Manager, Workflow Manager, Workflow Monitor, Repository
Manager), Oracle 11g, SQL Developer for Oracle, UNIX, JAMA for requirement gathering.
Responsibilities:
Implemented new business requirements of integrating all the fleet code into Spira using Informatica best practices.
These redesigned works also include the error re-porting as per new logic. There are about 37 business rules (BR01-
BR37) for this integration process.
Provided some performance tuning to improve the session performance to 80% and configured some email
notifications on workflow and session level.
Configured all the Informatica session for email notification to different business users.
Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and
Router transformations for populating target table in efficient manner.
Prepared mapping specification documents and source to target field matrix
Groundswell Group, Calgary, AB February 2013 to March 2014
Client: Shaw
Integration Consultant
Shaw is the leading communication provider in Canada providing internet service, phone service and TV service to its
customers. Worked on 3 separate projects:
Responsibilities:
• Conducted various meeting with different team to get knowledge on business logic
• Identified the bug in logic and reported to business with possible solution and implementation.
• Prepared Source to Target Mapping document with detailed transformation logic for each fields and related
joining tables.
• Developed complex mappings for various targets, tested and promoted to QA without single defect.
• Extraction of data from different sources like Oracle, Oracle E-Business suit (EBS), flat-files, and Mainframe.
• Extensively used the debugger to debug the mapping and to find out the data discrepancy coming from the
source or arising from the business rules being incorrectly applied.
• Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator, Update
Strategy, Rank, Expression and lookups (connected and unconnected) to the business logic using Power
Center
• Created Mapping Specification documents following using best Informatica practices.
• Created and executed MOP in development and promoted the code to Testing, Lab and Prod.
Environment: Informatica 9.5.1 (Designer, Workflow Manager, Workflow Manager, Workflow Monitor, Repository Manager),
Oracle 11g, Toad for Oracle, UNIX.
Truven Health Analytics, (Formerly Health care of Thomson Reuter) IL August 2012 to January 2013
Senior ETL Developer
Responsibilities:
• Developed PL/SQL procedures, Cursors, Functions and packages for cleaning address, matching persons,
update person hoh_id, inserting new records, ncoa_updatesset sensitive flags for transactions to find
matching criteria.
• Involved in Performance tuning of the Informatica mappings using various components like Parameter files,
Variables and Cache.
• Created some UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained
the batch processes using Unix Shell Scripts
• Perform small enhancements (data cleansing/data quality)
• Involved in defining the source to target data mappings, business rules, business and data definitions
Environment: Informatica 9.1(Designer, Workflow Manager, Workflow Monitor, Repository Manager), Oracle 10g, UNIX, Toad
9.7)
Lehigh Valley Health Networks, PA March 2011 to July 2012
Senior ETL / Informatica Developer
Project:
Leigh Valley Health Networks is one of America best hospital providing healthcare services. It provides a comprehensive range
of inpatient, clinical and diagnostic services in more than 40 areas of medical specialties and subspecialties. During this
project I have worked closely with the data warehouse development team, customers, business analysts and other colleagues
in the ITS department to analyze operational data sources, determine data availability, define the data warehouse schema and
develop ETL processes for the creation, maintenance, administration and overall support of the data warehouse.
Responsibilities:
• Designed and developed Informatica Mappings to load data from different Source systems to Data
Warehouse.
• Extensively used Mapping Variables, Mapping Parameters, and Parameter Files for capturing delta loads and
Populated Slowly Changing Dimensions using Informatica.
• Creation of Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica
Designer.
• Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup
and Router transformations for populating target table in efficient manner.
• Created Mapplet and used them in different Mappings.
• Provided Knowledge Transfer to the end users and created extensive documentation on the design,
development, implementation, daily loads and process flow of the mappings.
• Worked with session logs, Informatica Debugger, and Performance logs for error handling.
Environment: Informatica Power Center 9.1 (Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer,
Transformation developer, Mapplet Designer, Mapping Designer, Repository manager), PL/SQL, SQL Server, Unix.
Responsibilities:
Worked with business analysts, developers, and management professionals from various locations to gather
requirements.
Translated business requirements into technical specifications to build the Data Warehouse.
Developed mappings in using informatica and oracle data integrator that catered to the Extraction, Transformation,
and Loading from various source systems to target systems.
Used Informatica tool to handle complex Mappings and extensively used the various Transformations like Source
Qualifier, Aggregators, Lookups, Filters, Update Strategy, Expression, Sequence generator and Sorter etc.
Created numerous Interfaces to load data from Flat files, CSV files and Oracle tables into staging tables and then into
respective FACT/DIMENSION/Look-up Tables in Data Warehouse.
Loaded data from different source systems to target warehouse using interfaces with Knowledge modules like LKM,
IKM and CKM for data quality check.
Used ODI in-built scheduler to schedule Scenarios/Load Plan's in production.
Participated in all phases of project development, testing, deployment and support.
Developed, implemented and enforced ETL best practices standards.
Environment: Informatica power center 8.6. Oracle Data Integrator 10g, oracle 10g, SQL server, TOAD, SQL Developer, UNIX
EDUCATION
Master of Science in Electrical Engineering (Electronics) – 04/1999
Bachelor of Science in Electronics & Telecommunication Engineering – 06/1990