[go: up one dir, main page]

0% found this document useful (0 votes)
149 views5 pages

Mahesh - Informatica Developer

The document provides a professional summary for a candidate with over 6 years of experience in data warehousing and ETL. They have extensive experience with Informatica PowerCenter and designing mappings to load data from various sources into data warehouses and data marts. They also have experience building dashboards with Tableau and writing SQL queries against databases like Oracle, SQL Server, and Teradata.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
149 views5 pages

Mahesh - Informatica Developer

The document provides a professional summary for a candidate with over 6 years of experience in data warehousing and ETL. They have extensive experience with Informatica PowerCenter and designing mappings to load data from various sources into data warehouses and data marts. They also have experience building dashboards with Tableau and writing SQL queries against databases like Oracle, SQL Server, and Teradata.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 5

Professional Summary:

 Around 6+ years of professional experience as Software developer in the field of Information Technology, including
in analysis, design, development, testing and Data Warehousing and ETL process using Informatics Power Center
9.5.1/9.6.1/10.2 and Oracle 11G. 
 Over 5yearsof experience in ETL (Extract Transform Load), projects using Data Warehousing tools like Informatica
and databases like Oracle, MySQL, SQL Server and Teradata.
 Experience working with Teradata Parallel Transporter (TPT), BTEQ, Fast load, Multi-load, TPT, SQL Assistant, DDL
and DML commands.
 Built interactive dashboard using techniques like guided analytics and visual best practices provided by Tableau.
 Knowledge in data visualization by using Tableau to create graphs, charts and dashboards
 Designed and documented ETL frame works with Best Practices, Performance tuning techniques, naming
conventions, node configurations.
 Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server and
Worked on integrating data from flat files like fixed width and delimited.
 Expert level Data Integration skills using Informatica Power Center to design, develop, implement, and optimize ETL
mappings, transformations and workflows to move data from multiple sources including flat files, RDBMS tables,
XML’s into Operational Data Store (ODS), Data Warehouse and Data Marts.
 Worked with the T-SQL for developing complex Stored Procedures, Triggers, Functions, Views, Indexes, Cursors, SQL
joins and Dynamic SQL queries etc.
 Experience on Star Schema and Snowflake Schema, Fact Dimension Tables, Slowly Changing Dimensions. Interacting
with clients and users as well to know their requirements.
 Extensive experience with ETL tool Informatica in designing and developing complex Mappings, Mapplets,
Transformations, Workflows, Worklets, and scheduling the Workflows and sessions.
 Experience on Debugger to validate the mappings and gain troubleshooting information about data and error
conditions.
 Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly
and Yearly Loading of Data.
 Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions,
Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators and Normalizers.
 Implemented Slowly changing dimensions and change data capture using Informatica.
 Extensively developed Complex mappings using various transformations such as Unconnected / Connected lookups,
Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Union and more.
 Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows
environments.
 Experience in developing UNIX Shell scripts that are used by ETL processes.
 Good understanding with all phases of SDLC (System Development Life Cycle) including Planning, Analysis, Design,
Implementation and Maintenance.
 Strong analytical, problem-solving, communication, learning and team skills.
 Experience in using Automation Scheduling tools like Auto-sys and Control-M.

Technical Skill:
ETL Technology Informatica Power Center 9.5.1/9.6.1/10.2, Control-M, Autosys, SharePoint,
Erwin.
Data Warehouse ODS, Normalized Data Mart, Dimensional Star and Snowflake Modeling
Databases Teradata SQL, Oracle 11g/10g, MS SQL Server 2008R2/2012/14
Programming SQL, SQL Developer, T-SQL, PL/SQL, Toad 9.2/8.6, SQL Plus, Microsoft Office, SQL
Plus, UNIX Scripting.
Operating Systems Linux, Windows 7/10
Reporting Tools Basics Tableau and SSRS.

Professional Experience

CLIENT: UNION BANK Los Angeles CA AUGUST 2018 - Present


Role: INFORMATICA ETL DEVELOPER

Responsibilities
 Develop database objects like user-defined functions, user-defined procedures using T-SQL scripts.
 Worked on exporting data to flat files using Teradata Fast Export.
 Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then
move data from staging to Journal then move data from Journal into Base tables.
 Develop high-performance T-SQL queries, complex joins and advanced indexing techniques to optimize database
operations.
 Designing and developing the analytics, dashboards and visualizations using tableau BI and other popular BI
tools.
 Designed and developed ETL strategy to populate the Data Warehouse from various source systems such as
Oracle, Flat files, XML, SQL server. 
 Used Informatica Designer to create complex mappings using different transformations like Source Qualifier,
Expression, Lookup, Aggregator, and Update Strategy, Joiner, Filter and Router transformations to pipeline data
to Data Warehouse/Data Marts. 
 Created and monitored Sessions/Batches using Informatica Server Manager/Workflow Monitor to load data into
target Oracle database. 
 Generated Tableau Dashboard with quick/context/global filters, parameters and calculated fields on Tableau
(9.x) reports.
 Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, work
lets, Assignment, Timer and scheduling of the workflow.
 Develop key documents such as High-level design document, Low-level design document, accelerated change
document, workflow diagrams, Use Cases, Data flow diagrams.
 Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
 Designed and developed Informatics Power Center mappings, sessions and workflows as per requirement and
migrated to production environment. 
 Used Source Analyzer and Target designer to import the source and target database schemas, and the Mapping
Designer to develop mapping applying transformations to source and finally map to target tables. 
 Developed Mappings to extract data from ODS to Data Mart, and monitored Daily, Weekly and Monthly Loads. 
 Implemented and created slowly changing dimensions of Type1 and Type2 for storing historic data into Data
warehouse using Informatica. 
 Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
 Involved in automation of batch processing to run Informatica Workflows using Autosys. 

Environment: Informatica Power Centre 10.2, Teradata SQL10.2, Tableau, Oracle11g, SQL, UNIX, PL/SQL, Control-
M, Auto-sys.
Questar Assessment, MN Jan 2017 – July 2018
Role: INFORMATICA ETL DEVELOPER

Responsibilities:
 Worked on Informatics Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow
Monitor.
 Worked in the performance tuning of Teradata SQL, ETL and other processes to optimize session performance.
 Loaded data in to the Teradata tables using Teradata Utilities Bteq, Fast Load, Multi Load, and Fast Export, TPT.
 Parsed high-level design specification to simple ETL coding and mapping standards.
 Developed Tableau data visualization using Pareto's, Combo charts, Heat maps, Box and Whisker charts, Scatter
Plots, Geographic Map, Cross tabs, Histograms etc.
 Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
 Used Informatics designer for designing mappings and Mapp let’s to extract data from various sources like Oracle
and flat files.
 Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level.
 Created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters,
Stored Procedures, Update Strategy and Sequence Generator.
 Actively coordinated with testing team in the testing phase and helped the team to understand the dependency
chain of the whole project.
 Migration of Informatics Mappings/Sessions/Workflows from Dev, QA to Prod environments.
 Implemented mapping for slowly changing dimensions (SCD) to maintain current data as well as historical data.
 Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business
process, dimensions and measured facts.
 Excellent understanding of business process modeling, process improvements, and developing process flows.
 Analyzed, designed, developed, implemented and maintained moderate to complex initial load and incremental
load mappings to provide data for enterprise data warehouse.
 Responsible for creating complete test cases, test plans, test Data, and reporting status ensuring accurate coverage
of requirements and business processes.
 Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica
workflows in different environments.

Environment: Informatica Power Center 9.6.1, Tableau 10.0,TOAD, UNIX, Oracle, SQL*Plus, MS SQL Server 2008, T-SQL,
Windows XP.

Humana, Louisville, KY Feb 15 – Dec 16


ETL Informatica Developer

Responsibilities:
 Understanding the Business requirements based on Functional specification to design the ETL methodology in
technical specifications.
 Created mappings and sessions to implement technical enhancements for data warehouse by extracting data from
sources like Oracle and Delimited Flat files.
 Development of ETL using Informatics 9.6.1
 Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
 Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy
Designing and optimizing the Mapping.
 Developed Workflows using task developer, work let designer, and workflow designer in Workflow manager and
monitored the results using workflow monitor.
 Created various tasks like Session, Command, Timer and Event wait.
 Modified several of the existing mappings based on the user requirements and maintained existing mappings,
sessions and workflows.
 Tuned the performance of mappings by following Informatics best practices and also applied several methods to get
best performance by decreasing the run time of workflows.
 Prepared SQL Queries to validate the data in both source and target databases.
 Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
 Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.
 Created Test cases for the mappings developed and then created integration Testing Document.
 Prepared the error handling document to maintain the error handling process.
 Automated the Informatics jobs using UNIX shell scripting.
 Closely worked with the reporting team to ensure that correct data is presented in the reports.
 Interaction with the offshore team on a daily basis on the development activities.

Environment: Informatica Power Center 9.6.1, Oracle 11g, Informatica Data Quality (IDQ) 9.x, delimited files, UNIX Shell
Script, Windows 7, Toad for oracle 11G, Teradata, SSIS, T-SQL, SQL server 2008.

Blue Shield of California, CA Mar 14–Jan 15


ETL Developer

Responsibilities:
 Extensively used Informatica to load data from Oracle9 and flat files into the target Oracle 10g database.
 Used various transformations like Joiner, Aggregator, Expression, Lookup, Filter, Update Strategy and Stored
Procedures.
 Used Mapp lets and Reusable Transformations to prevent redundancy of transformation usage and
maintainability.
 Created and scheduled workflows using Workflow Manager to load the data into the Target Database.
 Involved in performance tuning of Targets, Sources, and Mappings. Improved performance by identifying
performance bottlenecks.
 Worked on different tasks in Workflows like sessions, events raise, event wait, e-mail and timer for scheduling
of the workflow.
 Involved in meetings to gather information and requirements from the clients.
 Involved in Designing the ETL process to extract, translate and load data from flat files to warehouse data base.
 Used Debugger to validate mappings and also to obtain troubleshooting information about data by inserting
Breakpoints.
 Documented the number of source / target rows and analyzed the rejected rows and worked on re-loading the
rejected rows.
 Created UNIX shell scripting and automation of scheduling processes.
 Wrote SQL Queries, PL/SQL Procedures, Functions, and Triggers for implementing business logic and for
validating the data loaded into the target tables using query tool TOAD.
Environment: Informatics Power Center 9.6.1, Oracle 10g, Teradata, MS SQL SERVER 2000, T-SQL, SQL, SSIS, PL/SQL,
SQL*Loader, UNIX Shell Script.

Lincoln Financial Group, Greensboro, NC Feb 13 – Jan 14


ETL Informatica Developer

Responsibilities:
 Worked closely with business analysts and gathered functional requirements. Designed technical design documents
for ETL process.
 Developed ETL mappings, transformations using Informatica Power Center 9.0.1/8.6.1.
 Implemented Change Data Capture (CDC) process to load into the staging area.
 Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet
Designer, Transformation Developer, and Workflow Manager.
 Worked on Importing DB2 tables into Informatica ETL power center. 
 Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, & Excel.
 Developed reusable Mapplets, Transformations and user defined functions.
 Extensively used Mapping Debugger to handle the data errors in the mapping designer.
 Experience using transformations such as Normalizer, Unconnected/Connected Lookups, Router, Aggregator, Joiner,
Update Strategy, Union, Sorter, and reusable transformations.
 Created event wait and event raise, email, command tasks in the workflows manager.
 Responsible for tuning ETL procedures to optimize load and query Performance.
 Good Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and
FACT and Dimensions tables.
 Extensively worked with incremental loading using Parameter Files, Mapping Variables and Mapping Parameters.
 Used Informatica Power Exchange for loading/retrieving data from mainframe system.
 Tested the mapplets and mappings as per Quality and Analysis standards before moving to production environment.
 Involved in writing shell scripts for file transfers, file renaming and concatenating files.
 Created debugging sessions for error identification by creating break points and monitoring the debug data values in
the mapping designer.
 Developed Unit test cases and Unit test plans to verify the data loading process.

Environment: Informatica Power Center 9.5.1, Oracle 11g/10G, UNIX Shell Script, Windows XP, Toad for oracle, SQL
server 2008.

L&T Infotech, India Aug 12 – Dec 12


ETL Informatica Developer

Responsibilities:
 Assisted to prepare design/specifications for data Extraction, Transformation and Loading.
 Developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.
 Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
 Prepared reusable transformations to load data from operational data source to Data Warehouse.
 Wrote complex SQL Queries involving multiple tables with joins.
 Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow
Monitor.
 Used debugger, session logs and workflow logs to test the mapping and fixed the bugs.
 Analyzed the dependencies between the jobs and scheduling them accordingly using the Work Scheduler.
 Improved the performance of the mappings, sessions using various optimization techniques.
Environment: Informatica Power Center 9.5.1 (Informatica Designer, Workflow Manager, Workflow Monitor), Oracle
10g, Flat files, UNIX, Shell Scripts, Toad 7.5.

You might also like