[go: up one dir, main page]

0% found this document useful (0 votes)
17 views6 pages

Data Analyst 5

The document outlines the professional profile of a Senior Data Analyst with 9 years of experience in Information Technology, specializing in data analysis, visualization, and business intelligence using tools like Python, SQL, and Tableau. It details their technical skills, educational background, and extensive experience in project management, data governance, and software development methodologies. The document also highlights their roles in various organizations, focusing on data migration, ETL processes, and collaboration with stakeholders to meet business requirements.

Uploaded by

Fred Golder
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views6 pages

Data Analyst 5

The document outlines the professional profile of a Senior Data Analyst with 9 years of experience in Information Technology, specializing in data analysis, visualization, and business intelligence using tools like Python, SQL, and Tableau. It details their technical skills, educational background, and extensive experience in project management, data governance, and software development methodologies. The document also highlights their roles in various organizations, focusing on data migration, ETL processes, and collaboration with stakeholders to meet business requirements.

Uploaded by

Fred Golder
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

 9 years of experience in Information Technology as a Data Analyst on

Data Analysis, Data Manipulation, Data Mining, Data Visualization and


Business Intelligence.
 Data Analytics and Software Development background with experience in
Python, R, SQL & NoSQL Databases and Machine Learning.
 Good Understanding of Data ingestion, Airflow Operators for Data Orchestration,
and other related python libraries.
 Ability to identify and learn applicable new technologies independently as needed.
 Proficient in designing and creating various Data Visualization Dashboards,
worksheets, and analytical reports to help users to identify critical KPIs and
facilitate strategic planning in the organization utilizing Tableau Visualizations
according to the end user requirements.
 Expertise in building dashboards using Tableau, Power BI, Quick Sight
 Created various types of charts like Heat Maps, Geocoding, Symbol Maps, Pie
Charts, Bar Charts, Tree Maps, Gantts, Circle Views, Line Charts, Area Charts,
Scatter Plots, Bullet Graphs and Histograms in Table Desktop, Power BI and Excel to
provide better data visualization.
 Very good exposure to the entire Software Development Life Cycle (SDLC) and
creating technical documentation.
 Experience in Project Management best practices, processes, and methodologies
(Agile and Waterfall).
 Highly experienced at translating business objectives from multiple stakeholders
into detailed implementation and project plans.
 Active involvement in requirement gathering sessions and helping the developers
to understand the user’s requirements.
 Ability to tailor communication and presentations to peers in both business and
technical areas, to deliver optimal business solutions in line with corporate
priorities.
 Expert in creating PL/SQL Schema objects like Packages, Procedures, Functions,
Subprograms, Triggers, Views, Materialized Views, Indexes, Constraints,
Sequences,
 Exception Handling, Dynamic SQL/Cursors, Native Compilation, Collection Types
Record Type, Object Type using SQL Developer.
 Knowledge and experience working in Waterfall as well as Agile environments
including the Scrum process and using Project Management tools like Project Libre,
Jira/Confluence, and version control tools such as GitHub/Git.
 Quick learner having strong business domain knowledge and can communicate
business data insights easily with technical and nontechnical clients.
 Ability to understand and articulate the “big picture” and simplify complex ideas.

EDUCATION

North South University Dhaka, Bangladesh

Bachelors in Electronics and Telecommunication Engineering

TECHNICAL SKILLS/TOOLS
SQL, PL/SQL, HTML 5, XML and VBA, Python (Scikit-learn, NumPy, pandas, matplotlib),
Programming Languages: Shell scripting, R, Java.
Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.
Cloud Platforms: AWS, EC2, EC3, Redshift & MS Azure
OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9
Operating System: Windows, Unix, Sun Solaris
ETL/Data warehouse Tools: Informatica 10.1, SAP Business Objects XIR3.1/XIR2 and Talend.
Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model
ERP: SAP, Archer
Programming: SQL, Python (Scikit-learn, NumPy, pandas, matplotlib), Shell scripting, R, Java.
Machine Learning: Data Analysis, Feature Engineering, Clustering, Regression, Classification, Decision Tree
Learning
Anaconda Suite, PyCharm IDE, Microsoft Suite (Excel, Word, PowerPoint), GIT, Maven,
Software: Tortoise SVN, JIRA.
Data Modeling Tools: Erwin 9.7/9.6, ER Studio v17, and Power Designer.
Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2, IBM Netezza,
Informatica, MySQL, MongoDB, Neo4J, Tableau.
Other tools: JIRA, MS Visio, SPSS, MS Excel, TOAD, VDI

PROFESSIONAL EXPERIENCE
TRUIST Financial Corporation
Atlanta, GA
March 2017 - Present
Senior Data Analyst

 Lead 5+ team members to initiate and finalize data requirements for Archer application,
coordinate development of Archer functionality, and engage in Unit testing and UAT
deliverables for initial requirements.
 Skilled in understanding commercial, credit cards and mortgage financial data.
 Have performed data mapping from various commercial sources (AFS – automated
financial services, DFP – dealer floor plan) to a unified layer.
 Have experience working in ETL track of data migration, with researching skills on
Informatica to identify data workflows and transformation within that layer.
 Responsible for Continuous Integration and Continuous Delivery process implementation
using Jenkins along with Python and Shell scripts to automate routine jobs.
 Developed rule sets for data cleansing and actively participated in data cleansing and
anomaly resolution of the legacy application.
 Create the DDL scripts using ER Studio and source to target mappings (S2T- for ETL) to
bring the data from multiple sources to the warehouse.
 Complex incoming data cleaning and formatting using Python.
 Developed Power BI data visualization using Cross tabs, Heat maps, Box and Whisker
charts, Scatter Plots, Geographic Map, Pie Charts and Bar Charts and Density Chart.
 Created and run jobs on AWS cloud to extract transform and load data into AWS Redshift
using AWS Glue, S3 for data storage and AWS Lambda to trigger the jobs
 Wrote Python scripts to parse XML documents and load the data in the database.
 Extracting data from SFDC (Salesforce) Tool, to generate KPI Reports as per the client
requirements, with help of Tableau Tool.
 Have worked extensively with Jira as a Jira coordinator and a Jira user.
 Experience in importing and exporting RDBMS relational data sets to Hadoop using Sqoop.
 Created data masking mappings to mask the sensitive data between production and test
environment.
 Coordinated sprint meetings, backlogs and release Backlogs with SCRUM team and
product owner using Jira.
 Coordinated enterprise release projects with different stakeholders from an end-to-end
standpoint.
 Proficient at coordinating daily standups and project status meetings with top
management.
 Expert in creating project timeline and communicating release schedule with IT Ops.
 Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing, and
analysis of big data.
 Worked on Metadata exchange among various proprietary systems using XML.
 Adapt and experienced at coordinating requirement gathering sessions such as Joint
Application Design (JAD) and Rapid Application Development (RAD) sessions,
brainstorming sessions, requirement reviews, system walkthroughs, scrum calls,
Workshops and one-on-one interviews with Business Users, SMEs, and Stakeholders (both
Onsite and Offshore).
 Experience in prioritizing Product Backlogs and using estimation techniques (Planning
Poker, T-shirt sizing, Relative Mass Valuation etc.) to help broke down large Epics into
Stories.
 Proficient in Designing Business Process Model (UML design, Activity diagrams, Sequence
diagrams, Data Flow Diagram, Wireframe) using MS Visio.
 Perform Data Analysis on the Analytic data present in Teradata, AWS using SQL, Teradata
SQL Assistant, and Python.
 Connected to AWS Redshift through Tableau to analyze the data for ad-hoc requirements
as well as canned reports.
 Performed data analysis and data profiling using complex SQL on various sources systems
including Oracle.
 Experienced in eliciting, defining, and analyzing requirement specifications with business
stakeholders, Subject Matter Experts (SMEs), development and QA team.
 Authored GAP analysis document supported by Feasibility Studies and impact analysis to
structure the project goals
 Good understanding of AGILE, SDLC and Waterfall methodologies.
 Extract and analyze data from various sources - Data wrangling and cleanup using Python-
pandas.
 Work with Stake holders and Business Analysts to gather requirements and mapping these
requirements to mapping document from source to destination.
 Import training weight file using Python (NumPy) and TensorFlow (assign). Create a
function to output the detection boxes with Python.
 Involved in generating reports using Tableau.
 Cleaned data source in SQL Server and prepared data source in Tableau metadata using
complex joins, Custom SQL, split and data blending.
 Design ER diagrams, logical model (relationship, cardinality, attributes, and candidate
keys) and convert these
 to physical data model including capacity planning, object creation and aggregation
strategies, partition strategies, Purging strategies according to business requirements.
 Perform Data Analysis on the Analytic data present in Teradata, AWS using SQL, Teradata
SQL Assistant, and Python.
 Created programs for Data Analysis using Python.
 Using Tableau Desktop, created multiple rich dashboards visually telling stories of the
business status, strength & weakness, potentials etc. for the client at a glance, also
interact with data as necessary.
 Perform Data Profiling and implementing Data Quality checks using Informatica developer,
Python.
 Responsible for technical Data governance, enterprise-wide Data modeling and Database
design.
 Perform Data mapping, logical data modeling, created class diagrams and ER diagrams
and used SQL queries to filter data within the Oracle database.
 Convert existing data archives to SAS databases to improve data quality and availability
 Load the data from the panda’s data frames to the team's user defined space in Redshift
database by using copy command from AWS S3 bucket.
 Creating SQL queries in Toad to perform data analysis, data validation and data
manipulation operations.

IBM
Dhaka, Bangladesh
November 2014 – December 2016
Data Analyst
 Worked as Data Analyst for requirements gathering, business analysis and project
coordination.
 Analyzed large data sets apply machine learning techniques and develop predictive
models, statistical models and developing and enhancing statistical models by leveraging
best-in-class modeling techniques.
 Work with other Data Analysis team to gathering the Data Profiling information
 Responsible for the analysis of business requirements and design implementation of the
business solution.
 Performed Data Analysis and Data validation by writing SQL queries using SQL assistant.
 Translated business concepts into XML vocabularies by designing XML Schemas with UML
 Worked on Data Mining and data validation to ensure the accuracy of the data between
the warehouse and source systems.
 Mentored large scale data and analytics using advanced statistical and machine learning
models.
 Designed a Request Analysis model using Natural Language Processing (NLP)'s.
 Compiled data from various sources public and private databases to perform complex
analysis and data manipulation for actionable results.
 Analyzed business process workflows and developed ETL procedures for moving data from
source to target systems.
 Automated data loading, extraction, reports generation using UNIX Shell scripting and
loading data by using SQL Loader to custom tables.
 Generating reports using MS Excel & Google Spread Sheets
 Analyzed and validated data in Hadoop Lake by querying through hive tables.
 Developed SQL Queries to fetch complex data from different tables in databases using
joins, database links.
 Involved in designing and developing SQL server objects such as Tables, Views, Indexes
(Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
 Participated in JAD sessions, gathered information from Business Analysts, end users and
other stakeholders to determine the requirements.
 Performed Data analysis of existing data base to understand the data flow and business
rules applied to Different data bases using SQL.
 Performed data analysis and data profiling using complex SQL on various sources systems
and answered complex business questions by providing data to business users.
 Performed the detail data analysis, Identify the key facts and dimensions necessary to
support the business requirements.
 Generated Data dictionary reports for publishing on the internal site and giving access to
different users.
 Used MS Visio and Rational Rose to represent system under development in a graphical
form by defining use case diagrams, activity, and workflow diagrams.
 Wrote a complex SQL, PL/SQL, Procedures, Functions, and Packages to validate data and
testing process.
 Worked in generating and documenting Metadata while designing OLTP and OLAP systems
environment.
 Worked in data management performing data analysis, gap analysis, and data mapping.
 Established a business analysis methodology around the RUP (Rational Unified Process).
 Developed stored procedures in SQL Server to standardize DML transactions such as
insert, update, and delete from the database.
 Created SSIS package to load data from Flat files, Excel and Access to SQL server using
connection manager.
 Develop all the required stored procedures, user defined functions and triggers using T-
SQL and SQL.
 Produced report using SQL Server Reporting Services (SSRS) and creating various types of
reports.

WIPRO
Dhaka, Bangladesh
September 2013 – October 2014
Data Analyst
 Responsible for defining the scope and implementing business rules of the project,
gathering business requirements and documentation.
 Responsible for writing Functional Requirement Specifications (FRS) and User Requirement
Specification (URS).
 Analyzed Business Requirements and segregated them into high level and low-level Use
Cases, Activity Diagrams / State Chart Diagrams using Rational Rose according to UML
methodology thus defining the Data Process Models.
 Understand the As-Is system and develop the To Be system concept and prepare the
System Process Maps.
 Successfully conducted JAD sessions, which helped synchronize the different stakeholders
on their objectives and helped the developers to have a clear-cut picture of the project.
 Conducted presentations of the Q/A test results with analysis to the stakeholders and
users and documented modifications and requirements.
 Responsible for integrating with Facets. Designing test scripts for testing of Claims in
Development, Integration, and production environment.
 Apply data governance policies to business strategies leveraging critical data elements.
 Used Oracle Enterprise Manager, TOAD for developing and managing Pl/ SQL scripts.
 Coordinating and Developing QA activities.
 Expertise in Claims, Subscriber/Member, Plan/Product, Claims, Provider, Commissions and
Billing Modules of Facets.
 Been a part of Architecture and modelling team and have used SOA (Service Oriented
Architecture).
 Participated in all phases of the Facets Extended Enterprise (TM) administrative system
implementation including the planning, designing/building/validation (DBV), testing, and
Go-live support phases.
 Assigned tasks among development team, monitored, and tracked progress of project
following Agile methodology.
 Created Process Flow diagrams, Use Case Diagrams, Class Diagrams, and Interaction
Diagrams using Microsoft Visio and Rational Rose.
 Wrote Test Cases and performed User Acceptance Testing (UAT), documented the in-detail
defects using the Defect Tracking report.
 Created Use cases, activity report, logical components, and deployment views to extract
business process flows and workflows involved in the project. Carried out defect tracking
using Clear Quest
 Maintained proper communication with the developers ensuring that the modifications and
requirements were addressed and monitored these revisions.
 Involved in compatibility testing with other software programs, hardware, Operating
systems, and network environments.

You might also like