[go: up one dir, main page]

0% found this document useful (0 votes)
2K views41 pages

Cashio Siwes Report

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2K views41 pages

Cashio Siwes Report

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 41

TECHNICAL REPORT ON

STUDENT INDUSTRIAL WORK EXPERIENCE SCHEME (SIWES)

BY

ELIJAH SEGUN

200404113

FACULTY OF SCIENCE.

ADEKUNLE AJASIN UNIVERSITY, AKUNGBA-AKOKO.

ROTBUN INCORPORATION, AKUNGBA-AKOKO, ONDO STATE.

IN PARTIAL FULFILMENT FOR THE AWARD OF

BACHELOR OF SCIENCE (B.Sc.) IN COMPUTER SCIENCE

ADEKUNLE AJASIN UNIVERSITY, AKUNGBA-AKOKO.

(APRIL – OCTOBER, 2024.)

i
CERTIFICATION
I hereby certify that this report of Student Industrial Work Experience Scheme (SIWES)
was prepared and compiled by ELIJAH SEGUN (Matric Number: 200404113) from the
Department of Computer Science, Faculty of Science, Adekunle Ajasin University, Akungba-
Akoko, for the successful completion of my six (6) month Industrial undertaken at ROTBUN
INCORPORATION, AKUNGBA-AKOKO, ONDO STATE.

STUDENT TRAINEE: ELIJAH SEGUN (Matric Number: 200404113)


SIGNATURE AND DATE: _____________________________

DEPARTMENTAL SIWES COORDINATOR: Mr. ABIOLA S.O.


SIGNATURE AND DATE: _____________________________

ii
DEDICATION

To God Almighty, I say all glory and honor to your name. He only has made it possible that I see
this day and without Him, I wouldn’t have gone this far. I therefore dedicate this report to Him.
Special dedication goes to my parent MR and MRS ELIJAH for their support and word of
encouragement throughout the period of my internship.

iii
ACKNOWLEDGEMENT
I would like to appreciate Almighty God for making this program a success and for his favor that
I enjoined all through my industrial training period. I will like to appreciate my training
supervisor; Mr. Rotimi Bunmi, Mr. Omole Ariyo and Mr Ewegbemi ISRAEL O. for their
endless guidance during my Industrial Training. I would also want to appreciate my family, my
parents especially for being the best parents I could ever have, I pray for God guidance and
protection over their lives amen. I also want to appreciate my fellow colleagues during this
intern.

iv
TABLE OF CONTENTS

TITLE i
CERTIFICATION ii
DEDICATION iii
ACKNOWLEGDEMENT iv
TABLE OF CONTENT v
LIST OF FIGURES vi

CHAPTER ONE
Introduction …………………………………………………………………………… 1
Background of study …………………………...…………………………………… 1
History of SIWES ………………………………………………………………………… 1
Vision statement ……………………………………………………………………… 2
Mission statement …………………………………………………………………… 2
Aim of SIWES ………………………………………………………………………………2
Objectives of SIWES …………………………………………………………………… 2
Importance of SIWES …………………………………………………………………… 3

CHAPTER TWO
General brief on organization ………………………………………………… 4
History of company ……………………………………………………………………4
Vision statement ……………………………………………………………………….4
Mission statement ………………………………………………………………………4
Management structure …………………………………………………………… 4
Core services and solution ………………………………………………………… 5
Industry impact ……………………………………………………………………… 6

CHAPTER THREE
Description of Work-done ………………………………………………………… 7

v
Skills developed and techniques learnt …………………………………… 7
Practical challenges ………………………………………………………………. 9
Theoretical principles …………………………………………………………… 12
Contribution …………………………………………………………………………… 13
Future of data analysis …………………………………………………………... 14

CHAPTER FOUR
Relevance of Data Analysis to Computer Science ………………………………………… 17
Glossary ……………………………………………………………………………………. 23
Recommendation …………………………………………………………………………… 30
Conclusion …………………………………………………………………………………... 31

vi
LIST OF FIGURES

The management structure ……………………………………………………………….. 5

Dashboard …………………………………………………………………………………10

My Picture ……………………………………………………………………………… 32

vii
EXECUTIVE SUMMARY

viii
CHAPTER ONE
1.0 INTRODUCTION

1.1 BACKGROUND OF STUDY

The Student Industrial Work Experience Scheme aims to help the student have the
practical knowledge of what they have been taught in school and to serve as landmark on what
they will love to do after their undergraduate study.

This report contains what I did during my 21 weeks industrial training.

1.2 HISTORY OF SIWES

SIWES was founded in 1973 by ITF (Industrial Training Funds) to address the problem
of tertiary institution graduates' lack of appropriate skills for employment in Nigerian industries.
The Students' Industrial Work Experience Scheme (SIWES) was founded to be a skill training
program to help expose and prepare students of universities, Polytechnics and colleges of
education for the industrial work situation to be met after graduation.

This system facilitates the transfer from the classroom to the workplace and aids in the
application of knowledge. The program allows students to become acquainted with and exposed
to the experience required in handling and operating equipment and machinery that are typically
not available at their schools.

Prior to the establishment of this scheme, there was a rising concern and trend among
industrialists that graduates from higher education institutions lacked appropriate practical
experience for employment. Students who entered Nigerian universities to study science and
technology were not previously trained in the practical aspects of their chosen fields. As a result
of their lack of work experience, they had difficulty finding work.

As a result, employers believed that theoretical education in higher education was unresponsive
to the needs of labor employers. Thousands of Nigerians faced this difficulty till 1973. The
fund's main motivation for establishing and designing the scheme in 1973/74 was launched
against this context.
The ITF (Industrial Training Fund) organization decided to aid all interested Nigerian students
and created the SIWES program. The federal government officially approved and presented it in

1
1974. During its early years, the scheme was entirely supported by the ITF, but as the financial
commitment became too much for the fund, it withdrew in 1978. The National Universities
Commission (NUC) and the National Board for Technical Education (NBTE) were given control
of the scheme by the federal government in 1979. The federal government handed over
supervision and implementation of the scheme to ITF in November 1984. It was taken over by
the Industrial Training Fund (ITF) in July 1985, with the federal government bearing entire
responsibility for funding.

1.2.1 VISION STATEMENT

To be the leading skills training organization in Nigeria and one of the best in the world.
To be the leading human capital development organization in Nigeria and one of the best in the
world.

1.2.2 MISSION STATEMENT

To set, regulate training standards and provide need-based human capital development
interventions using a corps of highly competent professionals in line with global best practices

1.3 AIM OF SIWES

The aim of the Student Industrial Work Experience Scheme (SIWES) is to provide
students with the opportunity to gain practical work experience in their field of study, to bridge
the gap between academic knowledge and practical work experience, and to provide students
with a realistic view of the work environment they will be entering after graduation.

1.4 OBJECTIVES OF SIWES

According to the Industrial Training Fund (ITF), the Student Industrial Work
Experience Scheme's (SIWES) particular goals are:

 To give Nigeria University students a way to get experience and skills in the workforce
while pursuing their studies.
 To guarantee that students have valuable job experience prior to graduation.
 To provide businesses, organizations, and institutions an opportunity to have an influence
on students.

2
 To provide a method through which students can utilize and operate machinery that
might not be accessible in their various institutions of higher learning.
 To teach students about obligations and professional ethics.
 To provide students the chance to apply their knowledge in actual workplace settings.
 To get students ready for graduate school.
 To give students a way to learn about the precise subject matter of their chosen field of
study.
 To increase employer participation in the entire educational process of educating students
at universities and other postsecondary institutions for careers in industry.

1.5 IMPORTANCE OF SIWES TO YOUR COURSE OF STUDY

The purpose of the Industrial Work Experience Scheme in Computer Science program is
to expose students to work experiences that they are likely to encounter and expose them to
handling equipment, machinery and learn new skills that may not be available in their
educational institutions. It should also be noted that the scheme aids students in integrating
theory with practical operations.

My involvement in the six-month SIWES industrial training program enables me to make


judgments and get active in carrying out the tasks. Additionally, it's critical for understanding the
theoretical concepts I learned in school. By enabling me to acquire new abilities in Computer
Science and other subjects, it broadens my perspective beyond Computer Science alone.
Additionally, it helps me solve several diseases that were discovered throughout my study of
Computer Science.

3
CHAPTER TWO
2.0 General brief on the organization
Rotbun Incorporation empowers businesses and individual through-cutting edge Data Analysis
and Data Science service with a mission to help people attain their full potential.

Rotbun Incorporation offers a comprehensive range of services to meet the unique needs of its
clients. We specialize in transforming raw data into actionable insights, enabling businesses to
make informed decision, optimize operation and drive growth.

In addition to its professional services, Rotbun Incorporation is deeply committed to education


and skills development. The company offers extensive Data Analysis and Data Science training
programs, designed to equip students and professionals with the skills needed to thrive in todays
data-driven world. Through the training programs, participants gain hands-on the latest tools and
technologies preparing them to confidently tackle read world challenges.

2.1 HISTORY OF ROTBUN


Rotbun started in April 2018 at Mini Mart, Federal Polytechnic, Ado Ekiti, Ekiti State with a
deep passion to revolutionize ICT in Nigeria and make it complete with international standards.
It was registered with CAC in 2019 and SMEDAN in 2021 as a technology research organization
which specializes in training and rendering services in Website Design, Programming languages,
Hardware, IT support service, Computer and accessories, Data Analysis, Research Development
and Statistical Consultancy Services We seek to equip people with needed knowledge and skills
in ICT and in turn become a source of income for the entrepreneur as it enables self-
employment.

2.2 VISION STATEMENT


To build Africa’s most successful IT company where knowledge curiosity and innovation are
inspired.

2.3 MISSION STATEMENT


We exist to help people actualize their full potential.

2.4 MANAGEMENT STRUCTURE

4
Figure 1: The management structure

Key Players on a Data Analytics Team. While team structure depends on an organization's size
and how it leverages data, most data teams consist of three primary roles: data scientists, data
engineers, and data analysts. Other advanced positions, such as management, may also be
involved.

In Nigeria's rapidly evolving digital landscape, Rotbun Incorporation stands out as a premier
data analysis company helping businesses leverage data for informed decision-making.
Established with the vision to empower Nigerian enterprises through actionable insights, Rotbun
Incorporation serves a broad range of sectors, including finance, telecommunications, healthcare,
and retail.

5
Core Services and Solutions

Data Analytics and Visualization: Rotbun Incorporation transforms raw data into meaningful
visualizations that drive decision-making. Using advanced tools like Power BI, Tableau, and
Python, they deliver dashboards and reports tailored to meet clients' unique business objectives.

Business Intelligence (BI) Solutions: Through customized BI solutions, the company enables
businesses to monitor and analyze real-time data, thereby enhancing operational efficiency. Their
BI solutions integrate seamlessly with clients’ existing data infrastructure, providing a holistic
view of organizational performance.

Predictive and Prescriptive Analytics: To help businesses stay ahead of trends, Rotbun
Incorporation leverages machine learning and statistical modeling to predict future outcomes and
recommend actions. This service is particularly valuable for sectors like finance and retail, where
forecasting demand and managing risks are critical.

Data Warehousing and Management: For organizations grappling with large volumes of data,
the company offers end-to-end data warehousing solutions. This includes the design,
implementation, and maintenance of secure and scalable data storage systems, ensuring efficient
data retrieval and analysis.

Data-Driven Strategy Consulting: Rotbun Incorporation collaborates with clients to develop


data-driven strategies that align with their business goals. Their consulting services help clients
implement best practices in data governance, quality control, and data security, ensuring reliable
and ethical data use.

Industry Impact and Future Outlook

Rotbun Incorporation has become a trusted partner for companies across Nigeria, helping them
harness the power of data to drive growth. As data analytics becomes more central to business
operations, the company is poised to expand its service offerings to include artificial intelligence
(AI) and cloud-based analytics, further supporting Nigerian businesses in their digital
transformation journey.

6
CHAPTER THREE
3.1 Description of Workdone

As a Data Analyst, I played a critical role in transforming raw data into actionable insights that
inform business decisions. The work involves collecting, processing, and analyzing large
datasets to identify trends, patterns, and anomalies. I use statistical techniques and data
visualization tools to interpret complex data, presenting their findings through reports,
dashboards, and visualizations that are accessible to non-technical stakeholders.

Key tasks include cleaning and organizing data to ensure accuracy, developing metrics to track
business performance, and conducting exploratory data analysis (EDA) to uncover insights. I
work closely with cross-functional teams, providing insights to improve processes, optimize
strategies, and solve business problems. Their work is foundational for data-driven decision-
making and is essential across industries, from finance and healthcare to retail and technology.

3.2 SKILLS DEVELOPED AND TECHNIQUES LEARNT

I possess a combination of technical skills, analytical abilities, and soft skills. Here’s a list of the
essential skills and techniques:

Technical Skills

Statistical Analysis: I understanding statistical concepts and methods is crucial for interpreting
data and making inferences.

Data Manipulation and Cleaning: Proficiency in cleaning and preparing data using tools like
Excel, Python (with libraries such as Pandas and NumPy), or R.

Data Visualization: Skills in creating visual representations of data using tools like Tableau and
Power BI, to effectively communicate insights.

Programming Languages: Familiarity with programming languages such as Python or R for data
analysis.

7
Excel Proficiency: Advanced skills in Microsoft Excel, including the use of pivot tables,
VLOOKUP, and complex formulas for data analysis.

Analytical Skills

Critical Thinking: Ability to evaluate data critically, identify patterns, and draw meaningful
conclusions from complex datasets.

Problem Solving: Skills in identifying problems and developing data-driven solutions to address
them.

Quantitative Skills: Strong numerical ability to analyze and interpret data accurately.

Soft Skills

Communication Skills: The ability to convey technical findings to non-technical


stakeholders clearly and effectively.

Collaboration and Teamwork: Experience working with cross-functional teams to align data
analysis with business objectives.

Attention to Detail: Meticulousness in data handling to avoid errors and ensure data integrity.

Adaptability: Willingness to learn new tools and techniques as the data landscape evolves.

Techniques

Exploratory Data Analysis (EDA): Using statistical graphics and visualization to understand the
data’s structure and identify patterns.

Regression Analysis: Employing regression techniques to understand relationships between


variables and make predictions.

A/B Testing: Designing and analyzing experiments to determine the effectiveness of different
strategies or changes.

8
Machine Learning Basics: Familiarity with basic machine learning concepts can be beneficial for
predictive analytics.

Data Governance and Quality Control: Understanding best practices for ensuring data quality,
security, and compliance.

By mastering these skills and techniques, a Data Analyst can effectively contribute to data-driven
decision-making and support organizational goals.

3.3 Practical Challenges

Collecting meaningful data: With the high volume of data available for businesses, collecting
meaningful data is a big challenge. Ideally, employees spend much time sifting through the data
to gain insights, which can be overwhelming. Besides, it’s impossible to sort and analyze all the
data in real-time, which might fail to provide accurate and relevant reports.

You can easily overcome this problem using an appropriate data analytics tool. The tool can help
you collect, analyze, and provide real-time reports for better decision-making. On the same note,
it reduces the time employees spend collecting and analyzing data, thereby boosting productivity.

The use of data analytics tools should also go hand in hand with employee training on effective
data utilization, either through online training programs or coaching workshops.

Selecting the right analytics tool: Without the perfect tool for your business data analytics
needs, you may be unable to conduct the data analysis efficiently and accurately.

Different analytics tools (Power BI, Tableau, RapidMiner, etc.) are available and offer varying
capabilities. Besides finding software that fits your budget, you should consider other factors
such as your business objectives and the solution’s scalability, integration capabilities, ability to
analyze data from multiple sources, etc.

If you have a data analyst, they should be well-versed in how to select the right tool. But since
the analytics landscape is changing quickly, those not conversant with modern data analytics
could enroll in a refresher course such as the Tableau Data Analytics Certificate to hone their

9
skills. Alternatively, you could consult an expert to guide you on the best tool based on your
business needs.

Data visualization: Data requires to be presented in a format that fosters understandability.


Usually, this is in the form of graphs, charts, infographics, and other visuals. Unfortunately,
doing this manually, especially with extensive data, is tedious and impractical. For instance,
analysts must first sift through the data to collect meaningful insights, then plug the data into
formulas and represent it in charts and graphs.

Figure 2: Dashboard

The process can be time-consuming, not forgetting that the data collected might not be all-
inclusive or real-time. But with appropriate visualization tools, this becomes much easier, more
accurate, and relevant for prompt decision-making.

Data from multiple sources: Usually, data comes from multiple sources. For instance, your
website, social media, email, etc., all collect data you need to consolidate when doing the
analysis. However, doing this manually can be time-consuming. You might not be able to get
comprehensive insights if the data size is too large to be analyzed accurately.

10
Software built to collect data from multiple sources is pretty reliable. It gathers all the relevant
data for analysis, providing complete reports with minimal risk of errors.

Low-Quality Data: Inaccurate data is a major challenge in data analysis. Generally, manual data
entry is prone to errors, which distort reports and influence bad decisions. Also, manual system
updates threaten errors, e.g., if you update one system and forget to make corresponding changes
on the other.

Fortunately, having the tools to automate the data collection process eliminates the risk of errors,
guaranteeing data integrity. More so, software that supports integrations with different solutions
helps enhance data quality by removing asymmetric data.

Data Analysis skills challenges: Another major challenge facing businesses is a shortage of
professionals with the necessary analytical skills. Without in-depth knowledge of interpreting
different data sets, you may be limited in the number of insights you can derive from your data.

In addition to hiring talent with data analysis skills, you should consider acquiring easy-to-use
and understand software. Alternatively, you could conduct training programs to equip employees
with the most up-to-date data analysis skills, especially those handling data.

Scaling challenges: With the rapidly increasing data volume, businesses face the challenge of
scaling data analysis. Analyzing and creating meaningful reports becomes increasingly difficult
as the data pile up.

This can be challenging even with analytics software, especially if the solution is not scalable.
That’s why it’s important to consult before acquiring a tool to ensure it’s scalable and supports
efficient data analysis as your business grows.

Data security: Data security is another challenge that increases as the volume of data stored
increases. This calls for businesses to step up their security measures to minimize the risks of
potential attacks as much as possible.

11
There are several ways of mitigating the risks, including; controlling access rights, encrypting
data with secured login credentials, and conducting training on big data. Alternatively, you could
hire cybersecurity professionals to help you monitor your systems.

Budget limitations: Data analysis is a cost-intensive process. It can be a costly investment,


from acquiring the right tools to hiring skilled professionals and training the employees on the
basics of data analysis. Again, with the high volatility of data, the managers must be proactive to
secure the system and address any security threats while scaling the system to accommodate the
growing volume of data.

Ideally, risk management is a small business function, and getting budget approvals to
implement the strategies can be a challenge. Nonetheless, acquiring the necessary tools and
expertise to leverage data analysis. So, the managers must be strategic about the solution they
receive and provide detailed return on investment (ROI) calculations to support the budget.

Lack of a data culture: The success of data analysis in a business depends on the culture. In a
research paper on business intelligence, 60% of companies claimed that company culture was
their biggest obstacle. However, most companies are not data-driven. They have not equipped
the employees yet with the necessary knowledge on data analysis.

To overcome this challenge, it’s crucial to equip your employees to support data culture by
providing the necessary training.

Data inaccessibility: Data collected can only benefit the business if accessible to the right
people. From the analysts to the decision-makers, businesses need to make sure every key person
has the right to access the data in real-time and be fully empowered with knowledge on how to
analyze different data sets and use the insights.

Mainly, businesses restrict system access for security reasons. But with appropriate security
safeguards, you can enhance safer and unrestricted data access for analysis and decision-making
purposes.

12
With the vast amount of data created daily, businesses face the huge challenge of sifting through
all the various data sets to draw valuable insights and inform business decisions. Besides the
possibility of messy data due to the high volume, they also face other challenges such as
collecting meaningful data, selecting the right analytics tool, data visualization, multiple-source
data, low-quality data, lack of skills, scaling challenges, data security, budget limitations, lack of
a data culture, and inaccessibility. Fortunately, they can overcome these challenges by investing
in a suitable data analytics tool, training employees on data analysis, and stepping up their
cybersecurity safeguards, among other suggested solutions.

3.4 Theoretical Principles

At this point, I think a theory of data analysis would look more like music than it would like
physics or mathematics. Rather than produce general truths about the natural world, a theory of
data analysis would provide useful summaries of what has worked and what hasn’t. A
“compression of the past”, so to speak. Along those lines, I think a theory of data analysis should
reflect the following principles:

 The theory of data analysis is not a theory that instructs us what to do or tells us universal
truths. Rather, it is a descriptive or constructive theory that guides analysts without tying
their hands.
 A theory should speed up the training of an analyst by summarizing what is commonly
done and what has been successful. It should reduce the amount of learning that must be
done experientially.
 The theory should serve the practice of data analysis and make data analysis better in
order to justify its existence. This is in contrast to traditional statistical theory, whose
existence could be justified by virtue of the fact that it allows us to discover truth about
the natural world, not unlike with mathematics or physics. In other words, a theory of
data analysis would have relatively little intrinsic value.
 The theory should not be dependent on specific technologies or types of data. It should be
reasonably applicable to a wide range of data analyses in many different subject areas. A
biologist talking to an economist should be able to understand each other when
discussing the theory of data analysis.

13
 The theory should go far beyond the instruction of different methods of analysis,
including aspects of data analysis that don’t strictly involve the data.

3.5 Contribution

 Designing and maintaining data systems and databases; this includes fixing coding errors
and other data-related problems.
 Mining data from primary and secondary sources, then reorganizing the data in a format
that can be easily read by either human or machine.
 Using statistical tools to interpret datasets, paying particular attention to trends and
patterns that could be valuable for diagnostic and predictive analytics efforts.
 Demonstrating the significance of their work in the context of local, national, and global
trends that impact both their organization and industry.
 Preparing reports for executive leadership that effectively communicate trends, patterns,
and predictions using relevant data.
 Collaborating with programmers, engineers, and organizational leaders to identify
opportunities for process improvements, recommend system modifications, and develop
policies for data governance.
 Creating appropriate documentation that allows stakeholders to understand the steps of
the data analysis process and duplicate or replicate the analysis if necessary.

3.6 Future of Data Analysis in Nigeria

Data analytics has become an increasingly important field in Nigeria, as businesses and
organizations seek to leverage data to drive decision-making and gain a competitive edge. With
the growing volume of data being generated across various sectors, the need for skilled
professionals who can extract insights and transform data into actionable information has never
been greater In this comprehensive article, we will explore the world of data analytics in Nigeria,
covering key topics such as the current state of the industry, the benefits of Data Analytics In
Nigeria, the challenges faced by practitioners, and the future outlook for this rapidly evolving
field.

The Current State of Data Analytics in Nigeria

14
Nigeria has witnessed a significant surge in the adoption of data analytics in recent years. Across
industries, from banking and telecommunications to healthcare and agriculture, organizations are
recognizing the value of data-driven decision-making. According to a report by the Nigerian
Communications Commission (NCC), Nigeria had over 141 million active internet subscribers as
of January 2021, indicating a vast pool of data waiting to be harnessed.

Benefits of Data Analytics in Nigeria

 Improved Decision-Making: Data analytics enables organizations to make informed


decisions based on factual insights rather than intuition or gut feelings. This leads to more
effective strategies and better resource allocation.
 Enhanced Customer Experience: By analyzing customer data, businesses can gain a
deeper understanding of their target audience, enabling them to tailor products, services,
and marketing efforts to better meet their needs.
 Increased Efficiency: Data analytics can help identify areas of inefficiency within an
organization, allowing for process optimization and cost savings.
 Competitive Advantage: Organizations that effectively leverage data analytics can gain
a significant advantage over their competitors by identifying new opportunities,
anticipating market trends, and making data-driven strategic decisions.

Challenges Faced by Data Analytics Professionals in Nigeria

1. Data Quality and Availability: One of the primary challenges faced by data analytics
professionals in Nigeria is the lack of high-quality, reliable data. Many organizations
struggle with data silos, inconsistent data formats, and incomplete datasets.
2. Talent Shortage: Nigeria faces a shortage of skilled data analytics professionals, as the
demand for these skills often outpaces the supply. This can make it challenging for
organizations to find and retain top talent.
3. Technological Limitations: While Nigeria has made significant strides in technology
adoption, some organizations still face limitations in terms of hardware, software, and
infrastructure. This can hinder their ability to effectively collect, store, and analyze data.

15
4. Regulatory Compliance: As data privacy and security become increasingly important
concerns, data analytics professionals in Nigeria must navigate a complex regulatory
landscape to ensure compliance with relevant laws and standards.

The Role of Education in Advancing Data Analytics in Nigeria

Educational institutions in Nigeria are playing a crucial role in advancing data analytics by
offering specialized programs and courses. Universities such as the University of Lagos,
Covenant University, and the University of Ibadan have introduced data science and analytics
degrees, equipping students with the skills needed to thrive in this rapidly evolving field.
Additionally, various online platforms and boot camps have emerged, providing affordable and
accessible. In conclusion, as Nigeria continues to embrace digital transformation, the future of
data analytics looks bright. With the increasing adoption of technologies such as artificial
intelligence (AI), machine learning (ML), and the Internet of Things (IoT), the volume and
complexity of data will continue to grow.

To capitalize on these opportunities es, organizations must invest in data infrastructure, foster a
data-driven culture, and develop strategies to attract and retain top data analytics talent. By doing
so, they can unlock the full potential of data analytics and drive sustainable growth in the years
to come. Raining opportunities for professionals looking to upskill or transition into data
analytics careers. Data analytics has become a game-changer in Nigeria, enabling organizations
to make more informed decisions, enhance customer experiences, and gain a competitive edge in
an increasingly digital landscape. While challenges exist, such as data quality, talent shortages,
and technological limitations, the future of data analytics in Nigeria is promising.

By investing in education, fostering a data-driven culture, and embracing emerging technologies,


Nigeria can position itself as a leader in data analytics, driving innovation and economic growth
across various sectors. As we move forward, organizations, policymakers, and educational
institutions must work together to create an enabling environment for data analytics to thrive in
Nigeria.

16
CHAPTER FOUR

4.1 Relevance of Data Analysis to Computer Science

In today's technology-driven world, data is being generated at an exponential rate. Companies


and organizations across all industries are collecting vast amounts of data, and they need skilled
professionals who can make sense of it all. This is where data analysis comes into play. Here are
some reasons why data analysis is considered a key component for success in Computer Science
programs:

1. Identifying Patterns and Trends: By analyzing data, computer scientists can identify
patterns and trends that may not be apparent at first glance. This information can help
organizations make better decisions, develop more efficient algorithms, and improve the overall
performance of their systems.

2. Making Informed Decisions: Data analysis enables computer scientists to make informed
decisions based on solid evidence and insights obtained from the data. Whether it's developing
new software, implementing security measures, or innovating in the field of Artificial
Intelligence, data-driven decisions often lead to better outcomes.

3. Enhancing Performance and Efficiency: Data analysis allows computer scientists to identify
bottlenecks, optimize algorithms, and enhance system performance. By understanding how data
affects the behavior of computer systems, professionals can design more efficient software
solutions that meet the demands of modern technology.

4. Detecting Anomalies and Security Threats: With the rise of cybercrime and data breaches,
security has become a top concern for organizations. Data analysis helps computer scientists
detect anomalies, identify potential security threats, and develop robust security measures to
protect sensitive data.

Skills Required for Data Analysis in Computer Science

17
To excel in data analysis within the field of Computer Science, students must possess a solid set
of skills. Here are some of the essential skills that can make a significant difference:

1. Programming: A strong foundation in programming is crucial for data analysis. Proficiency


in languages such as Python, Java, or R enables computer scientists to access, manipulate, and
analyze large datasets efficiently.

2. Statistics and Mathematics: Understanding statistical concepts and mathematical principles


is essential for drawing meaningful insights from data. Proficient knowledge in areas such as
probability, regression analysis, and hypothesis testing is highly beneficial.

3. Data Visualization: Being able to present data in a visually appealing and comprehensible
manner is a valuable skill. Proficiency in tools like Tableau, D3.js, or matplotlib allows computer
scientists to create informative visualizations that effectively convey insights to stakeholders.

4. Data Mining and Machine Learning: Knowledge of data mining techniques, as well as
machine learning algorithms, is essential for data analysis. Understanding how to apply these
methods helps in uncovering patterns, making predictions, and solving complex problems.

Data analysis is truly a key component for success in Computer Science programs. To
summarize, here are the key takeaways:

 Data analysis is a crucial skill that enables computer scientists to make informed
decisions, identify patterns, enhance performance, and detect security threats.
 Proficiency in programming, statistics, data visualization, and machine learning is
essential for effective data analysis in Computer Science.
 Having data analysis skills can pave the way to rewarding career opportunities in AI,
machine learning, big data, software development, and information security.

With the ever-increasing reliance on technology and data, computer scientists who possess
strong data analysis skills have a competitive edge in the job market. Aspiring computer science
students should focus not only on theoretical knowledge but also on practical experience and
problem-solving abilities through data analysis.

18
 Real-world applicability: Data analysis is used in various domains, such as finance,
marketing, healthcare, and social media. By incorporating data analysis into computer
science curricula, students gain practical skills that can be directly applied in their future
careers.
 Improved problem-solving abilities: Analyzing data helps students develop a
systematic approach to problem-solving. They learn to identify patterns, make
predictions, and draw conclusions based on evidence, preparing them for the challenges
they will face in their professional lives.
 Enhanced decision-making: In today's data-driven world, making informed decisions is
crucial. By analyzing data, computer science students learn how to utilize evidence-based
reasoning to make sound decisions and gain a competitive advantage in their chosen
field.
 Increased employability: With the rapid growth of the data industry, employers are
seeking individuals with strong data analysis skills. By incorporating data analysis into
computer science curricula, educational institutions can better prepare students for the job
market and increase their employability.

Practical Applications of Data Analysis in Computer Science

Data analysis finds extensive applications in various areas of computer science. Let's explore
some practical examples:

1. Machine Learning and Artificial Intelligence

Data analysis is at the core of machine learning and artificial intelligence. By analyzing large
datasets, algorithms can learn patterns and make predictions. Some key applications include:

 Image and speech recognition


 Recommendation systems
 Natural language processing

By understanding data analysis, computer science students can contribute to the development of
cutting-edge technologies that are revolutionizing industries worldwide.

19
2. Cybersecurity

Data analysis plays a significant role in cybersecurity. By analyzing network traffic, logs, and
user behavior, computer scientists can detect and prevent cyber threats. Important applications
include:

 Anomaly detection
 Intrusion detection
 Vulnerability assessment

By incorporating data analysis techniques into computer science curricula, students can
contribute to securing our digital world and protecting sensitive information.

3. Data Visualization

Data visualization is the graphical representation of data to facilitate understanding and decision-
making. By analyzing and visualizing data effectively, computer scientists can communicate
complex information in a comprehensible manner. Some key applications include:

 Interactive dashboards for business analytics


 Scientific visualizations
 Geospatial analysis and mapping

By developing strong data visualization skills, computer science students can present data in a
visually appealing and informative way, aiding decision-makers in various industries.

Data analysis is an essential skill in today's computer science curricula. By incorporating data
analysis, students can:

 Apply their skills across various real-world domains


 Improve their problem-solving abilities
 Enhance their decision-making capabilities
 Increase their employability

20
In conclusion, as the digital world becomes increasingly data-driven, computer science students
must develop proficiency in data analysis. By doing so, they can unlock opportunities to solve
complex problems, contribute to technological advancements, and make a meaningful impact in
their careers.

Thank you for joining us on this exciting exploration of the practical applications of data analysis
in computer science curricula. Stay tuned for more captivating articles that shed light on the
ever-evolving world of technology!

Enhancing Problem-Solving Skills Through Data Analysis in Computer Science

The Power of Data Analysis in Computer Science

Data analysis involves collecting, organizing, and interpreting large volumes of data. Computer
scientists employ techniques such as statistical modeling, machine learning, and visualization to
extract meaningful insights from raw data. These insights can then be used to solve complex
problems, optimize processes, and make informed decisions. Let's delve into the specific ways
data analysis can enhance problem-solving skills in computer science:

Identifying Patterns and Trends

 Data analysis allows computer scientists to identify patterns and trends within a given
dataset. This is particularly useful when dealing with large amounts of data, as it helps
recognize recurring elements or relationships between variables.
 Identifying patterns enables the development of efficient algorithms and models that can
solve problems more effectively. It improves computational efficiency and reduces time
and resource consumption.
 By analyzing data, computer scientists can uncover hidden insights that may not be
immediately apparent. These insights can provide a deeper understanding of problem
domains and help devise innovative solutions.

Optimizing Decision-Making Processes

21
 Data analysis plays a vital role in decision-making processes. Computer scientists can
leverage data-driven insights to make informed decisions, weighing different options
based on quantitative evidence.
 Through data analysis, computer scientists can assess the potential outcomes of different
strategies or solutions. This helps in identifying the most optimal approach, minimizing
risks, and making predictions about future scenarios.
 By employing data analysis techniques, computer scientists can reduce the chances of
errors and biases in decision-making. It enables them to justify decisions with facts and
evidence.

The Advantages of Utilizing Data Analysis for Problem-Solving

Incorporating data analysis techniques into problem-solving in computer science brings several
advantages:

 Efficiency: Data analysis allows computer scientists to optimize problem-solving


processes by identifying patterns, trends, and relationships within large datasets. This
leads to faster and more accurate solutions.
 Precision: By relying on data-driven insights, problem-solving approaches become more
precise and focused. Data analysis minimizes the likelihood of errors and biases, leading
to more reliable and effective solutions.
 Innovation: Analyzing data often uncovers hidden insights and alternative perspectives.
This fosters innovation by helping computer scientists think beyond traditional problem-
solving approaches and discover new solutions.
 Scalability: Data analysis techniques can handle vast amounts of data, allowing
computer scientists to solve complex problems that would be otherwise unmanageable.
This scalability enables them to tackle real-world challenges more effectively.

Data analysis is a powerful tool in enhancing problem-solving skills in computer science. By


analyzing data, computer scientists can identify patterns, trends, and relationships, leading to
more efficient problem-solving processes. Utilizing data-driven insights allows for more precise
decision-making and the ability to innovate and think outside the box. The advantages of

22
incorporating data analysis techniques include increased efficiency, improved precision,
fostering innovation, and scalability to tackle complex challenges.

Aspiring computer scientists should embrace data analysis as a core skill, as it is increasingly
sought after in the industry. By honing problem-solving abilities through data analysis, aspiring
professionals can position themselves as valuable assets in the ever-evolving field of computer
science.

The Benefits of Data Analysis in Computer Science Education

This is where data analysis comes into play.

Data analysis is the process of inspecting, cleansing, transforming, and modeling data to discover
useful information, draw conclusions, and support decision-making. It involves various
techniques and tools to extract meaning from data sets, enabling individuals to make data-driven
decisions. In the field of computer science education, incorporating data analysis into the
curriculum offers several benefits, both for the students and the industry as a whole.

4.2. Glossary

Artificial intelligence (AI): is a simulation of human intelligence processes by machines. It


combines computer science with robust datasets to enable problem solving using the rapid
learning capabilities of machines.

Attribute: When working in a spreadsheet or database, an attribute is a common descriptor used


to label a column. Labeling columns clearly and precisely can enable you to keep your data
organized and ready for analysis.

Augmented intelligence is a design pattern for a human-centered partnership model of people


and artificial intelligence used to enhance cognitive performance, including learning, decision
making, and new experiences. The combination of human intuition and artificial intelligence is
powerful and can help mitigate perceived risk with purely machine-driven AI.

23
Big data refers to large and complex datasets containing structured and unstructured data,
arriving in increasing volumes and velocity. Big data is relative — what was big ten years ago is
no longer considered big today, and the same will be true ten years from now. The point here is
that the data is big enough to require special attention with regards to storing, moving, updating,
querying, and aggregating it.

Business glossary is a repository of information that contains concepts and definitions of


business terms frequently used in day-to-day activities within an organization — across all
business functions — and is meant to be a single authoritative source for commonly used terms
for all business users. Business glossaries are used to build consensus in organizations and are
great for getting new team members up to speed on your organization’s jargon and lexicon of
acronyms.

Business intelligence (BI) leverages software and services that help business users make more
informed decisions by delivering reports and dashboards to help them analyze data and
actionable information.

Changelog: A changelog is a list documenting all of the steps you took when working with your
data. This can be helpful in the event that you need to return to your original data or recall how
you prepared your data for analysis.

Clean data: Clean data is data that is accurate, complete, and ready for analysis. Data cleaning,
an important step in the data analysis process, involves checking your data for inaccuracies,
inconsistencies, irregularities, and biases.

Cloud computing is a service provided via the internet where an organization can access on-
demand computing resources from another organization under a shared service model. Cloud
computing allows organizations to avoid large upfront costs and ongoing maintenance associated
with procuring, hosting, and managing their own data centers. Users can effectively rent
compute, network, and storage resources for a period and only pay for the services as long as
they are using them. This allows for maximum flexibility to scale up and scale down resources
quickly and on-demand.

24
CSV (comma-separated values) file: A CSV file is a text file that separates pieces of data with
commas. This is a common file type when downloading data files for analysis, as it tends to be
compatible with common spreadsheet and database software.

Dashboard: A dashboard is a tool used to monitor and display live data. Dashboards are
typically connected to databases and feature visualizations that automatically update to reflect
the most current data in the database.

Data analytics: Data analytics is the collection, transformation, and organization of data in order
to draw conclusions, make predictions, and drive informed decision-making. Data analytics
encompasses data analysis (the process of deriving information from data), data science (using
data to theorize and forecast) and data engineering (building data systems). Data analysts, data
scientists, and data engineers are all data analytics professionals.

Data architecture is the plan and design for the entire data lifecycle for an organization, starting
when data is captured, going all the way to when value is generated from data through analytics.

Data catalog is the pathway — or a bridge — between a business glossary and a data dictionary.
It is an organized inventory of an organization’s data assets that informs users — both business
and technical — on available datasets about a topic and helps them to locate it quickly.

Data cleaning: Data cleaning, cleansing, or scrubbing is the process of preparing raw data for
analysis. When cleaning your data, you verify that your data is accurate, complete, consistent,
and unbiased. It’s important to make sure you have clean data prior to analysis because unclean
or dirty data can lead to inaccurate conclusions and misguided business decisions.

Data democratization is the process of providing all business users with an organization —
technical and non-technical — access to data and enabling them to use it, when they need it, to
gain insights and expedite decision-making.

Data dictionary is the technical and thorough documentation of data and its metadata within a
database or repository. It consists of the names of fields and entities, their location within the
database or repository, detailed definitions, examples of content, descriptions for business

25
interpretation, technical information like type, width, constraints, and indexes, and business rules
and logic applied to derived or semantic assets.

Data engineering is the process and practices needed to transform raw data into meaningful and
actionable information. Common data engineering tasks involve data collection, extraction,
curation, ingestion, storage, movement, transformation, and integration.

Data enrichment: Data enrichment the process of is adding data to your existing dataset. You’d
typically enrich your data during the data transformation process as you are getting ready to
begin your analysis if you realize you need additional data in order to answer your business
question.

Data governance: Data governance is the formal plan for the way an organization manages
company data. Data governance encompasses rules for the way data is accessed and used, and
can include accountability and compliance rules.

Data ingestion is the process by which data is loaded from various sources to a storage medium
— such as a data warehouse or a data lake — where it can be accessed, used, and analyzed.

Data integration is the process of connecting disparate data together for analysis or operational
uses.

Data integrity: Data integrity encompasses the accuracy, reliability, and consistency of data
over time. It involves maintaining the quality and reliability of data by implementing safeguards
against unauthorized modifications, errors, or data loss.

Data lake is a central data repository that accepts relational, structured, semi-structured, and
non-structured data types in a low-to-no modeling framework, used for tasks such as reporting,
visualization, advanced analytics, and machine learning. A data lake can be established on
premises (within an organization’s data centers) or in the cloud.

Data management is the way you carry out your data strategy. It is the plans, policies,
procedures, and actions taken on data assets in an organization throughout the data lifecycle to
create valuable information repeatable and at scale.

26
Data mining is the practice of systematically analyzing large datasets to generate insightful
information, uncover hidden correlations, and identify patterns.

Data model is a communication and consensus building tool, where a person or an organization
creates a visual representation of how things relate together and how processes behave in the real
world. They are used to translate business requirements into technical requirements, especially in
database and system design.

Data quality is a measure of the condition of data based on factors such as accuracy,
completeness, consistency, and reliability. Generally, data is of sufficient quality when it is fit for
its intended uses in operations and decision making.

Data replication is the process and activities necessary to make a copy of data stored in one
location and moving it to a different location. This replication activity improves accessibility to
data and protects an organization from a single point of failure that can cause a data loss event.

Data science: Data science is the scientific study of data. Data scientists ask questions and find
ways to answer those questions with data. They may work on capturing data, transforming raw
data into a usable form, analyzing data, and creating predictive models.

Data source: A data source refers to the origin of a specific set of information. As businesses
increasingly generate data year over year, data analysts rely on different data sources to measure
business success and offer strategic recommendations.

Data strategy is a defined plan that outlines the people, processes, and technology your
organization needs to accomplish your data and analytics goals. A data strategy is designed to
answer exactly what you need in order to more effectively use data; what processes are required
to endure the data is high quality and accessible; what technology will enable the storage,
sharing, and analysis of data; and the data required, where it’s sourced from, and whether it’s of
good quality.

Data visualization: Data visualization is the representation of information and data using charts,
graphs, maps, and other visual tools. With strong data visualizations, you can foster storytelling,

27
make your data accessible to a wider audience, identify patterns and relationships, and explore
your data further.

Data warehouse is a highly governed centralized repository of modeled data sourced from all
kinds of different places. Data is stored in the language of the business, providing reliable,
consistent, and quality-rich information.

Data wrangling: Data wrangling, also called data munging or data remediation, is the process of
converting raw data into a usable form. There are four stages of the munging process: discovery,
data transformation, data validation, and publishing. The data transformation stage can be broken
down further into tasks like data structuring, data normalization or denormalization, data
cleaning, and data enrichment.

Database: A database is an organized collection of information that can be searched, sorted, and
updated. This data is often stored electronically in a computer system called a database
management system (DBMS). Oftentimes, you’ll need to use a programming language, such as
structured query language (SQL), to interact with your database.

Descriptive analytics tells you what happened in the past by looking at historical data and
finding patterns. Most organizations with some level of maturity on their analytics journey are
already doing some degree of descriptive analytics.

Diagnostic analytics helps organizations understand the “why” behind the “what” of descriptive
analytics. It enables better decision making, as well as generates better predictive use cases
through that understanding.

Embedded analytics places analytics at the point of need — inside of a workflow or application
— and makes it possible for users to take immediate action without needing to leave the
application to get more information to make a decision. Embedded analytics can work the
opposite way as well, where organizations put an operational workflow inside of an analytics
application to streamline processes. An example of this is an analytics application that identified
product out of stock trends having a mechanism to allow a planner to adjust the reorder threshold
within the analytics dashboard for that product.

28
Geospatial analytics gathers, manipulates, and displays geographic information system data and
imagery including GPS and satellite photographs to create geographic models and data
visualizations for more accurate modeling and prediction of trends. Putting data in context of its
geography can lead to insights that are not immediately visible when analyzed with traditional
data visualization methods.

Machine learning is a practical application of AI, where a system uses data and information to
learn and improve over time by identifying trends, patterns, relationships, and optimizations.

Metadata: Metadata is data about data. It describes various characteristics of your data, such as
how it was collected, where it’s stored, its file type, or creation date. Metadata can be particularly
useful for verification and tracking purposes.

Open data: Open data, also called public data, is data that is available for anyone to use.
Exploring and analyzing open datasets is one way to practice data analysis skills.

Predictive analytics is a form of advanced analytics that determines what is likely to happen
based on historical data using statistical techniques, data mining, or machine learning.

Prescriptive analytics pertains to true guided analytics where your analytics is prescribing or
guiding you toward a specific action to take. It is effectively the merging of descriptive and
predictive analytics to drive decision making.

Qualitative data: Qualitative data is data that describes qualities or characteristics. It’s generally
non-numeric data and can be subjective, for example eye color or emotions.

Quantitative data: Quantitative data is objective data with a specific numeric value. It’s
generally something you can count or measure, such as height or speed.

Query: A query is a request for information. It’s essentially the question you ask a database in
order to return the data you want to retrieve. In data analytics, you’ll formulate your database
queries using a query language, such as Structured Query Language (SQL).

29
Relational database: A relational database is a database that contains several tables with related
information. Even though data is stored in separate tables, you can access related data across
several tables with a single query. For example, a relational database may have one table for
inventory and another table for customer orders. When you look up a specific product in your
relational database, you can retrieve both inventory and customer order information at the same
time.

Structured Data: Structured data is formatted data, for example data that is organized into rows
and columns. Structured data is more readily analyzed than unstructured data because of its tidy
formatting.

Structured Query Language (SQL): Structured Query Language, or SQL (pronounced


“sequel”), is a computer programming language used to manage relational databases. It’s among
the most common languages for database management.

Supervised learning uses labeled data — data that comes with a target or identifier such as a
name, type, or number — and guided learning to train models to classify data or to make
accurate predictions. It is the simpler of the two types of machine learning, the most used, and
the most accurate because the learning is guided using known historical targets that you can plug
in to get the outcome.

Unstructured data: is data that is not organized in any apparent way. In order to analyze
unstructured data, you’ll typically need to implement some type of organization.

Unsupervised learning uses unlabeled data — data that does not come with a target or identifier
— to make predictions. It uses artificial intelligence algorithms to identify patterns in datasets
and doesn’t have any defined target variable. Unsupervised learning can perform more complex
tasks than supervised learning but has the potential to be less accurate in its predictions, and
possibly have reduced interpretability with additional analysis.

Recommendation

30
For Data Analysts looking to expand their impact, gaining foundational skills in machine
learning is highly beneficial. While traditional data analysis focuses on historical data patterns,
machine learning enables predictive insights and prescriptive analysis, allowing analysts to
forecast trends and suggest optimal actions. We recommend starting with supervised learning
techniques like linear regression, logistic regression, and decision trees, as these are valuable for
predictive tasks and can enhance a data analyst’s ability to uncover deeper insights. Familiarity
with unsupervised techniques, such as clustering and dimensionality reduction, is also
recommended for exploring data relationships. Mastery of these techniques will position analysts
to tackle complex, predictive, and prescriptive analytics challenges, creating greater value for
their organizations.

Conclusion

In today’s data-driven world, the role of a Data Analyst is pivotal to helping businesses make
informed decisions and stay competitive. By developing a strong foundation in statistical
analysis, technical skills, and visualization techniques, Data Analysts can interpret data
accurately and present insights effectively. As industries continue to rely on data for strategic
decision-making, those analysts who also incorporate machine learning techniques into their
toolkit will not only enhance their analytical capabilities but also open the door to advanced
problem-solving and predictive analytics. The right combination of technical and business skills
allows data analysts to provide comprehensive insights that drive growth and innovation,
ensuring they remain valuable assets across diverse industries.

31
32
33

You might also like