[go: up one dir, main page]

0% found this document useful (0 votes)
16 views16 pages

Softwaremetrics

This document presents a study on the impact of classroom environments and teaching tools on student performance in the Software Metrics course at IIT. Using the Goal-Question-Metric (GQM) framework, the study identifies key factors affecting student engagement and learning outcomes, including classroom conditions and tool effectiveness. The findings suggest a need for improvements in teaching methods, classroom environment, and the integration of practical tools to enhance student learning experiences.

Uploaded by

pronobkarmoker1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views16 pages

Softwaremetrics

This document presents a study on the impact of classroom environments and teaching tools on student performance in the Software Metrics course at IIT. Using the Goal-Question-Metric (GQM) framework, the study identifies key factors affecting student engagement and learning outcomes, including classroom conditions and tool effectiveness. The findings suggest a need for improvements in teaching methods, classroom environment, and the integration of practical tools to enhance student learning experiences.

Uploaded by

pronobkarmoker1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Software Metrics assignment 1

Impact of Classroom Environments and Teaching Tools on


Student Performance in the Software Metrics Course at IIT

Submitted to

Shah Mostafa Khaled, Ph.D.


Associate Professor, IIT, DU

Dr Emon Kumar Dey


Associate Professor, IIT, DU

Submitted by

Sazidur Rahman Mahin


BSSE-1414
Hasnain Ferdous
BSSE-1427
Pronob Karmorker
BSSE-1431

Software Metrics assignment 2


Chapter 1: Introduction
Overview:

In software engineering education, student performance is influenced not only by


curriculum and teaching quality but also by the classroom environment and tools used.
Measuring the impact of such factors can lead to more effective learning outcomes.
Motivation:
The GQM approach was selected for its ability to structure the evaluation around specific
goals and derive meaningful, actionable insights. Understanding how classroom factors
affect learning can help improve instructional strategies in future semesters.
Scope:

This study focuses on environmental (lighting, seating) and instructional (tool usage,
interaction method) factors in the Software Metrics course at IIT. It seeks to identify which
factors most strongly correlate with improved student performance and engagement.

Chapter 2: Goal Specification


2.1 GQM Framework Explanation
The Goal-Question-Metric GQM model provides a structured framework for software
measurement. It starts by defining a specific goal, then derives questions that assess
progress toward that goal, and finally identifies appropriate metrics (quantitative or
qualitative) to answer those questions.
This study uses the GQM approach to evaluate how classroom conditions and teaching
tools affect student understanding, focus, and performance in the Software Metrics
course SE 611.

2.2 Main Goal

To assess and improve student performance and engagement in the


Software Metrics course by analyzing classroom environmental factors
and teaching tool effectiveness.

Software Metrics assignment 3


2.3 Subgoals, Questions, and Metrics

GA1: Understand student background


QA1What is the academic level of students participating?

MA1 Semester identification 6th, 7th, etc.) from survey

QA2 What was their initial interest in Software Metrics before the course?

MA2Rating scale of interest level Very Interested to Not Interested at All)

GA2: Evaluate classroom conditions


QA1 How do students rate the physical classroom environment?

MA1 Comfort ratings Excellent to Very Poor)

QA2 Was the classroom environment helpful for maintaining focus?

MA2 Agreement level on focus retention Strongly Agree to Strongly Disagree)

QA3 How was the lighting condition in the classroom?

MA3 Lighting ratings Very Good to Very Poor)

QA4 How often were students distracted due to noise, heat, etc.? Frequency

levels Very Often to Never

GA3: Assess instructor delivery


QA1 Was the instructor audible during lectures?

MA1 Audibility rating Always Clear to Never Clear)

QA2 Did the instructorʼs teaching style help students understand the content better?

MA2 Agreement scale Strongly Agree to Strongly Disagree)

GA4: Evaluate teaching tool usage and effectiveness


QA1 Which tools were used in the Software Metrics class?

Software Metrics assignment 4


MA1 List of selected tools Slides, Whiteboard, Online Platforms, etc.) QA2 Which

tool helped students learn best?

MA2 Preferred learning tool selected in survey

QA3 How effective were these tools in explaining difficult concepts?

MA3 Effectiveness rating Very Effective to Very Ineffective)

QA4 How did students rate the use of digital tools (e.g., LMS, Google Meet)?

MA4 Rating scale Excellent to Very Poor)

QA5 Do students want more digital/interactive tools in the future?

MA5 Agreement scale Strongly Agree to Strongly Disagree)

GA5: Measure learning outcomes and perceived difficulty


QA1 How confident are students about their understanding after the course? MA1

Confidence level rating Very Confident to Not Confident at All) QA2 How difficult did

students find the course?

MA2 Difficulty rating Very Easy to Very Difficult)

QA3 Do students feel their performance improved during the course?

MA3 Self-reported performance change Improved to Decreased) QA4

Were assessments aligned with what was taught in class? MA4 Alignment

rating Always to Never)

GA6: Evaluate career relevance and satisfaction


QA1 How likely are students to apply knowledge from this course in their careers?

GA7: Collect student feedback for improvement


QA1 What part of the course did students enjoy the most, and why?

MA1 Open-ended responses categorized by theme (e.g., Group Work, Memes,


Games)

Software Metrics assignment 5


QA2 What changes do students suggest to improve the course?

MA2 Thematic classification (e.g., More Interaction, Practical Work, Tool


Integration)

Notes:
Metrics include Likert scale ratings 1 5 , frequency responses, multiple selection
counts, and free-text comments.

This mapping ensures every actual survey question is embedded in your GQM strategy.

Real data (from your 15 responses) supports each subgoal.

Chapter 3: Data Collection and Visualization

3.1 Data Collection Methods


To investigate how the classroom environment and teaching tools affect student
performance in the Software Metrics course, a quantitative survey-based approach was
used. The methodology is outlined below:

Instrument Used: Google Forms

Questionnaire: 22 structured questions divided into 5 categories:

Student Background

Classroom Environment

Teaching Tools

Learning & Performance

Open-ended Feedback

Type of Questions: Multiple-choice, Likert scale, and short-answer

Target Group: Undergraduate students enrolled in the Software Metrics course

Software Metrics assignment 6


Response Size: 15 valid responses collected

Collection Period: June 24 to June 28, 2025

Environment: Responses were collected anonymously online during class hours or after
lectures.

Proof of Data Collection:

Software Metrics assignment 7


Software Metrics assignment 8
Software Metrics assignment 9
Software Metrics assignment 10
3.2 Data Representation

To simplify the interpretation of the collected data, both tabular summaries and graphical
visualizations were used.

Summary Table:
Most Selected Answer
Question Percentage

Which semester are you in? 6th Semester 86.7%

How would you rate your interest before the course? Not Interested / Neutral
66.6%

Classroom environment rating Average 53.3%

Most useful teaching tool Slides 60%

Confidence after completing course Neutral 66.7%

Would you like more digital tools in class? Agree 53.3%

Will use this course in your career? Neutral 46.7%

3.3 Tools Used


Tool Purpose

Google Forms Designing and collecting survey responses

Google Sheets Viewing response data and preparing summaries

Excel Formatting response tables and plotting graphs

Google Charts / Forms Generating auto-updated pie and bar charts

Chapter 4: Results and Metric Analysis

Software Metrics assignment 11


4.1 Analysis of Results
The responses from 15 students provided valuable insights into how the classroom
environment and teaching tools influenced their learning experience in the Software
Metrics course.

Key Observations:
Student Background

A significant majority 86.7% of respondents were from the 6th semester, indicating a
focused and relevant participant base.

Initial Interest
Interest in Software Metrics was generally low to moderate:

33.3% were not interested at all

33.3% were neutral

This suggests the course had to overcome initial disinterest. Classroom

Environment

53.3% rated the physical environment as average

Only 6.7% found it excellent

A high percentage 46.7% disagreed that the environment helped them stay
focused.

Lighting and Distractions

53.3% rated lighting as average

46.7% said distractions occurred often

These indicate a non-optimal learning environment for many students. Audibility &

Instructor Clarity

57.1% found the instructor mostly clear,

21.4% reported sometimes unclear, which may affect content delivery.

Software Metrics assignment 12


4.2 Discussion of Patterns and Insights Pattern 1: Tool Usage vs Learning
Impact
Tools like slides and projectors were used frequently 86.7% .

60% of students said slides helped them learn best, whereas tools like videos were
underused 13.3% .

Despite tool availability, only 40% found them effective for difficult topics, and 0% found
them very effective.

Insight: Tools were present, but perhaps not used interactively or contextually enough to
enhance understanding.

Pattern 2: Confidence & Performance


Only 6.7% felt very confident about their understanding.

66.7% felt neutral, and 13.3% were slightly unconfident.

Only 20% reported real improvement in performance, while 46.7% showed slight
improvement.

Insight: Teaching methods may not have significantly boosted learning confidence or
performance for most students.

Pattern 3: Teaching Style and Engagement


73.3% were neutral about whether the instructor's style helped understanding.

Open feedback showed students wanted more interaction and practical tools, with
some suggesting course removal.

Insight: The course lacks engaging pedagogy and practical application, as reflected in
multiple feedback entries.

4.3 Metric Effectiveness Evaluation


The selected metrics were designed to cover:
Area of Evaluation Metric/Question Used Effectiveness

Classroom Quality Environment, lighting, distractions Effective

Software Metrics assignment 13


Tool Usage Frequency and usefulness of tools Effective

Teaching Delivery Instructor clarity, style, engagement Effective

Student Outcome Confidence, improvement, application Effective

Student Feedback Suggestions and likes/dislikes Effective

All chosen metrics helped gather measurable, relevant data. However, to deepen the
analysis, future metrics could include:

Objective performance measures (quiz/assignment scores)

4.4 Summary of Findings


Focus Area Key Takeaway

Environment Mostly average and distracting

Tools Slides dominant; others underused

Teaching Style Neutral to mildly negative perception

Confidence Mostly neutral with minor gains

Practicality Lacking, based on feedback

Chapter 5: Conclusion

5.1 Summary of Overall Findings


Through this study, we set out to evaluate how the classroom environment and teaching
tools influence student performance in the Software Metrics course. Using a structured
survey of 15 participants, we identified that:

Most students come from the 6th semester and had neutral or low interest in the
course from the beginning.

Software Metrics assignment 14


The physical classroom environment was generally rated as average, and frequent
distractions such as noise or heat negatively affected focus.

Tools such as PowerPoint slides and projectors were widely used, but students found
them moderately helpful. More interactive or visual aids (like simulations or videos)
were desired.

Students felt neutral about their improvement and confidence levels. The teaching
style, while clear to some, did not significantly impact most learners.

The findings suggest a clear gap between the teaching setup and student engagement. Even
though traditional tools were in place, the delivery method and classroom conditions limited
the courseʼs impact.

5.2 Learning Through the GQM Approach


The GQM Goal-Question-Metric) approach provided a clear framework for this
investigation:

Goal: To assess how classroom settings and teaching tools affect student learning
outcomes in Software Metrics.

Questions: We asked specific, measurable questions about environment, tool usage,


teaching methods, and learning results.

Metrics: Each question generated useful data that we could interpret in percentage
terms, helping us uncover real patterns.

By using GQM, we were able to connect abstract goals (like improving student
understanding) with concrete evidence (like tool effectiveness, confidence levels, and
engagement rates). This approach made the entire evaluation process objective, focused, and
actionable.

5.3 Future Improvements and Suggestions


Based on the survey responses and analysis, several key improvements are recommended
for future offerings of this course:

Improve Classroom Environment:


Address lighting, noise control, and seating comfort to create a more focused learning
space.

Software Metrics assignment 15


Enhance Teaching Style:

Adopt more interactive teaching techniques such as case studies, simulations, flipped
classrooms, and real-time feedback.

Integrate Practical Tools:

Use actual software metric tools in assignments and projects, so students can apply
what they learn in a real-world context.

Increase Engagement:

Add more 1-to-1 interactions, open discussions, and peer activities to maintain interest
and understanding.

Diversify Digital Content:

Include educational videos, quizzes, and visualizations to better explain difficult topics.

Collect More Data:

Include academic performance data (e.g., quiz results) to correlate with survey
responses and get deeper insights.

5.4 Final Reflection


This project taught me the value of structured evaluation using metrics. It also highlighted
the importance of student feedback in shaping educational practices. Even a well-planned
course can fail to engage learners if it lacks practical relevance and environmental support.
The GQM method helped convert assumptions into facts, and gave me a clear roadmap for
improving learning experiences in any technical subject.

Software Metrics assignment 16

You might also like