Assignment 2 guide
General guides
Average length for the assignment 2 is 30 pages
Fonts: Calibri, Font size 12, Line spacing 1.3, use of : Headings, Paragraphs, Subsections and
illustrations as appropriate is required.
Complete and fill all required data in the Front page with scan digital signature.
o Table of contents
o List of Figures
o List of Tables
Introduction
o Introduce the project and explain what the project is all about
o Explain the problems that requires address and give solutions to the problems
o Talk about what the important parts of the assignment you will address
Body of the report
o P2, P3, P4, P5, M2, M3, M4, M5, D2, D3
Conclusion
o The summary of the entire assignment that brings the report to a satisfying and logical end.
Evaluation
o You must provide your opinion or verdict on whether an argument, or set of research
findings, is accurate.
o You can provide a SWOT about the course.
References
o Inline text citations are required (Sources of information)
o You must use Harvard referencing style for this report.
Assignment criteria
P2 - Develop the database system with evidence of user interface, output and data validations, and
querying across multiple tables.
Develop a user interface (UI) for users to interact with the database (e.g., forms, menus).
Design outputs (reports, visualizations) to present data to users in a meaningful way.
Implement data validation rules to ensure data accuracy and consistency during input.
Develop queries that can retrieve and manipulate data across multiple tables.
P3 - Implement a query language into the relational database system
Assess the suitability of the T-SQL query language for sales software applications.
Implement data connection and execute queries to fulfill software functionalities.
M2 - Implement a fully functional database system, which includes system security and database
maintenance.
System deployment requires each member to log in before using the system.
The system implements secure password encryption during login and maintains clear
authorization levels based on user roles.
Develop a comprehensive backup and recovery plan to ensure data integrity during system
failures.
Schedule regular maintenance tasks to optimize performance and proactively address
potential issues.
M3 - Assess whether meaningful data has been extracted through the use of query tools to produce
appropriate management information.
Through software functions to collect data. From there, create a comparison table between
the current data and the data that needs to be collected from the requesting user.
Assess whether the level of compatibility between current data and the data that needs to be
collected by the requesting user.
P4 - Test the system against user and system requirements.
Identify elements of the system that need to be tested. Consider data that should be used to
fully test the system.
Match tests against user and system requirements.
Test procedures to be used: test plans, test models, e.g. structural testing, functional testing;
testing documentation.
M4 - Assess the effectiveness of the testing, including an explanation of the choice of test data used.
Analyse test coverage to identify areas where user interactions are not adequately tested.
Example: You have a function for calculating shipping costs based on weight and location. Test
coverage shows that only basic weight values are tested, but not edge cases (e.g., very heavy
items) or specific locations.
Assess the effectiveness of the chosen test data in uncovering potential issues within the
system. Example: Login tests only use valid usernames and passwords. There should be
additional tests for invalid inputs (e.g., empty username, incorrect password format) to
simulate real-world user behavior.
Recommend improvements to the testing strategy for future iterations, focusing on areas
with low coverage or insufficient data variation. Example: For the shipping cost calculation,
create test cases with very high weight values and invalid location formats to ensure proper
handling of edge cases.
D2 - Create Evaluate the effectiveness of the database solution in relation to user and system
requirements and suggest improvements.
Compare the final system to initial user and system requirements (Acceptance Testing):
This is a formal evaluation stage where stakeholders (users, developers, and managers)
come together to assess if the final system fulfills the original goals outlined in the user
and system requirements documents.
Ex: Let's say a user requirement specified the ability to generate reports with various data
filters. During acceptance testing, users would verify if the system offers report
generation functionality with the promised filter options.
Assess the overall effectiveness of the database solution (User Satisfaction Survey):
This involves gathering user feedback on their experience with the completed system.
This can be done through surveys, interviews, or focus groups. The goal is to understand
how well the system addresses user needs and how satisfied users are with its
functionalities.
Ex: A user satisfaction survey could ask questions about the ease of use, efficiency of data
retrieval, and overall usefulness of the system for completing tasks. Analyzing the
feedback helps identify areas where the system might need adjustments to better serve
user needs.
Evaluate the system's performance, security, and maintainability (Performance Testing &
Security Audit):
This stage involves a comprehensive evaluation of the system across multiple aspects.
o Similar to the process described earlier, you'd measure factors like data retrieval
speed, response time, and system stability under various load conditions (e.g., many
users accessing simultaneously).
o A security audit involves a systematic review of the system's security measures (e.g.,
user authentication, data encryption) to identify potential vulnerabilities and ensure
adequate protection against unauthorized access or data breaches.
o Maintainability evaluation focuses on how easily the system can be modified, updated,
and debugged in the future. Factors like code clarity, documentation quality, and
modular design all contribute to good maintainability.
Ex: Performance testing might reveal a bottleneck in data processing during peak system
usage. A security audit could identify a weak password policy that needs strengthening.
Evaluating maintainability might highlight poorly documented code sections that could
hinder future updates.
Identify areas for improvement based on the evaluation results:
After analyzing the results from acceptance testing, user satisfaction surveys, and
performance/security/maintainability evaluations, you'd identify areas where the system
falls short of expectations. These could be limitations in functionality, usability issues,
performance bottlenecks, or security vulnerabilities.
Recommend specific enhancements for future development (System Improvement
Roadmap):
Based on the identified areas for improvement, you'd create a roadmap for future system
development. This roadmap would prioritize enhancement suggestions and outline a plan
for implementing them in future versions of the database system.
Ex: Improvement recommendations could involve adding new reporting features,
improving search functionalities based on user feedback, or implementing performance
optimizations identified during testing.
P5, M5 - Produce technical and user documentation for a fully-functional system, including data flow
diagrams and flowcharts, describing how the system works.
Produce system architecture diagrams
Database schema details
Query documentation
Programming code documentation (if applicable)
Produce user documentation for end-users:
User manuals with clear instructions on how to use the system
Tutorials and guides for specific functionalities
FAQs (Frequently Asked Questions) to address common user issues
D3 - Evaluate the database in terms of improvements needed to ensure the continued effectiveness
of the system.
Monitor system usage and user feedback to identify potential issues.
Analyze database performance metrics (e.g., query execution times, storage utilization).
Evaluate the effectiveness of data security measures.
Assess the overall maintainability of the system based on documentation and code clarity.
Identify areas for improvement based on the evaluation results.
Recommend changes to the database system or documentation to ensure its continued
effectiveness.