SDLC Documentation: User Guide & Evaluation
The final stages of the Software Development Life Cycle (SDLC) involve delivering the product to the user and assessing its success. A well-written User Guide ensures the software is usable, while a critical Evaluation demonstrates your analytical skills to examiners.
Learning Objectives
- 12.2.2.1 Develop a comprehensive user guide for a software project
- 12.2.2.2 Evaluate the final project against the initial User Requirements Specification (URS)
- 12.2.2.3 Identify system limitations and propose future improvements
Part 1: The User Guide
The User Guide (or User Manual) is written for the end-user. Unlike your technical design documentation, the end-user does not care about your algorithms, database structures, or code. They only want to know how to install and use the software.
Essential Components
| Section | What to Include |
|---|---|
| Hardware & Software Requirements | Minimum OS version, required RAM, disk space, and any required third-party software (e.g., Python 3.10+, SQLite). |
| Installation Guide | Step-by-step instructions on how to set up the software. Include steps for database initialization if applicable. |
| Operation / How to Use | Clear instructions on performing core tasks (e.g., "How to add a new user", "How to generate a report"). Screenshots are mandatory here. |
| Troubleshooting & Error Messages | A table explaining common error messages the user might encounter and how they can fix them. |
Screenshot Rules
Do not just paste full-screen images. Crop screenshots to focus on the specific button or menu you are describing. Add red circles or arrows to direct the user's attention.
Part 2: Project Evaluation
The Evaluation is the final chapter of your project report. It proves to the examiner that you can objectively analyze your own work. A common mistake is simply writing "I liked this project and it works well." You must be critical.
Evaluating Against URS
You must compare your finished product against the User Requirements Specification (URS) you defined at the very beginning of the project.
Example: URS Evaluation Table
| Req ID | Original Requirement | Met? | Evidence / Justification |
|---|---|---|---|
| REQ-01 | System must allow admin to search students by ID. | Fully | Implemented via binary search on the Student form (See User Guide page 4). |
| REQ-02 | System must generate automated weekly email reports. | Partially | Reports can be generated manually, but the automated email trigger was not implemented due to time constraints. |
Limitations & Future Improvements
Examiners award high marks to students who recognize the flaws in their systems. No software is perfect. You must explicitly state what your program cannot do, and how you would fix it if you had another 6 months.
- Limitations: "The current hashing algorithm is weak", "The GUI is not responsive on mobile screens", "Data is stored locally, preventing multi-user access".
- Future Improvements: "Migrate the SQLite database to a cloud-hosted PostgreSQL server", "Implement 2FA (Two-Factor Authentication) for admin logins".
Common Pitfalls
Including Code in the User Guide
End-users do not understand Python or SQL. Never put code snippets in the User Guide. Keep all technical details in the Design and Development sections of your report.
Fake Perfection in Evaluation
Claiming your program is "perfect and has no limitations" will cost you marks. Examiners know student projects have flaws. Highlighting them shows maturity as a developer.
Practical Tasks
Activity 1: Write an Operation Guide. Pick an app on your phone (e.g., WhatsApp). Write a 4-step manual on "How to block a contact". Take cropped screenshots for each step and insert them into a Word document.
Activity 2: URS Matrix. Review your current CS project. Create an Evaluation Table with at least 3 requirements. Honestly assess whether they were Fully, Partially, or Not Met, and provide justification.
Self-Check Quiz
Q1: Who is the target audience for the User Guide, and how does this affect the writing style?
Q2: Why is it important to include a Troubleshooting section?
Q3: Why do examiners award high marks for listing system limitations?