21CSC309J SOFTWARE
ARCHITECTURE AND DESIGN
UNIT-3
1
Unit-3 - Software Architecture Analysis
Evaluating a Software Architecture, evaluate Architecture, What Qualities Can
We Evaluate an Architecture?, Outputs of an Architecture Evaluation,
Evaluating the Architecture – ATAM, Participants and Outputs
of ATAM, Phases of ATAM, CASE Study for ATAM, Evaluating the Architecture –
CBAM, Decision-Making Context, Basis for the CBAM - Case Study A,
Evaluating Software Architecture - SAAM D, SAAM
Evaluation Process, Evaluating Software Architecture – ARID, ARID Evaluation
Process
2
Why Evaluate Software Architecture?
• Ensures alignment with business, technical, and operational needs
• Identifies risks, inefficiencies, and areas for improvement
• Prevents costly issues before implementation or scaling
1. Define Evaluation Goals
• Does it meet business needs?
• Is it scalable, secure, and maintainable?
• Does it support performance requirements?
• Is it cost-effective?
2. Use Established Evaluation Methods
• ATAM – Identifies trade-offs and risks
• SAAM – Evaluates modifiability and flexibility
• CBAM – Assesses cost vs. benefits
• DRAS – Focuses on reliability, availability, and security
3. Evaluate Key Architectural Attributes
• Performance & Scalability
• Maintainability & Modularity
• Security & Compliance
• Reliability & Availability
• Technology Stack & Compatibility
• Cost Efficiency & Resource Utilization
Key Attribute: Performance & Scalability
• Can the system handle increased loads?
• What are the response times under stress?
• Are there bottlenecks in data processing or network communication?
Key Attribute: Maintainability & Modularity
• Is the architecture well-structured for future changes?
• Are components loosely coupled for easier maintenance?
• Is there proper documentation?
Key Attribute: Security & Compliance
• Does it follow security best practices? (e.g., OWASP, encryption)
• Are data access controls and authentication in place?
• Does it comply with GDPR, HIPAA, etc.?
Key Attribute: Reliability & Availability
• Are redundancy and failover mechanisms in place?
• How does the system handle failures?
• Is there monitoring and logging for early issue detection?
Key Attribute: Cost Efficiency & Resource
Utilization
• Is the system over-provisioned, leading to unnecessary costs?
• Can cloud resources be optimized?
• What are the licensing and maintenance costs?
4. Conduct Reviews & Testing
• Architecture Review Boards
• Prototyping
• Load Testing & Stress Testing
• Security Audits & Penetration Testing
5. Gather Feedback & Iterate
• • Collect stakeholder feedback
• • Adjust architecture based on real-world insights
• • Document lessons learned for future reference
3.2 Evaluate Architecture
14
Why Evaluate Software Architecture?
• Ensures quality, risk mitigation, and effectiveness
• Aligns with business and technical requirements
• Identifies potential improvements before deployment
1. Define Evaluation Criteria
• Performance (response time, scalability)
• Maintainability (modularity, ease of updates)
• Security (authentication, data protection)
• Reliability (fault tolerance, failover mechanisms)
• Cost Efficiency (resource utilization, operational costs)
2. Choose an Evaluation Methodology
• ATAM – Identifies trade-offs and risks
• SAAM – Assesses modifiability and change impact
• CBAM – Evaluates cost-effectiveness
• Security Audits – Examines vulnerabilities and compliance
3. Assess Key Architectural Aspects
• Structural Analysis – Examine component interactions
• Performance & Scalability – Load testing, caching strategies
• Security Evaluation – Penetration testing, encryption
• Maintainability & Extensibility – Code complexity, documentation
• Reliability & Availability – Fault tolerance, error handling
Key Aspect: Structural Analysis
• Examine how components interact (microservices, monolithic,
layered)
• Ensure modularity and separation of concerns
Key Aspect: Performance & Scalability
• Conduct load and stress testing
• Evaluate database performance and caching strategies
Key Aspect: Security Evaluation
• Perform penetration testing
• Ensure secure API design and data encryption
Key Aspect: Maintainability & Extensibility
• Review codebase complexity and technical debt
• Check documentation and adherence to coding standards
Key Aspect: Reliability & Availability
• • Assess fault tolerance and disaster recovery plans
• • Test failover mechanisms and error-handling capabilities
4. Conduct Architecture Reviews
• Stakeholder Interviews – Feedback from developers, architects, and
users
• Prototyping & Proof-of-Concepts – Validate design choices
• Peer Reviews & Walkthroughs – Identify gaps and improvements
5. Document Findings & Provide
Recommendations
• Highlight strengths and weaknesses
• Suggest improvements for scalability, security, and maintainability
• Develop an action plan for optimizations and risk mitigation
What Qualities Can We Evaluate an Architecture?
26
1. Performance & Scalability
• Can the system handle increased load and user demand?
• Response times under various conditions
• Supports horizontal or vertical scaling
• Tools: JMeter, Apache Benchmark
2. Maintainability & Modularity
• Structured for easy modifications
• Loose coupling & high cohesion
• Tools: SonarQube, CodeClimate
3. Security
• Authentication & Authorization in place
• Protection against vulnerabilities (SQL Injection, XSS)
• Tools: OWASP ZAP, Burp Suite
4. Reliability & Availability
• Resilient to failures
• Failover & redundancy mechanisms
• Tools: Chaos Monkey, ELK Stack
5. Usability & Accessibility
• User-friendly & intuitive design
• Compliance with WCAG standards
• Tools: User Testing, Google Lighthouse
6. Interoperability & Integration
• Easy integration with APIs
• Standard communication protocols (REST, GraphQL)
• Tools: API Testing, Integration Testing
7. Cost Efficiency
• Optimized operational costs
• Cloud resource efficiency
• Tools: AWS Cost Explorer, Docker, Kubernetes
Real-World Example: E-Commerce Platform
Evaluating Performance
• Scenario: Handling high traffic during Black Friday
• Load Testing: 100,000 concurrent users
• Scalability Check: AWS Auto Scaling
• Findings: Improve CDN caching & database sharding
Evaluating Maintainability
• Scenario: Frequent product & payment updates
• Microservices for modular design
• CI/CD automation with Jenkins
• Findings: Migrate from monolithic architecture
Evaluating Security
• Scenario: Protecting payment details
• Security Audits using OWASP ZAP
• Encryption with AES-256
• Findings: Enforce RBAC
Evaluating Reliability
• Scenario: System must not crash on server failure
• Failover Testing & Database Replication
• Monitoring with ELK Stack
• Findings: Optimize load balancers & hot standby databases
Evaluating Usability
• Scenario: Easy navigation on all devices
• User Testing & Mobile Responsiveness
• WCAG Compliance checks
• Findings: Optimize UI & reduce load time
Evaluating Cost Efficiency
• Scenario: High cloud bills due to unused resources
• Cloud Cost Analysis with AWS Cost Explorer
• Migration to serverless architecture
• Findings: Optimize auto-scaling & use containerization
Final Takeaways
• If the system passes all evaluations → Well-Architected Solution
• If issues arise → Optimize with targeted improvements
Outputs of an Architecture Evaluation
42
Architecture Quality Assessment Report
• • Summary of the architecture evaluation process.
• • Overall assessment of system strengths and weaknesses.
• • Recommendations for improvements.
• • Example: The architecture is well-structured for scalability but
lacks caching, leading to performance issues during peak traffic.
Identified Risks & Trade-offs
• • List of high-risk architectural decisions and their impact.
• • Trade-offs between quality attributes (e.g., security vs.
performance).
• • Example: NoSQL database improves scalability but reduces
complex query capabilities, affecting reporting.
Performance Benchmarking & Test Results
• • Load and stress test results (response time, throughput, latency).
• • Database query performance analysis.
• • API response time evaluations.
• • Example: Under 10,000 concurrent users, response time degrades
from 200ms to 1.5s, requiring optimization.
Security Audit Findings
• • Detected security vulnerabilities.
• • Compliance gaps (e.g., GDPR, HIPAA, PCI-DSS).
• • Recommendations for fixing security issues.
• • Example: API endpoints allow unauthenticated access. Implement
OAuth 2.0 for security.
Maintainability & Code Quality Metrics
• • Code complexity scores (e.g., cyclomatic complexity).
• • Technical debt analysis.
• • Documentation review.
• • Example: High coupling between modules makes updates difficult.
Refactoring recommended.
Cost Analysis & Optimization Suggestions
• • Cloud resource utilization report.
• • Infrastructure cost breakdown and optimization strategies.
• • Example: Reducing unused cloud instances can save 20% on costs.
Architecture Decision Log (ADL)
• • Key decisions made during the evaluation.
• • Justifications and alternatives considered.
• • Impact analysis of each decision.
• • Example: Chose microservices for flexibility despite added
complexity in service communication.
Actionable Recommendations & Roadmap
• • Prioritized improvements (short-term, mid-term, long-term).
• • Suggested refactoring or redesign strategies.
• • Timeline and resource estimates for implementation.
• • Example: Short-term: Implement caching (Redis). Mid-term:
Optimize indexing. Long-term: Migrate to service-oriented
architecture.
3.5 Architecture Tradeoff Analysis Method (ATAM)
51
Introduction to ATAM
• ATAM evaluates software architecture quality attributes (e.g.,
performance, availability, security).
• It mitigates risks early in the software development life cycle (SDLC).
Participants in ATAM
• Evaluation Team: 3-5 external members assessing architecture.
• Project Decision-Makers: Authority to mandate changes.
• Architecture Stakeholders: Users, maintainers, developers, testers,
etc.
Process of ATAM
• Identifies business drivers like system goals and constraints.
• Develops quality attributes and business scenarios.
• Analyzes trade-offs, sensitivity points, and risks.
• Gradually refines architecture through repeated cycles.
Steps of ATAM
• 1. Present ATAM – Explain process to stakeholders.
• 2. Present Business Drivers – Identify system goals.
• 3. Present Architecture – Overview of the system.
• 4. Identify Architectural Approaches – Discuss proposed architecture.
• 5. Generate Quality Attribute Utility Tree – Map requirements to properties.
• 6. Analyze Architectural Approaches – Prioritize scenarios.
• 7. Brainstorm Scenarios – Gather stakeholder input.
• 8. Analyze Architectural Approaches (again) – Refine with new insights.
• 9. Present Results – Review risks, trade-offs, and document findings.
Steps of ATAM Process
56
Phases of ATAM
• Phase 0: Preparation – Planning, recruitment, and team formation.
• Phase 1: Steps 1-6 (1 day, followed by a 2-3 week break).
• Phase 2: Steps 7-9 (2 days, involving stakeholders).
• Phase 3: Follow-up – Report generation and process improvement (1
week).
Outputs of ATAM
• Concise architecture presentation (1-hour summary).
• Clear articulation of business goals.
• Prioritized quality attribute requirements as scenarios.
• Identification of risks (potential negative impacts).
• Identification of non-risks (decisions unlikely to cause issues).
• Risk themes – Systematic weaknesses in architecture, process, and
team.
• Mapping architectural decisions to quality requirements.
3.6 Participants and Outputs of ATAM, Phases of ATAM
59
Introduction to ATAM
• • ATAM is a structured method for evaluating software architecture.
• • Analyzes quality attributes, risks, trade-offs, and improvement
areas.
Participants in ATAM
• Project Sponsor: Defines business goals and key quality attributes.
• System Architect: Presents system architecture and design choices.
• Developers: Provide technical insights into implementation.
• Test Engineers: Share performance testing and security assessment results.
• Operations Team: Evaluates deployment, monitoring, and maintenance
concerns.
• End Users / Clients: Give feedback on usability, performance, and expected
behavior.
• ATAM Evaluation Team: Facilitates the process, identifies risks, and
analyzes trade-offs.
Phase 1: Presentation (Understanding the
System & Goals)
• Identify business goals and quality attributes (e.g., performance,
security, scalability).
• The architect presents the system architecture.
• Example: A banking app is evaluated for security and performance to
handle millions of transactions daily.
Phase 2: Investigation & Analysis (Identifying
Key Scenarios & Risks)
• Identify quality attribute scenarios (performance, security,
scalability).
• Prioritize high-impact scenarios that affect business objectives.
• Example Scenarios:
Performance: 'Can the system process 100,000 transactions per
second?'
Security: 'Does the application block unauthorized login attempts?'
Scalability: 'Does the system auto-scale during peak hours?'
Phase 3: Testing & Trade-Off Analysis
(Evaluating Risks & Trade-offs)
• Evaluate architecture against the identified scenarios.
• Identify risks, sensitivity points, and trade-offs.
Example Trade-offs:
Encryption (Security) vs. Faster Processing (Performance): Stronger
encryption may slow transactions.
Microservices (Scalability) vs. Complexity (Maintainability):
Microservices improve scalability but make debugging harder.
Phase 4: Reporting & Decision Making
(Providing Recommendations & Roadmap)
• Document findings, risks, trade-offs, and solutions.
• Present a final report with a roadmap for improvements.
Example Roadmap:
• Short-Term: Implement caching to improve transaction speed.
• Mid-Term: Optimize API security with OAuth 2.0.
• Long-Term: Migrate to serverless architecture for better scalability.
Outputs of ATAM Evaluation
• Architecture Quality Report: Summary of strengths, weaknesses, and
risks.
• Identified Risks & Trade-offs: List of high-risk areas and decisions
affecting system performance.
• Prioritized Scenarios: Key quality attributes ranked by importance.
• Mitigation Strategies: Suggested solutions to address risks.
• Final Report & Roadmap: Step-by-step plan for improving the
architecture.
3.7 CASE Study for ATAM
67
ATAM Evaluation of an E-
Commerce Platform
Case Study Analysis
Business Goals & Quality Attributes
• ✅ High Performance – Ensure fast response times during peak load.
• ✅ Scalability – Handle a 5x increase in traffic during sales.
• ✅ Security – Prevent unauthorized access and cyberattacks.
• ✅ Availability – Maintain 99.99% uptime even during server failures.
• Quality Attributes:
• • Performance: Page load time ≤ 500ms
• • Scalability: Handle 1 million concurrent users
• • Security: Prevent SQL injection, DDoS
• • Availability: Ensure failover mechanism
Current System Architecture
• • Frontend: [Link] (SPA)
• • Backend: Microservices ([Link] & Spring Boot)
• • Database: PostgreSQL (Primary), Redis (Cache)
• • Authentication: OAuth 2.0 & JWT
• • Infrastructure: AWS (Auto-Scaling EC2 + Load Balancer)
Key Scenarios & Prioritization
• 1. Performance – Crucial for user experience & conversion rates
• 2. Security – Protects sensitive customer data
• 3. Scalability – Ensures platform stability during high traffic
• 4. Availability – Prevents revenue loss due to downtime
Identified Risks & Trade-offs
• • Slow Checkout Process – Optimize DB queries & use caching
• • API Security Vulnerability – Implement rate limiting & WAF
• • Slow Auto-Scaling – Switch to serverless (AWS Lambda)
• • Trade-off: Data Consistency vs. Speed – Use event-driven updates
Final Recommendations & Roadmap
• Short-Term (0-3 months): Optimize DB, Implement WAF
• Mid-Term (3-6 months): Improve auto-scaling, Real-time inventory
updates
• Long-Term (6-12 months): Migrate to serverless, Introduce Zero
Trust Security Model
ATAM Impact & Summary
• ✔ Early risk detection: Identified performance, security, scalability
issues
• ✔ Better decision-making: Stakeholders understood trade-offs
• ✔ Improved software quality: System became more resilient, faster,
and secure
3.8 Evaluating the Architecture – CBAM (Cost Benefit Analysis
Method)
75
CBAM (Cost-Benefit Analysis
Method)
An architectural evaluation method for cost-effective decision-making.
Why Use CBAM?
• - Helps in prioritizing architecture improvements based on ROI
• - Ensures cost-effective decision-making
• - Analyzes trade-offs between cost, risk, and benefit
• - Helps in long-term planning for architectural changes
CBAM Process: Phases & Steps
• CBAM consists of six phases combining qualitative and quantitative
evaluations:
1. Scenario Identification
2. Prioritization of Scenarios
3. Option Identification
4. Cost & Benefit Analysis
5. Decision Making
6. Implementation & Re-evaluation
Phase 1: Scenario Identification
• Stakeholders define architectural scenarios based on quality
attributes.
• Scenarios include performance, scalability, security, maintainability,
etc.
• Example:
• 'How can we reduce page load time from 1s to 500ms?'
• 'Can we improve security to handle 5 million transactions per day?'
Phase 2: Prioritization of Scenarios
• Business and technical teams rank scenarios based on their
importance to business goals.
• Each scenario is assigned a score based on business impact.
Example Prioritization:
1. Improve checkout performance - High Priority
2. Enhance API security - High Priority
3. Optimize cloud cost efficiency - Medium Priority
Phase 3: Option Identification
• Identify different architectural options for addressing the top
scenarios.
Example for 'Reduce page load time':
1. Implement CDN caching (Fast, low cost)
2. Upgrade database infrastructure (Expensive but scalable)
3. Optimize frontend assets (Cheaper but moderate impact)
Phase 4: Cost & Benefit Analysis
• Each option is analyzed for cost, benefits, risks, and feasibility.
Example Cost-Benefit Analysis:
1. CDN caching: $50K, High benefit, Low risk
2. Upgrade database: $200K, Medium benefit, Medium risk
3. Optimize frontend assets: $30K, Moderate benefit, Low risk
• CDN caching is selected due to high benefit at low cost.
Phase 5: Decision Making
• Prioritize solutions that maximize benefit while minimizing cost and
risk.
• Create a roadmap for implementation.
• Example Decision:
Short-term: Implement CDN caching.
Mid-term: Optimize frontend assets.
Long-term: Upgrade database for future scalability.
Phase 6: Implementation & Re-evaluation
• Deploy solutions and measure performance improvements.
• Evaluate ROI to determine if further changes are needed.
Example Results:
✅ 40% reduction in page load time.
✅ 15% increase in conversion rates.
✅ $100K per year saved in cloud costs.
CBAM Case Study: E-commerce Platform
Problem: Slow page load times and high server costs.
CBAM Process Applied:
1. Scenario Identification: 'Reduce page load time to 500ms while lowering cloud
costs.'
2. Prioritization: High priority due to revenue impact.
3. Options Identified: CDN caching, database optimization, frontend compression.
4. Cost-Benefit Analysis: CDN caching chosen for high impact at low cost.
5. Decision: Implement CDN caching first.
6. Implementation & Review: Improved checkout speeds and reduced costs.
✅ 15% increase in sales.
✅ 30% reduction in cloud hosting costs.
3.9 Decision-Making Context in CBAM (Cost
Benefit Analysis Method)
86
CBAM Decision-Making
Context
Understanding how CBAM supports informed architectural decisions.
What is the Decision-Making Context in
CBAM?
• - CBAM helps organizations evaluate cost, benefits, and risks.
• - The decision-making context refers to the business and technical
environment affecting choices.
Factors Influencing Decision-Making
1. Business Goals & Market Demands
- Does the architecture support business expansion?
2. Quality Attributes Trade-offs
- Performance vs. Cost, Security vs. Usability
3. Cost Constraints
- What is the budget for improvements?
4. Technical Debt & Legacy Systems
- Should we refactor old code or migrate?
5. Risk Tolerance
- How much downtime or data loss is acceptable?
CBAM Decision-Making Steps
• Step 1: Define Key Scenarios
• Step 2: Prioritize Scenarios
• Step 3: Identify Solution Options
• Step 4: Cost-Benefit Analysis
• Step 5: Make a Decision & Implement
Step 1: Define Key Scenarios
• - Identify critical architectural challenges based on business needs.
• Example:
• • 'We need to handle 1 million users during peak hours.'
• • 'How can we reduce cloud costs without affecting performance?'
Step 2: Prioritize Scenarios
• Example Priority Ranking:
• 1. Improve checkout performance (High impact on sales & UX)
• 2. Enhance API security (Protects customer data)
• 3. Optimize server costs (Reduces expenses)
Step 3: Identify Solution Options
• Example Solutions for Checkout Performance:
• 1. CDN Caching – Improves speed with low cost.
• 2. Database Optimization – High performance but expensive.
• 3. Serverless Microservices – Scalable but increases complexity.
Step 4: Cost-Benefit Analysis
Example Cost-Benefit Comparison:
1. CDN Caching - $50K, High Benefit, Low Risk
2. Database Upgrade - $200K, Medium Benefit, Medium Risk
3. Microservices Migration - $300K, Moderate Benefit, High Risk
• CDN Caching is the best choice due to high benefit and low cost.
Step 5: Make a Decision & Implement
- Choose the most cost-effective and low-risk solution.
- Example Decision:
• Short-term: Implement CDN caching for immediate performance
boost.
• Mid-term: Optimize database queries.
• Long-term: Gradually migrate to serverless microservices.
Decision-Making Output in CBAM
1. Prioritized Scenarios - Ranked by business value.
2. Cost-Benefit Analysis Report - Comparing strategies.
3. Risk Analysis - Understanding potential risks.
4. Implementation Roadmap - Deployment timeline.
Final Thoughts: Why CBAM Helps Decision-
Making
✅ Aligns technical decisions with business goals.
✅ Helps prioritize architecture changes based on ROI.
✅ Minimizes risks by evaluating trade-offs early.
✅ Provides a structured decision-making framework.
3.10 Basis for the CBAM (Cost Benefit Analysis Method)
98
Basis for the CBAM (Cost
Benefit Analysis Method)
Understanding the economic foundations and decision-making
principles of CBAM
Core Foundations of CBAM
• Economic Decision Theory
• Utility Theory
• Cost-Benefit Analysis
• Trade-Off Analysis
• Sensitivity Analysis
Economic Decision Theory
• Organizations allocate resources based on ROI.
• Cost-benefit analysis helps choose cost-effective solutions.
Example:
• Upgrading a database costs $200K but increases revenue by $500K.
Utility Theory
• CBAM measures the value of architectural strategies.
• Decisions balance expected benefits and risks.
Example:
• Improving performance may increase cloud costs but boost user
engagement.
Cost-Benefit Analysis
• Each option is evaluated for cost, benefit, and risk.
• Stakeholders prioritize maximum benefit at the lowest cost.
Example:
CDN Caching: $50K cost, 9/10 benefit, Low risk.
Trade-Off Analysis
• Helps balance competing priorities.
Example:
• Faster performance (CDN) improves load time but increases costs.
• Stronger security (MFA) improves safety but impacts usability.
Sensitivity Analysis
• Measures how external changes impact architecture.
Example:
• If cloud costs increase by 50%, will the system remain cost-effective?
Case Study: XYZ Retail
• Problem: High costs and slow performance.
• Goal: Improve speed, reduce cloud expenses.
• Solution: CDN caching, DB optimization, microservices.
Cost-Benefit Analysis of Solutions
• CDN Caching: $50K, 9/10 benefit, Low risk.
• DB Optimization: $150K, 8/10 benefit, Medium risk.
• Microservices: $500K, 7/10 benefit, High risk.
Implementation Plan
• Short-term: Deploy CDN caching.
• Mid-term: Optimize database.
• Long-term: Migrate to microservices.
Evaluation of Results
• Checkout load time reduced by 60%.
• Cart abandonment dropped by 33%.
• Cloud costs reduced by 20%.
• Sales conversion increased.
3.11 Evaluating Software Architecture – SAAM (Software
Architecture Analysis Method)
110
Evaluating Software
Architecture – SAAM
Software Architecture Analysis Method
What is SAAM?
• SAAM (Software Architecture Analysis Method) is an early
architecture evaluation method that focuses on analyzing a system’s
modifiability and adaptability by comparing different architectural
options.
• ✅ Main Goal: Evaluate modifiability, extensibility, and functional
suitability.
• ✅ Best Used For: Assessing if an architecture supports future
changes with minimal cost.
Key Characteristics of SAAM
• 🔹 Focuses on modifiability and extensibility.
• 🔹 Uses scenarios to evaluate impact.
• 🔹 Helps compare architectural designs.
• 🔹 Emphasizes stakeholder collaboration.
Steps in SAAM Evaluation
• 📌 Step 1: Identify the Architecture
• 📌 Step 2: Define Scenarios
• 📌 Step 3: Scenario Classification
• 📌 Step 4: Scenario Evaluation
• 📌 Step 5: Compare Architectural Options
• 📌 Step 6: Generate Evaluation Results
Scenario Classification Example
Scenario Type Impact
Add a new report generation ✅ Direct Low
feature
Integrate with third-party ❌ Indirect High
payment gateway
Change database from ❌ Indirect High
MySQL to MongoDB
Real-World Example: SAAM for a Banking
System
• Case: Evaluating BankXYZ’s Core Banking Architecture
• Goal: Assess future digital payments & mobile banking.
• Scenarios:
• ✅ Add a mobile banking app (Direct)
• ❌ Integrate with cryptocurrency payments (Indirect)
• ❌ Upgrade security compliance (Indirect)
• Results: Suggested API-based design & modular system.
3.12 SAAM Evaluation Process (Software Architecture Analysis
Method)
117
Software Architecture
Analysis Method (SAAM)
Evaluating Modifiability and Maintainability
Overview of SAAM
• SAAM is used to evaluate the modifiability and maintainability of
software architecture.
• ✅ Goal: Assess how well an architecture supports future changes.
• ✅ Method: Follows a structured process to analyze system flexibility.
Steps in SAAM Evaluation Process
• 📌 Step 1: Describe the Architecture
• 📌 Step 2: Identify Stakeholders
• 📌 Step 3: Develop Scenarios
• 📌 Step 4: Classify & Evaluate Scenarios
• 📌 Step 5: Compare Alternative Architectures
• 📌 Step 6: Generate Final Evaluation Results
Step 1: Describe the Architecture
• • Define components, connectors, and interactions.
• • Document current system structure (e.g., monolithic, layered,
microservices).
• • Example: A banking system with modules for account
management, transactions, and customer support.
Step 2: Identify Stakeholders
• • Involve key stakeholders such as:
• ✅ Architects – Ensure technical feasibility.
• ✅ Developers – Provide implementation insights.
• ✅ Business Managers – Align technical decisions with business goals.
• ✅ Users – Provide real-world use cases.
Step 3: Develop Scenarios
• Identify real-world modifications that may occur in the future.
• Categorize scenarios as:
✅ Direct Scenarios – No major architectural changes needed.
❌ Indirect Scenarios – Require significant modifications.
• Example:
o ✅ Add mobile banking app integration.
o ❌ Support blockchain-based payments.
Step 4: Classify & Evaluate Scenarios
Scenario Type Impact
Add a reporting feature ✅ Direct Low
Integrate with third-party ❌ Indirect High
payment gateway
Switch database from ❌ Indirect High
MySQL to MongoDB
Step 5: Compare Alternative Architectures
Architecture Pros Cons
Monolithic Easier to develop Harder to scale, Less
flexible
Microservices More scalable, Flexible More complex, Requires
DevOps
Step 6: Generate Final Evaluation Results
• Identify strengths and weaknesses of the current architecture.
• Provide recommendations for improvement (e.g., modularization,
API-based design).
• Example Outcomes:
✅ Architecture supports most direct scenarios well.
❌ Needs refactoring to handle high-modifiability requirements.
3.13 ARID – Active Reviews for Intermediate Designs
127
Active Reviews for
Intermediate Designs (ARID)
A Lightweight Architecture Evaluation Method
Overview of ARID
ARID is used to evaluate intermediate or incomplete software designs
before full implementation.
✅ Combines elements of ATAM and Active Design Reviews.
✅ Helps assess design usability and suitability early in development.
ARID Participants
1. ARID Review Team:
• Facilitator – Guides the process.
• Scribe – Records feedback.
• Questioners – Raise questions and create scenarios.
2. Software Architect / Lead Designer:
• Presents the design and answers questions.
3. Reviewers (Stakeholders & Developers):
• Assess usability and applicability.
ARID Evaluation Process
📌 Phase 1: Preparation (Pre-Review Meeting)
📌 Phase 2: Review Meeting
• 9 structured steps ensure a thorough design evaluation.
Phase 1: Preparation Steps
✅ Step 1: Identify Reviewers
✅ Step 2: Prepare Design Presentation
✅ Step 3: Prepare Seed Scenarios
✅ Step 4: Prepare for the Review Meeting
Phase 2: Review Meeting Steps
✅ Step 5: Present ARID Method
✅ Step 6: Present the Design
✅ Step 7: Brainstorm and Prioritize Scenarios
✅ Step 8: Perform the Review
✅ Step 9: Present Conclusions
Diagram of ARID Process
Phase 1: Preparation
↓
Step 1: Identify Reviewers
↓
Step 2: Prepare Presentation
↓
Step 3: Prepare Seed Scenarios
↓
Step 4: Prepare Review Meeting
↓
Phase 2: Review Meeting
↓
Step 5: Present ARID Method
↓
Step 6: Present the Design
↓
Step 7: Brainstorm Scenarios
↓
Step 8: Perform the Review
↓
Step 9: Present Conclusions
Real-World Example: ARID for an E-
Commerce System
Case: Evaluating a Shopping Cart API Design
✅ Scenarios Suggested by Reviewers:
• Integrate a discount coupon system.
• Add a new payment method (Apple Pay).
• Support international currencies.
Findings:
✅ Strengths: API is well-structured for general use.
❌ Weaknesses: Lacks flexibility for future payment integrations.
🚀 Recommendation: Implement an adapter pattern for payment modules.
ARID vs. Other Evaluation Methods
Method Focus Best For
ARID Review of partially designed Early-stage validation with
architectures developers
ATAM Trade-off analysis Evaluating complete
architectures
SAAM Modifiability and Long-term evolution of
maintainability systems
CBAM Cost-benefit analysis Decision-making based on
cost-effectiveness
Benefits of ARID
✅ ARID is a lightweight, early-stage evaluation method.
✅ Engages developers in reviewing usability before full development.
✅ Helps refine designs early, saving time and costs.