1/49
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Requirements Analysis
The process of discovering, documenting, and validating what a system should do; the bridge between a vague idea and a concrete plan for what to build
Cost of Fixing Requirements Errors (During Requirements Phase)
1x cost to fix — the cheapest time to catch and correct mistakes
Cost of Fixing Requirements Errors (During Design Phase)
5x cost to fix compared to catching it in the requirements phase
Cost of Fixing Requirements Errors (During Implementation)
10x cost to fix compared to catching it in the requirements phase
Cost of Fixing Requirements Errors (During Testing)
20x cost to fix compared to catching it in the requirements phase
Cost of Fixing Requirements Errors (After Deployment)
100x cost to fix — the most expensive time to correct a requirements mistake
Extractive Approach to Requirements
Treating users as mere sources of requirements to mine; asking "what features do you need?" and then building in isolation — often leads to mismatched solutions
Participatory Approach to Requirements
Treating users as design partners; collaboratively exploring problems and possibilities so solutions emerge that neither party would have imagined alone
Domain Model
A shared understanding of the concepts, relationships, and rules that govern the problem space; built collaboratively by users and developers to create a common language
Risk Dimension 1: Understanding
How well do we understand what's needed? High risk when requirements are ambiguous, complex, or interpreted differently by different stakeholders
Risk Dimension 2: Scope
How much are we committing to build? High risk when a seemingly simple feature hides dozens of interconnected decisions and hidden workflows
Risk Dimension 3: Volatility
How likely are requirements to change? High risk when requirements depend on external APIs, regulations, politics, or unproven technologies
High Understanding Risk (Example)
"The system shall ensure grading quality through meta-reviews" — undefined terms like "meta-review" and "quality" lead to completely different interpretations by professor, TA, and students
Strategies for Reducing Understanding Risk
Ask for concrete examples, create prototypes, define terms in a glossary, look for inconsistencies in how different stakeholders use the same term
High Scope Risk (Example)
"Students can request regrades" — hides dozens of questions: who can request, when, how many times, what's the escalation path, how are group assignments handled, what audit trail is needed, etc.
Managing Scope Risk
Break features into smallest units, identify dependencies explicitly, look for hidden workflows, prioritize ruthlessly (essential vs. nice-to-have), plan for incremental delivery
High Volatility Risk (Example)
Integrating with a university's new AI plagiarism detector still in contract negotiations — the API, vendor, legal requirements, and ethics approval can all change week to week
Managing Volatility Risk
Isolate volatile requirements behind interfaces, build the stable core first, design for flexibility, defer commitment on volatile items, document assumptions explicitly
Options When Facing High-Risk Requirements
Clarify (reduce understanding risk), Simplify (reduce scope risk), Stabilize (reduce volatility risk), Defer (push to later releases), Eliminate (question if truly needed)
Stakeholder
Anyone who is affected by the system or can affect its success; missing a key stakeholder leads to systems that don't meet real-world needs
Primary Stakeholders
Direct users of the system — in the grading system example: Students, Graders (TAs), and Instructors
Secondary Stakeholders
Indirect users affected by the system — e.g., Meta-Graders, Department Administrators, IT Support
Hidden Stakeholders
Often-forgotten parties affected by the system — e.g., Parents, Future Employers, Accreditation Bodies
Students' Core Values (as stakeholders)
Fairness, transparency, timeliness; want fast feedback, consistent grading, clear explanations, and a fair regrade process
TAs' Core Values (as stakeholders)
Efficiency, accuracy, workload management; want clear rubrics, automated testing, reusable feedback, and protection from frivolous complaints
Instructors' Core Values (as stakeholders)
Educational outcomes, academic integrity, oversight; want grading aligned to learning objectives, plagiarism detection, and minimal admin overhead
Student vs. TA Conflict
Students want unlimited regrade requests for fairness; TAs want protection from frivolous requests that waste their time
Instructor vs. Student Conflict
Instructors want detailed analytics on student performance; students want privacy of their grades and mistakes
Administrator vs. TA Conflict
Administrators want detailed audit logs for compliance; TAs want quick, simple grade entry without overhead
"Success Disaster" Risk
When a system becomes very popular, a large new set of unanticipated users emerges with needs that weren't considered in the initial requirements — managing this risk requires thinking about future stakeholders early
Elicitation Method: Interviews
One-on-one conversations using open-ended questions, critical incident technique, probing, and silence to surface needs, workflows, and pain points
Critical Incident Technique
An interview technique asking users to describe a specific time when things went wrong — reveals edge cases and hidden requirements that users otherwise forget to mention
Interview Pitfalls
Leading questions ("Wouldn't X be better?"), using technical jargon, and accepting vague answers without probing for specifics
Elicitation Method: Observation (Ethnography)
Watching users perform their actual work to discover what they really do (vs. what they say they do) — reveals file format issues, context switching, tool integration problems, etc.
Elicitation Method: Workshops / Focus Groups
Bringing multiple stakeholders together to brainstorm, resolve conflicts, and vote on priorities using activities like affinity mapping and dot voting
Affinity Mapping
A workshop activity where participants group similar ideas from brainstorming into clusters to identify themes and patterns in requirements
Dot Voting
A workshop prioritization technique where each participant gets a fixed number of dots to place on the features they consider most important — reveals collective priorities
Elicitation Method: Prototyping
Building quick mock-ups (paper or digital) to make requirements concrete and surface specific feedback — reveals gaps users couldn't articulate in the abstract
Elicitation Method: Document Analysis
Studying existing documents (rubrics, email complaints, appeal forms, syllabi, university policies) to understand current processes and discover implicit constraints
Elicitation Method: Scenarios and Use Cases
Writing concrete stories about how the system will be used, which reveal urgency handling, escalation paths, integration needs, and time-sensitive requirements
Why Multiple Elicitation Methods Are Needed
Each method has blind spots: interviews reveal what people say, observation reveals what they do, workshops resolve conflicts, prototypes validate understanding, documents uncover constraints
Red Flags in Requirements Elicitation
Only talking to managers, using leading questions, relying on a single method, doing no iteration, missing stakeholders, and jumping to solutions before understanding the problem
Solution Jumping
A red flag where stakeholders say "we need a database" instead of "we need to track grades" — focuses on implementation rather than the underlying problem to be solved
Requirements as Ongoing Conversation
Requirements elicitation is not a one-time phase; as stakeholders see prototypes and early versions they remember forgotten needs, realize what they actually want, and change their minds
Scope Negotiation (Professional Skill)
When requirements exceed resources, collaborating with stakeholders to prioritize: "Given our constraints, how do we deliver the most value?" — not saying "no" but finding creative trade-offs
Adaptability (Professional Skill)
Requirements will change — the professional response is to design systems that can evolve, communicate clearly when changes affect timelines, and maintain composure under shifting priorities
Christopher Alexander's Influence on Requirements
1970s architect who argued inhabitants should participate in designing their spaces; his participatory design philosophy influenced software patterns and the idea that users are domain experts, not just requirement sources
Pattern Languages
Shared vocabularies that allow experts and users to communicate about design — in software, they enable developers and users to build a common language (domain model) together
Inter-Rater Reliability
The technical term for grading consistency across TAs — an example of how participatory requirements discovery can shift vague values ("fairness") into precise, actionable concepts
Calibration Exercise
A solution that emerged from participatory requirements discussion: TAs review each other's grading decisions on example submissions before each assignment to align on rubric interpretation