310 Q1 UserStory

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/124

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

125 Terms

1
New cards
Acceptance gate
DoD is the shared contract for sign-off between devs and product owner.
2
New cards
Agile board (Kanban)
Visual board with Product Backlog, WIP limits, and pull-based flow.
3
New cards
Agile context
User stories are widely used within agile methodologies to structure work.
4
New cards
Agile tie-in (broken code)
Maps to working software, responding to change, and individuals & interactions.
5
New cards
Anti-pattern: giant story
High risk of missing iteration; split to reduce uncertainty.
6
New cards
Anti-pattern: no DoD
Leads to scope creep and unverifiable completion.
7
New cards
Anti-pattern: solution masquerading as story
E.g., ‘Build React page’ without user value.
8
New cards
Anti-pattern: vague notes
Notes that restate the role/goal add no scoping value.
9
New cards
Avoid solution bias
State the user problem and value before naming technologies.
10
New cards
Bad story: ‘Add extra tests’
No user, no value, no acceptance criteria.
11
New cards
Bad story: ‘Design team creation interface’
Describes a design task, not user value; not a user story.
12
New cards
Bad story: ‘Implement TeamCreator’
Unclear; solution-oriented; lacks user, value, and DoD.
13
New cards
Benefits of small stories
Clear DoD, quicker failure detection, legal traceability, easy work distribution.
14
New cards
Broken code: 3 rules
1) Does its job (no bugs) 2) Affords change 3) Understandable to others/future you.
15
New cards
Code review checklist
Design, functionality, complexity, tests, naming, comments, style, documentation.
16
New cards
Code review—author step
Open PR; self-review; choose reviewers; give context and rationale.
17
New cards
Cohesion benefit
Stories increase cohesion between customer/product owner and the development team.
18
New cards
Comment severity levels
Nit (minor), Optional/Consider (non-blocking), FYI (future idea/awareness).
19
New cards
Concrete example — Role/Goal/Benefit
As a prof, I want to create repositories so students can do their work.
20
New cards
Consistent terminology
Use the same stakeholder and domain terms across stories.
21
New cards
Continuous improvement
Refine INVEST and DoD quality in retrospectives.
22
New cards
Core template
As a , I want so that .
23
New cards
Cross-team constraints
Document interfaces and contracts in engineering notes.
24
New cards
Customer understanding via stories
Gives customers a concrete view of what the team will build for a feature.
25
New cards
Data for trade-offs
Stories return cost/value data so customers can weigh trade-offs across features.
26
New cards
Definition of Done (DoD)
Concrete, verifiable conditions that prove the story is complete and correct.
27
New cards
Definition of Done ties to validation
DoD specifies exactly how the customer will validate done/complete/correct.
28
New cards
Definition reminder
User stories describe features; they are not requirements specs or designs.
29
New cards
Dependency handling
Prefer independent stories; if blocked, note prerequisites explicitly.
30
New cards
E — Estimatable
Has enough detail to judge feasibility and effort.
31
New cards
Edge cases in notes
Document non-happy paths to bound scope and testing.
32
New cards
Effort estimate as value lens
Makes cost explicit so value vs. cost conversations are concrete.
33
New cards
Effort estimate attachment
Attach hours or points to help planning and prioritization.
34
New cards
Example — DoD (automated tests)
Provide automated test cases; the feature is programmatically verifiable.
35
New cards
Example — DoD (single command)
Feature runs as a single command with required parameters.
36
New cards
Effort estimate as value lens
Makes cost explicit so value vs. cost conversations are concrete.
37
New cards
Effort estimate attachment
Attach hours or points to help planning and prioritization.
38
New cards
Engineering notes purpose
List integrations, constraints, interactions with other parts, domain knowledge needed.
39
New cards
Engineering tasks vs stories
Tasks are dev-facing steps (e.g., set up DB); not phrased as user value.
40
New cards
Engineering tasks/notes purpose
Record interactions with other features/subsystems and technical considerations.
41
New cards
Estimating story points (simple table)
Use structured categories (e.g., simple/complex × single/multiple locations).
42
New cards
Example DoD — CLI
Single command accepts params and completes the task.
43
New cards
Example DoD — Integration test
Ensures compatibility with the GitHub API.
44
New cards
Example DoD — Script tests
Provide scripts for command-line aspects.
45
New cards
Example DoD — Unit tests
Errors covered for org, team name, and members.
46
New cards
Example DoD — Verifiability
Success is programmatically verifiable.
47
New cards
Example Engineering note — Config
Changeable constants live in config.js.
48
New cards
Example Engineering note — Future UI
API will be used by a future UI; design accordingly.
49
New cards
Example Engineering note — Integration
Must integrate with existing GitHubManager.
50
New cards
Example — Benefit
So the team can start working on the project.
51
New cards
Example — DoD (automated tests)
Provide automated test cases; the feature is programmatically verifiable.
52
New cards
Example — Estimate
Estimate: 1.5 units (per team’s scale).
53
New cards
Example — Notes
Inputs: team name, initial repo, team member GitHub IDs.
54
New cards
Example — Notes specifics
Inputs: list of repository names and list of student IDs (GitHub usernames).
55
New cards
Example — Role/Goal
As a prof, I want to create a repository for a 310 team.
56
New cards
Feature value vs cost decisions
Explicit costs enable deferring low-value, high-cost features to later.
57
New cards
Fibonacci for points
Use 1,2,3,5,8,… because estimation uncertainty grows with size.
58
New cards
Five parts overview
(1) Role–Goal–Benefit, (2) Limitations/Clarifications, (3) Definition of Done, (4) Engineering tasks/notes, (5) Effort estimate.
59
New cards
How to split big stories
Slice by workflow step, happy path vs. edge cases, or data subset.
60
New cards
I — Independent
Story can be reordered and implemented without tight coupling.
61
New cards
Independent ? unsequenced
Within one sprint, stories shouldn’t block each other even if long-term they build up.
62
New cards
INVEST: Estimable action
If not estimable, add details, split the story, or run a spike until it is.
63
New cards
INVEST: Estimatable nuance
Story must be precise enough to timebox; vague = risk of overruns.
64
New cards
INVEST: Independence nuance
Avoid precedence chains so teams can freely select stories from the backlog.
65
New cards
INVEST: Independent nuance
Independent to prioritize freely, but can still be sequenced across sprints.
66
New cards
INVEST: Negotiable details
Client-friendly language + time estimates enable real trade-offs.
67
New cards
INVEST: Negotiable nuance
Stories are living documents; negotiation often adds scope detail during planning.
68
New cards
INVEST: Small advantages
Smaller stories are easier to estimate, reason about, negotiate, and decouple.
69
New cards
INVEST: Small nuance
Sized to fit a sprint; too tiny can cause new dependencies—aim for balance.
70
New cards
INVEST: Testable link
Testability flows from a clear DoD agreed by customer and developer.
71
New cards
INVEST: Testable nuance
Define how to test and what pass criteria mean; avoid vague goals like 'make user happy'.
72
New cards
INVEST: Valuable nuance
Frame refactors in terms of customer value; exceptions exist but are rare.
73
New cards
INVEST: Value over time
Re-evaluate feature value as product/customer needs evolve before pulling it in.
74
New cards
INVEST acronym
Independent, Negotiable, Valuable, Estimatable, Small, Testable.
75
New cards
INVEST walkthrough context
Company email system with a contacts DB.
76
New cards
INVEST walkthrough user story
As an employee, search contacts by name to message them.
77
New cards
Iteration size constraint
Each story’s cost must fit within a single iteration (e.g., a ~2-week Scrum sprint).
78
New cards
Limitations/Clarifications purpose
Scopes the story to the subset of situations that matter; reduces ambiguity.
79
New cards
Link DoD to tests
Acceptance criteria should map to automated and/or scriptable checks.
80
New cards
Local estimation scale
1,2,3,5,8 with shared team definitions for each size.
81
New cards
N — Negotiable
Scope/details can be adjusted via conversation during planning.
82
New cards
Negotiation outcome
Capture agreed scope changes in notes and DoD immediately.
83
New cards
Points vs hours
Some teams avoid hours; others map points roughly to developer-hours.
84
New cards
Pre-release validation
Use DoD as checklist for demos and release readiness.
85
New cards
Prioritization signal
Value statement helps product owner rank stories.
86
New cards
Problem vs. Solution domain
Role–Goal–Benefit + DoD are problem-domain; engineering notes are solution-domain.
87
New cards
Reordering flexibility
Independent stories allow reprioritization between iterations.
88
New cards
Role–Goal–Benefit meaning
Who it’s for, what it does, and why it matters.
89
New cards
Role–Goal–Benefit purpose
Forces clarity on who benefits, what they’re trying to achieve, and why (value).
90
New cards
S — Small
Should take about half a day to half an iteration; split if larger.
91
New cards
Schedule slip rationale
Doubling a 2-week sprint is manageable; doubling a 1-year feature devastates schedules.
92
New cards
Stories as feedback vehicles
Because they’re small, they support quick customer feedback and iteration.
93
New cards
Stories drive backlog selection
Clear role/value and cost help prioritize which stories enter the next sprint.
94
New cards
Story points inform hours
Points help infer hours and prioritize stories for a sprint.
95
New cards
Story readiness checklist
Has role/goal/benefit, key notes, DoD, and a preliminary estimate.
96
New cards
Story ‘small & valuable’ summary
This example is small, clearly valuable, testable, and costed, fitting INVEST.
97
New cards
Subsystem interaction awareness
Engineering notes ensure cross-feature and subsystem impacts are captured.
98
New cards
T — Testable
Clear DoD enables objective verification.
99
New cards
Test review specifics
Are tests present, correct, at right level, and failing when they should?
100
New cards
Testable vs not testable
“Select x items in y seconds” is testable; “make user happy” isn’t.