Human Factors and Software Engineering Studies

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/48

flashcard set

Earn XP

Description and Tags

OBSIDIAN LINKS: Software crisis

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

49 Terms

1
New cards

what is the software crisis

Term used during early computing but can be applied now, the difficulty of producing useful and efficient programs with good design in the required time, caused by the rapid increase in computer power and possible problem complexity

2
New cards

what caused the software crisis

  • As software complexity increased, many software problems arose because existing methods were inadequate.

  • Also caused by the fact computing power has outpaced the skill ability of programmers effectively using those capabilities.

    • processes and methodologies created to try improve it but large projects will always be vulnerable

3
New cards

what are the crisis signs of the software crisis

  • over budget

  • over time

  • inefficient software

  • low quality software

  • not meeting requirements

  • hard to maintain

  • never delivered

4
New cards

what is a software engineering study

  • consists of an aim - what the study tries to answer or do

  • methodology - how to conduct design and implementation

  • results

  • conclusions - making conclusions needs to be based on evidence and critical thinking based on the threats to validity

  • threats to validity - anything which could affect the credibility of the product developed

5
New cards
<p>which of these bathtub curves is the best </p>

which of these bathtub curves is the best

  • people can have different affects on the bathtub curve

  • the red curve is best

6
New cards

what data is needed to construct a bathtub curve

  • requires accurate failure data (not always easy to gather):

  • need to measure time according to a subjective timeline decided by the team e.g.

    • release

    • hourly, daily, monthly, yearly

7
New cards

why is it hard to gather accurate failure data

  • hard to know which failures to include

    • minor, major, critical failures?

    • the severity of each is subjective

  • if combining all 3 failures in 1 bathtub, makes it difficult to compare bathtubs on a common basis as minor failures become equivalent to showstoppers

  • if trying to track the types individually, individual bathtubs need to be created

<ul><li><p>hard to know which failures to include </p><ul><li><p>minor, major, critical failures?</p></li><li><p>the severity of each is subjective </p></li></ul></li><li><p>if combining all 3 failures in 1 bathtub, makes it difficult to compare bathtubs on a common basis as minor failures become equivalent to showstoppers</p></li><li><p>if trying to track the types individually, individual bathtubs need to be created</p></li></ul><p></p>
8
New cards

how does brooks law impact the bathtub curve

bathtub can be drawn for a developing system as well and not just a released system, where the end coincides with more people being added and higher failure rate

9
New cards

how do refactoring practices impact the bathtub curve

  • when you do refactoring, there will be a temporary dip in the failure rate, but once the refactoring wears out it will go up again, resulting in these spikey teeth like bathtub curve - shows that refactoring is temporary, and later on in the system need to put in more effort for the same results as earlier in the system

    • bathtubs not always a smooth line, but can have bumps and zigzags to represent the tech debt and refactoring changes

<ul><li><p>when you do refactoring, there will be a temporary dip in the failure rate, but once the refactoring wears out it will go up again, resulting in these spikey teeth like bathtub curve - shows that refactoring is temporary, and later on in the system need to put in more effort for the same results as earlier in the system</p><ul><li><p>bathtubs not always a smooth line, but can have bumps and zigzags to represent the tech debt and refactoring changes</p></li></ul></li></ul><p></p>
10
New cards

what is a productivity drain

any wasted activity or situation that negatively affecting a programmers ability to code on time and on budget

11
New cards

why are productivity drains bad

  • bad because productivity drains impact how well programmers can do tasks like refactoring

  • shows how human factors have a significant impact on software engineering, a lot of the principles depend on whether humans can follow them effectively.

    • most times not, leading to the software crisis

12
New cards

what are the 6 productivity drains

  • misaligned work

  • excess work in progress

  • rework

  • demand vs. capacity imbalance

  • repetitive and manual effort

  • aged and cancelled effort

13
New cards

what is the drain - misaligned work

  • basically ‘productive procrastination’

  • based on unclear communication and poor grasp on priorities

  • developers tackle tasks that don’t directly contribute to the important tasks

  • can tech debt, TODO be classed as misaligned work? there’s a balance between what we should do now or what we should leave

14
New cards

what is the drain - excess work in progress

  • no penalty for overloading developers

  • this creates a silent stressor as uncomplete tasks pile up and project deadlines slip (tech debt)

15
New cards

what is the drain - rework

  • having to redo work - possibly refactoring, again, debatable on whether refactoring is rework or is useful

  • a roadblock to productivity (blocker) caused by

    • a tangled web of unclear requirements

    • poor communication between teams

    • inadequate testing practices

    • unaddressed tech debt

16
New cards

what is the drain - demand vs. capacity imbalance

  • mismatch between the demand for work and the available capacity to handle it (basically, either too much work or no work all, no balance at all)

  • occurs when one stage in a connected workflow operates too quickly or slowly for the next stage

17
New cards

what is the drain - repetitive manual effort

  • repetitive tasks like manual testing, data entry, and low value work

  • this work steals valuable time from better more innovative tasks and could just be automated

18
New cards

what is the drain - aged and cancelled effort

  • miscommunication/inflexibility can cause work to be abandoned or obsolete (binned work) before completion

  • caused by lack of adaption to new information or feedback

19
New cards

what is software cost estimation

estimating the time and cost and effort it will take to develop a system

20
New cards

what is the cone of uncertainty in estimation

concept that shows that over time, the amount of error you make in estimating the amount of time it takes to complete something goes down

  • estimation accuracy increases as you become more knowledgeable in what the system is going to do

<p>concept that shows that over time, the amount of error you make in estimating the amount of time it takes to complete something goes down </p><ul><li><p>estimation accuracy increases as you become more knowledgeable in what the system is going to do</p></li></ul><p></p>
21
New cards

what is a BDUF project

big design up front

  • associated with waterfall methodology where design must be completed and perfected before implementation

22
New cards

what is an iterative project

  • the project is developed in iterations or stages that can be repeated as needed, associated with agile

23
New cards

how does the cone of uncertainty differ between BDUF and iterative projects

BUDF - greater estimation error

iterative - better estimation accuracy

<p>BUDF - greater estimation error </p><p>iterative - better estimation accuracy </p>
24
New cards

what are the different software estimation techniques

  • algorithmic cost modelling

  • expert judgement

  • estimation by analogy

  • Parkinson’s law

  • specification-based

25
New cards

what is algorithmic cost modelling estimation

  • a formulaic approach based on historical cost information, generally based on software size

  • cost/effort is estimated as a formula including product, project, and process attributes (A, B, M)

  • values estimated by project manager

  • most models ae similar but have different A, B, M values

  • doesn’t always involve experts, just people who collect the data

26
New cards

what is the formula for algorithmic cost modelling

  • Effort = A * (B/M)

    • A is anticipated system complexity decided by the PM - this again is subjective to define

    • B the number of hours available

    • M a value reflecting the number of product, process and people attributes

27
New cards

what is expert judgement estimation

  • 1 of more experts in both software development and application domain use their experience to predict the software costs

  • process iterates until consensus is reaches

  • similar to algorithmic cost modelling, but intuition only and no formula

28
New cards

what are the pros of expert judgement

  • cheap estimation method

  • can be accurate if experts have direct experience of similar systems

29
New cards

what are the cons of expert judgement

  • very inaccurate if there are no true experts

30
New cards

what is estimation by analogy

  • cost of a project computed by comparing the project to a similar project in the same application domain

  • NASA does this a lot since many systems are similar, so past systems act as a good guide for future system estimation

31
New cards

what are the pros of estimation by analogy

  • accurate if previous project data available

32
New cards

what are the cons of estimation by analogy

  • impossible is no comparable project done and need to start from the beginning

  • needs systematically maintained cost data

33
New cards

what is Parkinson’s law estimation

  • states that the work expands to fill the time available, so whatever costs are set, they’ll always be used up

34
New cards

how do we combat Parkinson’s law

  • state what, the project will cost whatever resources are available and that’s all it’s getting (applying a fixed cost)

35
New cards

what are the pros of Parkinson’s law

  • no risk of overspending

36
New cards

what are the cons of Parkinson’s law

  • system is usually unfinished because the cost is determined by the available resources instead of the objective statement

37
New cards

what is specification-based estimation

  • project cost is agreed on basis of an outline proposal and the development is constrained by that cost

  • a detailed specification may be negotiated

  • may be the only appropriate when other detailed information is lacking - no prior knowledge, the specification itself is the guide onto how long it will take e.g. making a spec in uni, you are your own expert

38
New cards

what are some issues with Cyclomatic Complexity

  • entirely different programs can have the same CC value

  • cant be calculates where case and switch statements are used for conditions

  • people argue it is no better than LOC, might as well use LOC

  • hard to interpret e.g. CC 10 vs. CC 20,

    • 2x as complex?

  • some classes have a very high CC but can be simple and easy to understand e.g. doing 1 thing that has many conditions

  • developed for procedural like C, should it be used for OO?

    • CC often replaces using avg(CC) where the average CC of methods in a class

    • this does not represent the spread of CC across the class e.g. {10,1,1} vs. {4,4,4}

39
New cards

what are some issues with DIT

  • nobody has come up with a good value for DIT e.g. earlier inheritance study is not the only correct one

    • some systems need lots of inheritance and some don’t e.g. GUI lots DIT vs. maths based systems not a lot DIT

  • most developers in industry don’t see the value of DIT, so what’s the point in collecting it? LOC and CC are more favoured - most systems have a median DIT = 1

  • most systems collapse to DIT 1 as a result of merging sub and super classes

  • DIT meant to reflect how humans structure info but in reality doesn’t really show this, we don’t know what level of inheritance is good

<ul><li><p>nobody has come up with a good value for DIT e.g. earlier inheritance study is not the only correct one</p><ul><li><p>some systems need lots of inheritance and some don’t e.g. GUI lots DIT vs. maths based systems not a lot DIT</p></li></ul></li><li><p>most developers in industry don’t see the value of DIT, so what’s the point in collecting it? LOC and CC are more favoured - most systems have a median DIT = 1</p></li><li><p>most systems collapse to DIT 1 as a result of merging sub and super classes</p></li><li><p>DIT meant to reflect how humans structure info but in reality doesn’t really show this, we don’t know what level of inheritance is good</p></li></ul><p></p>
40
New cards

what is Di Penta 2020 refactoring study about

so many things were developed in the 60s-80s, because nobody has come up with a better alternative since

<p>so many things were developed in the 60s-80s, because nobody has come up with a better alternative since</p>
41
New cards

what is Tuckman’s Theory of Teams

  • suggests that every team goes through four key stages as they work on a task before it can become effective

  • these phases are all necessary and inevitable for a team to grow up to face challenges, tackle problems, find solutions, plan work, and deliver results

  • as the team develops maturity and ability, relationships establish, and leadership style changes to more collaborative or shared leadership

42
New cards

what are the phases in Tuckerman’s theory of teams

  • forming

  • storming

  • norming

  • performing

  • adjorning

<ul><li><p>forming</p></li><li><p>storming</p></li><li><p>norming</p></li><li><p>performing </p></li><li><p>adjorning </p></li></ul><p></p>
43
New cards

what is the forming phase

team spends time getting acquainted

open discussion on the purpose, methods, and problems of the task

44
New cards

what is the storming phase

open debate - conflict may exist on the goals, leadership, methods of task

most challenging stage - lack of trust, personality clashes, productivity nosedive from misaligned work

45
New cards

what is the norming phase

a degree of consensus enables the team to move on

differences are worked through, team learns to cooperate and take responsibility - focus on team goals and productivity

harmony between members

46
New cards

what is the performing phase

team is effective

open compliance emerges to group norms and practice allowing the team to be flexible, functional, and constructive

team requires minimum assistance from leadership and can effective in productivity and performance

47
New cards

what is the adjourning phase

the team agrees it has completed the phase

followed optionally by mourning stage (not in the original stages but added after) where the team is nostalgic that it’s over if the team was closely bonded

48
New cards

what is the role of the project manager in Tuckerman’s theory of teams

need to support the team throughout the different stages in different ways

  • forming - be open and encourage open communication, give high guidance as roles are unestablished

  • storming - be patient, focus the energy, support the wounded, authoritative leadership to push things back on track

  • norming - a chance to coordinate the consensus, can be relaxed but ensure it doesn’t go back to storming

  • performing - maintain the team momentum (not easy), little guidance needed

49
New cards

just a student study on inheritance

knowt flashcard image