SQA - Pre-Final

0.0(0)
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/61

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

62 Terms

1
New cards

Review

is a process for evaluating a documented software project product

2
New cards

Formal Design Reviews, Peer Reviews, Expert Opinions

Give the (3) Methodologies of Review

3
New cards

Formal design reviews

also called design reviews (DRs), differ from all other review methods

4
New cards

Review Leader

Appropriate candidates for review team leadership

5
New cards

Review Team

The entire of this should be selected from among the senior members of the project team

6
New cards

Inspection participants, Walkthrough participants

Give the (2) recommended peer review team

7
New cards

Inspection Participants

The number of participants in an inspection range from a minimum of three to a maximum of six

8
New cards

Review Leader (Moderator)

Leader of the inspection responsible for planning the inspection and coordinating it

9
New cards

The Author

is, with no exception, a participant in each type of peer review

10
New cards

Designer, Coder or Implementer, Tester

These (3) are the Specialized Professionals

11
New cards

A Designer

They are the system analyst responsible for analyzing and designing the software system reviewed.

12
New cards

A Coder or Implementer

A professional who is thoroughly acquainted with coding tasks, preferably the designated coding team leader

13
New cards

A Tester

This experienced professional, preferably the leader of the assigned testing team, focuses identifying design errors usually detected during the testing phase

14
New cards

Walkthrough Participants

The participants are led through the material in one of two formats

15
New cards

Review Leader (Coordinator

Candidates for the coordinator position should have traits similar to those of the inspection moderator.

16
New cards

The Author

is, with no exception, a participant. In many cases, he/she serves as the coordinator

17
New cards

A Standards Enforcer

This team member, who specializes in developing standards and procedures

18
New cards

A Maintenance Expert

called upon to focus on maintainability, flexibility, and testability issues

19
New cards

A User Representative

Participation of an internal or external user representative in the walkthrough team contributes to the review’s validity

20
New cards

Expert opinions

prepared by outside experts, support quality evaluation by introducing additional capabilities to the internal review staff

21
New cards

Software testing

is an activity in which a system or component is executed under specified conditions.

22
New cards

Software testing levels

are the different stages of the software development lifecycle where testing is conducted

23
New cards

Unit Testing

A level of the software testing process where individual units of software are tested

24
New cards

Integration Testing

A level of the software testing process where individual units are combined and tested as a group

25
New cards

System Testing

A level of the software testing process where a complete, integrated system is tested

26
New cards

Acceptance Testing

A level of the software testing process where a system is tested for acceptability

27
New cards

Incremental Testing

tests the software in steps – software modules as they are completed (unit tests)

28
New cards

Big Bang Testing

tests the software as a whole once the completed package is available

29
New cards

Black Box Testing

is also called “structural testing”. It identifies bugs only according to software malfunctioning as they are revealed in its erroneous outputs.

30
New cards

White Box Testing

examines internal calculation paths in order to identify bugs

31
New cards

Manual Testing

is the process of testing software by hand to learn more about it, to find what is and isn’t working

32
New cards

Automated Testing

is the process of testing the software using an automation tool to find the defects

33
New cards

Alpha Site Testing

are tests performed by potential users at the developer’s site on a new software package.

34
New cards

Beta Site Testing

is much more commonly applied than alpha site testing.

35
New cards

software operation contract review

are based on the contract draft

36
New cards

Software operation services plans

are annual plans that direct the management regarding the required resources

37
New cards

Software operation quality metrics

are used to identify trends in software operation services efficiency, effectiveness, and customer satisfaction, and as basic information for planning and budgeting.

38
New cards

Relevant

Related to an attribute of substantial importance

39
New cards

Valid

Measures the required attribute

40
New cards

Reliable

Produces similar results when applied under similar conditions

41
New cards

Comprehensive

Applicable to a large variety of implementations and situations

42
New cards

Mutually exclusive

Does not measure attributes measured by other metrics

43
New cards

Easy and simple

The implementation of the metrics data collection is simple and performed with minimal resources

44
New cards

Does not required independent data collection

Metrics data collection can be integrated with other project data collection systems

45
New cards

Immune to biased interventions by interested parties

The data collection and processing system is protected from unwanted changes

46
New cards

Software Product Metrics

are a quantitative representation of software products or intermediate product’s attributes

47
New cards

Software Product Size Metrics, Software Attributes Metrics

The (2) Classification of Software Product Metrics

48
New cards

Thousand Lines of Codes

This metric represents metrics based on the physical completed size of software

49
New cards

Function Points

This metric represents the result of applying a measure from the group of functional size measurement methods

50
New cards

Software Functionality Metrics

relate to the following aspects: suitability, accuracy, interoperability, security, and functionality compliance.

51
New cards

Software Reliability Metrics

User metrics distinguish between: Full Reliability, Vital Reliability, Total Unreliability

52
New cards

Full Reliability

When all software system functions perform properly

53
New cards

Vital Reliability

When all vital functions function properly

54
New cards

Total Unreliability

When all software system functions fail

55
New cards

Software Usability Metrics

relate to the following aspects: understandability, learnability, operability, attractiveness, and usability compliance.

56
New cards

Software Efficiency Metrics

relate to the following aspects: behavior overtime, resource utilization, and efficiency compliance

57
New cards

Software Maintainability Metrics

relate to the following aspects: analyzability, changeability, stability, testability, and maintainability compliance.

58
New cards

Software Portability Metrics

relate to the following aspects: adaptability, installability, coexistence, replaceability, and portability compliance.

59
New cards

Software Effectiveness Metrics

relate to a variety of implementation situations: corrections and changes of the original software product

60
New cards

Software Productivity Metrics

relate to a variety of implementation situations allowing comparison between tasks and teams and between time periods

61
New cards

Software Safety Metrics

relate to the user being injured as a result of software safety failure.

62
New cards

Software Satisfaction Metrics

related to user satisfaction, where the level of satisfaction is measured by a questionnaire.