CCEA GCSE Digital Technology - Chapter 16 Testing & Chapter 17 Evaluation

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/18

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

19 Terms

1
New cards

Waterfall Model

Represents a sequential approach to application development. A series of stages need to be completed in a fixed order and each stage must be completed before the next can begin.

2
New cards

Iterative Development Approach

A step-by-step approach taken to the development of an application. Each step sees the life cycle of analysis, design, development, testing, installation and review being repeated, each time adding more and more to the application until it is completed

3
New cards

Bugs

Another word for error or fault, which leads to errors in the execution of a program or application.

4
New cards

Testing Evidence

Documentation which illustrates the outcome of tests applied to application software, this may be in the form of annotated screen shots.

5
New cards

Null Data

Used to test that the system can cope when no data is entered.

6
New cards

Extreme Data

Used to test that the system can cope with very large or very small data values.

7
New cards

Black Box Testing

Where the tester is unaware of the internal structure of the application they are testing.

8
New cards

White Box Testing

A method of testing which examines the underlying structure of the application or code which has been developed.

9
New cards

Dry Run

A paper based exercise which allows the programmer to go through the solution step by step. The dry run will highlight any errors in the logic of the solution.

10
New cards

Trace Tables

Created during a dry run containing all data items and output used in the section of code being reviewed. The value of each data item is documented after each line of the solution is executed.

11
New cards

System Testing

Carried out on a completed and fully integrated system to ensure correct outputs are produced in compliance with the user requirements document.

12
New cards

Beta Testing

Carried out just after alpha testing and before the final version of the application is released commercially. Selected members of the target audience will test the system for errors.

13
New cards

Alpha Testing

Involves simulating the real-world environment the application has been designed for, normally carried out by a small number of users who have not been involved in the development. Alpha testing is prior to beta testing.

14
New cards

A/B Testing

End users are presented with different versions of a digital application; statistical analysis is carried out to determine which is most successful.

15
New cards

Robustness

A measure of the systems ability to continue to run when high volumes of valid, exceptional or invalid data is entered.

16
New cards

Evaluation

A document which considers the success of a project in relation to how complete the solution is, how efficient it is, how well it meets the end user's requirements and how well it operates on specified platforms.

17
New cards

Robust System

A system can be considered to be robust if it does not crash when processing high levels of valid, invalid or exceptional data.

18
New cards

Qualitative User Requirements

Relate to the quality of the solution, and may be subjectively assessed i.e. not everyone may assess them equally

19
New cards

Quantitative User Requirements

Requirements which can be easily measured, for example in terms of time.