2.1.1 Thinking abstractly

0.0(0)
studied byStudied by 0 people
full-widthCall with Kai
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/67

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

68 Terms

1
New cards

define abstraction

the process of removing unnecessary details from a problem

2
New cards

representational abstraction

analysing what is relevant and simplifying a problem based on this information

3
New cards

Abstraction by generalisation

grouping together similarities within a problem tp identify what kind of problem it is

4
New cards

why do abstraction by generalisation

allows certain problems to be categorised as being of a particular type so a common solution can be used to solve these problems

5
New cards

Data abstraction

details about how data is being stored are hidden. Programmers can make use of abstract data structures without concerning themselves about how they are implemented.

6
New cards

Procedural abstraction

Programmers can perform functions without having knowledge about the code used to implement this functionality.. Once a procedure has been coded, it can be reuses as a black-box

7
New cards

Levels of abstraction

the highest levels of abstraction are closes to the user and are usually responsible for providing n interface to the user to interact with hardware. Lowest levels of abstraction are responsible for actually performing these tasks through the execution of machine code

8
New cards

Pros of abstraction

allows non experts to make used of a range of systems or models by hiding info that is too complex; more efficient design as programmers can focus on the key elements; less time spent coding;

9
New cards

why are levels of abstraction used

to handle large, complex problems by separating tasks - higher levels deal with user interaction while low levels execute machine code

10
New cards

what is meant by layers of abstraction in programming languages

the distinction between low-level and high-level languages - each layer hides complexity from the one above it

11
New cards

low level vs high level langauges 

low level is closer to machine code, harder write and requires hardware knowledge; high level is easier to lear and closer to natual language and hides machine-lvel detail 

12
New cards

how does the TCP/IP use abstration

it divides network communication itno four laters, each handlind a different function and hiding details from the others

13
New cards

why are standards important in layered in layered models it TCP/IP?

they ensure compatability and allow layers to work independently while following agreed protocols

14
New cards

abstraction vs reality 

abstraction simplifies real-world entities into computational models like variables or objects 

15
New cards

how does OOP use abstraction

objects represent real-world entities, attributes model characterisitics and methods model actions

16
New cards

what does designing a solution involve

thinking ahead about how different components of a problem will be handled in the best way possible

17
New cards

why is thinking ahead important 

it helps developers anticipae potential issues and make programs easy and inuitive to use 

18
New cards

what are the three main parts of any computations problem

inputs, processing and outputs

19
New cards

what are te inputs in a computational problem

data entered into a system by the user that is required to solve a problem

20
New cards

what are outputs in a computational problem 

the results produced after the iputs have been processed 

21
New cards

what must designers decide when handling input and ot=utput

suitable data types, data structures and methods to capture and present the solution

22
New cards

what order should designers consider inputs and outputs

start by identifying required outputs, then determine the inputs and processing needed to achieve them

23
New cards

what are preconditions 

requirements that mist be met before a program or subroutine executes 

24
New cards

what happens if preconditions arent met

the prigram may fail to run or return ivalid results

25
New cards

why are preconditions useful

they let a subroutine safely assume that input arguments meet certain criteria

26
New cards

where can preconditions be defined 

within the code itself or in the documentaiton accompanying the program 

27
New cards

what does including preconditions in the documentation achieve

reduces program length and complexity, and saves debugging and maintenance time

28
New cards

why do peconditions promote reusability

they ensure necessary checks are made before execution so the subroutine can be safely reused

29
New cards

what is caching

storing recently used instructions or data in cache memory so they can be accessed faster later 

30
New cards

why is caching useful

it avoids repeatedly fetching data from slower secondary storage, saving time and improving performance

31
New cards

eg of caching

web browsers cache frequently visited pages to speed up loading and reduce bandwidth usage

32
New cards

what is prefetcing 

a technique where alogrithms predct and preload data or instryctions likely ot be needed soon 

33
New cards

what limits the effectiveness of prefetching and caching

accuracy of prediciton alogrithms and the limited size and search speed of cache memory

34
New cards

main advanatge of caching and prefetching

significantly improved performance if implemented correctly

35
New cards

what are reusable program components 

pre-written, tested pieces of code that can be used in multiple programs 

36
New cards

where are reusable components stored

in libraries

37
New cards

why is decomposition useful when designing software

it breaks the problem into smaller tasks, making it easier to identify reusable components

38
New cards

eg of reusable components 

stacks, queues, classes, subroutines 

39
New cards

pros of reusable components

savs time, money and resources sunce theyre already tested and reliable ; can be reused in future projects so reduces development costs

40
New cards

limitation of reusable components

compatability issues may arise which may require modification work which may be more costly than developing in-house

41
New cards

why is thinking procedurally useful

it simplifies the program design, allows easier testing and debugging, enables teamwork by dividing tasks, and makes t easier to modify and reuse components later

42
New cards

what is the first stage of thinking procedurally 

probelm decomposition - taking the problem defined by the user and breaking it down into smaller subproblems that are easier to solve 

43
New cards

define problem decomposition

the process of splitting a large,cimplex problem into smaller, more manageable subproblems, each which can be solved separately

44
New cards

pros of problem decomposition

makes the overall problem easier to manage, allows different peopple to work on different parts accoridng to their skills, and supports modular, testable program design

45
New cards

top-down design 

involves starting with the overall problem and progressively breaking it into smaller and smaller subproblems until each can be implemented as a single, specific task

46
New cards

pros of top down design

It helps structure complex problems into logical levels, where higher levels show the overall structure, and lower levels define detailed tasks. This allows clearer design, independent development, and easier testing.

47
New cards

aim of top-down design

o keep decomposing the problem until each subproblem can be represented as a single, self-contained task or subroutine that can be coded and tested individually

48
New cards

what ahppens after all the components are identified

They are combined and integrated into a complete working program, ensuring that each subroutine interacts correctly with others

49
New cards

Why is the order of operations important when combining subroutines?

Some subroutines depend on data from others. Executing them in the wrong order can cause errors or illogical results, so programmers must plan the correct sequence of operations

50
New cards

when can subroutines be executed simultaneously

iProblem decomposition → Split problem into subproblems

  • Top-down design → Refine subproblems into simple tasks

  • Identify components → Decide how to implement each part

  • Combine solution → Integrate subroutines into a full program

  • Determine execution order → Arrange subroutines logically and efficientlyf they don’t depend on the same data or inputs, multiple subroutines can be executed simultaneously (parallel execution)

51
New cards

summarise thinking procedurally 

  • Problem decomposition  

  • Top-down design

  • Identify components

  • Combine solution

  • Determine execution order

52
New cards

define a decision

The result reached after some consideration

53
New cards

why is decision making important

it allows programmers to choose the most effective approach for solving a problem

54
New cards

eg of making decisions in software development

choosing the programming paradigm, choosing input and output devices, deciding how to collect and process data

55
New cards

how to simplify the decision making process

by narrowing down possible options based on feasibility, functionality and familiarity - eg choosing a programming language that suits the problem

56
New cards

why is it important to identify where decisions need to be made

so enough info can be gathered about the available options, allowing informed secisions that lead to more effective and efficient solutions

57
New cards

main factors influencing the outcome of decisions

effectiveness, convenience, feasibility 

58
New cards

how to evaluate factors when making decisions

rank the factors by importance. this prioritisation hekps determine the most suitable option

59
New cards

what is the effect of decisions on a program’s flow

they determine the route the program follows. depending on the user’s input or internal logic, different sections of the code may execute, leading to varied outcomes

60
New cards

thinking logically meaning 

identifying where and when decisions must be made, planning all possible outcomes and structuring the program to handle each scenario appropriately

61
New cards

summarise decision making

  • Identify where decisions need to be made

  • Gather information and options

  • Evaluate options based on effectiveness, convenience, and feasibility

  • Prioritise key factors

  • Choose and implement the best solution

  • Reflect on the decision’s outcome and adjust if necessary

62
New cards

what is concurrent processing 

completing more than one task at a given time. This doesnt mean that mutiple tasks have to be done at once, it means different tasks are given slices of processor time to give the illusion that tasks are being performed simultaneously 

63
New cards

what is concurrent thinking

the mindset that allows you to spot patterns and parts of problems where concurrency can be applied

64
New cards

how to fetermine the parts of the problem that can be solved concurrently

first asses which parts of the problem are relatedand these can often be dealt concurrently

65
New cards

difference between concurrent processing and concurrent thinking 

concurrent processing uses a computer processor while thinking uses your brain 

66
New cards

concurrent processing vs parallel processing

parallel processing uses multiple processes to complete more than one task simultaneously but concurrent processing is when one task is given a slice of processor time to make it look like tasks are bign completed simultanously when in reality they are executed sequentially

67
New cards

proces of concurrent processing

thenumber of tasks completed in a givent ime is increased; less time is wasted waiting for an input or user interaction as other tasks can be completed

68
New cards

cons of concurrent processing 

it can take a longer time to complete when large numbers of users or tasks are involved as processes cannot be completed at once; there is an overhead in coordinating and switching between processing which reduces program throughput; just as with parallel processing, not all tasks are suited to being broken up and performed concurrently