Concurrency

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/45

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

46 Terms

1
New cards
  • Machine instruction level

  • High-level language statement level

  • Unit level

  • Program level

Concurrency can occur at four levels:

2
New cards

language issues in instruction- and program-level concurrency

Because there are no (2), they are not addressed here

3
New cards
  • input and output operations

Multiprocessor Architectures

  • Late 1950s - one general-purpose processor and one or more special-purpose processors for —

4
New cards
  • program-level concurrency

Multiprocessor Architectures

  • Early 1960s - multiple complete processors, used for —

5
New cards
  • instruction-level concurrency

Multiprocessor Architectures

  • Mid-1960s - multiple partial processors, used for —

6
New cards
  • Single-Instruction Multiple-Data

  • Multiple-Instruction Multiple-Data

Multiprocessor Architectures

  • SIMD machines

  • MIMD machines

7
New cards

Physical concurrency

Categories of Concurrency:

  • Multiple independent processors ( multiple threads of control)

8
New cards

Logical concurrency

Categories of Concurrency:

  • The appearance of physical concurrency is presented by timesharing one processor (software can be designed as if there were multiple threads of control)

9
New cards

thread of control

— in a program is the sequence of program points reached as control flows through the program

10
New cards
  • physical concurrency

  • concurrent execution

Motivations for the Use of Concurrency

  • Multiprocessor computers capable of — are now widely used

  • Even if a machine has just one processor, a program written to use — can be faster than the same program written for nonconcurrent execution

11
New cards
  • designing software

  • locally or over a network

Motivations for the Use of Concurrency

  • Involves a different way of — that can be very useful many real-world situations involve concurrency

  • Many program applications are now spread over multiple machines, either —

12
New cards

task or process or thread

Introduction to Subprogram-Level Concurrency

  • A — is a program unit that can be in concurrent execution with other program units

  • it usually work together

13
New cards
  • implicitly started

  • suspended

  • not return to the caller

Introduction to Subprogram-Level Concurrency

  • Tasks differ from ordinary subprograms in that:

    • A task may be —

    • When a program unit starts the execution of a task, it is not necessarily —

    • When a task’s execution is completed, control may —

14
New cards

Heavyweight tasks

Two General Categories of Tasks

  • — execute in their own address space

15
New cards

Lightweight tasks

Two General Categories of Tasks

  • all run in the same address space – more efficient

16
New cards

disjoint

A task is — if it does not communicate with or affect the execution of any other task in the program in any way

17
New cards

Task Synchronization

A mechanism that controls the order in which tasks execute

18
New cards
  • Cooperation synchronization

  • Competition synchronization

Two kinds of task synchronization

19
New cards
  • Cooperation

  • producer-consumer problem

Kinds of synchronization

  • —: Task A must wait for task B to complete some specific activity before task A can continue its execution

  • e.g., the —

20
New cards
  • Competition

  • shared counter

  • mutually exclusive access

Kinds of synchronization

  • —: Two or more tasks must use some resource that cannot be simultaneously used

  • e.g., a —

  • is usually provided by —

21
New cards
<p>Depending on order, there could be four different results</p>

Depending on order, there could be four different results

Need for Competition Synchronization:

Task A: TOTAL = TOTAL + 1

Task B: TOTAL = 2 * TOTAL

22
New cards

Scheduler

Providing synchronization requires a mechanism for delaying task execution

23
New cards

scheduler

Task execution control is maintained by a program called the —, which maps task execution onto available processors

24
New cards
  • new

  • ready

  • running

  • blocked

  • dead

5 Task Execution States

25
New cards

New

Task Execution States

— created but not yet started

26
New cards

Ready

Task Execution States

— to run but not currently running (no available processor)

27
New cards

Blocked

Task Execution States

— has been running, but cannot now continue (usually waiting for some event to occur)

28
New cards

Dead

Task Execution States

— no longer active in any sense

29
New cards

Task Execution States

Task —

<p>Task —</p>
30
New cards
  • Liveness

  • complete its execution

  • — is a characteristic that a program unit may or may not have

  • In sequential code, it means the unit will eventually —

31
New cards

deadlock

In a concurrent environment, a task can easily lose its liveness

If all tasks in a concurrent environment lose their liveness, it is called —

32
New cards
  • Semaphores

  • Monitors

  • Message Passing

3 Methods of Providing Synchronization

33
New cards

semaphore

a data structure consisting of a counter and a queue for storing task descriptors

34
New cards

task descriptor

a data structure that stores all of the relevant information about the execution state of the task

35
New cards

guards on the code

Semaphores can be used to implement — that accesses shared data structures

36
New cards

wait and release (or signal)

Semaphores have only two operations

37
New cards

competition and cooperation synchronization

Semaphores can be used to provide both

38
New cards

encapsulate

Monitor
The idea: — the shared data and its operations to restrict access

39
New cards

monitor

an abstract data type for shared data

40
New cards

Message passing

a general model for concurrency

41
New cards
  • semaphores and monitors

  • competition synchronization

Message passing

  • It can model both —

  • It is not just for —

42
New cards

task communication

Message passing

Central idea: — is like seeing a doctor--most of the time she waits for you or you wait for her, but when you are both ready, you get together, or rendezvous

43
New cards

Concurrent execution

Summary

— can be at the instruction, statement, or subprogram level

44
New cards

Physical concurrency

Summary

— when multiple processors are used to execute concurrent units

45
New cards

Logical concurrency

Summary

— concurrent united are executed on a single processor

46
New cards

semaphores, monitors, rendezvous, threads

Summary

4 Mechanism