Parallel and Distributed Computing Concepts

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/15

flashcard set

Earn XP

Description and Tags

These flashcards cover key concepts, definitions, and principles of parallel and distributed computing, providing a comprehensive study aid.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

16 Terms

1
New cards

What is parallel computing?

The simultaneous use of multiple compute resources to solve a computational problem.

2
New cards

Why do we use parallel computing?

To save time and/or money, solve larger/more complex problems, provide concurrency, and better utilize underlying parallel hardware.

3
New cards

What distinguishes distributed computing from parallel computing?

Distributed computing leverages multiple independent computers that are often geographically dispersed, whereas parallel computing focuses on a single machine with tightly coupled processors.

4
New cards

What is Client-Server Architecture?

A model where clients request services or resources from centralized servers, commonly used in web applications.

5
New cards

What is Peer to Peer (P2P) architecture?

A network architecture where nodes act as both clients and servers, sharing resources directly without centralized coordination.

6
New cards

What is cluster computing?

Connecting multiple computers (nodes) within a single location to work together as a unified system, typically for scientific research and data analysis.

7
New cards

What is data parallelism?

A form of parallelism where the same operation is performed on multiple pieces of data concurrently.

8
New cards

What is task parallelism?

Executing different dependent or independent tasks simultaneously in a parallel computing environment.

9
New cards

What is Flynn's taxonomy?

A classification system for parallel computer architectures that distinguishes based on Instruction Stream and Data Stream.

10
New cards

What is Amdahl's Law?

A formula that defines the potential speedup of a program that can be achieved through parallelization, focusing on a fixed-size problem.

11
New cards

What is Gustafson's Law?

A law that considers how the problem size can increase with the number of processors, highlighting scalability.

12
New cards

What is the CAP theorem?

It states that a distributed system can provide at most two of the following three guarantees: Consistency, Availability, and Partition tolerance.

13
New cards

What is meant by strong consistency?

A model that ensures all nodes see the same data at the same time, providing a single, up-to-date view of data.

14
New cards

What are the challenges in synchronization in distributed systems?

Deadlock prevention, network latency, and the complexity of ensuring consistent states across nodes.

15
New cards

What is a message-passing interface (MPI)?

A standardized interface for message-passing parallel computing that allows for communication between multiple processing units.

16
New cards

What is the importance of fault tolerance in distributed systems?

Ensures continued operation despite failures, crucial for maintaining system reliability, availability, and data integrity.