1/15
These flashcards cover key concepts, definitions, and principles of parallel and distributed computing, providing a comprehensive study aid.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
What is parallel computing?
The simultaneous use of multiple compute resources to solve a computational problem.
Why do we use parallel computing?
To save time and/or money, solve larger/more complex problems, provide concurrency, and better utilize underlying parallel hardware.
What distinguishes distributed computing from parallel computing?
Distributed computing leverages multiple independent computers that are often geographically dispersed, whereas parallel computing focuses on a single machine with tightly coupled processors.
What is Client-Server Architecture?
A model where clients request services or resources from centralized servers, commonly used in web applications.
What is Peer to Peer (P2P) architecture?
A network architecture where nodes act as both clients and servers, sharing resources directly without centralized coordination.
What is cluster computing?
Connecting multiple computers (nodes) within a single location to work together as a unified system, typically for scientific research and data analysis.
What is data parallelism?
A form of parallelism where the same operation is performed on multiple pieces of data concurrently.
What is task parallelism?
Executing different dependent or independent tasks simultaneously in a parallel computing environment.
What is Flynn's taxonomy?
A classification system for parallel computer architectures that distinguishes based on Instruction Stream and Data Stream.
What is Amdahl's Law?
A formula that defines the potential speedup of a program that can be achieved through parallelization, focusing on a fixed-size problem.
What is Gustafson's Law?
A law that considers how the problem size can increase with the number of processors, highlighting scalability.
What is the CAP theorem?
It states that a distributed system can provide at most two of the following three guarantees: Consistency, Availability, and Partition tolerance.
What is meant by strong consistency?
A model that ensures all nodes see the same data at the same time, providing a single, up-to-date view of data.
What are the challenges in synchronization in distributed systems?
Deadlock prevention, network latency, and the complexity of ensuring consistent states across nodes.
What is a message-passing interface (MPI)?
A standardized interface for message-passing parallel computing that allows for communication between multiple processing units.
What is the importance of fault tolerance in distributed systems?
Ensures continued operation despite failures, crucial for maintaining system reliability, availability, and data integrity.