1/14
These flashcards cover the key concepts and definitions related to distributed systems, parallel computing, and scalability from the lecture notes.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Distributed System
A collection of independent computers (nodes) that appear to users as a single coherent system.
Parallel System
A computing system where multiple processors or CPU cores execute computations simultaneously.
Concurrent System
A system where multiple processes execute within overlapping time periods, possibly interacting with each other.
Scalability
The ability of a system to handle increased load by adding resources without negatively impacting performance.
Latency
The time delay in processing requests in a distributed system.
Fault Tolerance
The ability of a system to continue operating correctly even when some of its components fail.
CAP Theorem
A fundamental principle stating that in a distributed data store, only two of the following three properties can be guaranteed at the same time: Consistency, Availability, and Partition Tolerance.
Concurrency vs Parallelism
Concurrency allows tasks to overlap in execution time, while parallelism allows tasks to run simultaneously.
Load Balancing
The practice of distributing incoming network traffic or computational workload across multiple backend servers to prevent bottlenecks.
Client-Server Systems
A distributed system model where clients send requests and servers provide services.
Peer-to-Peer (P2P) Systems
Distributed network architecture where each node acts as both a client and a server.
Publish-Subscribe (Pub/Sub) Systems
A messaging pattern where publishers send events and subscribers receive them via a broker.
Cluster Computing
A type of distributed system where a group of machines work together as a single powerful system.
Cloud Computing
A model for delivering computing resources over the internet, typically using centralized virtualized infrastructure.
Edge Computing
Computing that is performed at or near the data source, aimed at reducing latency.