Big Idea 4: 4.3 - Parallel and Distributed Computing
A computer needs to handle many tasks as it operates:
System tasks: the operating system has tasks like scheduling (what to do next), managing hardware, working with network, etc.
User tasks: executing programs that the user has selected, such as running computer games
Tasks need to be scheduled by the operating system. Balance tasks so all CPU are being used evenly and fully. Can be done sequentially, in parallel, and be distributed to other computers.
Sequential:
Tasks are done one after another. In order.
Why?
Limited hardware. (ex. one CPU)
Tasks are dependent. (B depends on A. C depends on B.)
Parallel:
The program is broken into multiple, smaller, sequential computing operations. Some are performed simultaneously.
Same computer.
Executed at the same time.
Why?
Faster operations.
Hardware driven. (Several processors, cores)
A lot of data can be processed in the same way.
Can scale more effectively.
Takes the amount of time it takes to do the longest task.
Efficiency can be compared by looking at the time.
Speedup Time is by how much faster parallel computing is compared to sequential computing.
Distributed Computing:
One computer to one or more others. Multiple devices to run a program.
Can mix sequential and parallel computing, based on how it sends out tasks to other computers.
Allows problems that cannot be solved with just one computer to be solved on multiple devices.
Web search —> Google sends the request to its thousands of servers in the background
Can take longer