1. Introduction to Parallel Computing
Motivating Parallelism
The role of parallelism in accelerating computing speeds has been recognized for several decades.
Parallelism provides multiplicity of datapaths and increased access to storage elements, significant in commercial applications.
Scalable performance and lower cost of parallel platforms reflect in a wide variety of applications.
The Computational Power Argument
Moore's law predicts an exponential increase in the complexity of integrated circuits.
Increasing circuit complexity implies a need for parallelism to translate transistors into useful operations per second.
Explicit parallelism is particularly important.
The Memory/Disk Speed Argument
Despite increases in processor clock rates, DRAM access times have not improved at the same pace, causing performance bottlenecks.
Parallel platforms provide increased memory system bandwidth and higher aggregate caches.
Principles of data locality and bulk access guide parallel algorithm design and memory optimization.
The Data Communication Argument
The Internet is seen as a large computing platform exploited by applications like SETI@home and Folding@home.
Some applications, like databases and data mining, cannot move large volumes of data and must use parallel techniques for analysis.
Scope of Parallel Computing Applications
Applications in Engineering and Design
Optimization of airfoils, engines, circuits, and structures.
Design and simulation of micro- and nano-scale systems.
Process optimization and operations research.
Scientific Applications
Characterization of genes and proteins.
Computational physics and chemistry for materials science and astrophysics.
Weather modeling, mineral prospecting, and flood prediction.
Commercial Applications
Data mining for business and marketing decisions.
Large-scale servers and information retrieval systems.
Wall Street trading systems.
Applications in Computer Systems
Network intrusion detection, cryptography, and multiparty computations.
Embedded systems and modern automobiles.
Peer-to-peer networks and distributed control algorithms.
Organization and Contents of this Course
Fundamentals
Basic parallel platforms, algorithm design principles, group communication, and analytical modeling techniques.
Parallel Programming
Programming with message passing libraries and threads.
Parallel Algorithms
Matrix computations, graph algorithms, sorting, discrete optimization, and dynamic programming.