I. BASIC TECHNIQUES.
1. Parallel Computers.
2. Message-Passing Computing.
3. Embarrassingly Parallel Computations.
4. Partitioning and Divide-and-Conquer Strategies.
5. Pipelined Computations.
6. Synchronous Computations.
7. Load Balancing and Termination Detection.
8. Programming with Shared Memory.
9. Distributed Shared Memory Systems and Programming.
II. ALGORITHMS AND APPLICATIONS.
10. Sorting Algorithms.
11. Numerical Algorithms.
12. Image Processing.
13. Searching and Optimization.
Appendix A: Basic MPI Routines. Appendix B: Basic Pthread Routines. Appendix C: OpenMP Directives, Library Functions, and Environment Variables Index.
- Chapter on Distributed Shared Memory (DSM) programming—Describes techniques and tools for shared memory programming on clusters.
-
Enables programs to be written in shared memory paradigm which has advantages over traditional message passing programming.
-
- Content revisions throughout.
-
Provides students with the most current and concise information possible.
-
- Updated Companion Website—Includes revised step-by-step instructions for students and extensive support materials for instructors such as PowerPoint slides and assignments.
-
Provides a resource that complements the text in a variety of ways that will help both students and professors in and out of the classroom.
-
- Required software (MPI, PVM, DSM) available FREE!
-
Students are provided with all the learning materials necessary for success in the course.
-