This lesson is still being designed and assembled (Pre-Alpha version)

Introduction to Parallel Programming using MPI

Message Passing Interface (MPI) is the standard method to parallelise across a number of computers using distributed memory. Weather and climate models, studies on galaxy formation, and molecular dynamics simulation all use MPI to take advantage of processing power and memory across many computers.

Prerequisites

Experience of a programming language is required Knowledge of Fortran, C, C++ or Python. Working knowledge of Linux is essential.

Schedule

Setup Download files required for the lesson
00:00 1. Introduction What is a parallel computer?
What parallel programming models are there?
How do I get performance?
00:30 2. MPI standard Why was MPI developed?
How can I use MPI?
What is the basic code required?
01:00 3. MPI point to point communication How do I send a message?
How do I know if it was successful?
02:00 4. MPI collective communication How do I avoid having to use multiple recvs and sends?
What operations can be performed by all processors at once?
03:00 5. Advanced topics in MPI What is the best way to read and write data to disk?
Can MPI optimise commnications by itself?
How can I use OpenMP and MPI together?
03:45 6. Summary Where can I go for further information?
03:50 Finish

The actual schedule may vary slightly depending on the topics and exercises chosen by the instructor.