What is Parallel Computing?
Using Parallel Computing you can run any number of compute resources, each of them concurrently solving discrete parts of your program. To establish parallel computing, you can connect multiple processors to a computer or connect multiple computers in a network.
To know more about parallel computing, browse through the following link:
What is MPI?
MPI is the acronym of Message Passing Interface. It is a library of message passing programs that can be triggered from programming languages like C / Fortran 77.
About MPI Parallel Computing
MPI is suitable for parallel computing as well as working with clusters of different workstations. These workstation clusters can be utilized as one parallel resource using MPI thereby establishing parallel computing. MPI is used to establish message passing in shared memory. MPI Parallel computing involves two main innovations namely MPI communicators, User defined data types.
Data communication between processes involved in parallel computing is established using message communicators. The messages are sent and received via MPI communicator functions namely MPI_Send and MPI_Recv.
Many parallel systems need to send and receive noncontiguous data as messages. Hence before sending the message, MPI has to pack all noncontiguous data into a single message and in the receiving end, the message has to be unpacked. This is established using MPI data types.
Advantages of MPI Parallel Computing
- MPI supports portability as well as platform independent computing.
- MPI offers capability of cross platform development.
- Using MPI, you can achieve heterogeneous communication that is cluster friendly as well as grid capable.
- MPI provides many transport layers and it does not impose any communication overhead. Hence it offers excellent performance.
For more details, browse through the following link:
Here is a video presentation on MPI Parallel Computing / Cluster Programming: