1).What is Asynchronous concurrent processes? 2). Explain the difference between concurrency and synchronization?

 1) 

Synchronization:

means acquiring a reentrant lock on an object. The lock is released when either end of synchronized block is reached or thread goes into waiting state. The lock is reentrant in the sense that the same thread can acquire the lock again and again but a different thread cannot. So synchronized keyword essentially guards a piece of code from being accessed by multiple threads simultaneously

Concurrency:

 collections are exactly opposite. They are called so because they allow multiple threads to concurrently access the same data (without causing a side effect). For example, concurrent Hash Map allows multiple threads to perform write operations in parallel as long as the writes are happening on different segments.


********************************************


2).


Message Passing Interface (MPI) is a communication protocol for parallel programming. MPI is specifically used to allow applications to run in parallel across a number of separate computers connected by a network

  In the MPI programming model, a computation comprises one or more   processes that communicate by calling library routines to send   and receive messages to other processes. In most MPI implementations, a fixed set of processes is created at program initialization, and one process is created per processor. However, these processes may execute different programs. Hence, the MPI programming model is sometimes referred to as multiple program multiple data (MPMD) to distinguish it from the SPMD model in which every processor   executes the same program.

Because the number of processes in an MPI computation is normally fixed, our focus in this chapter is on the mechanisms used to communicate   data between processes. Processes can use point-to-point communication operations to send a message from one named process to another; these operations can be used to implement local and unstructured communications. A group of processes can call collective communication operations to perform commonly used   global operations such as summation and broadcast. MPI's ability to probe for messages supports asynchronous communication. Probably MPI's most important feature from a software engineering viewpoint is its support for modular programming. A mechanism called a communicator allows the MPI programmer to define modules that encapsulate internal communication structures.



 


Comments

Popular posts from this blog

Explain with the help of an example the distributed Coordination in OS?

A BRIEF HISTORY OF COMPUERS:

State the main difference in Microsoft windows GUI and MAC GUI?