Abstract

Parallel processing systems use data parallelism to achieve high performance data processing. Data parallelism is normally based on data arrays, which are distributed to separate nodes. Therefore, efficient communication between nodes is required to initialize the distribution. In this paper, we propose a computation and communication overlapping technique to reduce the overhead of communication during array distribution. Our overlapping technique uses task parallelism for the initiate task and the worker tasks, and also the synchronization mechanism supported by Chapel. To show our overlapping technique is effective, we design and develop a parallel version of the Mandelbrot set program, and then evaluate the benefit of overlapping against the execution time of Mandelbrot. From our comparison, the overlapping technique proposed in this paper is effective in that it reduces the impact of communication in the initial array distribution.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
