z-logo
Premium
Performance results of running parallel applications on the InteGrade
Author(s) -
Cáceres E. N.,
Mongelli H.,
Loureiro L.,
Nishibe C.,
Song S. W.
Publication year - 2010
Publication title -
concurrency and computation: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.309
H-Index - 67
eISSN - 1532-0634
pISSN - 1532-0626
DOI - 10.1002/cpe.1524
Subject(s) - computer science , overhead (engineering) , distributed computing , grid computing , middleware (distributed applications) , grid , knapsack problem , exploit , parallel computing , throughput , matrix multiplication , operating system , algorithm , wireless , physics , geometry , mathematics , computer security , quantum mechanics , quantum
The InteGrade middleware intends to exploit the idle time of computing resources in computer laboratories. In this work we investigate the performance of running parallel applications with communication among processors on the InteGrade grid. As costly communication on a grid can be prohibitive, we explore the so‐called systolic or wavefront paradigm to design the parallel algorithms in which no global communication is used. To evaluate the InteGrade middleware we considered three parallel algorithms that solve the matrix chain product problem, the 0–1 Knapsack Problem, and the local sequence alignment problem, respectively. We show that these three applications running under the InteGrade middleware and MPI take slightly more time than the same applications running on a cluster with only LAM‐MPI support. The results can be considered promising and the time difference between the two is not substantial. The overhead of the InteGrade middleware is acceptable, in view of the benefits obtained to facilitate the use of grid computing by the user. These benefits include job submission, checkpointing, security, job migration, etc. Copyright © 2009 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here