z-logo
Premium
Parallel Computation of a Weather Model in a Cluster Environment
Author(s) -
Chang HsiYa,
Huang KuoChan,
Shen CherngYeu,
Tcheng ShouCheng,
Chou ChaurYi
Publication year - 2001
Publication title -
computer‐aided civil and infrastructure engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.773
H-Index - 82
eISSN - 1467-8667
pISSN - 1093-9687
DOI - 10.1111/0885-9507.00239
Subject(s) - computer science , parallel computing , ibm , exploit , computation , parallelism (grammar) , domain decomposition methods , cluster analysis , software , decomposition , message passing interface , cluster (spacecraft) , domain (mathematical analysis) , computer cluster , operating system , message passing , algorithm , ecology , mathematical analysis , materials science , physics , computer security , mathematics , finite element method , machine learning , biology , thermodynamics , nanotechnology
Recently, the superior and continuously improving cost‐performance ratio of commodity hardware and software has made PC clustering a popular alternative for high‐performance computing in both academic institutes and industrial organizations. The purpose of this work is to use PC clusters to solve a weather‐prediction model in parallel mode, and the result also will be compared with those obtained on some conventional parallel platforms such as the Fujitsu VPP300, IBM SP2 (160 and 120 MHz), and HP SPP2200. Techniques of domain decomposition and data communication are used to exploit parallelism of the model. Interprocessor data communication is done by the Message Passing Interface communication library routines. Two versions of the parallel codes, one with longitude decomposition and the other with latitude decomposition, are tested and compared. Speedups of the parallel weather model on these machines with various numbers of processors show that substantial reductions in computation time can be achieved as compared with sequential runs.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here