z-logo
Premium
Toward high‐performance computational chemistry: I. Scalable Fock matrix construction algorithms
Author(s) -
Foster Ian T.,
Tilson Jeffrey L.,
Wagner Albert F.,
Shepard Ron L.,
Harrison Robert J.,
Kendall Rick A.,
Littlefield Rik J.
Publication year - 1996
Publication title -
journal of computational chemistry
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.907
H-Index - 188
eISSN - 1096-987X
pISSN - 0192-8651
DOI - 10.1002/(sici)1096-987x(19960115)17:1<109::aid-jcc9>3.0.co;2-v
Subject(s) - massively parallel , computer science , algorithm , fock matrix , parallel computing , computation , scalability , cluster analysis , scaling , matrix (chemical analysis) , computational science , theoretical computer science , hartree–fock method , mathematics , artificial intelligence , computational chemistry , chemistry , chromatography , database , geometry
Several parallel algorithms for Fock matrix construction are described. The algorithms calculate only the unique integrals, distribute the Fock and density matrices over the processors of a massively parallel computer, use blocking techniques to construct the distributed data structures, and use clustering techniques on each processor to maximize data reuse. Algorithms based on both square and row‐blocked distributions of the Fock and density matrices are described and evaluated. Variants of the algorithms are discussed that use either triple‐sort or canonical ordering of integrals, and dynamic or static task clustering schemes. The algorithms are shown to adapt to screening, with communication volume scaling down with computation costs. Modeling techniques are used to characterize algorithm performance. Given the characteristics of existing massively parallel computers, all the algorithms are shown to be highly efficient for problems of moderate size. The algorithms using the row‐blocked data distribution are the most efficient. © 1996 by John Wiley & Sons, Inc.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here