z-logo
Premium
GPU‐based branch‐and‐bound method to solve large 0‐1 knapsack problems with data‐centric strategies
Author(s) -
Shen Jingcheng,
Shigeoka Kentaro,
Ino Fumihiko,
Hagihara Kenichi
Publication year - 2018
Publication title -
concurrency and computation: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.309
H-Index - 67
eISSN - 1532-0634
pISSN - 1532-0626
DOI - 10.1002/cpe.4954
Subject(s) - computer science , overhead (engineering) , parallel computing , speedup , knapsack problem , central processing unit , cuda , graphics processing unit , data transmission , branch and bound , transfer (computing) , multi core processor , algorithm , computer hardware , operating system
Summary An out‐of‐core branch‐and‐bound (B&B) method to solve large 0‐1 knapsack problems on a graphics processing unit (GPU) is proposed. Given a large problem that produces many subproblems, the proposed method dynamically swaps subproblems to CPU memory. Because such a CPU‐centric subproblem management scheme increases CPU‐GPU data transfer, we adopt three data‐centric strategies to eliminate this side effect. The first is an out‐of‐order search (O3S) strategy that reduces the data transfer overhead by adaptively transferring subproblems between the CPU and GPU. The second is an explicitly‐managed pipelining strategy that hides the data transfer overhead by overlapping data transfer with GPU‐based B&B operations. The third is a GPU‐based stream compaction strategy that reduces the sparseness of arrays to be transferred. Experimental results demonstrate that the proposed out‐of‐core method stored 41 times as many subproblems as a previous in‐core method that manages subproblems in GPU memory, solving approximately twice as many problem instances on the GPU. In addition, compared to a previous breadth‐first search (BFS) strategy, the proposed O3S strategy achieved an average speedup of 7.5 times.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here