z-logo
open-access-imgOpen Access
Hierarchical collaborative caching in 5G networks
Author(s) -
Tang Qinqin,
Xie Renchao,
Huang Tao,
Liu Yunjie
Publication year - 2018
Publication title -
iet communications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.355
H-Index - 62
eISSN - 1751-8636
pISSN - 1751-8628
DOI - 10.1049/iet-com.2018.5553
Subject(s) - computer science , cache , latency (audio) , server , distributed computing , computer network , mobile edge computing , enhanced data rates for gsm evolution , cache algorithms , cpu cache , telecommunications
Caching in mobile networks can reduce the redundant data transmission and cope with the challenge of the explosive growth of mobile data traffic. It has been considered as a promising technology in 5G networks and has been attracting a lot of attention in recent years. Although many existing works have addressed the content placement problem or the cache optimisation problem, most of them do not consider the issue of hierarchical collaborative caching. Collaborative caching can further alleviate the traffic pressure and reduce the user‐perceived latency by reducing duplicate content transmission. Therefore, in this study, the authors consider a hierarchical collaborative caching framework with the cache deployment at the distributed gateway and mobile edge computing servers, and then design a novel caching strategy based on this framework. They formulate the hierarchical collaborative content placement problem as an optimisation problem to maximise the latency saving under the constraint of limited cache capacity. Since finding the optimal solution is an NP‐hard problem, they propose a genetic placement algorithm to find the near‐optimal solution to reduce the computation complexity. Numerical experiment results show that the proposed algorithms can significantly improve the performance compared with the reference algorithms.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here