z-logo
open-access-imgOpen Access
Predicting future location in mobile cache based on variable order of prediction-by-partial-matching algorithm
Author(s) -
Lincan Li,
Chiew Foong Kwong,
Fenghu Chen,
Qianyu Liu,
Jing Wang
Publication year - 2018
Publication title -
nottingham eprints (university of nottingham)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.1049/cp.2018.1729
Subject(s) - computer science , cache , cache algorithms , cache invalidation , context (archaeology) , smart cache , matching (statistics) , enhanced data rates for gsm evolution , cache coloring , real time computing , computer network , algorithm , cpu cache , artificial intelligence , statistics , mathematics , paleontology , biology
Mobile caching at the edge of the wireless network has been regarded as an ideal approach that can alleviate the user access latencies. While there is a problem that the user terminal (UT) is moving too fast when enter a serving cache area, it may not have enough time to acquire the required data from the cache. One solution is to predict the UT's future location and pre-prepare the requested content at the cache devices that will appear in the UT's future path. Once the UT arrive at the serving cache area, they can directly acquire the data since it is already at the location, rather than send in a request to update the cache. The key point to achieve this reliably is the accuracy of the location prediction. This paper presents a location prediction model based on prediction-by-partial-matching (PPM) algorithm in the mobile cache design. The performance of this model will be compared by using one-order context, two-order context and three-order context, respectively. All the models will be evaluated in a real world data.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom