z-logo
Premium
On detecting TCP path saturation in LTE networks
Author(s) -
Schulte Lennart,
Boz Eren,
Varis Nuutti,
Manner Jukka
Publication year - 2017
Publication title -
international journal of communication systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.344
H-Index - 49
eISSN - 1099-1131
pISSN - 1074-5351
DOI - 10.1002/dac.3334
Subject(s) - computer science , computer network , tcp global synchronization , zeta tcp , tcp acceleration , tcp friendly rate control , transmission control protocol , tcp tuning , cellular network , path (computing) , context (archaeology) , tcp westwood plus , network packet , paleontology , biology
Summary A significant part of the data in mobile networks is transferred as bulk data with transmission control protocol (TCP) as apps or video downloads. When the video takes too long to start, users are more prone to abandon watching, which eventually leads to decreased revenue for the content provider. While it is widely known that TCP has performance issues in mobile networks, end‐to‐end measurement studies, especially based on real data, should be studied further. In this paper, we measure the efficiency of TCP in long term evolution (LTE) networks and provide an analysis on the reasons of unoptimal performance based on 235 000 measurements from deployed mobile networks. For this purpose, we propose an algorithm for LTE networks that detects the periods during a TCP connection where the path is saturated, ie, the network is the limiting factor. From this data, we find that TCP is a source for unoptimal performance, and assumed reasons from other papers are partly confirmed and partly refuted. Most importantly, we find that the amount of queueing on the path has significant impact on the achieved protocol performance. Lastly, we learn from the LTE experience and put the findings into a 5G context.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here