Modern cellular networks utilising the long-term evolution (LTE) and the coming 5G set of standards face an ever-increasing demand for low-latency mobile data from connected devices. Header compression is employed to minimise the overhead for IP-based cellular network traffic, thereby decreasing the overall bandwidth usage and, subsequently, transmission delays. Here, we employ machine learning approaches for the prediction of Robust Header Compression version 1's and version 2's compression utility for VoIP transmissions, which allows the compression to dynamically adapt to varying channel conditions. We evaluate various regression models employing r2 and mean square error scores next to complexity (number of coefficients) based on an RTP specific training data set and separately captured live VoIP audio calls. We find that the proposed weighted Ridge regression model explains about at least 50 % of the observed results and the accuracy score may be as high as 94 % for some of the VoIP transmissions.
|State||Published - 2017|